javascriptbit-manipulationbitwise-operatorssharedarraybuffer

javascript sharedArrayBuffer and bitwise operations returning a 32bit instead of 16bit number


javascript, 2020: I've never worked with the sharedArrayBuffer, (or bits) and I have some questions. Basically I want to store a set of bools and a small counter (4-bit), on a single Int16Array element. but as I manipulate the slot of memory- it looks like it changes to 32-bits.

  const cellbinaryMaintain = 15; //should be last bit position
   const bitArray = new SharedArrayBuffer(bufferMemoryAllocation); // double the number of cells 
   const mainGrid = new Int16Array(bitArray); // 16-bit int

and down in a for(i) loop:

        mainGrid[i]=setBitToOne(mainGrid[i], cellbinaryMaintain);

where the 'setBit' functions are:

function setBitToOne(what, bitPos) 
    {
    return what | (1 << bitPos);
}

function setBitToZero(what, bitPos)
    {
    const mask = ~(1 << bitPos);
    return what & mask;
}

but that results in:

decimal:-32768

in binary: 11111111111111111000000000000000

when the result I'm seeking is the last bit of a 16bit number set:

decimal: 32768

in binary: 1000000000000000

I'm not used to working with bit-wise operators, do I need to mask out the +16bits? I've tried passing the index to the function, and work without reassigning the value to a new variable, but that still didn't work.

(index, bitPos) //same 11111111111111111000000000000000
     {
     console.log(mainGrid[index], "value is");  // value is 0
     console.log(bitPos, "bit is"); // value is 15
     mainGrid[index]|=(1<<bitPos);
} 

and when I console out the mask i'm applying i of course get:

decimal:32768

binary: 1000000000000000,

exactly what I want. same with the below code:

let num = 0;
let result =  (num|=1<<15);

console.log(result, "result");
bLog(result); //binary log

which seems to work... so what is happening here, why does this not seem to work with an Int16Array?


Solution

  • The values in mainGrid are still OK - the bits that you put into it are in there, and no extras (after all there's no room for them to fit). However, the values would get printed in a funny way, because loading from an element from an Int16Array means (by definition) that the top bit is interpreted as having a value of -32768, instead of +32768 that it would have for an Uint16Array. The practical consequence of that is that the value gets sign-extended when loaded (the whole Number thing in JavaScript complicates the story but not in a way that matters here).

    Using Uint16Array is less confusing, as no sign-extension occurs. Alternatively you could manually mask out the extra bits before printing, by using:

    value & 0xFFFF