javascriptc#float32

javascript float32Array push changes value


I recently find out some weird behaviour in JS, and i might need some insight on it.

i have a js script that i need to translate into C#. now, in the JS i have:

values = new Float32Array(n).fill(0) (fill(0) just fill the array with 0 values) i compute several values, one is 0.48214292933260094 (that is stored in another Float32Array).

Now when i do a values.push() of this value, the value stored in the array is now 0.48214292526245117 which is close but not the same...

When doing the same thing in C#, the value stay the same (either as float, double or decimal).

Does anyone have a hint at why? the discrepency appears around the 7th digit so i guess it has to do with floating point errors, yet i use Float32Array in the two objects.... (Further ahead in the code i filter the values depending if they are == 0, so that's creating divergence between JS and C#)

tried different type of data in C# (double,float,decimal), C# stay consistent when transfering value from one list to another. JS is not. the weirdest part is that JS script works and C# don't...


Solution

  • C# stay consistent when transfering value from one list to another.

    No, you can try the following code in C#

    // (double)(float)0.48214292933260094 = 0.48214292526245117
    var v = new List<float> { (float)0.48214292933260094 };
    Console.WriteLine((double)v[0]);
    

    The result you got in JS is because Float32Array uses 32-bit to store numbers, while floating point numbers in JS are 64-bit.