javascriptmathfloating-pointminnegative-zero

Why does Math.min() return -0 from [+0, 0, -0]


I know (-0 === 0) comes out to be true. I am curious to know why -0 < 0 happens?

When I run this code in stackoverflow execution context, it returns 0.

const arr = [+0, 0, -0];
console.log(Math.min(...arr));

But when I run the same code in the browser console, it returns -0. Why is that? I have tried to search it on google but didn't find anything useful. This question might not add value to someone practical example, I wanted to understand how does JS calculates it.

 const arr = [+0, 0, -0];
    console.log(Math.min(...arr)); // -0

Solution

  • -0 is not less than 0 or +0, both -0 < 0 and -0 < +0 returns False, you're mixing the behavior of Math.min with the comparison of -0 with 0/+0.

    The specification of Math.min is clear on this point:

    b. If number is -0𝔽 and lowest is +0𝔽, set lowest to -0𝔽.

    Without this exception, the behavior of Math.min and Math.max would depend on the order of arguments, which can be considered an odd behavior — you probably want Math.min(x, y) to always equal Math.min(y, x) — so that might be one possible justification.

    Note: This exception was already present in the 1997 specification for Math.min(x, y), so that's not something that was added later on.