I know that JavaScript has both a normal zero 0 (known as a positive zero +0) and a negative zero -0, however I have never come across a situation where I had to use -0.
There are some existing posts on stack overflow about how positive and negative zeros are similar/different, but none of them explain real life use-cases/examples of it.
Assume we're studying the function y = 1/x
and we'd like to know how it behaves when x is small. Let's take x=1, x=0.1, x=0.01
and calculate the func:
x = 1;
while(x) {
x /= 10;
document.write(x + ' ' + 1/x + '<br>');
}
As you can see, it approaches towards positive infinity. 1/x
is equal to Infinity
because at some point x
gets so small that it's indistinguishable from 0
, and 1/0 = Infinity
. Note that this is the "positive" Infinity, that is, "a very big number".
Now, let's start with -1
instead of x=1
:
x = -1;
while(x) {
x /= 10;
document.write(x + ' ' + 1/x + '<br>');
}
The answer is now -Infinity
, that is, the function approaches towards the negative Infinity, "a very small number". Of course, this is also correct, but how did the computer get that? We just learned that 1/0 = (positive) Infinity
? The secret is that the zero in the last snippet is actually negative, so x
on the last iteration is -0
and not just 0
, and 1/-0
gives -Infinity
. Without the signed zero, the last snippet would give an incorrect result.
Hope that explains it a bit.