In the repository code, in a module developed by another team, I discovered that there is a conversion of a price from cents to euro, just dividing the number by 100.
The code is in Javascript, so it uses the IEEE 754 standard.
I know that is not safe handling money values as floating-point numbers, but I was wondering if this case is safe before sending the task to the other team.
So far, I didn't find any case where dividing an integer by 100 gets an inaccurate result. Let's go further: 100
is just 2*2*5*5
.
We know that dividing a number by 2
is safe, since it is just equal to a shift of a position.
So we can easily say that, if exists a number that is not accurately divisible by 5, then the division by 100 is not accurate.
I did many tests and I didn't find any of these numbers, but I'm far from a theoretical demonstration of the thesis.
So, is dividing a number by 100 safe in the IEEE 754 standard?
A floating point decimal number with 15 significant digits of precision converts to a 64-bit binary floating point number (Number
in JavaScript) and back to decimal without loss of precision. Although the binary number may not store the decimal number exactly, it has more bits of precision (minimum 17 decimal significant digits are required to represent a 53-bit mantissa) and converts with rounding back to the original decimal exactly. These extra binary digits of mantissa are there precisely to keep those 15 significant decimal digits exact in all results of CPU arithmetic. See Number of Digits Required For Round-Trip Conversions for full details.
When you divide by 100 the binary result still has 53-bit of precision with a possible error in the unit of least precision (the lowest bit of mantissa) unless the result underflows to 0 (see What Every Computer Scientist Should Know About Floating-Point Arithmetic for full details.). That binary number still converts with rounding to a correct exact decimal number within 15 significant decimal digits of precision.
In other words, if your decimal numbers have no more that 15 significant digits then dividing them by 100 keeps that precision.
E.g. try 123456789012345 / 100
and 0.000123456789012345 / 100
in your browser console (both these numbers have 15 significant decimal digits of precision) - these divisions return correct decimal numbers within 15 significant decimal digits:
123456789012345 / 100
1234567890123.45
0.000123456789012345 / 100
0.00000123456789012345