When running
console.log(new Intl.NumberFormat('en-US', {
minimumFractionDigits: 0,
maximumFractionDigits: 0,
maximumSignificantDigits: 3,
minimumSignificantDigits: 1
}).format(10.123456789));
I would expect the output to be 10
. Instead for some reason it outputs 10.1
which breaks the maximumFractionDigits: 0
constraint. What's going on? Considering this constraint is ignored across browsers it seems this is according to specification, but I just can't phantom a reason for this.
Self answering this, as roundingPriority
has gotten added since I asked this question.
The fraction digits (minimumFractionDigits/maximumFractionDigits) and significant digits (minimumSignificantDigits/maximumSignificantDigits) are both ways of controlling how many fractional and leading digits should be formatted. If both are used at the same time, it is possible for them to conflict.
These conflicts are resolved using the roundingPriority property. By default, this has a value of "auto", which means that if either minimumSignificantDigits or maximumSignificantDigits is specified, the fractional and integer digit properties will be ignored.
So, using lessPrecision
will respect the maximum
constraints:
console.log(new Intl.NumberFormat('en-US', {
minimumFractionDigits: 0,
maximumFractionDigits: 0,
maximumSignificantDigits: 3,
minimumSignificantDigits: 1,
roundingPriority: 'lessPrecision'
}).format(10.123456789));
Chrome Firefox Safari options.roundingPriority parameter 106 116 15.4