I've read a lot that NSDecimalNumber
is the best format to use when using currency.
However, I'm still getting floating point issues.
For example.
let a: NSDecimalNumber = 0.07 //0.07000000000000003
let b: NSDecimalNumber = 7.dividing(by: 100) //0.06999999999999999
I know I could use Decimal
and b would be what I'm expecting:
let b: Decimal = 7 / 100 //0.07
I'm using Core Data in my app. So I'm stuck with NSDecimalNumber
. Unless I want convert a lot of NSDecimalNumbers
to Decimals
.
Can someone help me get 0.07?
The problem is that you’re effectively doing floating point math (with the problems it has faithfully capturing fractional decimal values in a Double
) and creating a Decimal
(or NSDecimalNumber
) from the Double
value that already has introduced this discrepancy. Instead, you want to create your Decimal
values before doing your division (or before having a fractional Double
value, even if a literal).
So, the following is equivalent to your example, whereby it is building a Double
representation (with the limitations that entails) of 0.07
, and you end up with a value that is not exactly 0.07
:
let value = Decimal(7.0 / 100.0) // or NSDecimalNumber(value: 7.0 / 100.0)
Whereas this does not suffer this problem because we are dividing a decimal 7
by a decimal 100
:
let value = Decimal(7) / Decimal(100) // or NSDecimalNumber(value: 7).dividing(by: 100)
Or, other ways to create 0.07
value but avoiding Double
in the process include using strings:
let value = Decimal(string: "0.07") // or NSDecimalNumber(string: "0.07")
Or specifying the mantissa/significant and exponent:
let value = Decimal(sign: .plus, exponent: -2, significand: 7) // or NSDecimalNumber(mantissa: 7, exponent: -2, isNegative: false)
Bottom line, avoid Double
representations entirely when using Decimal
(or NSDecimalNumber
), and you won't suffer the problem you described.