iosswiftdecimalnsdecimalnumber

How to parse Decimal from String without losing precision?


I am dealing with money amounts in my app so precision is obviously very important and Decimal/NSDecimal number is supposed to be the way to handle money in swift. I want to convert between Decimal and String while keeping precision. (parsing API responses, storing to DB ...)

This is my current solution:

private let dotLocaleDictionary = [NSLocale.Key.decimalSeparator: "."]

func apiStringToDecimal(_ string: String) -> Decimal? {
    let number = NSDecimalNumber(string: string, locale: dotLocaleDictionary) as Decimal
    return number.isNaN ? nil : number
}

func decimalToApiString(_ decimal: Decimal) -> String {
    return (decimal as NSDecimalNumber).description(withLocale: dotLocaleDictionary)
}

The locale dictionary takes care of the decimal separator, and I expected the string initializer to be precise. But while playing around in a playground I noticed that after a certain length of input strings the resulting numbers get cut off.

I also tried parsing the strings with a number formatter, but the results were even worse.

My playground code:

let small = "1.135160000500009000100020003000400050061111"
let big = "1234567890123456789000100020003000400061111"
let medium = "123456789123456789.0001000200030004000511111"
let negative = "-123456789123456789.0001000200030004000511111"
let numericStrings = [small, big, medium, negative]

numericStrings.forEach {
    print($0)
    guard let decimal = apiStringToDecimal($0) else {
        print("Failed to parse decimal from: \($0)")
        print("--------------------")
        return
    }

    print(decimal)
    let string = decimalToApiString(decimal)
    print($0 == string ? "match" : "mismatch")

    print("--------------------")
}

let max = Decimal.greatestFiniteMagnitude
print("MAX decimal: \(max)")
print((max as NSDecimalNumber).description(withLocale: dotLocaleDictionary))
print(decimalToApiString(max))

And the output it produces:

1.135160000500009000100020003000400050061111
1.13516000050000900010002000300040005006
mismatch
--------------------
1234567890123456789000100020003000400061111
1234567890123456789000100020003000400060000
mismatch
--------------------
123456789123456789.0001000200030004000511111
123456789123456789.000100020003000400051
mismatch
--------------------
-123456789123456789.0001000200030004000511111
-123456789123456789.000100020003000400051
mismatch
--------------------
MAX decimal: 3402823669209384634633746074317682114550000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
3402823669209384634633746074317682114550000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
3402823669209384634633746074317682114550000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000

Converting from decimal to string seems to work fine, a string to decimal seems to be the issue. Since my numbers are smaller than the max Decimal, I don't think my test numbers are simply to large. I'm pretty sure I have ruled out printing problems and I have run the code on an actual device so this is not a playground issue.

Am I doing something wrong? Is there a better way to convert between Decimal and String in swift?

Or are the strings simply too long? (i have not found any documentation/discussion on why this would not work)


Solution

  • From NSDecimalNumber

    NSDecimalNumber, an immutable subclass of NSNumber, provides an object-oriented wrapper for doing base-10 arithmetic. An instance can represent any number that can be expressed as mantissa x 10^exponent where mantissa is a decimal integer up to 38 digits long, and exponent is an integer from –128 through 127.

    (emphasis mine)

    While NSDecimalNumber provides decimal arithmetics with a big precision, the precision has a limit.

    You probably need a custom library, for example BigNum.