swiftfloating-pointdecimalprecisionfloating-accuracy

Mystery behind presentation of Floating Point numbers


I was testing some simple solution for my app, and I ran into some case where question comes up in my head... "Why one floating number is represented in JSON correctly (as I expect) and other one not...?"

in this case conversion from String to Decimal and then to JSON of number: "98.39" is perfectly predictable from human point of view, but number: "98.40" doesn't look so beautiful...

And my question is, could someone explain please to me, why conversion from String to Decimal works as I expect for one floating number, but for another it is not.

I have red a lot about Floating Point number error, but I can't figure it out how the proces from String ->... binary based conversion stuff...-> to Double has different precision for both cases.


My playground code:

struct Price: Encodable {
    let amount: Decimal
}

func printJSON(from string: String) {
    let decimal = Decimal(string: string)!
    let price = Price(amount: decimal)

    //Encode Person Struct as Data
    let encodedData = try? JSONEncoder().encode(price)

    //Create JSON
    var json: Any?
    if let data = encodedData {
        json = try? JSONSerialization.jsonObject(with: data, options: [])
    }

    //Print JSON Object
    if let json = json {
        print("Person JSON:\n" + String(describing: json) + "\n")
    }
}

let stringPriceOK =     "98.39"
let stringPriceNotOK =  "98.40"
let stringPriceNotOK2 = "98.99"

printJSON(from: stringPriceOK)
printJSON(from: stringPriceNotOK)
printJSON(from: stringPriceNotOK2)
/*
 ------------------------------------------------
 // OUTPUT:
 Person JSON:
 {
 amount = "98.39";
 }

 Person JSON:
 {
 amount = "98.40000000000001";
 }

 Person JSON:
 {
 amount = "98.98999999999999";
 }
 ------------------------------------------------
 */

I was looking/trying to figure it out what steps has been performed by the logical unit to convert: "98.39" -> Decimal -> String - with result of "98.39" and with the same chain of conversion: "98.40" -> Decimal -> String - with result of "98.40000000000001"

Many thanks for all responses!


Solution

  • This is purely an artifact of how an NSNumber prints itself.

    JSONSerialization is implemented in Objective-C and uses Objective-C objects (NSDictionary, NSArray, NSString, NSNumber, etc.) to represent the values it deserializes from your JSON. Since the JSON contains a bare number with decimal point as the value for the "amount" key, JSONSerialization parses it as a double and wraps it in an NSNumber.

    Each of these Objective-C classes implements a description method to print itself.

    The object returned by JSONSerialization is an NSDictionary. String(describing:) converts the NSDictionary to a String by sending it the description method. NSDictionary implements description by sending description to each of its keys and values, including the NSNumber value for the "amount" key.

    The NSNumber implementation of description formats a double value using the printf specifier %0.16g. (I checked using a disassembler.) About the g specifier, the C standard says

    Finally, unless the # flag is used, any trailing zeros are removed from the fractional portion of the result and the decimal-point wide character is removed if there is no fractional portion remaining.

    The closest double to 98.39 is exactly 98.3900 0000 0000 0005 6843 4188 6080 8014 8696 8994 1406 25. So %0.16g formats that as %0.14f (see the standard for why it's 14, not 16), which gives "98.39000000000000", then chops off the trailing zeros, giving "98.39".

    The closest double to 98.40 is exactly 98.4000 0000 0000 0056 8434 1886 0808 0148 6968 9941 4062 5. So %0.16g formats that as %0.14f, which gives "98.40000000000001" (because of rounding), and there are no trailing zeros to chop off.

    So that's why, when you print the result of JSONSerialization.jsonObject(with:options:), you get lots of fractional digits for 98.40 but only two digits for 98.39.

    If you extract the amounts from the JSON object and convert them to Swift's native Double type, and then print those Doubles, you get much shorter output, because Double implements a smarter formatting algorithm that prints the shortest string that, when parsed, produces exactly the same Double.

    Try this:

    import Foundation
    
    struct Price: Encodable {
        let amount: Decimal
    }
    
    func printJSON(from string: String) {
        let decimal = Decimal(string: string)!
        let price = Price(amount: decimal)
    
        let data = try! JSONEncoder().encode(price)
        let jsonString = String(data: data, encoding: .utf8)!
        let jso = try! JSONSerialization.jsonObject(with: data, options: []) as! [String: Any]
        let nsNumber = jso["amount"] as! NSNumber
        let double = jso["amount"] as! Double
    
        print("""
        Original string: \(string)
            json: \(jsonString)
            jso: \(jso)
            amount as NSNumber: \(nsNumber)
            amount as Double: \(double)
    
        """)
    }
    
    printJSON(from: "98.39")
    printJSON(from: "98.40")
    printJSON(from: "98.99")
    

    Result:

    Original string: 98.39
        json: {"amount":98.39}
        jso: ["amount": 98.39]
        amount as NSNumber: 98.39
        amount as Double: 98.39
    
    Original string: 98.40
        json: {"amount":98.4}
        jso: ["amount": 98.40000000000001]
        amount as NSNumber: 98.40000000000001
        amount as Double: 98.4
    
    Original string: 98.99
        json: {"amount":98.99}
        jso: ["amount": 98.98999999999999]
        amount as NSNumber: 98.98999999999999
        amount as Double: 98.99
    

    Notice that both the actual JSON (on the lines labeled json:) and the Swift Double versions use the fewest digits in all cases. The lines that use -[NSNumber description] (labeled jso: and amount as NSNumber:) use extra digits for some values.