swiftcastingtime-complexityoverhead

What is the runtime cost of Swift's casts?


What are the different runtime costs incurred by the following type casts?

  1. Numeric cast of constant, e.g.:

    let f = 0.1 as CGFloat
    

    I'd imagine this has zero runtime cost.

  2. Numeric cast of runtime value, e.g.:

    let f = someDoubleValue as CGFloat
    

    I'd imagine this has an extremely small runtime cost.

  3. Upcast, e.g.:

    let dict: [String: Int] = ...
    let anyObj = dict as AnyObject
    

    I'd expect this to have zero runtime cost.

  4. Failable Downcast, e.g.:

    let anyObj: AnyObject = ...
    if let str = anyObj as? String { ... }
    

    I'd expect this to have a runtime cost proportional to the number of classes in the hierarchy of the dynamic type of anyObj.

  5. Forced Downcast, e.g.:

    let anyObj: AnyObject = ...
    let str = anyObj as! String
    

    Maybe the cost for a forced downcast is slightly lower?

  6. Forced Downcast of collection, e.g.:

    let dates: [AnyObject] = ...
    for date in dates as! [NSDate] { ... }
    

    What happens here — especially when dates comes from an NSArray? Is the runtime cost of this cast proportional to the number of its elements? What if I cast to a more complex collection type like [String: [String: [Int]]] — is the whole collection traversed to make sure all its elements and subelements conform to this cast?

For each of the first four cases, are my assertions true?


Solution

  • Sources:

    1. https://github.com/apple/swift/blob/master/stdlib/public/runtime/Casting.cpp
    2. https://github.com/apple/swift/blob/master/stdlib/public/core/ArrayCast.swift
    3. https://github.com/apple/swift/blob/master/stdlib/public/core/HashedCollections.swift.gyb