Swift's String has String.init(_:radix:)
initializer, which seems to let you convert Int values to hexadecimal. However, for values with the high bit set, it displays them as negative hex values.
How do you get a 64 bit integer to display in unsigned hexadecimal? String(format:)
seems to only work on the low 32 bits of an Int.
This code:
let hashHex = String(hashValue, radix: 16, uppercase: true)
Creates strings like -5DB41312E0C8A0A9
(which presumably uses 2's compliment, and is not helpful.)
String.init(_:radix:uppercase:)
works with any BinaryInteger
. If the BinaryInteger
is signed, then it will produce a signed string too.
hashValue
is of type Int
, which is signed. You can convert it to an UInt
using init(bitPattern:)
. For example,
let hashValue = -1
print(String(UInt(bitPattern: hashValue), radix: 16))
// ffffffffffffffff
Note that it is incorrect to use UInt(hashValue)
- the initialiser without any parameter labels will try to create a UInt
with the numeric value, instead of preserving the bit pattern. UInt(-1)
will crash because -1 is not a number representable by UInt
.