I've been using strings to represent decoded JSON integers larger than 32 bits. It seems the string_of_int
is capable of dealing with large integer inputs. So a decoder, written (in the Json.Decode namespace):
id: json |> field("id", int) |> string_of_int, /* 'id' is string */
is succefully dealing with integers of at least 37 bits.
Encoding, on the other hand, is proving troublesome for me. The remote server won't accept a string representation, and is expecting an int64. Is it possible to make bs-json
support the int64
type? I was hoping something like this could be made to work:
type myData = { id: int64 };
let encodeMyData = (data:myData) => Json.Encode.(object_([("id", int64(myData.id)]))
Having to roll my own encoder is not nearly as formidable as a decoder, but ... I'd rather not.
You don't say exactly what problem you have with encoding. The int
encoder does literally nothing except change the type, trusting that the int
value is actually valid. So I would assume it's the int_of_string
operation that causes problems. But that begs the question, if you can successfully decode it as an int
, why are you then converting it to a string
?
The underlying problem here is that JavaScript doesn't have 64 bit integers. The max safe integer is 253 - 1. JavaScript doesn't actually have integers at all, only float
s, which can represent a certain range of integers, but can't efficiently do integer arithmetic unless they're converted to either 32-bit or 64-bit int
s. And so for whatever reason, probably consistent overflow handling, it was decided in the EcmaScript specification that binary bitwise operations should operate on 32-bit integers. And so that opened the possibility for an internal 32-bit representation, a notation for creating 32-bit integers, and the possibility of optimized integer arithmetic on those.
So to your question:
Would it be "safe" to just add external int64 : int64 -> Js.Json.t = "%identity" to the encoder files?
No, because there's no 64-bit integer representation in JavaScript, int64
values are represented as an array of two Number
s I believe, but is also an internal implementation detail that's subject to change. Just casting it to Js.Json.t
will not yield the result you expect.
So what can you do then?
I would recommend using float
. In most respects this will behave exactly like JavaScript numbers, giving you access to its full range.
Alternatively you can use nativeint
, which should behave like float
s except for division, where the result is truncated to a 32-bit integer.
Lastly, you could also implement your own int_of_string
to create an int
that is technically out of range by using a couple of lightweight JavaScript functions directly, though I wouldn't really recommend doing this:
let bad_int_of_string = str =>
str |> Js.Float.fromString |> Js.Math.floor_int;