I am reading JSON in response to an HTTP endpoint and would like to extract the contents of an array of objects which is nested inside. The response can be large so I am trying to use a streaming approach instead of just json.Unmarshal'ing the whole thing. The JSON looks like so:
{
"useless_thing_1": { /* etc */ },
"useless_thing_2": { /* etc */ },
"the_things_i_want": [
{ /* complex object I want to json.Unmarshal #1 */ },
{ /* complex object I want to json.Unmarshal #2 */ },
{ /* complex object I want to json.Unmarshal #3 */ },
/* could be many thousands of these */
],
"useless_thing_3": { /* etc */ },
}
The json library provided with Go has json.Unmarshal
which works well for complete JSON objects. It also has json.Decoder
which can unmarshal full objects or provide individual tokens. I can use this tokenizer to carefully go through and extract things but the logic to do so is somewhat complex and I cannot then easily still use json.Unmarshal
on the object after I've read it as tokens.
The json.Decoder is buffered which makes it difficult to read one object (i.e. { /* complex object I want to json.Unmarshal #1 */ }
) and then consume the ,
myself and make a new json.Decoder
- because it will try to consume the comma itself. This is the approach I tried and haven't been able to make work.
I'm looking for a better solution to this problem. Here is the broken code when I tried to manually consume the commas:
// code here that naively looks for `"the_things_i_want": [` and
// puts the next bytes after that in `buffer`
// this is the rest of the stream starting from `{ /* complex object I want to json.Unmarshal #1 */ },`
in := io.MultiReader(buffer, res.Body)
dec := json.NewDecoder(in)
for {
var p MyComplexThing
err := dec.Decode(&p)
if err != nil {
panic(err)
}
// steal the comma from in directly - this does not work because the decoder buffer's its input
var b1 [1]byte
_, err = io.ReadAtLeast(in, b1[:], 1) // returns random data from later in the stream
if err != nil {
panic(err)
}
switch b1[0] {
case ',':
// skip over it
case ']':
break // we're done
default:
panic(fmt.Errorf("Unexpected result from read %#v", b1))
}
}
Use Decoder.Token and Decoder.More to decode a JSON document as a stream.
Walk through the document with Decoder.Token to the JSON value of interest. Call Decoder.Decode unmarshal the JSON value to a Go value. Repeat as needed to slurp up all values of interest.
Here's some code with commentary explaining how it works:
func decode(r io.Reader) error {
d := json.NewDecoder(r)
// We expect that the JSON document is an object.
if err := expect(d, json.Delim('{')); err != nil {
return err
}
// While there are fields in the object...
for d.More() {
// Get field name
t, err := d.Token()
if err != nil {
return err
}
// Skip value if not the field that we are looking for.
if t != "the_things_i_want" {
if err := skip(d); err != nil {
return err
}
continue
}
// We expect JSON array value for the field.
if err := expect(d, json.Delim('[')); err != nil {
return err
}
// While there are more JSON array elements...
for d.More() {
// Unmarshal and process the array element.
var m map[string]interface{}
if err := d.Decode(&m); err != nil {
return err
}
fmt.Printf("found %v\n", m)
}
// We are done decoding the array.
return nil
}
return errors.New("things I want not found")
}
// skip skips the next value in the JSON document.
func skip(d *json.Decoder) error {
n := 0
for {
t, err := d.Token()
if err != nil {
return err
}
switch t {
case json.Delim('['), json.Delim('{'):
n++
case json.Delim(']'), json.Delim('}'):
n--
}
if n == 0 {
return nil
}
}
}
// expect returns an error if the next token in the document is not expectedT.
func expect(d *json.Decoder, expectedT interface{}) error {
t, err := d.Token()
if err != nil {
return err
}
if t != expectedT {
return fmt.Errorf("got token %v, want token %v", t, expectedT)
}
return nil
}