I have a json field in a sqlite3 collection. My schema looks like:
CREATE Table Animals(
id int,
sounds json,
name string
)
I understand the go-sqlite interface does not support the json datatype explicitly. However, my json is quite simple, as all fields are json arrays, eg;
["bark", "woof", "growl"]
["meow", "hiss", "growl"]
So a full record might be:
id sounds name
1 ["bark", "woof", "growl"] Fido
2 ["meow", "hiss", "growl"] Rufus
Using the package:
_ "github.com/mattn/go-sqlite3"
I am able to extract my json field with
var id sql.NullInt64
var name sql.NullString
var sounds []uint8
err := db.QueryRow("SELECT id,name,sounds FROM Animals WHERE id = ?;", 1).Scan(&id, &name, &sounds)
fmt.Println(strconv.Itoa(id) + "|" + name + "|" + strings.Join(sounds, "+"))
// does print correctly:
1|Fido|bark+wood+growl
That is, it seems the sqlite3 json gets stored in a unicode string(?) as a series of...bytes?...that I can convert to string with the String module. I'm additionally interested in the "+" join operation so I can make a query+string+looking+thing out of this for another application downstream.
However, I'd really like to bundle this all up in JSON, and take advantage of JSON unmarshalling/parsing rather than my ad hoc custom prints. When I try:
type Animal struct {
id int `json:"id"`
name sql.NullString `json:"name"`
sounds []uint8 `json:"sounds"`
}
var a Animal
err := db.QueryRow("SELECT id,name,sounds FROM Animals WHERE id = ?;", 1).Scan(&a.id, &a.name, &a.sounds
)
It prints a bona fide array of integers. How can I embed the strings.Join(sounds []uint8) declaration + function transformation combo in my json-enabled type definition?
Additionally, it's not clear to me how to use the []uint8 string in the event the json is a nulled [] or true NULL and further make it robust against these.
Some refs:
Your question brings up several topics. But the easiest answer to all of them is probably:
Don't use a relational database.
You seem to want to fetch objects/documents, so using a storage mechanism that natively supports this will prevent the need for kludges everywhere. MongoDB, CouchDB, or some other NoSQL solution is probably the right fit for your desires.
But having said that, there are answers to your specific questions. Put together, they arguably make for something complex and ugly, though.
sounds
type.Create a custom type, which implements the sql.Scanner interface, and unmarshals the JSON value for you:
type Sounds []string
func (s *Sounds) Scan(src interface{}) error {
switch t := src.(type) {
case []byte:
return json.Unmarshal(t, &s)
default:
return errors.New("Invalid type")
}
}
Use sqlx for this. It allows you to scan your entire row into a struct much more easily than the standard library. It can use the db
tag to match rows to the struct field.
You can have multiple tags in your struct:
type Animal struct {
id int `db:"id" json:"id"`
name sql.NullString `db:"name" json:"name"`
sounds []uint8 `db:"sounds" json:"sounds"`
}