sqlgostructsqlx

Efficiently mapping one-to-many many-to-many database to struct in Golang


Question

When dealing with a one-to-many or many-to-many SQL relationship in Golang, what is the best (efficient, recommended, "Go-like") way of mapping the rows to a struct?

Taking the example setup below I have tried to detail some approaches with Pros and Cons of each but was wondering what the community recommends.

Requirements

Example

For sake of clarity I have removed error handling

Models

type Tag struct {
  ID int
  Name string
}

type Item struct {
  ID int
  Tags []Tag
}

Database

CREATE TABLE item (
  id                      INT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY
);

CREATE TABLE tag (
  id                      INT GENERATED BY DEFAULT AS IDENTITY PRIMARY KEY,
  name                    VARCHAR(160),
  item_id                 INT REFERENCES item(id)
);

Approach 1 - Select all Items, then select tags per item

var items []Item
sqlxdb.Select(&items, "SELECT * FROM item")

for i, item := range items {
  var tags []Tag
  sqlxdb.Select(&tags, "SELECT * FROM tag WHERE item_id = $1", item.ID)
  items[i].Tags = tags
}

Pros

Cons

Approach 2 - Construct SQL join and loop through rows manually

var itemTags = make(map[int][]Tag)

var items = []Item{}
rows, _ := sqlxdb.Queryx("SELECT i.id, t.id, t.name FROM item AS i JOIN tag AS t ON t.item_id = i.id")
for rows.Next() {
  var (
    itemID  int
    tagID   int
    tagName string
  )
  rows.Scan(&itemID, &tagID, &tagName)
  if tags, ok := itemTags[itemID]; ok {
    itemTags[itemID] = append(tags, Tag{ID: tagID, Name: tagName,})
  } else {
    itemTags[itemID] = []Tag{Tag{ID: tagID, Name: tagName,}}
  }
}
for itemID, tags := range itemTags {
  items = append(Item{
    ID: itemID,
    Tags: tags,
  })
}

Pros

Cons

Failed approach 3 - sqlx struct scanning

Despite failing I want to include this approach as I find it to be my current aim of efficiency paired with development simplicity. My hope was by explicitly setting the db tag on each struct field sqlx could do some advanced struct scanning

var items []Item
sqlxdb.Select(&items, "SELECT i.id AS item_id, t.id AS tag_id, t.name AS tag_name FROM item AS i JOIN tag AS t ON t.item_id = i.id")

Unfortunately this errors out as missing destination name tag_id in *[]Item leading me to believe the StructScan is not advanced enough to recursively loop through rows (no criticism - it is a complicated scenario)

Possible approach 4 - PostgreSQL array aggregators and GROUP BY

While I am sure this will not work I have included this untested option to see if it could be improved upon so it may work.

var items = []Item{}
sqlxdb.Select(&items, "SELECT i.id as item_id, array_agg(t.*) as tags FROM item AS i JOIN tag AS t ON t.item_id = i.id GROUP BY i.id")

When I have some time I will try and run some experiments here.


Solution

  • the sql in postgres :

    create schema temp;
    set search_path = temp;
    create table item
    (
      id INT generated by default as identity primary key
    );
    
    create table tag
    (
      id      INT generated by default as identity primary key,
      name    VARCHAR(160),
      item_id INT references item (id)
    );
    
    create view item_tags as
    select id,
      (
              select
                array_to_json(array_agg(row_to_json(taglist.*))) as array_to_json
              from (
                    select tag.name, tag.id
                     from tag
                             where item_id = item.id
                   ) taglist ) as tags
    from item ;
    
    
    -- golang query this maybe 
    select  row_to_json(row)
    from (
        select * from item_tags
    ) row;
    
    

    then golang query this sql:

    select  row_to_json(row)
    from (
        select * from item_tags
    ) row;
    

    and unmarshall to go struct:

    pro:

    1. postgres manage the relation of data. add / update data with sql functions.

    2. golang manage business model and logic.

    it's easy way.

    .