gogo-map

how to normalize (1:N) a csv file to a map in Go?


I'm trying to normalize a structure from a CSV file, which is like this:

name, note
'Joe', 5
'Joe', 3
'Ashley', 1
'Ashley', 7
'Ashley', 4

to a map, that after read that file, will be reduced to:

map [string][]string{
    "joe" = {5, 3},
    "ashley" = {1, 7, 4},
}

what's the best approach to do that?

Im new in Go, and the code I created is something like this:

func main() {
    fileName := "new"
    xlsx, err := excelize.OpenFile(fileName + ".xlsx")
    if err != nil {
        fmt.Println(err)
        return
    }
    rows, err := xlsx.Rows("Sheet1")
    if err != nil {
        fmt.Print(err)
    }

    for rows.Next() {

        column, err := rows.Columns()
        if err != nil {
            println(err)

        }

        for i := 0; i < 1; i++ {
            if i == i {
                m := map[string][]string{
                    column[i]: []string{column[1]},
                }
                fmt.Printf("%v\n", m)
            }

        }
    }
}

Solution

  • It should be pretty straightforward:

    m := map[string][]string{}
    for rows.Next() {
        column, err := rows.Columns()
        if err != nil {
            panic(err)
        }
        if len(column) < 2 {
            panic("row too short")
        }
        m[column[0]] = append(m[column[0]], column[1])
    }