I would like to batch-insert data into our ClickHouse database. Using gorm
, I can easily use
type audit struct{
field1 string `json:"field1"`,
field2 string `json:"field2"`,
}
chDB.Table(tableName).CreateInBatches(audits, 5)
However, if the audit
contains an array field as below, the data of other fields will still be saved in the database. Only the field with array (field1
as below) will not be saved.
type audit struct{
field1 []string `json:"field1"`,
field2 string `json:"field2"`,
}
chDB.Table(tableName).CreateInBatches(audits, 5)
Error:[error] unsupported data type: &[]
Only if I prepare a statement and save data then it works. However, in that case, it's not batch-insert anymore
sqlDB, err := db.DB()
tx, err := sqlDB.Begin()
stmt, err := tx.PrepareContext(ctx, `insert into audit_table (field1, field2) values (?, ?)`)
_, err = stmt.ExecContext(ctx, clickhouse.Array(audit.field1), audit.field2)
err = tx.Commit()
Can someone help me with this? How to batch-insert array-data into ClickHouse columns?
You may need clickhouse.Array
Type to insert in clickhouse.Array
column
type audit struct{
field1 clickhouse.Array `json:"field1"`,
field2 string `json:"field2"`,
}
audits := audit{
field1: clickhouse.Array([]string{"one", "three"}),
field2: "two"
}
chDB.Table(tableName).CreateInBatches(audits, 5)