Instead of using a struct
, which is predefined in terms of its number of data fileds, capacity, and types of that fields, having a map
, which is expandable and can contain several data types as value, will be more advantageous. For example, on database side, changes to column names, column types, or number of columns in a table will not affect the go code related to querying the database, such as go data structure you put the rows
returned form your database query.
Is there a way in golang sql package or related drivers to know the types of the data, in rows, returned by a database query to define a struct
with appropriate number of fields and types?
If not how can a map
be used to accomplish this with column names of returned rows
are keys and row fields are values of this map
?
To do that you'll have to use a map with values of type interface{}
so they can store any type. If you also need the column names you'd have to use rows.Columns()
to then extract all the data and types.
This will read all columns from a table and store the values in a map as type interface{}
with the column names as keys. Based on that you should be able to work out what you need for your use.
var myMap = make(map[string]interface{})
rows, err := db.Query("SELECT * FROM myTable")
defer rows.Close()
if err != nil {
log.Fatal(err)
}
colNames, err := rows.Columns()
if err != nil {
log.Fatal(err)
}
cols := make([]interface{}, len(colNames))
colPtrs := make([]interface{}, len(colNames))
for i := 0; i < len(colNames); i++ {
colPtrs[i] = &cols[i]
}
for rows.Next() {
err = rows.Scan(colPtrs...)
if err != nil {
log.Fatal(err)
}
for i, col := range cols {
myMap[colNames[i]] = col
}
// Do something with the map
for key, val := range myMap {
fmt.Println("Key:", key, "Value Type:", reflect.TypeOf(val))
}
}
Using the reflect package you can then get the Type for each column as needed as demonstrated with the loop at the end. Or you can use a type switch to extract the types.
The above is generic and will work with any number of columns and types.