I'm working with a database which has yet to be normalized, and this table contains records with over 40 columns.
The following is my Go code for (attempting) to scan the records into a large struct:
type Coderyte struct {
ID int `json:"id"`
EmrID int `json:"emr_id"`
DftID int `json:"dft_id"`
Fix int `json:"fix"`
ReportDate string `json:"report_date"` // Time?
Patient string `json:"patient"`
... // etc
}
func ReadCoderyte(res http.ResponseWriter, req *http.Request) {
rows, err := db.Query("SELECT * FROM coderyte")
if err != nil {
http.Error(res, "Error querying database", 500)
}
defer rows.Close()
// Convert rows into a slice of Coderyte structs
coderytes := make([]*Coderyte, 0)
for rows.Next() {
coderyte := new(Coderyte)
err := rows.Scan(&coderyte) // Expected 42 columns
if err != nil {
panic(err)
http.Error(res, "Error converting coderyte object", 500)
}
coderytes = append(coderytes, coderyte)
}
When I call this code, Scan
complains that it "expected 42 destination arguments, not 1". My understanding is that I would need to address every single field in this large struct, inside of the scan call, ie Scan(&coderyte.ID, &coderyte.EmrID, etc)
My searches have only yielded this other question, where the suggested answer is to use sqlx
. I'm trying to avoid using a third-party tool if I don't need it.
My question boils down to: Is there a way to convert a large database record into a struct without specifying every single field?.
I should also note that the ultimate goal of this function is to return a JSON array of objects, but I did not include that part of the code because I feel it is not important. If there is a way to bypass Scan
and return JSON, that would be an appreciated answer as well.