I am writing an API in Go (my first hands on with the language) and I have some MySQL queries in loops. When it runs on a larger query it can take more than 7-10 seconds to reply and often times out. Is there a way to make these queries a little more efficient? Is it simply a matter of DB structure, or even more simply server spec?
Server spec is a cheap single core VPS with 512mb RAM.
Go Code for query:
//Vendor search
reply, err: = db.Query("SELECT * FROM manuf_ids WHERE id='" + manuf + "';")
if err != nil {
panic(err.Error())
}
for reply.Next() {
var manufacturer Manufacturer
err = reply.Scan( & manufacturer.ID, & manufacturer.Desc)
if err != nil {
panic(err.Error())
}
replytwo, err: = db.Query("SELECT id,descrip FROM dev_ids WHERE manuf='" + manuf + "';")
if err != nil {
panic(err.Error())
}
for replytwo.Next() {
var devices Device
err = replytwo.Scan( & devices.ID, & devices.Desc)
if err != nil {
panic(err.Error())
}
//Get sub-devices
replythree, err: = db.Query("SELECT id,descrip FROM sub_dev_ids WHERE parent='" + devices.ID + "';")
if err != nil {
panic(err.Error())
}
for replythree.Next() {
var sub_devices Sub
err = replythree.Scan( & sub_devices.ID, & sub_devices.Desc)
if err != nil {
panic(err.Error())
}
devices.Subs = append(devices.Subs, sub_devices)
}
replythree.Close()
manufacturer.Devs = append(manufacturer.Devs, devices)
}
replytwo.Close()
results = append(results, manufacturer)
}
reply.Close()
I have read a little about preparing queries, and I use that in another function for writing to the DB, but I am not sure if that will help here. I did remove my defers from the loops and that helped, but marginally.