Story so far. We are thinking about switching away from perl. The candidates are go or node. For that we wrote simple wrappers in Dancer2, Flask, Node and Go for a long running database query that we have. I had them all up and running, so I benchmarked a bit with light load. Then I decided to stress the applications. Every framework was able to cope with
ab -n 1000 -c 100 http://localhost:8080/
except go. If I did not limit connection then I would get error 'too many connections', if I limited connections to 100, then ab would give timeout error and quit.
My gist https://gist.github.com/2d8473ce576cab5f7c66 with go code. What should I change, so I can use go server under load?
The trouble looks like you are overloading your database with too many simultaneous connections. Remember Go is a truly concurrent language.
Have you tried setting db.SetMaxOpenConns(1000)
to a much smaller number, say db.SetMaxOpenConns(10)
?
Or alternatively you could limit the number of simultaneously running goroutines like this
Declare these globally
const maxAtOnce = 50
var limiter = make(chan struct{}, maxAtOnce)
func init() {
// Fill up with tokens
for i := 0; i < maxAtOnce; i++ {
limiter <- struct{}{}
}
}
And in your getTimeSheet
put this at the start
// take a token
<-limiter
// give it back on exit
defer func() {
limiter <- struct{}{}
}()