I want to put all arriving http.Request
s into a queue and have a separate thread (goroutine
?) process these requests and return the appropriate status.
However, the main http request
handler directly completes the request even when the http.Request
object is sent asynchronously to a goroutine
.
Is there a way to control when the http.Request
is completed and thereby asynchronously process it?
[Update]
I want to implement a producer-consumer model. The main request handler produces the requests and put them into a queue. A consumer thread (or threads) will read these requests, consume the body of the requests and return them.
http handlers are executed in a different goroutine per request, so if you are simply trying to free up the main serve loop, it's not neccesary.
If you are looking to serialize processing of requests, you could use a sync.Mutex
and have your handler's lock on that. This would have a similar effect in that the requests would be handled one at a time.
I don't think sync.Mutex
is fair, so it may not meet your needs.
also, if you wanted to be stateful between requests, then this is probably not the right solution.
As Jorge Marey mentioned, channels would work as well.
Though, i'd suggest you look at golang.org/x/net/context as it is a package specifically designed for multi-stage processing with timeouts and whatnot.
my guess is you will end up with a channel that passes structs that look like:
type Req struct{
ctx context.Context
w http.ResponseWriter
r *http.Request
}