I made a small web server using gorilla/mux
.
I took the small example at the end of the README file in their Github page.
r := mux.NewRouter()
r.HandleFunc("/", listUrlsHandler)
r.HandleFunc("/foo", fooHandler)
http.Handle("/", r)
err := http.ListenAndServe(":9090", nil) // set listen port
if err != nil {
log.Fatal("ListenAndServe: ", err)
}
To emulate a long processing I added a timer in the callback function.
func fooHandler(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusOK)
fmt.Fprintln(w, "foo is downloading...")
log.Println("foo is downloading...")
time.Sleep(15*time.Second)
}
I run 2x /foo
at the same time:
2018/01/12 09:15:03 foo is downloading...
2018/01/12 09:15:18 foo is downloading...
- Only one is processing, the second one is not called. Once the first one is returned (15sec later), the second one starts.
- Why? I want them to be processed in parallel...
- This is deal-breaker, that means clients cannot access to the same page at the same moment.
How can I reply to the client and process later? Adding a goroutine in the handler?
func fooHandler(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusOK)
fmt.Fprintln(w, "foo is downloading...")
log.Println("foo is downloading...")
go time.Sleep(15*time.Second) // HERE <------------
// go myCustomFunc() // HERE <------------
}
I had an idea of how it's working but obviously I am totally wrong.
I thought the router created a goroutine
at each call.
Could you tell me what's the best practice to do what I want ?
I found the problem using tcp dump:
The second request opens a TCP connection that gets closed 150ms later (FIN from client). Then the second request is processed by the first TCP connection opened by the first request after the first request is done.
Might be Firefox's behavior that close the TCP connection and use the previous connection.