2019-07-11 01:24
浏览 96


I am writing an HTTP server to handle Cisco Meraki Scanning API. It is a push API, in which Cisco periodically calls your endpoint with a POST request and a JSON body. The API must be able to respond to this POST request in less than 500ms. If not, Cisco will stop sending you data, and you can't recover that information.

So, I have been looking for ways to handle these requests as fast as I can.

The first thing I did was decouple the processing of the JSON body by using a queue. I take the Body from the request, put it in the queue, and respond. Then, several workers will process the body and store it on S3 asynchronously. I also tried to make the server as simple as possible.

Most requests work in less than 500ms, but some don't. Looking at where I am, the only thing that comes to mind to improve these times is to process the body of the request faster.

The whole server is available on this link: meraki_endpoint.go. And this is how I am handling the request body at the moment:

func handleData(w http.ResponseWriter, r *http.Request, jobs chan job) {
  var devicesSeen DevicesSeen
  // I am using "" instead of "encoding/json"
  err := json.NewDecoder(r.Body).Decode(&devicesSeen)
  if err != nil {
    http.Error(w, "Bad request - Can't Decode!", 400)
  // Create Job and push the work into the Job Channel
  go func() {
    jobs <- job{devicesSeen}
  // Render success

At the moment I am decoding the JSON as I read if from the body, instead of reading it and store it as a list of bytes using ioutil.ReadAll(r.Body). After trying both ways, I couldn't find a significant speed improvement.

How can I improve the performance of the server? Or, how can I read the body of the request faster so I can work on it later on the queue?

Edit #1

I moved my stack to another AWS region, closer to the source of the data, and the roundtrip time descended to a fifth of what it was before.

  • 写回答
  • 好问题 提建议
  • 关注问题
  • 收藏
  • 邀请回答

1条回答 默认 最新

  • ds15812330851 2019-07-11 02:41

    It does not look like you could read the body much faster, specially if you already tried just doing a ioutil.ReadAll(r.Body) instead of decoding it.

    And since from what you observe most requests are indeed fast, probably your problem is not that handleData function.

    Here are some things to try:

    • Play with different "max_workers" settings

    If the amount of pending requests is too large, even though goroutines are cheap they do take some memory so your server might slow down anyway, specially if garbage collection plays a role there, which takes us to the next bullet.

    • Try to profile GC, or else play with different settings

    Here you have the runtime documentation including the GOGC variable you can try tuning, plus some flags you can try using to profile the GC pauses (see gctrace flag there for example)

    These blogs might also help, they detail some issues they were seeing with GC at high processing volume and how they worked around those improving GC performance:

    • Scale horizontally

    You don't describe your whole setup, but probably you have a group of servers load-balancing? If volume is high one possibility is that the amount of servers you have can't handle the load, in that case you can try adding more servers.

    解决 无用
    打赏 举报

相关推荐 更多相似问题