I have a GAE Golang app that should be able to handle hundreds of concurrent requests, and for each requests, I do some work on the input and then store it in the datastore.
Using the task queue (appengine/delay lib) I am getting pretty good performance, but it still seems very inefficient to perform single-row inserts for each request (even though the inserts are deferred using task queue).
If this was not app engine, I would probably append the output a file, and every once in a while I would batch load the file into the DB using a cron job / some other kind of scheduled service.
So my questions are:
- Is there an equivalent scheme I can implement on app engine? I was thinking - perhaps I should write some of the rows to memecache, and then every couple of seconds I will bulk load all of the rows from there and purge the cache.
- Is this really needed? Can the datastore handle thousands of concurrent writes - a write per http request my app is getting?