I am working with go to download files from one server and after manipulating the files sending it to another server.
The files size can vary from 1MB to 200MB.
Currently, my code is pretty simple, I am using http.Client and bytes.Buffer .
It takes lot of time to handle does big files (the 100MB to 200MB) which there is a lot of them.
After a quick profiling, I see that most of the time I do bytes.(*Buffer).grow,
How can I create big buffers for example for 16MB?
What can I do in order to improve my efficiency of the code? General tips for handling with large http requests?
Edit
I will explain, exactly what I am trying to do. I have couchdb documents (with attachments) that I am trying to copy to another couchdb instance. The couchdb documents size can be from 30MB to 200MB, copying tiny (2 - 10MB) couchdb documents - is really fast.
But sending the document over the wire is really slow. I am currently, trying to profile, and try to use @Evan answer to see what is my problem.