I am working on backend architecture which is purely in Golang. I have an API which is used to upload a file to golang server and then I am transferring the file to cloud storage(from the golang server itself). Now, I want both the transfers to be independent, so that, the end user should not has to wait for the response after uploading a file.
End User -> Golang Server ->[Concurrency/Parallelism] -> Cloud Storage
Now, I thought of two ways:
- Create a goroutine as soon as the user finishes the upload and transfer the file to cloud.
- Insert the file handler into a queue, and a different process would read this queue and transfer the file to cloud storage (Multiple producers - Single Consumer model).
I found examples of doing this using goroutine and channels but I think that would create as many goroutines as much there are uploads. I want to use the second option but not able to understand of how to go about it in golang?
Also, do suggest if I am using wrong approach and there is some other efficient method of doing this.
Update
Details about the requirement and constraint:
1. I am using AWS S3 as cloud storage. If at some point, the upload from Go server to Amazon S3 fails, the file handler should be kept as in to keep record of the failed upload.(I am not prioritising this, I might change this based on clients feedback)
2. The file will be deleted from the Go server as soon as the upload completes successfully to Amazon S3, so as to avoid repetitive uploads. Also, if a file is uploaded with same name, it will be replaced at Amazon S3.
3. As pointed out in comments, I can use channel as the queue. Is it possible to design the above architecture using Go's Channels and goroutines?