2019-09-24 19:13
浏览 169

Golang AWS S3批处理对象创建

I'm trying to create a bunch of "folders" inside s3 bucket. Was trying to duplicate the same approach as https://github.com/aws/aws-sdk-go/blob/master/service/s3/s3manager/batch_test.go#L742 but this thing expects a "Body" which is not really needed in my case. So far the following code does what I need but I feel like there is a better "batched" approach that can be implemented.

serv := s3.New(session.New(s3h.Config))

for _, i1 := range []string{"1", "2", "3", "4", "5", "6", "7", "8", "9", "0", "a", "b", "c", "d", "e", "f"} {
    for _, i2 := range []string{"1", "2", "3", "4", "5", "6", "7", "8", "9", "0", "a", "b", "c", "d", "e", "f"} {
        req := &s3.PutObjectInput{
            Bucket: aws.String(S3_BUCKET),
            Key:    aws.String(i1 + i2 + "/"),

Any pointers?


  • 写回答
  • 关注问题
  • 收藏
  • 邀请回答

1条回答 默认 最新

  • dpblwmh5218 2019-09-25 11:35

    There is no such thing as "folders" in S3. At simple high level, S3 is a big key/value store. The list operation supports finding subsets of keys that begin with a certain prefix and can stop at a known delimiter (e.g. /), allowing it to look like it has file system like semantics.

    Simply putting an object to /a/b/c/d which look like it has created folders a, b & c but it has simply created one object with that key. The list operation used in the console is stopping at / as a delimiter so it offers a folder like view.

    You can't put multiple S3 objects in a single operation. Each one has to be uploaded as an individual operation. Using Go, I'd recommend you could have a channel which you put all uploads required and have multiple concurrent workers processing these to improve performance. You'd need to do some tuning to find the optimal amount. I wouldn't be surprised if there isn't already a library for this as this would be quite a common S3 use case.

    打赏 评论

相关推荐 更多相似问题