doushi1473
doushi1473
2015-08-22 03:34

Golang将Http请求FormFile上传到Amazon S3

已采纳

I'm creating a micro service to handle some attachments uploads to Amazon S3, What I'm trying to achieve is accept a file and then store it directly to my Amazon S3 bucket, my current function :

func upload_handler(w http.ResponseWriter, r *http.Request) {
  file, header, err := r.FormFile("attachment")

  if err != nil {
    fmt.Fprintln(w, err)
    return
  }

  defer file.Close()

  fileSize, err := file.Seek(0, 2) //2 = from end
    if err != nil {
         panic(err)
    }

    fmt.Println("File size : ", fileSize)

  bytes := make([]byte, fileSize)
  // read into buffer
  buffer := bufio.NewReader(file)
  _, err = buffer.Read(bytes)

  auth := aws.Auth{
        AccessKey: "XXXXXXXXXXX",
        SecretKey: "SECRET_KEY_HERE",
  }
  client := s3.New(auth, aws.EUWest)
  bucket := client.Bucket("attachments")

  err = bucket.Put(header.Filename, bytes, header.Header.Get("Content-Type"), s3.ACL("public-read"))

  if err != nil {
    fmt.Println(err)
    os.Exit(1)
  }
}

The problem is that the files stored in S3 are all corrupted, After a small verification it seems that the file payload is not read as bytes

How to convert the file to bytes and store it correctly to S3 ?

  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答

1条回答

  • duanao2585 duanao2585 6年前

    Use ioutil.ReadAll:

    bs, err := ioutil.ReadAll(file)
    // ...
    err = bucket.Put(
        header.Filename, 
        bs, 
        header.Header.Get("Content-Type"), 
        s3.ACL("public-read"),
    )
    

    Read is a lower-level function which has subtle behavior:

    Read reads data into p. It returns the number of bytes read into p. It calls Read at most once on the underlying Reader, hence n may be less than len(p). At EOF, the count will be zero and err will be io.EOF.

    So what was probably happening was some subset of the file data was being written to S3 along with a bunch of 0s.

    ioutil.ReadAll works by calling Read over and over again filling a dynamically expanding buffer until it reaches the end of the file. (so there's no need for the bufio.Reader either)

    Also the Put function will have issues with large files (using ReadAll means the entire file must fit in memory) so you may want to use PutReader instead:

    bucket.PutReader(
        header.Filename, 
        file, 
        fileSize, 
        header.Header.Get("Content-Type"), 
        s3.ACL("public-read"),
    )
    
    点赞 评论 复制链接分享

相关推荐