doudi8525 2018-02-06 13:52
浏览 245
已采纳

每个部分读取超过4096字节的字节。

I'm trying to process a multipart file upload in small chunks to avoid storing the entire file in memory. The following function seems to solve this, however when passing a []byte as the destination for the part.Read() method, it reads the part in chunks of 4096 bytes instead of in chunks of the destination size (len([]byte)).

When opening a local file and Read()'ing it into a []byte of the same size, it uses the entire space available as expected. Thus I think it's something specific to the part.Reader(). However, I'm unable to find anything about a default or max size for that function.

For reference, the function is as follows:

func ReceiveFile(w http.ResponseWriter, r *http.Request) {
  reader, err := r.MultipartReader()
  if err != nil {
    panic(err)
  }
  if reader == nil {
    panic("Wrong media type")
  }
  buf := make([]byte, 16384)
  fmt.Println(len(buf))
  for {
    part, err := reader.NextPart()
    if err == io.EOF {
      break
    }
    if err != nil {
      panic(err)
    }
    var n int
    for {
      n, err = part.Read(buf)
      if err == io.EOF {
        break
      }
      if err != nil {
        panic(err)
      }
      fmt.Printf("Read %d bytes into buf
", n)
      fmt.Println(len(buf))
    }
    n, err = part.Read(buf)
    fmt.Printf("Finally read %d bytes into buf
", n)
    fmt.Println(len(buf))
  }
  • 写回答

2条回答 默认 最新

  • dongmo1708 2018-02-06 19:29
    关注

    The part reader does not attempt to fill the caller's buffer as allowed by the io.Reader contract.

    The best way to handle this depends on the requirements of the application.

    If you want to slurp the part into memory, then use ioutil.ReadAll:

    for {
        part, err := reader.NextPart()
        if err == io.EOF {
          break
        }
        if err != nil {
          // handle error
        }
        p, err := ioutil.ReadAll(part)
        if err != nil {
          // handle error
        }
        // p is []byte with the contents of the part
    }
    

    If you want to copy the part to the io.Writer w, then use io.Copy:

    for {
        part, err := reader.NextPart()
        if err == io.EOF {
          break
        }
        if err != nil {
          // handle error
        }
        w := // open a writer
        _, err := io.Copy(w, part)
        if err != nil {
          // handle error
        }
    }
    

    If you want to process fixed size chunks, then use io.ReadFull:

     buf := make([]byte, chunkSize)
     for {
        part, err := reader.NextPart()
        if err == io.EOF {
          break
        }
        if err != nil {
          // handle error
        }
        _, err := io.ReadFull(part, buf)
        if err != nil {
          // handle error
          // Note that ReadFull returns an error if it cannot fill buf
        }
        // process the next chunk in buf
    }
    

    If the application data is structured in some other way than fix sized chunks, then bufio.Scanner might be of help.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

悬赏问题

  • ¥20 关于URL获取的参数,无法执行二选一查询
  • ¥15 液位控制,当液位超过高限时常开触点59闭合,直到液位低于低限时,断开
  • ¥15 marlin编译错误,如何解决?
  • ¥15 有偿四位数,节约算法和扫描算法
  • ¥15 VUE项目怎么运行,系统打不开
  • ¥50 pointpillars等目标检测算法怎么融合注意力机制
  • ¥20 Vs code Mac系统 PHP Debug调试环境配置
  • ¥60 大一项目课,微信小程序
  • ¥15 求视频摘要youtube和ovp数据集
  • ¥15 在启动roslaunch时出现如下问题