douzhou7037
2018-12-31 10:21
浏览 240

读取和解析大型XML文件的性能问题

I have a directory which contains several large XML files (total size is about 10 GB). Is there any way to iterate through the directory containing the XML files and read 50 byte by 50 byte and parse the XML files with high performance?

func (mdc *Mdc) Loadxml(path string, wg sync.WaitGroup) {
    defer wg.Done()
    //var conf configuration
    file, err := os.Open(path)
    if err != nil {
        log.Fatal(err)
    }
    defer file.Close()
    scanner := bufio.NewScanner(file)
    buf := make([]byte, 1024*1024)
    scanner.Buffer(buf, 50)
    for scanner.Scan() {
        _, err := file.Read(buf)
        if err != nil {
            log.Fatal(err)
        }
    }

    err = xml.Unmarshal(buf, &mdc)
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println(mdc)
}
  • 写回答
  • 好问题 提建议
  • 追加酬金
  • 关注问题
  • 邀请回答

2条回答 默认 最新

相关推荐 更多相似问题