dqlb38410 2018-09-03 18:17
浏览 294
已采纳

用小内存在go lang中读取大文件的最快方法

I want to read the data from the different text or json or csv files. Which is the approach should I follow?

I have read these blogs File read, read 2GB text file with small RAM for the different approach for file reading.

Different approach:

* Reading a file in chunks
* Reading file chunks concurrently
* Reading the entire file into memory
* Splitting a long string into words
* Scanning word by word

Not able to find out the fastest way of reading the file with small RAM.

  • 写回答

1条回答 默认 最新

  • duanjiu1950 2018-09-03 18:36
    关注

    There's basically two different ways to approach parsing a file: document parsing and stream parsing.

    Document parsing reads the data from the file and turns it into a big set of objects that you can query, like the HTML DOM in a browser. The advantage is you have the complete data at your fingertips, this is often simpler. The disadvantage is you have to store it all in memory.

    dom = parse(stuff)
    
    // now do whatever you like with the dom
    

    Stream parsing instead reads a single element at a time and presents it to you for immediate use, then it moves on to the next one.

    for element := range stream(stuff) {
        ...examine one element at a time...
    }
    

    The advantage is you don't have to load the whole thing into memory. The disadvantage is you must work with the data as it goes by. This is very useful for searches or anything else which needs to process one by one.


    Fortunately Go provides libraries to handle the common formats for you.

    A simple example is handling a CSV file.

    package main
    
    import(
        "encoding/csv"
        "fmt"
        "log"
        "os"
        "io"
    )
    
    func main() {
        file, err := os.Open("test.csv")
        if err != nil {
            log.Fatal(err)
        }
    
        parser := csv.NewReader(file)
    
        ...
    }
    

    We can slurp the whole thing into memory as a big [][]string.

    records, err := parser.ReadAll()
    if err != nil {
        log.Fatal(err)
    }
    
    for _,record := range records {
        fmt.Println(record)
    }
    

    Or we can save a bunch of memory and deal with the rows one at a time.

    for {
        record, err := parser.Read()
        if err == io.EOF {
            break
        }
        if err != nil {
            log.Fatal(err)
        }
    
        fmt.Println(record)
    }
    

    Since every line of a CSV is functionally the same, processing it one row at a time makes the most sense.

    JSON and XML are more complex because they are large, nested structures, but they can also be streamed. There's an example of streaming in the encoding/json documentation.


    What if your code isn't a simple loop? What if you want to take advantage of concurrency? Use a channel and a goroutine to feed it concurrent with the rest of the program.

    records := make( chan []string )
    go func() {
        parser := csv.NewReader(file)
    
        defer close(records)
        for {
            record, err := parser.Read()
            if err == io.EOF {
                break
            }
            if err != nil {
                log.Fatal(err)
            }
    
            records <- record
        }
    }();
    

    Now you can pass records to a function which can process them.

    func print_records( records chan []string ) {
        for record := range records {
            fmt.Println(record)
        }
    }
    
    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论

报告相同问题?

悬赏问题

  • ¥15 c语言怎么用printf(“\b \b”)与getch()实现黑框里写入与删除?
  • ¥20 怎么用dlib库的算法识别小麦病虫害
  • ¥15 华为ensp模拟器中S5700交换机在配置过程中老是反复重启
  • ¥15 java写代码遇到问题,求帮助
  • ¥15 uniapp uview http 如何实现统一的请求异常信息提示?
  • ¥15 有了解d3和topogram.js库的吗?有偿请教
  • ¥100 任意维数的K均值聚类
  • ¥15 stamps做sbas-insar,时序沉降图怎么画
  • ¥15 买了个传感器,根据商家发的代码和步骤使用但是代码报错了不会改,有没有人可以看看
  • ¥15 关于#Java#的问题,如何解决?