douzhou7037
2018-12-31 10:21
浏览 227

读取和解析大型XML文件的性能问题

I have a directory which contains several large XML files (total size is about 10 GB). Is there any way to iterate through the directory containing the XML files and read 50 byte by 50 byte and parse the XML files with high performance?

func (mdc *Mdc) Loadxml(path string, wg sync.WaitGroup) {
    defer wg.Done()
    //var conf configuration
    file, err := os.Open(path)
    if err != nil {
        log.Fatal(err)
    }
    defer file.Close()
    scanner := bufio.NewScanner(file)
    buf := make([]byte, 1024*1024)
    scanner.Buffer(buf, 50)
    for scanner.Scan() {
        _, err := file.Read(buf)
        if err != nil {
            log.Fatal(err)
        }
    }

    err = xml.Unmarshal(buf, &mdc)
    if err != nil {
        log.Fatal(err)
    }
    fmt.Println(mdc)
}
  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 邀请回答

2条回答 默认 最新

  • douhuan1257 2018-12-31 12:08
    已采纳

    You can do something even better: You can tokenize your xml files.

    Say you have an xml like this

    <inventory>
      <item name="ACME Unobtainium">
        <tag>Foo</tag>
        <count>1</count>
      </item>
      <item name="Dirt">
        <tag>Bar</tag>
        <count>0</count>
      </item>
    </inventory>
    

    you can actually have the following data model

    type Inventory struct {
        Items []Item `xml:"item"`
    }
    
    type Item struct {
        Name  string   `xml:"name,attr"`
        Tags  []string `xml:"tag"`
        Count int      `xml:"count"`
    }
    

    Now, all you have to do is to use filepath.Walk and do something like this for each file you want to process:

        decoder := xml.NewDecoder(file)
    
        for {
            // Read tokens from the XML document in a stream.
            t, err := decoder.Token()
    
            // If we are at the end of the file, we are done
            if err == io.EOF {
                log.Println("The end")
                break
            } else if err != nil {
                log.Fatalf("Error decoding token: %s", err)
            } else if t == nil {
                break
            }
    
            // Here, we inspect the token
            switch se := t.(type) {
    
            // We have the start of an element.
            // However, we have the complete token in t
            case xml.StartElement:
                switch se.Name.Local {
    
                // Found an item, so we process it
                case "item":
                    var item Item
    
                    // We decode the element into our data model...
                    if err = decoder.DecodeElement(&item, &se); err != nil {
                        log.Fatalf("Error decoding item: %s", err)
                    }
    
                    // And use it for whatever we want to
                    log.Printf("'%s' in stock: %d", item.Name, item.Count)
    
                    if len(item.Tags) > 0 {
                        log.Println("Tags")
                        for _, tag := range item.Tags {
                            log.Printf("\t%s", tag)
                        }
                    }
                }
            }
        }
    

    Working example with dummy XML: https://play.golang.org/p/MiLej7ih9Jt

    点赞 打赏 评论
  • dphphvs496524 2018-12-31 11:40

    The encoding/xml package provides a medium-level xml.Decoder type. That lets you read through an XML input stream one Token at a time, not unlike the streaming Java SAX model of old. When you find the thing you're looking for, you can jump back into decoder.Decode to run the normal unmarshaling sequence to get individual objects out. Just remember that the token stream might contain several things that are "irrelevant" (whitespace-only text nodes, processing instructions, comments) and you need to skip over them, while still looking for things that are "important" (non-whitespace text nodes, unexpected start/end elements).

    As a high-level example, if you're expecting a very large SOAP message with a list of records, you might do a "streaming" parse until you see the <soap:Body> start-element, check that its immediate child (e.g., the next start-element) is the element you expect, and then call decoder.Decode on each of its child elements. If you see the end of the operation element, you can unwind the element tree (you now expect to see </soap:Body></soap:Envelope>). Anything else is an error, that you need to catch and process.

    The skeleton of an application here might look like

    type Foo struct {
        Name string `xml:"name"`
    }
    
    decoder := xml.NewDecoder(r)
    for {
        t, err := decoder.Token()
        if err != nil {
            panic(err)
        }
        switch x := t.(type) {
        case xml.StartElement:
            switch x.Name {
            case xml.Name{Space: "", Local: "foo"}:
                var foo Foo
                err = decoder.DecodeElement(&foo, &x)
                if err != nil {
                    panic(err)
                }
                fmt.Printf("%+v
    ", foo)
            default:
                fmt.Printf("Unexpected SE {%s}%s
    ", x.Name.Space, x.Name.Local)
            }
        case xml.EndElement:
            switch x.Name {
            default:
                fmt.Printf("Unexpected EE {%s}%s
    ", x.Name.Space, x.Name.Local)
            }
        }
    }
    

    https://play.golang.org/p/_ZfG9oCESLJ has a complete working example (not of the SOAP case but something smaller).

    XML parsing in Go, like basically everything else, is a "pull" model: you tell the reader what to read and it gets the data from the io.Reader you give it. If you manually create an xml.Decoder you can pull one token from a time from it, and that will presumably call r.Read in digestible chunks, but you can't push tiny increments of data into the parser as you propose.

    I can't specifically speak to the performance of encoding/xml, but a hybrid-streaming approach like this will at least get you better latency to the first output and keep less live data in memory at a time.

    点赞 打赏 评论

相关推荐 更多相似问题