douxian9010 2015-01-11 23:26
浏览 80
已采纳

golang sync.WaitGroup永远不会完成

I have the below code that fetches a list of URL's and then conditionally downloads a file and saves it to the filesystem. The files are fetched concurrently and the main goroutine waits for all the files to be fetched. But, the program never exits (and there are no errors) after completing all the requests.

What I think is happening is that somehow the amount of go routines in the WaitGroup is either incremented by too many to begin with (via Add) or not decremented by enough (a Done call is not happening).

Is there something I am obviously doing wrong? How would I inspect how many go routines are presently in the WaitGroup so I can better debug what's happening?

package main

import (
    "fmt"
    "io"
    "io/ioutil"
    "net/http"
    "os"
    "strings"
    "sync"
)

func main() {
    links := parseLinks()

    var wg sync.WaitGroup

    for _, url := range links {
        if isExcelDocument(url) {
            wg.Add(1)
            go downloadFromURL(url, wg)
        } else {
            fmt.Printf("Skipping: %v 
", url)
        }
    }
    wg.Wait()
}

func downloadFromURL(url string, wg sync.WaitGroup) error {
    tokens := strings.Split(url, "/")
    fileName := tokens[len(tokens)-1]
    fmt.Printf("Downloading %v to %v 
", url, fileName)

    content, err := os.Create("temp_docs/" + fileName)
    if err != nil {
        fmt.Printf("Error while creating %v because of %v", fileName, err)
        return err
    }

    resp, err := http.Get(url)
    if err != nil {
        fmt.Printf("Could not fetch %v because %v", url, err)
        return err
    }
    defer resp.Body.Close()

    _, err = io.Copy(content, resp.Body)
    if err != nil {
        fmt.Printf("Error while saving %v from %v", fileName, url)
        return err
    }

    fmt.Printf("Download complete for %v 
", fileName)

    defer wg.Done()
    return nil
}

func isExcelDocument(url string) bool {
    return strings.HasSuffix(url, ".xlsx") || strings.HasSuffix(url, ".xls")
}

func parseLinks() []string {
    linksData, err := ioutil.ReadFile("links.txt")
    if err != nil {
        fmt.Printf("Trouble reading file: %v", err)
    }

    links := strings.Split(string(linksData), ", ")

    return links
}
  • 写回答

3条回答 默认 最新

  • duanke6249 2015-01-11 23:40
    关注

    There are two problems with this code. First, you have to pass a pointer to the WaitGroup to downloadFromURL(), otherwise the object will be copied and Done() will not be visible in main().

    See:

    func main() {
        ...
        go downloadFromURL(url, &wg)
        ...
    }
    

    Second, defer wg.Done() should be one of the first statements in downloadFromURL(), otherwise if you return from the function before that statement, it won't get "registered" and won't get called.

    func downloadFromURL(url string, wg *sync.WaitGroup) error {
        defer wg.Done()
        ...
    }
    
    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(2条)

报告相同问题?

悬赏问题

  • ¥20 基于MSP430f5529的MPU6050驱动,求出欧拉角
  • ¥20 Java-Oj-桌布的计算
  • ¥15 powerbuilder中的datawindow数据整合到新的DataWindow
  • ¥20 有人知道这种图怎么画吗?
  • ¥15 pyqt6如何引用qrc文件加载里面的的资源
  • ¥15 安卓JNI项目使用lua上的问题
  • ¥20 RL+GNN解决人员排班问题时梯度消失
  • ¥60 要数控稳压电源测试数据
  • ¥15 能帮我写下这个编程吗
  • ¥15 ikuai客户端l2tp协议链接报终止15信号和无法将p.p.p6转换为我的l2tp线路