2016-11-21 00:26
浏览 647


I'm trying to process a file which contains 200 URLs and use each URL to make an HTTP request. I need to process 10 URLs concurrently maximum each time (code should block until 10 URLs finish processing). Tried to solve it in go but I keep getting the whole file processed with 200 concurrent connection created.

for scanner.Scan() { // loop through each url in the file
        // send each url to golang HTTPrequest
        go HTTPrequest(scanner.Text(), channel, &wg)

What should i do?

  • 写回答
  • 关注问题
  • 收藏
  • 邀请回答

2条回答 默认 最新

  • donglunzai4288 2016-11-21 01:33

    A pool of 10 go routines reading from a channel should fulfill your requirements.

    work := make(chan string)
    // get original 200 urls
    var urlsToProcess []string = seedUrls() 
    // startup pool of 10 go routines and read urls from work channel 
    for i := 0; i<=10; i++ {
      go func(w chan string) {
         url := <-w
    // write urls to the work channel, blocking until a worker goroutine
    // is able to start work
    for _, url := range urlsToProcess {
      work <- url

    Cleanup and request results are left as an exercise for you. Go channels is will block until one of the worker routines is able to read.

    打赏 评论
  • drn1008 2016-11-22 09:42

    code like this

    longTimeAct := func(index int, w chan struct{}, wg *sync.WaitGroup) {
            defer wg.Done()
            time.Sleep(1 * time.Second)
    wg := new(sync.WaitGroup)
    ws := make(chan struct{}, 10)
    for i := 0; i < 100; i++ {
            ws <- struct{}{}
            go longTimeAct(i, ws, wg)
    打赏 评论

相关推荐 更多相似问题