The problem code is:
go func() {
defer wg.Done()
for {
task := <-tasks
if task.Attemts >= .5 {
tasks <- task # <- error
}
Println(task)
}
}()
Tasks filling with tasks <- Task{"1", rand.Float64()}
in another loop.
And now we've got deadlock...
Full example: https://play.golang.org/p/s1pnb1Mu_Y
The point of my code is - create web scrapper, which one will try to parse urls after fails. Take some attempts and then drop url.
Might be in golang we have some more ideomatic way to solve this problem, cuz i don't know.