I needed to call a web service a specific number of times, and add the results to a file (order being unimportant). It might be possible to simply spawn a goroutine for every request, but a more elegant solution is to use the producer-consumer pattern:
func main() {
jobs := make(chan bool)
results := make(chan string)
done := make(chan bool)
numberOfWorkers:= 5
numberOfJobs:= 1000
for i := 0; i < numberOfWorkers; i++ {
go worker(jobs, results, done)
}
go func() {
for i := 0; i < numberOfJobs; i++ {
jobs <- true
}
}()
go func() {
count := 0
for {
result := <-results
println(result)
count++
if count >= numberOfJobs {
done <- true
return
}
}
}()
<-done
}
func worker(jobs chan bool, result chan string, done chan bool) {
for {
select {
case <-jobs:
res, err := getResult()
if err != nil {
panic(err)
}
results <- res
case <-done:
return
}
}
}
func getResult() (string, error) {
resp, err := http.Get("http://localhost/foo")
if err != nil {
return 0, err
}
defer resp.Body.Close()
body, err := ioutil.ReadAll(resp.Body)
if err != nil {
return 0, err
}
return body, nil
}
This solution uses 3 channels: one to specify the work still remaining (a bool in this case, but this channel could include data to vary the request), one to return results from the workers; and one to signify that work was complete.
This turned out to be a more interesting problem than it looked. My first attempt was using a WaitGroup, but I couldn’t get that to work. And debugging revealed an interesting gap in my mental model of how channels work: I hadn’t fully internalised the fact that pushing a message onto a go channel will block unless someone is ready to consume it. Hence the need to push work onto the jobs channel in a separate goroutine. I had it in my head that they were more like an Erlang/F# mailbox, despite the fact I knew that buffered channels were a thing.
The error handling could also be improved, currently if any request fails it just panics.