如何从并发和递归函数打印结果? [关闭]

I've been going through the go tour, and I've finished the web crawler exercise, but I think that the technique I used to print all the results may be inefficient.

Here is my code. I only edited the crawl and main functions so I'll just post that. Here is the link to the exercise ( http://tour.golang.org/#70 )

    var used = make(map[string]bool)

    func Crawl(url string, depth int, fetcher Fetcher, results chan string) {
        if depth <= 0 {
            return
        }
        body, urls, err := fetcher.Fetch(url)
        if err != nil {
            results <- fmt.Sprintf("%v",err)
            return
        }
        results <-fmt.Sprintf("
found: %s %q
", url, body)
        for _,u := range urls {
            if used[u] == false {
                used[u] = true
                go Crawl(u, depth-1, fetcher, results)
            }
        }
        return
    }
    //------------------------------------------------------------
    func main() {
        used["http://golang.org/"] = true
        results := make(chan string)
        go Crawl("http://golang.org/", 4, fetcher, results)
        for i := 0; i < len(used); i++ {
            fmt.Println(<-results)
        }
    }

I use the "for i < len(used)" line in main to ensure that the value from results is printed only if there is a result to print. I can't just use

    for i := range results

because it is hard to use "close(results)" in the crawl function since it is recursive, but with the way I do it I have to find the length of the variable used every time.

Is there a better way to do this?

To wait for a collection of goroutines to finish, use a sync.WaitGroup.

I believe you'll find the example in the official documentation very familiar..

http://golang.org/pkg/sync/#example_WaitGroup

Quoting:

var wg sync.WaitGroup
var urls = []string{
    "http://www.golang.org/",
    "http://www.google.com/",
    "http://www.somestupidname.com/",
}
for _, url := range urls {
    // Increment the WaitGroup counter.
    wg.Add(1)
    // Launch a goroutine to fetch the URL.
    go func(url string) {
        // Fetch the URL.
        http.Get(url)
        // Decrement the counter.
        wg.Done()
    }(url)
}
// Wait for all HTTP fetches to complete.
wg.Wait()

This will block until all the work is done.

If you really want to print the results progressively as you collect them, the simplest way is to do it in the fetcher itself.