在Golang中处理并发HTTP请求

I'm trying to process a file which contains 200 URLs and use each URL to make an HTTP request. I need to process 10 URLs concurrently maximum each time (code should block until 10 URLs finish processing). Tried to solve it in go but I keep getting the whole file processed with 200 concurrent connection created.

for scanner.Scan() { // loop through each url in the file
        // send each url to golang HTTPrequest
        go HTTPrequest(scanner.Text(), channel, &wg)
}
fmt.Println(<-channel)
wg.Wait()

What should i do?

A pool of 10 go routines reading from a channel should fulfill your requirements.

work := make(chan string)

// get original 200 urls
var urlsToProcess []string = seedUrls() 

// startup pool of 10 go routines and read urls from work channel 
for i := 0; i<=10; i++ {
  go func(w chan string) {
     url := <-w
  }(work)
}

// write urls to the work channel, blocking until a worker goroutine
// is able to start work
for _, url := range urlsToProcess {
  work <- url
}

Cleanup and request results are left as an exercise for you. Go channels is will block until one of the worker routines is able to read.

code like this

longTimeAct := func(index int, w chan struct{}, wg *sync.WaitGroup) {
        defer wg.Done()
        time.Sleep(1 * time.Second)
        println(index)
        <-w
}
wg := new(sync.WaitGroup)
ws := make(chan struct{}, 10)
for i := 0; i < 100; i++ {
        ws <- struct{}{}
        wg.Add(1)
        go longTimeAct(i, ws, wg)
}
wg.Wait()