使用golang从S3读取多个文件

I am a novice in golang. I want to read multiple files from Amazon S3. I am using the s3gof3r library.

The go routine is as follows:

for i := 1; i <= fileNo; i++ {
    go test(i, b)
}

func test(i int, b *Bucket) () {
    fmt.Println("Loading file no:" + strconv.Itoa(i))
    defer wg.Done()
    r, _, err := b.GetReader("testFile_" + strconv.Itoa(i) + ".htm", nil)
    buf := new(bytes.Buffer)
    buf.ReadFrom(r)
    fmt.Println(err)
    fmt.Println("Completed file no:" + strconv.Itoa(i))
    r.Close()           
    }

This code works alright if I have about 200 files (i.e. 200 go routines reading from 200 files) but it crashes if I have to read more files (I have to read more than 10,000 files)

The error that i get is

 panic: runtime error: invalid memory address or nil pointer dereference
    panic(0x39fde0, 0xc8200100f0)
    /usr/local/go/src/runtime/panic.go:464 +0x3e6
bytes.(*Buffer).ReadFrom(0xc8200d3f18, 0x0, 0x0, 0x0, 0x0, 0x0)
    /usr/local/go/src/bytes/buffer.go:176 +0x239
main.test(0x4c, 0xc8200bc8e0)

The error comes from using 'ReadFrom'. Is there a problem using ReadFrom in this way? Or is this a wrong way to accomplish the task of reading so many files?

Probably you ran out of memory.

Is it a typo buf := new(bytes.Reader) in your example, did you mean bytes.Buffer? I guess so.

r, _, err := b.GetReader(...)
//...
n, err := buf.ReadFrom(r)

Check for errors there. Probably it will be bytes.ErrTooLarge in ReadFrom call.

ErrTooLarge is passed to panic if memory cannot be allocated to store data in a buffer.