如果查询数量过多,goroutines中的GORM将冻结

I've decided to build service what creates CSV report.

In use: Go 1.12, GORM (as PostgreSQL ORM)

func main() {
  ... // init DB connection etc
  defer db.Close()
  go fetch(db)
  for {} // keeps open process
}

func fetch(db *gotm.DB) {
  .... // some code counts pages, creates file etc
  sqlLimit := 20000 // set limit
  for i := 0; i < pages; i++ {
    db.Table("reports_bookings"),Debug().Where(sql).Offset(i * sqlLimit).Limit(sqlLimit).Find(&myModels)
    .... // code: push it to file
  }
}

So when code tries to fetch data it just freezes. If decrease limit and set 100 for example, it runs SQL 2 times and freezes.

Debug() does show nothing as well. As I told seems like it froze. One core from processor loaded.

It works Okay without goroutines. Can you help me to figure out, why it doesn't work in goroutines?

Thanks in advance.

EDIT

maybe my approach is bad and you can suggest, how to fix it. In the end, file should be uploaded to S3 (for example)

You need to wait for all goroutines to complete before exiting the program.

func main() {
  ... // init DB connection etc
  wg := sync.WaitGroup{}
  defer db.Close()
  wg.Add(1) // number of goroutines
  go fetch(db, &wg)
  wg.Wait() // wait for goroutines before exiting
}

func fetch(db *gotm.DB, wg *sync.WaitGroup) {
  .... // some code counts pages, creates file etc
  sqlLimit := 20000 // set limit
  for i := 0; i < pages; i++ {
    db.Table("reports_bookings"),Debug().Where(sql).Offset(i * sqlLimit).Limit(sqlLimit).Find(&myModels)
    .... // code: push it to file
    for {} // keeps open process
  }
}

Otherwise, your program would exit before your goroutines are finished