What is the best way to read large CSV files, at the moment I am reading one record at a time rather than using ReadAll()
.
reader := csv.NewReader(csvFile)
reader.FieldsPerRecord = -1
for {
// read just one record at a time
record, err := reader.Read()
if err == io.EOF {
break
} else if err != nil {
checkErr(err)
return
}
Is there a better way to do this to save memory?
I am writing each record/row to a database by sending an array over GRPC to a separate service.
Yes, there is one option you can use to improve it.
It is possible to allow reader to reuse a slice that is returned by it on each Read
method call.
To do it you need to set reader.ReuseRecord = true
.
But be careful, because the returned slice may be changed after the next call of Read
!