将换行符分隔的JSON Blob的整个文件读取到内存中,并以golang中最少的转换量对每个Blob进行编组?

I'm new to go, so don't know a whole lot about the language specific constructs.

My use case is first to read into memory an input file containing JSON blobs that are newline delimited. From this "array" of JSON source, I'd like to unmarshal each array element to deal with it in golang. The expected structure mapping is already defined.

I typically like to read all lines at once, so ioutil.ReadFile() as mentioned in How can I read a whole file into a string variable in Golang? seems like a good choice. And json.Unmarshal appears to take byte array as the source. But if I'm using ReadFile(), I have a single array of bytes for the whole file. How might I extract slices of this byte array such that the newline bytes (as delimiters) are skipped and each slice is one of those JSON blobs? I'd assume the best technique is one that doesn't do or minimizes data type conversions. As the easy hack would be something like convert the byte array to string, split the newline delimited string to array then cast each string array element back to bytes to pass to json.Unmarshal. I'd prefer the optimized approach but not sure how to tackle the implementation algorithm details in go, could use some tips here.

Ideally, I'd like the preprocessing done beforehand, so that I'm not dealing with the content of the JSON byte array from file as I'm iterating over the slices, etc. Rather I'd like to preprocess the single byte array read from file into an array of byte array slices, with all the newline bytes removed, each slice being the segments that were delimited by newline.

Use bufio.Scanner to read a line at a time:

 f, err := os.Open(fname)
 if err != nil {
     // handle error
 }
 s := bufio.NewScanner(f)
 for s.Scan() {
    var v ValueTypeToUnmarshalTo
    if err := json.Unmarshal(s.Bytes(), &v); err != nil {
       //handle error
    }
    // do something with v
}
if s.Err() != nil {
    // handle scan error
}

or use ioutil.ReadFile to slurp up the entire file and bytes.Split to break the file into lines:

 p, err := ioutil.ReadFile(fname)
 if err != nil {
    // handle error
 }
 for _, line := range bytes.Split(p, []byte{'
'}) {
    var v ValueTypeToUnmarshalTo
    if err := json.Unmarshal(line, &v); err != nil {
       //handle error
    }
    // do something with v
 }

or use the json.Decoder built-in streaming feature to read mulitple values from the file:

 f, err := os.Open(fname)
 if err != nil {
    // handle error
 }
 d := json.NewDecoder(f)
 for {
    var v ValueTypeToUnmarshalTo
    if err := d.Decode(&v); err == io.EOF {
       break // done decoding file
    } else if err != nil {
       // handle error
    }
    // do something with v
}

Run the code on the playground

The ioutil.ReadFile approach uses more memory than the other approaches (one byte for each byte in file plus one slice header for each line).

Because the decoder ignores whitespace following a JSON value, the three approaches handle line terminators.

There are no data conversions in any of these approaches other than those inherent to unmarshalling JSON bytes to Go values.