Currently the way to convert an object to json and gzip it is:
jsonBytes, _ := json.Marshal(payload)
//gzip json
var body bytes.Buffer
g := gzip.NewWriter(&body)
g.Write(jsonBytes)
g.Close()
This results in an intermediate large byte buffer jsonBytes
, whose only purpose is to be then converted into gzipped buffer.
Is there any way to stream the marshalling of the payload
object so it comes out gzipped in the first place?
Yes, you may use json.Encoder
to stream the JSON output, and similarly json.Decoder
to decode a streamed JSON input. They take any io.Writer
and io.Reader
to write the JSON result to / read from, including gzip.Writer
and gzip.Reader
.
For example:
var body bytes.Buffer
w := gzip.NewWriter(&body)
enc := json.NewEncoder(w)
payload := map[string]interface{}{
"one": 1, "two": 2,
}
if err := enc.Encode(payload); err != nil {
panic(err)
}
if err := w.Close(); err != nil {
panic(err)
}
To verify that it works, this is how we can decode it:
r, err := gzip.NewReader(&body)
if err != nil {
panic(err)
}
dec := json.NewDecoder(r)
payload = nil
if err := dec.Decode(&payload); err != nil {
panic(err)
}
fmt.Println("Decoded:", payload)
Which will output (try it on the Go Playground):
Decoded: map[one:1 two:2]