Golang JSON时间默认布局因平台而异?

I'm having a strange issue where the JSON encoded string of a time.Time is varying between my development environment (OSX) and production environment (Ubuntu 14.04 x64).

type Thang struct {
  CreatedAt time.Time `json:"created_at"`
}

OSX:

{
  // RFC3339 layout
  created_at: "2015-04-24T22:39:59Z",
}

Ubuntu 14.04 x64:

{
  // RFC3339Nano layout
  created_at: "2015-05-21T15:00:46.546000003Z",
}

I've googled around forever. Can't figure this one out. Here's some more info:

  • It's a straight forward web services app
  • Running go 1.4.1 on my OSX machine
  • I cross compile the app on my OSX machine for deployment like this:
    • GOOS=linux GOARCH=amd64 go build

Would appreciate any insight!

The source for time.Time's MarshalJSON is:

// MarshalJSON implements the json.Marshaler interface.
// The time is a quoted string in RFC 3339 format, with sub-second precision added if present.
func (t Time) MarshalJSON() ([]byte, error) {
        if y := t.Year(); y < 0 || y >= 10000 {
                // RFC 3339 is clear that years are 4 digits exactly.
                // See golang.org/issue/4556#c15 for more discussion.
                return nil, errors.New("Time.MarshalJSON: year outside of range [0,9999]")
        }
        return []byte(t.Format(`"` + RFC3339Nano + `"`)), nil
}

[time.MarshalJSON source on GitHub]

and is the same on all platforms. Note:

[…] with sub-second precision added if present.

And that RFC3339Nano = "2006-01-02T15:04:05.999999999Z07:00". The "9"'s mean use up-to that many digits but remove trailing zeros. So it looks like both your examples can match this format. (And when parsing times Go always accepts and parses fractional seconds no matter what the format says).

Section 5.6 of RFC3339 specifies that the fractional seconds are optional and may or may not be included (only saying that if the decimal point is present it must be followed by at least one digit).

For some reason does the time you're using on one system only have second accuracy or some such? (e.g. does this come from some filesystem or other sub-system that only stores seconds on one of the OSes?).

RFC3339 doesn't say anything about the precision of the time. Seems that Ubuntu insists on adding another nine decimal digits of precision, which is fine according to RFC3339.

On iOS / MacOS X the same date format parser cannot parse seconds with and without a decimal point, so I use two formats

"yyyy'-'MM'-'dd'T'HH':'mm':'ssX5"
"yyyy'-'MM'-'dd'T'HH':'mm':'ss.SSSSSSSSSX5"

and try both. (Actually I had six S characters because I thought nobody sane would use more than six decimals. That was wrong).

I think the date formats in iOS / MacOS X are standardised by the Unicode standard, so that might apply to other parsers as well. For efficiency reasons, I would remember which parser worked the last time and try that one first - if you got one date with nanoseconds, then the next date is likely to have nanoseconds as well.