I'm trying to implement a 32-bit (MT19937-32, LFSR113 & LFSR88, among others) random sources in Go, but math.Rand
's source interface accepts Int63()
as method.
How do we convert uint32
to int64
(non-negative int64, or 63-bit)
here's an LFSR88 code (some methods and consts omitted):
type LFSR88 struct {
s1, s2, s3, b uint32
}
.
.
.
func (lfsr *LFSR88) Uint32() uint32 {
lfsr.b = (((lfsr.s1 << 13) ^ lfsr.s1) >> 19)
lfsr.s1 = (((lfsr.s1 & 4294967294) << 12) ^ lfsr.b)
lfsr.b = (((lfsr.s2 << 2) ^ lfsr.s2) >> 25)
lfsr.s2 = (((lfsr.s2 & 4294967288) << 4) ^ lfsr.b)
lfsr.b = (((lfsr.s3 << 3) ^ lfsr.s3) >> 11)
lfsr.s3 = (((lfsr.s3 & 4294967280) << 17) ^ lfsr.b)
return (lfsr.s1 ^ lfsr.s2 ^ lfsr.s3)
}
Converting a uint32
to an int64
is quite simple:
var u32 uint32 = /* some number */
var i64 int64 = int64(u32)
The problem with this alone is that you'll end up with an int64 that's half 0 bits, so you probably want to combine two of them:
var u1, u2 uint32 = /* two numbers */
var i64 uint64 = int64(u1) + int64(u2)<<32
See a complete example here.