I'm doing some work with integer compression.
I've implemented variable-byte encoding algorithm in c++ (see the snippet below).
I wonder how to implement it in golang since I cannot convert string
or tune
type between int
type in memory as memcpy()
does.
Then, I've figured out binary.Write()
in package encoding/binary
can do the serializing work, which can encode uint8 into one byte, unint16 into 2 bytes, uint32 in 4 types and so on.
But how to encode a integer, which is between 2097152 and 268435456, using only 3 bytes ?
Is there any similar converting method like the snippet ?
void encode(int value, char* code_list, int& len) {
int bit_value = 0;
int bit_num = 0;
if (value < 128) {
bit_num = 1;
} else if (value < 16384) {
bit_num = 2;
bit_value = 1;
} else if (value < 2097152) {
bit_num = 3;
bit_value = 3;
} else {
bit_num = 4;
bit_value = 7;
}
value <<= bit_num;
value += bit_value;
memcpy(code_list + len, (char*) &value, bit_num);
len += bit_num;
}
Your encoding is such that the count of least-significant 1
bits in the first byte tells you how many bytes the encoded value has.
Here's a Go implementation of your code, that avoids depending on endianness (which your C version does), and uses an io.Writer
rather than something like memcpy
.
See it run at: https://play.golang.org/p/jr0NypSnlW
package main
import (
"fmt"
"bytes"
"io"
)
func encode(w io.Writer, n uint64) error {
bytes := 0
switch {
case n < 128:
bytes = 1
n = (n << 1)
case n < 16834:
bytes = 2
n = (n << 2) | 1
case n < 2097152:
bytes = 3
n = (n << 3) | 3
default:
bytes = 4
n = (n << 4) | 7
}
d := [4]byte{
byte(n), byte(n>>8), byte(n>>16), byte(n>>24),
}
_, err := w.Write(d[:bytes])
return err
}
func main() {
xs := []uint64{0, 32, 20003, 60006, 300009}
var b bytes.Buffer
for _, x := range xs {
if err := encode(&b, x); err != nil {
panic(err)
}
}
fmt.Println(b.Bytes())
}