I have a code like this in nodejs:
// Create Buffer from hex representation
b = new Buffer('002400050200000000320000000003847209cd4450ff94ad8c0000000002c581000001d3', 'hex')
// Read with offset 0
b.readUInt16BE(0) // -> Out: 36
It reads a string which is an hex representation of data. When first two bytes are read readUInt16BE, an int (36) is obtained. That is the expected behavior.
I need to replicate this behavior using Go, but I have some troubles.
1) How to create a buffer from an string in hex format? 2) How to implement a readUInt16BE function in order to obtain (36) when first two bytes are read?
I can create a buffer with 00 24, but I need to use any string.
// Creates buffer [00 24]
v := make([]byte, 0, 2)
v = append(v, 0)
v = append(v, 24)
fmt.Println(v) // -> out: [0 24]
Finally, I am a little confused with function binary.BigEndian.Uint16, which is returning 24 instead 36.
x := binary.BigEndian.Uint16(v)
fmt.Println(x) // -> out: 24
Can you help me to understand this?
You are appending a decimal value of 24 to the buffer, when you should be appending the hexadecimal value 0x24:
v := make([]byte, 0, 2)
v = append(v, 0)
v = append(v, 0x24)
fmt.Println(v)
x := binary.BigEndian.Uint16(v)
fmt.Println(x)
Converting the the original Node.js code to Go would look something like the following:
import (
"encoding/binary"
"encoding/hex"
)
b, err := hex.DecodeString(`002400050200000000320000000003847209cd4450ff94ad8c0000000002c581000001d3`)
if err != nil {
// TODO: handle error
}
_ = binary.BigEndian.Uint16(b[:2])