I need help
Debezium(https://debezium.io/) convert decimal digits as integer in base64 string with precision as parameter in json In java it decoded like this example
import java.math.BigDecimal;
import java.math.BigInteger;
import java.util.Base64;
public class HelloWorld{
public static void main(String []args){
String encoded = "lTBA";
int scale = 4;
final BigDecimal decoded = new BigDecimal(new BigInteger(Base64.getDecoder().decode(encoded)), scale);
System.out.println(decoded);
}
}
"lTBA" = -700
but in Golang I have some troubles in similar example
package main
import (
"encoding/base64"
"fmt"
"math/big"
)
func main() {
data, _ := base64.StdEncoding.DecodeString("lTBA")
bigInt := new(big.Int).SetBytes(data).String()
fmt.Println(bigInt)
}
https://play.golang.org/p/3Xdq9x6B9V- It returns 9777216
Because when I use SetBytes in "math/big/Int" it set digit as positive always
// SetBytes interprets buf as the bytes of a big-endian unsigned
// integer, sets z to that value, and returns z.
func (z *Int) SetBytes(buf []byte) *Int {
z.abs = z.abs.setBytes(buf)
z.neg = false
return z
}
Constructor of Java's BigInteger
expects the number in 2's complement binary representation, while Go's big.Int.SetBytes()
expects an unsigned integer value (in big-endian byte order):
SetBytes interprets buf as the bytes of a big-endian unsigned integer, sets z to that value, and returns z.
What we can do is proceed using Int.SetBytes()
, but if the number is negative, we have to transform it to the number the binary data represented in 2's complement. We can tell if it's negative if its first bit is 1 (which is the highest bit of the first byte).
This transformation is simple: if the input is represented using n
bytes, construct a number using n+1
bytes where the first is 1
, the rest is 0
(the max representable number using n
bytes plus 1). Subtracting (Int.Sub()
) the input from this number will give you the absolute value of the number in 2's complement, so we just have to apply a negative sign on this number: Int.Neg()
.
Scale can be applied by dividing (Int.Div()
) the result by a number being 10
scale
. We may construct such a number by appending scale
zeros to a 1
.
Here is a decode()
function that does all this. It's not optimized for performance, but it does the job:
func decode(in string, scale int) (out *big.Int, err error) {
data, err := base64.StdEncoding.DecodeString(in)
if err != nil {
return
}
out = new(big.Int).SetBytes(data)
// Check if negative:
if len(data) > 0 && data[0]&0x80 != 0 {
// It's negative.
// Convert 2's complement negative to abs big-endian:
data2 := make([]byte, len(data)+1)
data2[0] = 1
temp := new(big.Int).SetBytes(data2)
out.Sub(temp, out)
// Apply negative sign:
out.Neg(out)
}
// Apply scale:
if scale > 0 {
temp, _ := new(big.Int).SetString("1"+strings.Repeat("0", scale), 10)
out.Div(out, temp)
}
return
}
Example testing it:
n, err := decode("lTBA", 4)
fmt.Println(n, err)
Output (try it on the Go Playground):
-700 <nil>