I'm trying to integrate Avro and Schema Registry in our Kafka data pipelines. Now it looks inside of Go service like
Get data -> Encode data to JSON -> Write to Kafka
I want to use avro schema in the same way
Generate schema -> Update schema in Schema Registry -> Get data -> Encode to Avro -> Write to Kafka
But there are several questions:
There are a few alternatives to create go classes from Avro. did you try this https://github.com/actgardner/gogen-avro?
Usage To generate Go source files from one or more Avro schema files, run:
gogen-avro [--package=] [--containers] output_directory avro_schema files
Confluent Schema registry provide several ways to check compatibility betwween schemas. You can take a look to their API Here. There are other ways to do it, for example using maven during your integration tests to ensure compatibility between your source and other enviroments. You can find some information here.