Avro 1.13.0 no longer allows nested record redefining
If I have two avsc files the plugin would create different objects for records until 1.13.0. But since this version avro plugin throws Can’t redefine error in ParseContext.java.
Avro 1.13.0 no longer allows nested record redefining
If I have two avsc files the plugin would create different objects for records until 1.13.0. But since this version avro plugin throws Can’t redefine error in ParseContext.java.
Enhanced switch for non-public classes
Occasionally a 3rd party library contains APIs that return non-public classes which you cannot reference directly. One such example is org.apache.avro.generic.GenericRecord.get()
which can sometimes return a java.nio.HeapByteBuffer
object. If I wanted to switch over that class like so I will get a compile error:
How to write multiple avro objects into a ByteArrayOutputStream
The Avro documentation page has an example that write multiple avro objects into a file:
avro, difference between .getEncoder.encode() and SpecificDatumWriter()
I have an avro schema. While compiling it, the model class comes into existence in Java, let’s call it Test1 class.
Avro, Schema Evolution, Backward-Compatibility
Avro specification declares that extending a schema with optional fields is backward-compatible. Unfortunately, it does not work with binary streams for us and I do not know how to fix it. I have written a simple demo app to demonstrate the problem. The producer creates an instance, serializes it, and saves it to a file and the consumer reads the file and deserializes the stream. If the optional field is added to the schema and the schema is compiled (maven plugin), then instances based on the previous version of the schema cannot be serialized. The behavior is different when we use DataFileWriter/Reader, in that case it works but we need binary streams as we use kafka and the messages contain the serialized data.