Deserialized Avro, using AvroEx, contains incorrect data

When deserializing Avro, using AvroEx, I was getting an error, so I decided to start with a simple example to see how thing work.

However while I don’t get an error in this case I do get the wrong data back.

The data is serialized in a Ruby application as follows:

id = 1

encoder = AvroTurf.new(schemas_path: Rails.root.join("kafka_schemas"))
message = encoder.encode({ "a" => id}, schema_name: "test")
File.write("/tmp/message.avro", message)

and then deserialized as follows in Elixir:

schema = AvroEx.decode_schema!(File.read!("#{:code.priv_dir(:kafka)}/schemas/test.avsc"))
p = File.read!("/tmp/message.avro")
%{ "a" => id } = AvroEx.decode!(schema, p)

id # => -40

As you can see I get a different id back, -40 versus 1.

The schema both are reading is as follows:

{
  "type": "record",
    "name": "test",
    "doc": "Test",
    "fields": [
    {
      "name": "a",
      "type": "int"
    }
  ]
}

Is it possible this is due to the encoding of the file, versus raw bytes… I tried using File.writebin on the Ruby side and it creates the same file (same checksum).

Many thanks!

I tried using Avrora and it works just fine. I’m not sure but I think the issue might have been because AvroEx works with plain Avro without headers or embedded schema.