How to encode JSON with non-string numeric precision?

The international standard for medical messages is currently moving to FHIR:

They’ve made certain… poor choices (like not including the version id in any message documents).

One of these is that if a message is delivered in JSON format, floats must be encoded as numbers in the JSON with precision preserved. So, for encoding, I can’t use “5.00” with something like Jason. And I can’t use 5.00 because Elixir (and even Javascript) would immediately throw out the zeros.

It would be nice if Elixir had a ‘decimal’ data type for storing precision in a float.

Currently, I’m thinking of forking the Jason library to treat two-tuples as {float_num, precision} (maybe I could sneak a protocol in…) when encoding.

But I’d rather not fork it :frowning:

Any other thoughts or ideas to deal with this?

Scott S.

1 Like

And woah - maybe Jason already covered those bases!!!

It looks like in the code it has a dependency on the Decimal library! (hope springs…)

And alas, no :frowning:
It looks like Jason encodes Decimals as strings – which kind of defeat the purpose of including the library almost? Hrm…

oh he has an issue for it:

1 Like

And solved with a protocol that uses a FHIR.Decimal struct wrapping Decimal :slight_smile:


I’m curious but confused. How could:

floats must be encoded as numbers in the JSON with precision preserved .

even make sense since floats don’t have explicit precision? For example:

iex(5)> << x :: bitstring >> = << 5.0 :: float >> 
<<64, 20, 0, 0, 0, 0, 0, 0>>
iex(6)> << x :: bitstring >> = << 5.00000000000000 :: float >>
<<64, 20, 0, 0, 0, 0, 0, 0>>

Different requested precision, exactly the same IEEE encoding. Even @michalmuskala in the cited issue remarks:

This encoding is intentional. The Decimal data type is about exact representation of a decimal number - something that cannot be done by a JSON number type, because they are not exact. Encoding decimals as floats would be incorrect and prone to extremely subtle errors - for example"9007199254740993.0") if encoded as a straight up JSON number, would be decoded in JavaScript (and most languages) incorrectly as 9007199254740992.0 since 9007199254740993 is not representable by a float.

Just curious how this standard requirement (floats preserving precision) could ever actually work!


Wrap a decimal struct in a custom struct, then implement the jason encoding protocol for your custom struct.

1 Like

@benwilson512 I get that, all good. Its the standard saying floats preserving precision. Perhaps I am too literally interpreting the intent of float in which case the standard should probably have said something more like “arbitrary precision number”

Hehe, it’s crazy! :slight_smile:

Basically, since the JSON document is encoded as a string, I can get something like
{"value": 5.000} as a string blob.

Encoding it, I’m using the protocol trick above to use a string as my source and not a real number in the language (Elixir in this case).

When I decode it, most languages would truncate the extra zeros as you know.

I have an XSD that tells me what every field should be – so I can use that to preserve floats as strings theoretically by not converting them.

I got the encode working with the protocol – now I need to figure out how to always force numbers to be treated as strings, then use my XSD to determine how they should actually be represented.

It sucks because it is against the standards – but it’s a “standard” :frowning:

1 Like

And yeah – by float I just mean a Decimal type basically in which precision (as in significant digits) matters (and not in the internal computer sense of representing numbers) Sorry for the confusion!

1 Like

This really doesn’t make sense, a float doesn’t have a precision in the json spec like that, you can encode json in a few other forms as well, including binary as some intermediary systems might do, not just a string, and this supposed ‘precision’ will vanish at that point even though they are identical in terms of JSON values. It must be encoded as a string or some more detailed structure if precision is important or it is categorically not JSON and thus traditional JSON libraries not only should not but even must not be used, nor must there be any intermediary middleware, proxies, or anything else that might compress it into a tighter form than textual JSON thus it would need a new mimetype and likely also extension as well.

It makes no sense. But since the precision can exist in the text form of a JSON message, the people that made the FHIR standard want it to be presented as a number in JSON, decoded as a string, encoded as a number in the document…

1 Like

This is the issue then, if it is to be decoded as a string then it must be a JSON string as JSON numbers do not have stable string forms.

It’s a terrible hack idea brought to us by an international standards body that made incompatible versions of a standard without including a version number and no simple way to derive such a thing.


This is horrifying… ^.^;

Why don’t they just use ASN.1 or so, it’s well known, stable, efficient, easily versioned, etc… And it’s already used in the medical industry (as I deal with it and use it in the medical industry)…


I don’t know! :frowning:

They also came up with their own ideas of ‘polymorphic properties’ (I know, it’s not used correctly) – in which the name of a field changes with its data-type. (birthDate, birthBoolean, birthInteger, etc.)

1 Like

This is already possible and also well typed in ASN.1!

It’s starting to feel like they are trying to convert an ASN.1 schema into json, very poorly, because they don’t match… >.>

Well, if you have to deal with it then forking Jason into a library for it (or maybe PR’ing into Jason in a way that does not change performance for the usual path) would be welcome for others? A dedicated FHIR library perhaps?


I think I will have to fork it to read things as strings. There were several issues/PRs from people seeking similar Decimal support and michalmuskala seemed keen on preserving the actual JSON spec, standards, sanity, etc :slight_smile:

Yeah that one is really the most important reason. ^.^;

But still, following a (broken) standardized medical spec might be a good enough reason to get the feature in now?

1 Like

Maybe :slight_smile: I got it working locally with a very small hack … and opened an issue offering to do a PR to place an option for evil parsing… so we’ll see!! :slight_smile: Thanks for the suggestions and affirmation about how crazy this is…!

1 Like