Phoenix JSON Encoding

Hi,

I have an Ecto schema like this:

schema "items" do
  field :name, :string 
  field :json_data, :map
end

Where the structure of the :json_data field is very hard to normalize.

My problem is that the :json_data field can be very large and when serving it through an API, the decoding/encoding done by Jason noticeably slows down my response time.

I’m wondering if there’s a way for me to skip the encoding/decoding and simply store it as JSON in my database, have it come into my app as a JSON string and then serve it through an api without it having to be re-encoded.

You may write custom postrex extension type to pass the original iodata down to postgres with no processing.

The down side is that you may get exception from postgresql, instead of json library, for bad json string.


Or you if you don’t need to query them as json,m you may store them as string column.


Or you may put the temporary table as string column and use postgresql function to convert them… with trigger or separate job…


Or you may try json processing with NIF - such as https://github.com/rusterlium/juicy

1 Like