Serializing Ash Structs with Jason

So I am using live_svelte which lets you use svelte in your live_views. You pass in the props to the svelte component in a h_sigil and live_svelte serializer them using Jason. Is it possible to derive the encoder for ash resource structs?

I am currently manually implementing Jason.Encoder. Is there a means to derive the encoder for ash resources?

defimpl Jason.Encoder, for: Flame.App.Reactant do
  def encode(value, opts) do
    Jason.Encode.map(Map.take(value, [:id, :identity, :spec]), opts)
  end
end

I think I’d actually suggest not using a protocol-based encoder, primarily because with Ash resources you can have calculations/aggregates/metadata that you may want to display along with the resource. For instance, if you wanted to show related data, or if you wanted to show computed properties, you might want something like this:

def encode(resources, opts \\ []) do
  Jason.encode!(sanitize(resources, opts))
end

defp sanitize(records, opts) when is_list(resources) do
  Enum.map(records, &sanitize(&1, opts))
end

defp sanitize(%resource{} = record, opts) do
  if Ash.Resource.Info.resource?(resource) do
    fields = opts[:fields] || public_attributes(record)

    Map.new(fields, fn
      {field, further} ->
        {field, sanitize(Map.get(record, field), further)} 
      field -> 
        {field, sanitize(Map.get(record, field), [])} 
      end)
  else
    record
  end
end

defp sanitize(value, _), do: value

defp public_attributes(%resource{}), do: resource |> Ash.Resource.Info.public_attributes() |> Enum.map(&(&1.name))

Something like the above would let you call encode(record, fields: [:field1, :field2, relationship: [fields: [:field3]]).

This lets you load data and serialize it, i.e

MyResource
|> Ash.Query.load([:field1, :field2, relationship: [:field3]])
|> MyApi.read!()
|> Encoder.encode(fields: [:field1, :field2, relationship: [fields: [:field3]])
6 Likes

Wow. This is great. I’m guessing the names are a little wrong. They should be.

def encode(records, opts \\ []) do
  Jason.encode!(sanitize(records, opts))
end

defp sanitize(records, opts) when is_list(records) do
  Enum.map(records, &sanitize(&1, opts))
end

Cheers!

1 Like

Alternative solution for those who stumble upon this thread:

Had the same use case with live_svelte and decided to go with more automatic approach by writing an Ash extension that defines customizable Jason implementation for a resource based on its fields - ash_jason.

To get some reasonable default behavior just include AshJason.Extension into extensions in use Ash.Resource call. To customize use optional jason section.

It is also possible to write your own extension using ash_jason as a reference point - the library is small, just around hundred lines between two files.

13 Likes

Is there a newer way to convert Ash resources to and from JSON?

The answer here really is “it depends”. A record can exist in many states, and in general it doesn’t always make sense to have one “this is how you turn it into JSON”. For example, you don’t necessarily want to encode all loaded relationships every time, etc. So to really answer I’d have to know why you are encoding/decoding Ash resources to/from JSON :smiley:

It would be just to edit the Resource as a JSON. I also want to replace MongoDB with PostgreSQL. I would migrate MongoDB documents to a JSON column in a table.

In general, “editing a resource as JSON” isn’t a super flexible concept. You’re altering the underlying data structure directly. If you model your data using embedded resources or map types etc. you can accomplish pretty much whatever you want.

Why do you want to be able to arbitrarily edit your data model as JSON? In general, application design (IMO) should be going through “actions” to edit data in defined and semantic ways, with validations etc. applied.

I would want the JSON to go through actions to edit the data. Some users prefer taking JSON from a textarea to edit it in their editor instead of clicking around a form.

So with an “action” the concept is you have a contract of its interface. For example:

update :increment_score do
  argument :amount, :integer, allow_nil?: false

  change atomic_update(:score, expr(score + ^arg(:amount)))
end

The only way to allow editing json one-to-one with the underlying data model is to effectively do away with that input/data model abstraction which is typically quite important. What you can do relatively safely however, is something like

update :update do
  argument :stuff, <an embedded type, or a struct/map type w/ fields constraint>

  change fn changeset, _ -> 
    # decide what to do with the `:stuff` argument.
  end
end

Users could edit a specified type by you as JSON, and you can map it to its effects on the action. If the thing they are editing is just “a blob” of sorts then you can just write it directly to the corresponding attribute etc.

And, FWIW, if there is an attribute that is just a “user editable data model”,
then you can do

attributes do
  attribute :name, <# embedded type, or a struct/map type w/ fields constraint>
end

actions do
  update :update do
    accept [:name]
  end
end

If a map Is provided as input to name (i.e free form JSON data) it should “just work”.

Could we also pass JSON to AshPhoenix.Form.submit or Ash.Changeset.for_create like Ecto.Changeset.cast that takes in a map with string keys?

Yep :slight_smile: you can do that.