Using embedded_schemas for `form_for`

I have a form that lets users set on and off time for a set of machines, this leads to the upsert of potentially several records (a ProductionDay record is created for each machine that is toggled to on).
Because I have to update several records in a single form I decided to use an embedded_schema to manage the user inputs then convert it back to ProductionDay change sets before updating. I’ve seen and used this pattern before but on this particular form I’m having lots of issues I think because there is a mix of persisted data and data that is not perhaps.

When loading the form I get all Machines and then create a %ProductionDay{} record for them. If the machine is already set to on a ProductionDay will exist in the database and be loaded, if it’s off I’ll need to create a ProductionDay struct with some default data to pass to the form.

  def create_changeset_for_location(%{id: id} = location, day) do
    machines = Machines.all(location_id: id)
    production_days = all(location_id: id, day: day)

    production_days =
      Enum.map(
        machines,
        fn %{id: id, name: name} ->
          case Enum.find(production_days, &(&1.machine_id == id)) do
            %ProductionDay{} = pd -> pd
            _ -> %ProductionDay{ ...defaults...}
          end
        end
      )

    ProductionDayInput.changeset(%ProductionDayInput{production_days: production_days}, %{})
  end

This creates a ProductionDayInput changeset with embedded ProductionDays:

defmodule App.ProductionDayUpdateInput do
  embedded_schema do
    embeds_many(:machine_days, ProductionDay)
  end

  def changeset(%ProductionDayUpdateInput{} = input, params \\ %{}) do
    input
    |> cast(params, [])
    |> cast_embed(:machine_days)
  end
end

Then when updating the LiveView I keep the change sets up to date like this:

%{assigns: %{production_day_changeset: prod_day}} = socket
ProductionDayUpdateInput.changeset(prod_day.data, params)

to update the changesets.

You may have already noticed the issue but with embeds_many/cast_embed I’m not actually associating the changes with the initial ProductionDay records we embedded, it’s creating two new changesets because on_replace: update is not an option. This means that when I go to persist this info I need to cherry pick between data on the initial changesets and the changes on the new ones. That feels clunky and like a hacky workaround.

I’ve tried first converting ProductionDay to another embedded_schema like ProductionDayPlaceholder but I’ll always come up against the wall of trying to cast the changes using cast_embed.

This question may seem similar to this one but I’m not necessarily looking for a way to force on_replace: update I’m just trying to figure out the best way update/insert these ProductionDays whatever it may be.

Hmm, do you have associations set up between Location and ProductionDay e.g. Location has many ProductionDay and ProductionDay belongs to Location, perhaps through Machine?

If so, you could use cast_assoc/3’s built in support for Partial changes for many-style associations.

By preloading an association using a custom query you can confine the behavior of cast_assoc/3. This opens up the possibility to work on a subset of the data, instead of all associations in the database.

Taking the initial example of users having addresses imagine those addresses are set up to belong to a country. If you want to allow users to bulk edit all addresses that belong to a single country, you can do so by changing the preload query:

query = from MyApp.Address, where: [country: ^edit_country]

User
|> Repo.get!(id)
|> Repo.preload(addresses: query)
|> Ecto.Changeset.cast(params, [])
|> Ecto.Changeset.cast_assoc(:addresses)

This will allow you to cast and update only the association for the given country. The important point for partial changes is that any addresses, which were not preloaded won’t be changed.

In your case, instead of restricting by :country, you’d be restricting by :day when preloading ProductionDays rather than Addresses with Location as the parent instead of User. If I recall correctly, this approach may require leaving the :on_replace option set to the default value :raise.

  # pseudocode
  def preloaded_location_changeset_for_day(id, day) do
    query = from ProductionDay, where: [day: ^day]

    location = 
      Location
      |> Repo.get!(id)
      |> Repo.preload(:machines, production_days: query)

    production_days =
      for %{id: id} <- location.machines do
        Enum.find(location.production_days, %ProductionDay{ ...defaults...}, &(&1.machine_id == id))
      end
   
    Location.changeset(%{location | production_days: production_days}, %{})
  end
1 Like

This is essentially the approach I ended up taking (100% inspired by your response).
There are a few other places in the codebase where I’ve noticed the pattern of utilizing the same data for different schemas, effectively only loading/accessing what you need from the database. So I took your approach of querying the Locations and using the preload(:production_days) but I did it with a new module to keep the context separate (since this actually dosen’t have much to do with location management):

defmodule App.Scheduling.ProductionDayUpdateInput do
  alias __MODULE__
  alias App.Scheduling.ProductionDay

  import Ecto.Changeset

  use App..Schema

  schema "locations" do
    has_many(:production_days, ProductionDay, foreign_key: :location_id)
  end

  def changeset(%ProductionDayUpdateInput{} = input, params \\ %{}) do
    input
    |> cast(params, [])
    |> cast_assoc(:production_days)
  end
end

Then it’s loaded as a changeset with:

 ProductionDayUpdateInput
    |> from(as: :pd)
    |> where([pd: pd], pd.id == ^id)
    |> preload(production_days: ^from(ProductionDay, where: [date: ^day], preload: :machine))
    |> Repo.one()
    |> ProductionDayUpdateInput.changeset(%{})

Perhaps not the most idiomatic solution but like I said, I’ve seen this pattern around our app and wanted to implement it.

1 Like

Yup, that’s a pattern very much encouraged by Ecto. Its approach to schemas makes it easy to tailor the mapped structs for a particular context/use case by decoupling how the data is represented in business logic from the database. While it’s a bit more verbose upfront compared to a traditional ORM, I’ve found the flexibility is worth the tradeoff.

Makes sense! And it’s always nice to see the solution people end up going with so thanks for the update.

1 Like