I have a form based on a schema “Recipe”, a recipe has many “Groups”, they are added/removed dynamically and each group has many “Ingredients”.
# ... There can be any number of groups wrapping this ingredient logic (sortable, dynamic)
<.inputs_for :let={ingredient} field={group[:ingredients]}>
<.input type="select" options={@ingredients} field={ingredient[:ingredient_id]} />
<.input field={ingredient[:amount]} label="Amount" />
# ... There can be any number of ingredients (sortable, dynamic)
</.inputs_for>
Any change to the group or the ingredients in that group will send the entire list of ingredients (the select) down the wire, it’s currently close to 250KiB, which gets a bit much for every validation step on keypress (even when throttled)
Worth mentioning is that both groups and ingredients use the Ecto sort/delete feature (which I love btw).
Once the data is received morphdom does very little, since nothing really has changed. I tried adding a wrapper around the select with phx-update="ignore" in some kind of desperation, but that didn’t work. Is there a way to not have repeated data sent on every message? Streams wouldn’t work since each ingredient can be added/removed/sorted, at least I don’t think it would work, considering the state is gone once rendered.
I’m not sure it will work in this instance but you could try extracting everything insideinputs_for into a LiveComponent. They do their own tracking of assigns, which is one trick to reduce data over the wire. I believe it is covered in this talk.
You might consider changing the select to a custom component that allows searching and partial loading. I don’t imagine a select with 250k of options is fun to use.
I had the same problems with a “block/content editor” for our CMS. This was a potentially HUGE assoc of blocks with child blocks, variables, etc. After tons of tweaking and testing, the absolute best way was to split out each block to its own form with its own validate function. Then the main entry’s form has a save event that assembles all forms into a giant changeset and saves. The difference was huge!
It’s not as terrible as it sounds, it’s big, but every option has an UUID which really eats up space, it’s supposed to be used with keyboards and typing to go to the option is instant
Might be refactored in the future but as of now I’m more worried about large messages. I will have a look at the video, thanks!
Interesting, but wouldn’t that hit the same issue? I can update the main form just fine, then the large data is not transferred, it’s only changes within the groups it occurs, but maybe that has to do with nesting or something, I would have to play around a bit with it.
I’ve been testing some more and found some issues overall, especially some of the core components seem to be excessively updated, but the big culprit is inputs_for, seems like it re-renders everything inside of it on any change within its context.
I mean I get how that happens, the data has changed somewhere, but it’s not very optimal if you have inputs_for with nested inputs_for and large forms. Guess I have to rethink things, a lot
Any change to the group or the ingredients in that group will send the entire list of ingredients (the select) down the wire, it’s currently close to 250KiB, which gets a bit much for every validation step on keypress
Question: can you make change affect only the socket changeset value and send it to db on form submit?
My idea would be that the dynamic changes of group/ingredients that go through validation are temporary data assigned in socket and only on submit they are stored in the db. So idea is to move the concern from db layer to elixir layer and using changesets, maps and lists separate and optimize changes. Not sure if my idea is good, take it with a grain of salt.
Maybe this should be called out in the docs, but this is a known reality with inputs_for. It iterates a list and those are only optimized when using streams (which form abstractions don’t support). This was considered a reasonable tradeoff given before <.form /> and <.inputs_for /> were added any change in a whole form would rerender the whole form, nowadays at least a forms top level data are change tracked individually and granular.
Generally the way to optimize change tracking with large forms is to avoid large forms. I’d argue at a certain point people won’t be interacting with the whole form – as in change things at all levels before expecting a save to happen – anyways. Make the actual forms smaller and save intermediate states more aggressively. If you need to retain the “commit to a set of changes after a while” consider storing e.g. drafts, which can then be promoted to published records.
I think a note in the docs would be nice, but now that I’ve looked at the implementation and all the great comments in the thread I understand the situation better
In this case it’s not the form that is large, just content of the options, and since you can have many group with many ingredients I get a larger and larger diff for every added ingredient.
The reason I went for inputs_for is that the ability to dynamically add/delete and get sorting for free (with a simple sortablejs hook) is just really really good. I get that it’s “just” form inputs but it’s still a lot to keep track of to implement it manually. In an earlier version I did part of the adding/deleting/slicing manually (the form was less complex then) in event handlers, modifying the changeset for each action.
Another downside with form splitting is that the UI might not make sense without adding extra forms, say you have
Form contents
inputs_for
Form contents
inputs_for
To keep the UI you’d have to split it into four separate forms (or just ignore forms altogether) since HTML doesn’t allow nested forms.
With Live Components I can make the diff only affect the single entity being validated (as in only the ingredient current having a change gets all the ingredients sent as a diff again). It could be good enough for this use case, compared to doing all the previously mentioned things manually.