How to handle nested changesets where data is duplicated leading to unique key conflicts

I have a set of data in a huge nested array of json, the data already has ids associated with it.
Problem is some of the data is duplicated so if it encounters an entity with the same id later on, the resulting changeset has two identical entities with the same id and it raises a unique key constraint error on insert and the entire thing halts.

Is there a way to just force a huge nested changeset to ignore conflicts?

If you are using insert_all, then you can provide on_conflict: :nothing option for an appropriate conflict_target.

i did try this but it still raises a constraint error on the nested changesets… i put [:id] as the conflict target but still raises a Ecto.Constraint error

** (Ecto.ConstraintError) constraint error when attempting to insert struct:

    * answer_sets_pkey (unique_constraint)

In such cases I would consider avoiding nested changesets and manage the bulk inserts manually with insert_all.

i suspected this would be what I’d have to do.

Thanks for the confirmation.