I am trying to create a unique id for a set of inserts to the database, and I want to make sure that it is unique.
Is this a good approach?
defp generate_random_unique_id() do
repeat_id = UUID.uuid4()
events =
Event
|> Event.by_repeat_id(repeat_id)
|> Repo.all()
case events do
[] -> repeat_id
_ -> generate_random_unique_id()
end
end
Is there ever a possibility that the process could get stuck here?
Probably Repo.aggregate would be better (as it’s faster to count then get all records, deserialise them, etc), as you could just count then do what you need? Or just conflict?
In your approach there is a potential race condition in which two processes could theoretically create the same ID and then attempt to insert it. It may be better to delegate the creation of the ID to the database in a trigger.
Why not use the database? Create a table which holds the unique value, enforced with a unique constraint, and then reference this table from the multiple rows you are inserting?
Use and ecto transaction to make sure it is atomic.