Hi,
What’s the simplest way of going over an extremely large set of records in a database? Is there something like Rails’ find_in_batches in Ecto?
Hi,
What’s the simplest way of going over an extremely large set of records in a database? Is there something like Rails’ find_in_batches in Ecto?
Ecto.Repo.stream/2. You can see in the docs it takes a :max_rows
of 500
, which is the same as the batch size in find_in_batches
.
result = Repo.transaction(fn() ->
your_ecto_query_here
|> Repo.stream
|> Enum.each(fn(one_record) ->
# do something with one record here.
end)
end)
case result do
{:ok, success} -> # ...
{:error, reason} -> # ...
end
For those of you who want the “batch” part of find_in_batches for whatever reason, use Stream.chunk_every to coerce the Repo batches into Lists:
Repo.transaction(fn() ->
your_ecto_query_here
|> Repo.stream(max_rows: 100)
|> Stream.chunk_every(100)
|> Enum.each(fn batch ->
# do something with one *batch* here.
end)
end)