When you are using stream and then use Enum.to_list() everything is just copied into memory ignoring benefits of streams. You should do your work in stream chunks avoiding loading everything in memory. Could that disk usage be because OS just run out of memory and had to kill some processes to stay alive?
Have you tried calling :erlang.garbage_collect() between calls? I don’ know how Elixir GC works but maybe it’s not triggering fast enough. You can also run each query inside a Task (another process) and process it there. When task exits all memory it used is released because process it used dies. Except reference counted binaries it shared to other processes.
String are also binaries in Elixir and any string larger than 64 would be reference counted if I have understood how things work in VM correctly.
Maybe problem is similar to what C#'s .NET had that automatic detection of memory limits at least under docker caused .NET to allocate too much memory because GC wasn’t aggressive enough causing OOM. So setting memory limit when running Elixir could help. Fast search showed this is the way to do it Running Elixir with limited heap size - #3 by alvises






















