I am working on a project where users have the possibility to export data from different Psql tables as CSV.
Inspired by a great article (https://joaquimadraz.com/csv-export-with-postgres-ecto-and-phoenix) I managed to successfully setup a Stream to query and send the csv data as chunks to the user, where they are made available as a download.
This works great for one table at a time, but today I attempted to expand this as such that the user is able to request for multiple tables and receive the data in csv format zipped (compressed) into one file. However, I cannot get my head around how to achieve this.
Zipping data in a lazy manner seems counterintuitive to me, is this possible in Elixir/Erlang? It appears to me I need a structure that lazily reads rows from a Psql copy command, while zipping and also keep track of where one file stops and another starts.
How can I compress the data and send it as one package using a Ecto.Adapters.SQL stream and send it chunked to the user?
I hope I phrased the question clear enough, please let me know if more explanation is needed.