Storing Collection for batch insert in database via Ecto

I get some data from third party HTTP API. I get each datum (json → store as map) at random intervals. And I get all the data within 10 minutes (upper limit) from the API.

I do not want to keep inserting into my postgres database each time I get a datum from API.

Instead, I want to store each datum as a list of maps and then insert at once.

What is the best practice in this case?

  1. use ETS?
  2. use Genserver?
  3. simple prepending all the data to a list of maps (where a map in list represents a data point)
  4. Enlightening approach that I am unaware of.

Understanding the motivation for this restriction is important; the alternatives are all trading complexity and/or durability for avoiding the simplest approach.

Some things to think about:

  • what should happen if the data doesn’t arrive within 10 minutes? Drop it? Notify something? Wait moar?
  • what should happen if the system goes down partway through? Will the third party API retry? Does that data matter?