Resilient and battle tested event store with PostgreSQL

Have you considered using a purpose built event store, such as Greg Young’s Event Store?

“Store at around 15,000 writes per second and 50,000 reads per second!”
https://eventstore.org/

I’ve written an Elixir Event Store using Postgres for persistence. The performance when running the benchmark suite on my laptop is 4,929 events/sec for a single writer and 8,586 events/sec for 50 concurrent writers.

To persist events without blocking you could do:

Task.start(fn ->
  EventStore.append_to_stream(stream_uuid, :any_version, events, :infinity)
end)

Alternatively you could send the GenServer process a message to store the events outside of the request:

defmodule Work do
  use GenServer

  def handle_call(...) do
    send(self(), {:persist_events, events})
    # ...
  end

  def handle_info({:persist_events, events}, state) do
    EventStore.append_to_stream(stream_uuid, :any_version, events, :infinity)
    # ...
  end
end

You could even use an approach with two GenServers. One to accept requests (e.g. store these events) which forwards the request to a second process to actually persist them to the event store. This allows the first GenServer to immediately reply without any blocking as it is offloading the work to another process.

How do you deal with storing events when the database is inaccessible? You could push them onto a queue and have one or more consumers taking events from the queue and writing them to the event store. However the same problem applies when the queue is unavailable.

5 Likes