External DB connection (redis) and supervised processes

I have a need to connect to a redis database (using Redix) storing external state. My workload comes through kafka - so I’m listening to the kafka stream using Kaffe. If every process opens a connection , that seems really inefficient. Is there a way to flow state through a supervisor - so open the connection with the supervisor start and pass it?

The following is the kafka message handler being called - you’ll notice redix connecting to the database which I’d rather pass in.

def handle_messages(messages) do
    {:ok, conn} = Redix.start_link
    for %{key: key, value: value} = message <- messages do
      IO.inspect message
      
      key
        |> process_item(conn)
        |> IO.puts
    end
    :ok # Important!
  end

I was thinking about this scenario recently, and this is the perfect place to get feedback.

I view the conn as state, the connection state is either connected or not connected.You could use an Agent (or GenServer, your call) to store the connection state and add a public api to the Agent/GenServer to access data. You will only have a single connection to redis, which will eventually become a bottle neck.

Next up is a connection pool, where handle_message would request a connection, and use that for process_item. Or process_item could request the connection. If you didn’t want to write the connection pool yourself, check out Poolboy. This is the approach I was going to use.

Very interested in what others suggest :grinning:

You can also use some existing solution from HEX e.g. https://hex.pm/packages/ex_redis_pool

1 Like

Thank you both, the redis pool is exactly what I needed. The ex_redis_pool implements a supervisor as well - still getting used to the paradigm shift.