Is creating a PostgreSQL notifications process per user per Phoenix channel a good approach?

When ever a user connect to our phoenix channel like user join a channel when he login and leave channel when logout.So whenever it join channel we create Postgress notification process and listen to triggers

{:ok, pid} = Postgrex.Notifications.start_link(repo_config)
    {:ok, ref} = Postgrex.Notifications.listen(pid, "#{@channel}#{Enum.at(args, 0)}")

When user logout or leave channel due to some other reason like close browser we unlisten and stop the process.

Postgrex.Notifications.unlisten(state.pid, state.ref)
:gen_server.stop(state.pid)

Is This good Approach.

Other Approaches in my mind are.

  1. Create single Listener Genserver then broadcast to respective user.(Reason may create
    bottleneck )
  2. Create pool of Genserver and main listener Genserver just receive from Postgres and send to other Genserver from pool which then do some processing and broadcast to user channel.(Processing involve some data manipulation not any heavy task just add and check some field in user struct)

I think every postgres listener will have its own connection to the database which might be problematic if you expect a lot of users.

I would start with a single listener, keep an eye in it and then reach for something more involved if it proves to be a bottleneck.

2 Likes

Are they not using connection pool ? So if they using connection pool then it still problematic ?

The notifications process is a dedicated connection, it is NOT using the pool. It fundamentally can’t, it needs to reserve the connection to receive messages. It can’t simultaneously do that and be available for queries in the pool.

You’ll want to run one of these per node and then do an Elixir level fan out of those messages to other processes that care about them.

4 Likes