Running an oban job on every node connected to a central database

What I’d like to do is run a job on every node connected to the same postgres database. Specifically I want to refresh ETS tables on every running node. The nodes are not clustered via erlang, so I’d like to use postgres as the “signaling” layer.

Is there a facility in Oban to achieve this kind of behavior that I missed?

1 Like

Not Oban

You can use listen/notify at low node count it will be fine

Yes but I suspect it can also be done with Oban, for example by setting up a separate queue per node. Each node would get their own queue. You need to come up with a naming scheme and also keep track of the nodes/queues you have.

https://hexdocs.pm/oban/splitting-queues.html

Now, if you are not running in an Erlang cluster but have a functioning Phoenix.PubSub via a Redis backend or some other one, you can also just use PubSub mechanism to do that.

I also thought about that, but that makes instrumentation harder, because then you need to track your nodes outside of the code. This means more infrastructure bookkeeping :slight_smile:

Regarding Redis: if I would employ redis, I’d probably skip using ETS at that point. The key here is to use as few moving parts as feasible in order to keep complexity at bay.

@evadne yeah that was also my conclusion since Oban is basically doing the same, at least in the basic setup :smiley:

Thanks for the sanity check!


I’ll leave it “unresolved” for a bit, just in case we all missed an Oban feature that makes this easily possible and someone else wants to chime in :slight_smile:

Oban cron is designed to run on only one node, quite the opposite of what you’re going for. However, there is an undocumented function that plugins use to run on cron schedules.

For example:

Oban.Cron.schedule_interval(self(), :refresh, "0 * * * *")

That will send the message :refresh to the current process at the top of every hour.

2 Likes