Fairness in message queue system

Hey there,

We are currently using Oban in a multi-tenant application with shared queues.

We are running into a problem with noisy-neighbors. When one user can flood the queue and take up all the available resources making others wait.

Problems described here in more detail:

https://medium.com/thron-tech/multi-tenancy-and-fairness-in-the-context-of-microservices-sharded-queues-e32ee89723fc

We tried out global partitioning: Oban.Pro.Engines.Smart — Oban Pro v1.5.2
While this works as advertised, it locks in the big user to their limit when there is no other jobs in the queue leaving capacity on the table.

Changing priority during inserting the jobs is not going to work. We want to process the big user’s jobs just as the small ones. We are looking for fairness but also full capacity utilization (as far as local limits go on a queue).

I was looking at some options that the links mention, to have separate queues per customer and some load balancer load the jobs into the actual queue but we would end up with a lot of queues and that seems to be a bad thing.

Has anyone implemented something like this or have some ideas?

I am open to other message queues or systems if they address the concern.

Thank you

Hi @benonymus,

We’ve heard of this issue before. We’re currently slating a “burst” option, (For Pro v1.6),
that will help keep the queue saturated.
For now, you could rescale the queue manually with Oban.scale_queue/1 if you have certain periods of lower activity.

5 Likes