Is there a way to reject a message during processing a request

Hello,

I’m new to Elixir development and I can’t find a way to do what I want.

I’ve a GenServer service which execute multiple HTTP calls in a row and then wait for the next message to process.

On the other side, I’ve an API server that exposes this service.

I’d like to reject instantaneously new messages that are received saying that something is already in progress.
But, every messages are stored in the mailbox, waiting for the next availability.

Is there a way to limit the mailbox size?

Thank you

Hello and welcome,

Don’t do this at the level of the mailbox, do it in a queue, or in a manager gen_server…

Also some options are available to manage limited resources, like poolboy.

1 Like

Thank you for your help!

So, I need to manage the “in-progress” state in another GenServer?
But in this case, I cannot mix sync and async messages?

API → GateManager → Gate

API wants an immediate response: “OK” or “KO already in-progress”

GateManager could manage the state, but will have to send a sync message to Gate to wait for the response and update the state. During this time, all new messages received are stored in the mailbox and will be processed after the first one.

In the end, as all the messages are synchronous, an HTTP timeout could occur, and another request during the first, will queue a new message in the mailbox, that will be processed right after.

You can monitor the queue depth with process_info and do different things in different conditions. I think this is what the Logger backend do:

  • when queue is shallow, do async processing, not to hold up the clients
  • when queue is deeper than a threshold, do sync processing to slow down the clients
  • when queue is even deeper, just drop messages
1 Like

Make sure you need/want a separate process doing this, versus relying on the concurrency provided by the web server already (workers in Cowboy, etc). GenServer and “service” have a lot of the same letters, but they aren’t always the same thing.

That said, the key factor in dividing up this functionality is the question “when are responses possible?”.

A single GenServer that synchronously makes HTTP calls won’t respond to messages until after all those requests are handled.

In order to reply while requests are in-flight, you need at least two processes (one to do the requests, one to reply with “no we’re busy”). They can break apart in multiple ways:

  • the single GenServer could use an asynchronous HTTP client (for instance, HTTPoison’s stream_to option) which has a process under the hood; the GenServer remains responsive to messages and tracks the status of the HTTP requests. This makes the code more complicated, as it’s now spread out across handle_info handlers, but satisfies the liveness requirement
  • keep the existing, synchronous GenServer but put a “minder” GenServer in front of it. The “minder” handles incoming requests and either sends them via cast to the “request” GenServer or replies “nope”, as well has handling the reply (or EXIT) from the “request” process. This splits apart tracking the “busy” state from actually doing the requests.
  • (building on the above) :poolboy (referenced earlier in this thread) implements almost exactly this pattern, but with many “request” GenServers
2 Likes

Yes, it was a way to synchronize state. But I didn’t want to have subsequent message waiting in the queue.

I think this is exactly what I want to achieve. I wasn’t able to successfully manage sync/async messages. I’ll try this.

Thank you for all of your responses, it’s a little clearer now. :slight_smile:
I’ll keep you posted of my progress

I’ll look into this if I really need to manipulate the message queue.
Thank you for pointing that solution.