Async http requests using a single connection

I’m developing an application that needs to make frequent API calls to a web server, but the server only allows me to have one active http connection at a time. As i need to make several hundred requests every few minutes, i need this to be done asynchronously. I’ve thought about creating a hackney pool with a limit of one connection and using it for the requests. Does that sound like a good solution? How could I optimize performance in such a situation?
Thanks in advance.

Does the remote server support HTTP2? I’d consider looking into Mint, https://elixir-lang.org/blog/2019/02/25/mint-a-new-http-library-for-elixir/ it’s an HTTP client that would let you have the kind of control you need for this.

3 Likes

I think OP’s use of the word “async” might confuse some of us. I think he wants requests processed in order, but by async he means fire off a request, keep working, get a callback/message when the result is available.

If I’m right, then a simple genserver pulling requests out of its mailbox, sending them and waiting for result synchronously, then sending a message with the result, would work. This also has the advantage of isolating clients from the choice of HTTP library–would allow for trying out different ones.

Hi, thanks for the response. Actually, there is no need to preserve order, async as async goes, the requests are totally independent an can ve processed in whichever order they arrive. What worries me is how to achieve optimum concurrency using only one connection.

This looks very promising indeed. Thanks for the reply. I’ll look into it. As for HTTP2, i’m not sure, will have to check. In your opinion, would HTTP1 not allow for multiple requests to be fired to the server through a single connection and subsequently wait for the responses to arrive? Would they be treated synchronously by the server? Thanks!

As I understand it that’s correct, HTTP1 does not let you multiplex different requests over a single connection.

2 Likes

They must be handled in order. You can send request #2 before getting the response to request #1, so you can overlap I/O, but responses will be in order, and 99.99% likely the server will process them in strictly sequential order. There is basically no concurrency over a single HTTP 1 connection. So you cannot have “optimum concurrency” for your requests because your provider is explicitly forbidding it.

1 Like