So I’m building a background service that periodically (once a minute) will reach out to a number of machines via HTTP or plain TCP. Each interaction should only take 100-200ms.
I’ve put each machine into each own
GenServer and register those globally via
Registry so I can reuse them and ensure that a) I don’t overwhelm a remote machine by accident, and b that requests for each machine will be serialized.
However, as the number of machines will scale to around 1000 (generous upper limit), I wonder if I will face the dreaded “out of file descriptors” error.
Since I’m being careful to not keep file descriptors open, I can always just bump the limit of file descriptors - but I wonder if there is any other hidden Erlang/Elixir limit I will hit before that.
I know there’s a port limit but that is be high (65k) and this VM doesn’t do anything else, so for plain TCP connections using
:gen_tcp I should be fine.
For HTTP, I’m using straight
HTTPPoison with the default options, which, as far as I can tell, doesn’t use any connection pooling. Now, I’m not sure whether I want to use HTTP keepalive, since I don’t trust the remote HTTP servers to do the right thing WRT to keep alive, and I don’t have control over them. I’ve seen weird bugs in the past.
Any other thing I should be aware of?