I’ve been looking at my application in terms of resource usage. It’s a simple SQS consumer that calls some APIs and is under a very small load. BEAM constantly reports usage of around 70MB of memory:
Each module, and every function in every module is an atom. If you are loading a few libraries hitting 22K isn’t too hard. I’m not sure what to think of the memory. Its not so bad compared to other high-level language VMs though is it? I’m sure Go or Rust would be a lot smaller but a better comparison is Ruby, Python or Java.
I seem to recall a minimal web/proxy thing I wrote taking about 24MB used… There is an awful lot of runtime data available in the BEAM. You can easily find out what processes have the largest heaps etc. I’ve used the base Erlang system info calls, but I bet somebody here can guide you to easier-to-use tools that will allow you to drill down and find out where the big uses are.
If you do try building a release, I’m curious to see what the memory usage turns out to be. I made a quick comparison in a project of my own and also just a plain new phoenix project, in both cases memory use was much higher in the release. Why do releases use so much more memory than `mix run`?
As I point out though, my comparison may be unfair, as not all code is necessarily loaded in the mix version when I checked memory use, while releases preload all modules. But you should be able to get an accurate comparison.