How to troubleshoot/optimise memory-usage with Observer?

Hey folks!

I’ve got a basic Cowboy/Plug-centered Elixir app. It processes a lot of data inputs. I’m trying to deploy it to Fly.io, which works. But I keep blowing out the available memory causing the VM to shutdown. Rather than keep increasing the resources/scaling up, I’d like to figure out why/what is causing the peak memory consumption.

I’ve been able to observe it running with :observer but I’m not really able to peg what the call in the app is that is maxing out the memory consumption. Anyone know of a way to instrument this, or catch it with observer? Hopefully this makes sense.

2 Likes

If you are processing a lot of binary data, you could be running into some issues with large reference counted binaries. While :observer is a great tool, I would also take a look at Recon and specifically the :recon.bin_leak/1 function call. That function can help you pinpoint binary related memory leaks. Recon has a bunch of other utility functions as well, so def review the docs!

I would also suggest taking a look at Chapter 7 in Erlang in Anger as it has a lot of good information around memory leaks on the BEAM and how to diagnose+fix them.

5 Likes

Thanks much. I’m pretty sure it isn’t a memory leak. :smiley:

It’s more a case of having the data in memory for reasons. I can do the old fashion IO.puts for each process to try and catch it.

I’ll definitely look at Recon though!

1 Like

Erlang in anger can be a great resource: https://www.erlang-in-anger.com/

It talks about Recon too iirc.

If it’s memory the observer will help you see what kind (ets? process memory? Something else?) and that can be very helpful.

Looking at the processes and sorting by their memory usage can also be helpful. Maybe one process is using a lot, or maybe you’ve spawned a ton of processes all of which use enough that it adds up quick (the latter has happened to me and I diagnosed it with the observer!)