Compilation going out of memory

I am trying to dynamically create Elxiir modules which have binary file contents embedded in their functions.

It boils down to this:

defmodule EmbeddedFile do
  def contents do

When compiling this, I observe that compilation takes very long and runs out of memory often. It seems as though the memory usage is exponential to the size of the binary. Even a file of 5MB triggers this, I have 8 GB of RAM.

Does anyone have an idea of that’s going on here?

1 Like

Why do this? If you’re trying to send files via a web server use

I wanted to run a webserver as escript similar to what erlang PL does, so it needs to embed all “files” inside the escript. I’ve now looked how they do it and do it differently, embedding the files in the escript .zip.

However my question still stands, the memory usage is quite surprising. @josevalim mentions something similar in this post however there it’s only about slowness of compilation, not the memory usage.

Fair point! I don’t know off hand, reading very large files in the module body itself is fine, but it seems that trying to place that value within the function causes issues.

AFAIK there is a compiler “bug” that results in binaries being expanded to separate characters only to be collected back again in a later compiler pass. A fix by @josevalim is scheduled for OTP 20 (and released in the rc)


Thanks for the link, looking forward to OTP 20 now even more :slight_smile:

1 Like