Very high memory usage when serving large file with Phoenix

Hello, i began to explore Elixir and Phoenix recently and have decided to do some projects with it.

I need to read a file with a lot of urls (currently 1mil for testing) and serve that as json to clients.

The controller looks like this:

defmodule FnApiWeb.FetchBlacklist do
  use FnApiWeb, :controller

  @blacklist %{"sites" => File.read!("top1m.bcp")
               |> String.split("\n", trim: true)
               |> Enum.map(fn x -> "*://*." <> x <> "/*" end)}

  def index(conn, _params) do
    json(conn, @blacklist)
  end
end

On 500, 1000 or 10000 urls memory usage and performance is great.
When i do 1mil it takes more than 4GB of RAM to even start the phoenix server.

I believe i’m doing something wrong and the code must be placed elsewhere but i’d like some help on this.

Modules are loaded into memory either on startup or on usage. This will include your module attribute. If you have a large file I’d strongly suggest to put the file/json within priv/ and use Plug.Conn.send_file/5, which should be way more efficient.

The thing is, i need to make a json table using the file first and then send it to the client, should i do the conversions inside the index function before sending it?

I’d do the conversion at compile time, but put the data back into a file instead of trying to store it on the module.

Thanks! that worked!
Should this compile-time code be placed somewhere better or is the controller the right place?