I use webpack to bundle up my clientside application, which I place inside the priv/static folder as part of my build script. I then route to these based on the subdomain from which they were requested.
Without using hashes in the filenames, I was doing something like
defmodule MyApp.PageController do
use MyApp.Web, :controller
def landing(conn, _params) do
def app(conn, _params) do
defp put_headers(conn) do
conn |> put_resp_header("content-type", "text/html; charset=utf-8")
defp render_entry(conn, file) do
path = resolve(file)
conn |> Plug.Conn.send_file(200, path)
defp resolve(file) do
But I am not sure how this will be possible when the filenames are not known upfront. I suppose you could pattern match on the filenames in the static folder, but this does not seem very efficient.
Just using the host option in the Phoenix router
I did solve it though using a regex to match the first, un-hashed part of the filename, as the file is being cached on a CDN in any case
These are two ‘solutions’ I can think of right now:
Build a system in which the filename is known up front: This means that you create a data store (using for instance ETS or maybe SQL) containing pairs of filenames <-> hashes. You can then instantly check what name belongs to what hash (and if it exists at all).
Have people request a file using both the filename and a hash. The hash can be used to make sure they indeed are looking for the proper file (in this case, the hash should be computed using some kind of appended secret only known to you, so it is impossible to predict the hashes.)
The way Phoenix solves this is by building a .json file (it could be an Elixir file as well) of the filename => digest mapping. Then we load this file when the server starts, for example into an agent, then you just need to query the agent for the digest when it is time to serve it.