Serving images in a production environment

I’m building an ecommerce store on AWS using Nginx and Elixir/Phoenix where an admin can upload pictures for products. There is also a gallery page where an admin can upload any number of pictures. Needless to say from the second a user visits the site, through viewing products, their cart and finally checking out they will be loading and viewing pictures constantly along the way.

I’ve read about a method where you’re constantly running mix phx.digest, someone has mentioned Mongo DB as a good way to handle files, obviously I could use photoshop or something else to compress the pictures before upload but I wouldn’t want the admin user to have to worry about this and taking care of it programmatically is ideal.

To best optimize my website, how should I handle uploading and serving images with Elixir and Phoenix?

I think you need something like AWS S3.

1 Like

Thats been a thought on my mind, I wasn’t sure if that would handle compression as well. I’ll go read about it.

My preferred method of handling images is to store the raw images somewhere like S3, and then use a resizing image proxy to resize them on the fly. Typically, the resizing parameters are in the image tag’s URL, so they are easy to change without having to resize images. Resizing image proxies also have the ability to change image format and change the shape.

There are commercial proxies such as Imgix and Cloudinary, and there are open source self-hosted ones such as Imgproxy and Thumbor. They all have different features, advantages, and disadvantages.

When self-hosting, you’ll want to put the proxy behind a CDN such as Fastly or Cloudfront.

There are Hex packages for building the image URLs for some of these proxies: imgproxy, thumbor_client, thumbox, imgex, and a bunch for Cloudinary.

Currently, I’m using imgproxy hosted on a Render instance, fronted by CloudFront. Because imgproxy has a Dockerfile and Render makes it easy to set up a web service just by supplying a git repo that contains a Dockerfile, it took me less than 5 minutes to get it running.

12 Likes

The arc library allows you to upload files and set up whatever rules you want to happen when a file gets uploaded. For example you could have an admin upload a single source image of a product and then configure arc to use imagemagik to compress that image and also create a few thumbnail sizes of it automatically.

Then you can serve those images directly with nginx. Also, arc lets you configure it to write files directly to S3 instead of the local file system if you prefer that.

3 Likes

Heres what I came up with:


  def compress_resize_and_upload_to_s3(images_params) do
    Enum.map(images_params, fn %Plug.Upload{filename: filename, path: path} ->
      random = UUID.uuid4()
      unique_filename = "#{random}-#{filename}"
      {:ok, local} = Application.fetch_env(:app, :local_url)
      local_filepath = local <> "/" <> unique_filename
      {:ok, body} = Poison.encode(%{resize: %{method: "scale", height: 800}})

      case Tinify.from_file(path) do
        {:ok, %{url: url}} ->
          with {:ok, %Response{body: binary}} <- request("post", url, body),
               :ok <- File.write(local_filepath, binary) do
          end

        {:error, error} ->
          raise error
      end

      {:ok, smaller_image_binary} = File.read(local_filepath)

      ExAws.S3.put_object("S3_BUCKET_NAME", unique_filename, smaller_image_binary,
        acl: :public_read
      )
      |> ExAws.request!()

      "S3_BUCKET_URL/#{unique_filename}"
    end)
  end

Using Tinify which is an Elixir Api for TinyPNG, I send my image using from_file/1 which returns a url containing the image compressed. I then make another request to their resizing api, which Tinify does not support directly yet, scaling the image down to a height of 800px. After we write the smaller image to a folder on the local system we save it to AWS S3. From 3mb image files to less than 200kb!

Theres some opportunity for optimization here, takes about 6 seconds for one picture. Instead of saving it locally and then uploading to S3 I could probably just stream and upload, if anyone has any other suggestions lmk! I appreciate you guys.

1 Like

Can chunk it, and thus for each packet you receive you can just forward it right back out?

Though a redirect to the S3 bucket (with a few seconds long access key?) would probably be easier, or just linking that directly in the HTML?

@Codball Is that done in the background or blocking the UI ? I mean 6 seconds for each image is much.

It does block the UI, it is a bit much :upside_down_face:

I’ve realized I don’t have to save it locally at all and can just use the binary that is returned from the request call.

random = UUID.uuid4()
      unique_filename = "#{random}-#{filename}"
      {:ok, local} = Application.fetch_env(:app, :local_url)
      local_filepath = local <> "/" <> unique_filename
      {:ok, body} = Poison.encode(%{resize: %{method: "scale", height: 800}})

      case Tinify.from_file(path) do
        {:ok, %{url: url}} ->
          with {:ok, %Response{body: binary}} <- request("post", url, body) do
            ExAws.S3.put_object(S3_BUCKET_NAME, unique_filename, binary, acl: :public_read)
            |> ExAws.request!()

            "#{S3_BUCKET_URL}/#{unique_filename}"
          end

        {:error, error} ->
          raise error
      end

Still pretty slow though :confused: