Local storage vs something like Amazon S3 for files?

Would it be more effective to store fixed media assets (like lessons, videos, or images that I’ve created and aren’t expected to grow dynamically) directly on the server’s local storage instead of using a third-party service like Amazon S3?

I’m not talking about static content served directly, but rather files that are read on client request, rendered, and sent dynamically via LiveView. Since the media is pre-created by me and doesn’t change often or grow due to user input, could storing and serving the content directly from the server simplify the process and potentially improve performance by avoiding the S3 → server → client flow? Also, phoenix is easily distributed, so I can use that to my advantage if I want to have the media closer to the client in future.

Why not? I would generally encourage to go with the local storage by default and only switch to s3 only if you need specific features it offers, like for example navigating over the files from their interface or having automatic backups.

The performance is obviously better as you are not doing network calls, but you will have to figure out how to backup your files(if you need it) and this can vary extremely depending on how you deploy your application.

2 Likes

Thanks for the reply! Yes, this does make sense in my mind. I’m just getting started, so these are just some naive questions I have. For example, I want to be able to send an authorized user an image (or even a video) from the server using LiveView. However, as I understand it, LiveView works by diffing changes to the socket assigns.

To achieve this, I think it would require putting a base64-encoded string of the image or video into the assigns, which could result in a lot of data being sent over the WebSocket. My concern is that sockets aren’t really designed to handle large payloads like that, especially for media content, which can be quite large.

Additionally, the media I’m sending is not served via a static path. It’s dynamically fetched from the server based on some kind of user action or authorization, so it needs to be sent on demand rather than being accessible through a public or pre-defined static URL.

So I don’t know if I’m looking at this wrong or just heading in the wrong direction. How would this work for images and videos? Would using assigns for these kinds of files be efficient, or is there a better approach for serving such content dynamically through LiveView?

Nah the usual way to do this is to just use a normal link. This is why people use S3. You put the content on a private S3 bucket, and then use regular signed links with a short expiration.

Overall I don’t think it makes sense to put the content on your servers. S3 is cheap, durable, and easy to use.

1 Like

Thanks for your message : ). Do you have a recommended work flow when dealing with signed links in liveview?

It is the same whether using LiveView or otherwise. You gave records in your database usually that track what items exist and where on S3 they are stored. You then can use those records to also track authorization and access rules. If someone is allowed access to those records you use a library to generate a signed link. At that point it’s just a normal link and you put it in an img tag or whatever the normal HTML thing is.

4 Likes

If you simply want to let the user download the file/files it is as simple as creating a phoenix controller that will serve the binary, then generate a link in your liveviews, this is not only completely flexible (it’s extremely trivial also to make this work with s3), but you get the full benefit of filtering from your application who can access that dynamic link.

This is up to you how you want to implement it, with a phoenix controller implementing temporary URLs is trivial, but you will need to keep track of them in a database most probably, alternatively you could use a signed token in the link with an expiration time + filename and validate it upon reception.

3 Likes

Thanks for the info guys. Really appreciate it!