Send a blob from js to liveview handle_event

Hello guys, i have some troubles to send a blob of an audio record to be uploaded throught handle_event, the case are that i have not idea of how is the way to solve it. i got the audio blob on the hook and i could do a push to the handle event but after what can i do to store it? exist some way to handle it as any entry upload file? thank you so much in advance

Have you looked into Live View file uploads? Uploads — Phoenix LiveView v0.15.7

@benwilson512 yes, but the case is that i can not use the helper live_file_input, the basic idea is to make an audio recording through js and send it to a handle_event of liveview to be store. right now i use the hooks to do it, but i can not found a way to send the blob created and store it.

Today you could join a separate channel on the Phoenix socket and send the binary data that way.

The LiveView handle_event callbacks will never receive binary data directly (see type of unsigned_params could be binary by Ljzn · Pull Request #1390 · phoenixframework/phoenix_live_view · GitHub), but my intention has always been that live uploads could eventually support this use case.

Technically we should be able to upload any File/Blob/ArrayBuffer, but there are at least 2 limitations:

  1. We are lacking a client-side API to enqueue/manage upload entries
  2. We will need to decide whether or not we will support uploads of indeterminate length (i.e. streaming)
3 Likes

I’m also trying to figure this out and had a few alternative ideas. @erikhu, I think we have the exact same issue - how to capture via the MediaRecorder API and upload it to the server. In my case, I always want to allow file uploads for pre-recorded audio.

Disclaimer: I am not a LV or JS expert. However, here were my ideas

  • :x: Append a blob as a file to an actual <input type="file">, then let LV handle the rest. Sadly, this is not possible with browser APIs.
  • :x: Could I append my blob to the submit data that liveview passes through its websocket connection? Reading through phoenix_live_vew.js, I couldn’t find a feasible way to do this.
  • :x: Since I plan on uploading to s3 anyway, perhaps I could inject an entry via the uploader…? Yet, this won’t get called unless I somehow mock an entry with some usable reference I could pass to the uploader.
  • :man_shrugging: My current plan is to basically roll my own, with two inputs - a file input handled by LiveView and one that I will handle myself. Basically, I’ll attach my blob to a custom XMLHttpRequest request and on success pass the s3 URL to my text input field. There is a good example in the Phoenix docs at the bottom of the page. Then, in my component, grab upload file URLs from the live_file_input entries or from my custom text field. Perhaps my changeset can validate missing attachments. I’d also need to sign my upload requests to s3 somehow.
  • :-1: For those who can live with less polish, one could also add capture: true to the live_upload_helper - It seems that with this, folks on mobile can both record and upload their recordings. Then you could ask users on desktop to use mobile *cough*. This probably not a workable solution for most, but might be fine for an internal app or MVP.

I’d strongly welcome other approaches.

P.S. @mcrumm, if you guys do add support, I say skip the indeterminate length. Just my two cents, but I imagine it reduces complexity and could potentially be added later. I also saw this DockYard blog post to help those needing streaming capabilities.

2 Likes

Once You have the file, You might move it to a place where plug static can serve it. Or use plugins, like Waffle and waffle_ecto.

The first solution should be simple, but You need to know what path to use… and setup plug static in your endpoint.

The path should be relative to the private dir of the application. You can get it with …

:code.priv_dir(:your_app)

The second solution is suitable when You want to apply transformations (like image resizing etc) or use the upload as a simple field in a schema.

1 Like

As Michael said, you’ll need to use a regular channel on the side which will allow you to push up binary blobs

1 Like

Trying to find where this is documented. Do you mean the livesocket can serve regular channels? Or would this be a separate socket?

You can base64 encode the blob into a string and sent it in pushEvent. Obviously this is not the best use of the bandwidth but for non performance critical stuff it should be the simplest way. With a separate channel you need to deal with another server side process, do another auth,

If the blob is more than a few dozen KB you may want to chunk it and send one chunk at a time and waiting on ack from the server side, to avoid hogging the live view channel.

1 Like

thank you so much, i did it. i created a channel only to upload audio files. in a similar way as liveview do it, joining to particular channel by each audio file. i think is the most easy way to resolve this issue. so when the audio is uploaded i send a notification to the liveview and update the things related to it. anyway i agree and hope that someday this case would be resolve by liveview.
@rio517 thank you so much for your ideas, i was evaluating and i think it could be more hard to do it than just create a channel and upload through it. i am not a lv or js expert so i am not sure if that solution was the best.