Phoenix Sockets, what could be the best approach

We have a nuxt application, which is using the Phoenix API to work. We have the front page as LiveView, which is basically getting an image from Phoenix api through sockets in the form of data:image/jpeg;base64,/9j

    async playJpegStream() {
      let data = {
        ip: "1.1.1.1",
        source: "live_view_widget"
      }

      if (this.token) {
        data.token = this.token
      }
      if (this.api_id && this.api_key) {
        data.api_key = this.api_key
        data.api_id = this.api_id
      }

      this.isPlaying = true
      let socket = new Socket(process.env.SOCKET_URL, {
        params: data
      })
      socket.connect()
      this.channel = socket.channel("cameras:" + this.cameraExid, {})
      this.channel.join()
      this.channel.on("snapshot-taken", data => {
        this.url = "data:image/jpeg;base64," + data.image
        this.currentSnapshotDate = data.iso_timestamp
        this.isLoading = false
      })
    },
    stopJpegStream() {
      if (this.channel) {
        this.channel.leave()
      }
    }

this is how we are getting image data using Phoenix sockets.

My question is that, is that a right approach to have an image from Phoenix api and update it over JS application client-side.

whereas I can also see when this LiveView Plays

I see in the browsers network tab

I see that the value of resources is increasing as each image is less than 2MB and more than 1 MB.

This is how we broadcast from Phoenix API

  def broadcast_snapshot(camera_exid, image, timestamp) do
    EvercamMediaWeb.Endpoint.broadcast(
      "cameras:#{camera_exid}",
      "snapshot-taken",
      %{image: Base.encode64(image), timestamp: timestamp, iso_timestamp: convert_unix_to_iso(timestamp, "UTC")})
  end

I’d create a controller, which can serve the file and only send the url via websockets. This will allow you to e.g. put a read though cache in front or allow the browser to cache the resource if it’s shown multiple times.

1 Like

We dont have any URLs of image, the image is coming straight from the cameras, and they are in binary forms.

does stream helps here?

how? can you explain?

Dont know much about it. Please do check https://www.poeticoding.com/elixir-streams-to-process-large-http-responses-on-the-fly/

Actually, this is totally irrelevant to what I have asked :slight_smile:

Previously I used data from a webcam and sent it through a Phoenix Socket. Back then there were less features, and I resorted to sending the images embedded in JSON as base64-encoded blobs.
It ‘works’ but it did mean that the byte size of the messages being sent was roughly ~133% of the original image size.
Instead, I’d recommend having a separate socket connection only for the images, and encode them not using JSON but using a different implementation of the Phoenix.Transport.Serializer behaviour. There is an article from a couple of years ago that talks about using Msgpack instead, which might be relevant: https://nerds.stoiximan.gr/2016/11/23/binary-data-over-phoenix-sockets/

2 Likes

So your approach could definitely work and is perfectly fine for small scale but you’re essentially serving video, which could also be done with serving a video stream. The pros about serving an actual video stream is that you can transform the 1-2MB per jpeg (which could roughly sum up to 10-30MB per sec if you want to serve realtime video of 10-15 fps) to a couple of bytes per second as you’re only sending the changes between frames. (There was a cool talk about how to achieve this which you can find here https://www.youtube.com/watch?v=eNe5dmRP9Cc ). The TL;DR is to use ffmpeg which will accept a stream of bytes to convert to a video stream in realtime which you would serve with something like ffplay. I guess a nice headache to actually get to work for now.

The other streaming/chunking a http request suggestion above is also not that far off the mark and maybe even more preferable because it’s a proven code path and a lot more lightweight. See here an plug which you should be able to use https://github.com/elixir-vision/picam/blob/master/examples/picam_http/lib/picam_http/streamer.ex
and a thread about it within this forum: Nerves device as an IP Camera?
Idea here is that there is a special content type for this kind of ‘animated jpeg’ that the browser understands, you just create an img and point it at a url which serves a stream of jpegs through ‘multipart/x-mixed-replace’.
The browser will keep the connection open and wait for new image frames (in the form of jpg images) that will be served by your plug/server.

Keep in mind that if this existing code works for you, it’s fine! But the last described method could be pretty easy to implement. Good luck!

1 Like