Live streaming technologies with Elixir / Phoenix

I’m currently doing some research about live streaming and live video (like Twitch, YouTube Live, etc.). There are a lot of different technologies and tools out there for implementing real time video & audio in the web and mobile.

I found WebRTC and originally thougth it was a good fit. But after learning the details, I found a lot of complications with things like TURN servers and the overall need for a lot of different servers and services. As far as I understand it, the sending client could actually be more demanded with more viewers which is a no-go for this type of application.
Looking for simpler solutions I found protocols like HLS and RTP but I found issues with delay and a lack of Elixir implementations for both of them.

Looking around I also found a tutorial using NodeJS and normal websockets, just sending the raw images in Base64 (or other encoding) over the websocket. I have some performance concerns about this, but I have no data to know how fast Websockets and more specifically Phoenix Channels perform.
Since I saw that UDP seems to be the better option, I looked into :gen_utp, but it seems like a lot of the inifrastructure would have to be implemented, since there aren’t any frameworks.

Does anybody have any experience using Elixir or Phoenix for video live streams could give me some advice?

3 Likes

you may want to take a look at mux and their competitors. They’ve solved a lot of these problems and will let you focus on the goals of your product vs getting bogged down in the details of video streaming.

Also, be sure to do your math. Bandwidth gets very expensive very quickly with video at level of scale, you don’t want to be hit with a giant bill at the end of the month that your customers payments aren’t able to cover.

2 Likes

Hi @Jeykey, Membrane has a pure elixir implementation of RTP - see the demo and we recently created a bunch of components for streaming via HLS. We’re now documenting and polishing them, so they should be available shortly :wink: We’re also preparing an article about building a Twitch-like app with Membrane, watch the membrane tag if you’re interested.

Whatever solution you end up with, definitely don’t send raw images, especially in base64, as it will generate huge traffic and won’t scale at all (maybe unless you have very low resolution and framerate)

10 Likes

From what I’ve read so far, HLS seems to have quite a high latency. Whats your experience and is it suitable for something like Twitch?

I believe Twitch itself uses either DASH or HLS, and so does Youtube/Facebook live. That’s because of scaling - those protocols basically split the stream into small chunks and serve them via HTTP, so you can quite easily set up a CDN in between and handle thousands of viewers. But yes, that also introduces higher latency - usually about 10 seconds - and while I suppose you can get better, it is not going to be even close to RTP’s 200-500 milliseconds. If the lowest possible latency is required, the solution is to use RTP on both sides. I see two options there:

  1. Peer to peer - streamer sends directly to each viewer - can handle a few receivers and strains the sender, but is relatively easy to implement with WebRTC - requires only a signalling server, such as Membrane WebRTC server and JS clients
  2. RTP between streamer and server and RTP between server and each viewer - if you want to have browser clients you still need to use WebRTC, but this time via WebRTC gateway on the server. And this will not scale as easily as HLS/DASH.
7 Likes

In the Discord Blog I read that they use WebRTC but run all their traffic through a Gateway for moderation and for protecting the IP addresses of the users. Why would you want to use a technologie designed for P2P in a Client / Server architecture?

For two main reasons

  • it’s widely supported, in particular by all the popular browsers
  • P2P and client-server scenarios don’t really differ in terms of the itermediary protocol - when you make server a peer, client-server is still P2P from that point of view
2 Likes

I understand some of the advantages of WebRTC, but I’m still somewhat confused about how to use it in a centralized server architecture, like the one described by Discord.

Routing audio/video through media servers offers other advantages as well, such as moderation.

Are they describing a TURN server, or another solution?
They also talk a lot about WebSockets end using Elixir, at least for their Signaling server.
What is the involvement of Elixir in their (or another WebRTC) application?
Could you also transfer video & audio through Websockets?

Are they describing a TURN server, or another solution?

TURN stands for Traversal Using Relay NAT, so it seems they have something beyond TURN. Nevertheless, the architecture is probably similar.

They also talk a lot about WebSockets end using Elixir, at least for their Signaling server.

WebRTC doesn’t standardize technologies to be used in the signalling layer, so the use of WebSockets is absolutely valid.

What is the involvement of Elixir in their (or another WebRTC) application?

Don’t really understand this question. It’s possible to implement entire WebRTC stack in Elixir, we aim to do that at Membrane.

Could you also transfer video & audio through Websockets?

Video or audio is just a stream of data, so you can transfer (send and receive) them anyhow. The problem is that live streaming is way more complex than just sending and receiving data. You can see this talk by @mspanc to have some overview. Proper tooling can handle significant part of that complexity.

2 Likes

@mat-hek I look forward to your article. What do you think of this blog post’s approach?

2 Likes

Well, that’s the article I was speaking of, it’s by my teammate and I reviewed it :stuck_out_tongue: So personally I think the approach is perfect :smiley: I posted about that in the membrane thread but it looks like it’s not visible enough there. @AstonJ can it be separated?

3 Likes

It can be moved to the Elixir Blog Posts thread here if you like, or, if you post a new thread in the backend blog posts category on devtalk.com you’ll can get a thread for it there and it will automatically get cross-posted here under your username :003:

This announcement has more details: