I’m currently doing some research about live streaming and live video (like Twitch, YouTube Live, etc.). There are a lot of different technologies and tools out there for implementing real time video & audio in the web and mobile.
I found WebRTC and originally thougth it was a good fit. But after learning the details, I found a lot of complications with things like TURN servers and the overall need for a lot of different servers and services. As far as I understand it, the sending client could actually be more demanded with more viewers which is a no-go for this type of application.
Looking for simpler solutions I found protocols like HLS and RTP but I found issues with delay and a lack of Elixir implementations for both of them.
Looking around I also found a tutorial using NodeJS and normal websockets, just sending the raw images in Base64 (or other encoding) over the websocket. I have some performance concerns about this, but I have no data to know how fast Websockets and more specifically Phoenix Channels perform.
Since I saw that UDP seems to be the better option, I looked into :gen_utp, but it seems like a lot of the inifrastructure would have to be implemented, since there aren’t any frameworks.
Does anybody have any experience using Elixir or Phoenix for video live streams could give me some advice?