Membrane - a new framework for multimedia processing

I’m confused how an application would mix both Membrane and Janus - what would Janus do that Membrane couldn’t do alone, given that Membrane can mix audio / video and handle the signaling component of it.

Excited to see the starter kit though! I think I will wait for it to start my app :slight_smile:

Membrane is not ready yet to fully act as a WebRTC peer. We’re closer than ever but the realistic estimate is 2-3 months and I am pretty sure some unexpected bugs will show up.

We don’t want to wait for so long with releasing such a tool.

Given that we decided to create a tool where Janus essentially acts as a source to the Membrane pipeline. In the long run we obviously plan to upgrade it to be 100% Membrane.


Makes sense, thanks for the response!

It is awesome to hear that Membrane could at some point in the future act as a WebRTC peer :slight_smile: Is it in scope to make it work also with WebRTC data channels or will it be only for audio/video? Just out of curiosity.

1 Like

It’s not in the roadmap yet, but it will be possible to add in the future. PRs for that are going to be very welcome once A/V is ready :slight_smile:


what would be the best way if I want encode a video file and stream it realtime? I am thinking of Membrane. Is WebRTC the way to go?

Made a thread about this here How would I serve / stream a video file?


It all depends on the target platform and requirements regarding the latency. If you don’t need to stream it in with very low latency, HLS might be good enough. Please note that there are many quirks related to real-time reencoding, not all formats are very compatible with that as some of them store metadata at the end of the file.

Could I do HLS with Membrane or would I use something for that?

We do have HLS element and it’s working in one of the projects we did comercially, although it does not have public stable release yet. It will be available within a few weeks.


Thats a a pleasure to know! Thank you!

WebRTC will require:

  1. signalling (easiest part)
  2. ICE
  3. DTLS (integrated with SRTP)
  4. SRTP (integrated with DTLS)

DTLS is the most complicated thing, I don’t know if latest code in erlang can already do this, we have our own implementation of DTLS.

Our rust code just uses openssl implementation, it works, but in flussonic we have implementation in erlang. Source code for dtls is somewhere there in the internet, it is possible to find it.

2-3 months is very optimistic, you are really cool guys if you will do it in this time =)

If you refuse from adaptive bitrate control (you should postpone this thing) and other extended features, it is possible to do it before you will be pissed off by it.

From all our protocols (rtmp, rtsp, hls, dash, mss, mpegts, h323, sip), webrtc was most painful to implement.

1 Like

The summary is correct.

We have run into some issues with DTLS in latest Erlang and we have to rely on some hacks and external C code to make it work. We will consider backporting the changes to the erlang core but we don’t want to wait for the release and force users to use the most recent Erlang version so for some time we will probably stay as it is.

We started a while ago and these 2-3 months are essentialy time to assemble all these parts together, not to build this from scratch. Keep fingers crossed :slight_smile:


Amazing work with membrane. I’m just about to start a project and this “starter kit” sounds like the perfect tool to learn more about both janus and membrane.

Have you guys released the starter kit yet?

1 Like

Hello, there!
Membrane Core 0.5.1 is out, and stable!


Tomasz Zawada


Hi, I was happy when I read this but I havent seen any updates on it. If I missed something, would you share a link? Also what would I do if I aint sure of the format in the video file? Could I do choose the demuxer depending on the codecs in the file etc?

Hi @smolcatgirl, we’ve been having lots on our heads recently, and it’s taking longer than expected… nevertheless, we have just released HTTP Adaptive Stream plugin that supports streaming via HLS :tada: However, note that this expects the stream already muxed to CMAF. The plugin doing the muxing is still under refactor, you can track the progress here.

I’d also like to note we have just released libopus-based Opus decoder, yet another component needed for WebRTC.

Also what would I do if I aint sure of the format in the video file? Could I do choose the demuxer depending on the codecs in the file etc?

You could, but Membrane doesn’t provide a transparent way to do that, you’d need to implement the choosing part or do some preprocessing with another tool, like FFmpeg.


Thank for your reply! :3

1 Like


This may be a part of capture process. We use it to capture SDI cards.

Hi there, it’s finally time to announce that Membrane supports HLS :tada: We finished all the needed plugins and prepared a demo of retrieving RTP stream and publishing it via HLS. @kanes115 described how we used it to create a streaming platform in this blog post. Enjoy :slight_smile:


We have a task to recode rtsp live streams into webrtc live streams.
We were able to do it on golang with this solution.
Can we do it on membrane?