Connecting via WebRTC to PipeWire

Hi, been looking at Membrane for a few days now and it seems pretty great. I don’t have any prior experience in this area so the guides have been very helpful.

I want to relay some web-rtc audio (in both directions) from a browser into a server (e.g. a membrane server) and then into pipewire nodes running on the same server. My default approach is to use Membrane for the signalling and then a Membrane pipeline to get the packets into raw audio frames, which I will then push into pipewire, through either a couple of pipewire ‘Unix FIFO’ modules, or a ‘Simple Protocol’ module. These modules are then linked to the ‘real’ pipewire nodes. Does this sound like a sane approach?

Follow up question:

My other thought when I looked at the available pipewire modules, was to use the pipewire RTP modules and then just advertise their endpoints during signalling (in Membrane). But I don’t know if this is feasible, as even ex_webrtc seems to want own the rtp endpoints after setup?

Thx!

Hi @ausimian, the approach sounds valid, but it involves sending raw audio between processes. I don’t think it’s ever a bottleneck, but you may consider reducing the overhead by sending it via RTP, since Pipewire accepts it (WebRTC → Membrane → RTP → PipeWire). BTW, I think you could replace Membrane with Boombox here, both in your current solution and the RTP approach.

I don’t think it’s possible to make RTP go to Pipewire directly when using WebRTC, mainly because of how WebRTC handles sockets and encryption. And even after magically solving that, there would be a 1000 more quirks and corner cases :wink:

1 Like

Yeah, thx for reply. I think I’ll have to look at the SAP module for Pipewire to get the most optimal solution working.

Thx for the feedback.

1 Like