I have a Phoenix application that I have setup to create a RTP session with LiveView server. LiveView currenty handles the signaling part and I am getting the RTP packets on my LiveView.
Now, I am trying to figure out the best way to process the media packets. I see two options.
- Use membrane with a custom source element that take the RTP/Opus packets and builds up a pipeline to do whatever processing needs to be done.
- Use boombox(this would be my preference). But I don’t see how to create the input that boombox needs. The WebRTC input takes a signaling server(which I have already taken care of). Not sure how to get the RTP packets into boombox so it can create the output. Perhaps I could use
Membrane.WebRTC.SignalingChannel
, to handle the signalling part and then pass that to BoomBox…However, I have not figured out how the SignalingChannel gets used. Could I start this in my LiveView and pass the signaling messages from client to the SignalingChannel?