Is it possible to send audio streams between room subscribers using Phoenix channels and web sockets?

Any ideas? Thank you.


Probably not as a binary data stream, as channels use JSON.

Though you should be able to baseX encode the audio chunks and send them to your client.

There you have to decide them and handle those binary chunks how you would handle them “natively”


You can switch out the transport protocol as well as the serializer for messages sent, so it should be possible to switch out the parts, which are not optimal in the default implementations.


Are you implementing an audio chat, or broadcasting some audio to all participants?

In both cases, WebSocket is likely not the best choice. If you want a one-to-one or few-to-few audio connection, look into WebRTC, a web standard for peer to peer audio/video/data communication implemented by all major browsers. You can use Phoenix channels for signaling, and then establish the peer to peer audio connection between browsers.

Here is a talk on the subject (shameless plug, it is me last year at OpenFest 2019 :slight_smile: ):


I knew that you can swap transport, but that we can swap serializer as well is new to me, can you show some documentation?

See the :serializer key.

Phoenix itself is using two serializers, since it switched from json objects to a json array for space efficiency with 1.4.


I have managed to send audio data between connected clients with channels after encoding the byte [] into Base64. However, cannot play the decoded sound in the browser for some reason.

Maybe this will help you


Were you able to get this to work?

do you have this example repo. i already up the janus server gateway in docker. but i don’t actually know how to playground with phoenix. cuz iam new in elixir.

I have surely been able to transfer the byte arrays [Java] (I believe after encoding them into base64) but I have in the meantime managed to achieve the result without Phoenix channels. But I believe theoretically it is possible.