Any ideas? Thank you.
Is it possible to send audio streams between room subscribers using Phoenix channels and web sockets?
Probably not as a binary data stream, as channels use JSON.
Though you should be able to baseX encode the audio chunks and send them to your client.
There you have to decide them and handle those binary chunks how you would handle them “natively”
You can switch out the transport protocol as well as the serializer for messages sent, so it should be possible to switch out the parts, which are not optimal in the default implementations.
Are you implementing an audio chat, or broadcasting some audio to all participants?
In both cases, WebSocket is likely not the best choice. If you want a one-to-one or few-to-few audio connection, look into WebRTC, a web standard for peer to peer audio/video/data communication implemented by all major browsers. You can use Phoenix channels for signaling, and then establish the peer to peer audio connection between browsers.
Here is a talk on the subject (shameless plug, it is me last year at OpenFest 2019 ): https://www.youtube.com/watch?v=aNdaqo2nlRc
I knew that you can swap transport, but that we can swap serializer as well is new to me, can you show some documentation?
Phoenix itself is using two serializers, since it switched from json objects to a json array for space efficiency with 1.4.
I have managed to send audio data between connected clients with channels after encoding the
byte  into
Base64. However, cannot play the decoded sound in the browser for some reason.
Maybe this will help you https://dockyard.com/blog/2019/09/06/use-phoenix-to-integrate-janus-webrtc-in-web-apps