just thought that some of you might be interested in a new experimental live coding system I co-developed with the University of Sheffield.
It doesn’t quite have a name yet, but we developed it in collaboration with the band The Black Dog who have used it to release a number of tracks of their latest EP Seclusion as code.
The live coding system was built using Phoenix LiveView, Luerl and WebAudio. It was a huge amount of fun and it was amazing how much power the tools gave me to build it. Soooo much nicer than Qt and C++
If you want to know a little more about how it works, we gave a workshop recently and the material we used is here:
I’ve wondered about using the web audio api in combination with LV/Phoenix. Thanks for sharing Sam! In the past I’ve done some work on the crossroad of (very modest) audio processing and Qt. Nowhere near as fun as Elixir development. I can imagine how much more fun this other stack is for this kind of domain.
I’ve seen you present your work about Sonic Pi a few years ago in Londen. Till today it still stands as one of the most energetic and inspiring presentations I’ve seen.
I haven’t yet found any direct benefits from WebAudio + LiveView, but WebAudio + Phoenix Channels opens up a huge set of opportunities for collaborative jamming.
I plan to consolidate my learning from this project and fold it back into Tau5, a ground-up rewrite of Sonic Pi, which will drop Ruby and switch to Luerl + Elixir. I also plan to switch to Phoenix/LiveView for the GUI - working simultaneously with an embedded Qt WebEngine for a standard app feel and web browsers for a more web-approach.