I’m interested in building an interactive, text-based game using Phoenix. That is, I’d like ordinary web users to be able to go to the site using a normal browser and interact with it using a text-based interface. Any suggestions about example code for handing TTY interaction over the web?
An in-browser TTY-session can be emulated using a WebSocket connection; it is not that different from a ‘chatbox’ after all.
I started writing a Multi User Dungeon in Elixir quite a while back (and never finished it) called Alchemud.That version of Alchemud actually connects through a real Telnet-connection, and does not expose any web interface and also, looking back, there is room for a lot of improvements in the rest of the system, but maybe you can get some other inspiration from it.
About four years have passed since I posted this question, so I’d like to ask it again. Aside from the “interactive, text-based game” I was talking about then, I’m interested in using Phoenix + LiveView + ??? to create blind-accessible terminal sessions over the web.
By way of explanation, screen readers are able to navigate tables and use assorted HTML tags that add structure to web pages. I’m curious whether this can be used to make interactive terminal apps and tools such as Nushell more blind-friendly.