I’d like some ideas on simulating a tactile I/O device (Graphiti), using Phoenix and friends. The purpose of the simulator is to allow software developers to experiment with controlling a Graphiti when they don’t have an actual device on hand.
The Graphiti has a 60×40 array of independently refreshable pins, a variety of pushbuttons, etc. My notion is that all of this could be emulated using a Phoenix-based web page. For example, the pin array could be represented by an HTML table with clickable cells.
The Graphiti is normally connected to a computer using Bluetooth, HDMI, or USB (HID or VCD), but this may be largely irrelevant. In practice, it uses binary commands, sent via serial communication and mediated by a virtual communication port.
It seems like a Phoenix program should be able to set up such a port, allowing assorted client programs to talk to it instead of an actual device. However, I’m not at all sure how this should be done. Help?
For our embedded software we are using the quantum leaps framework (great stuff btw, implementing Harel statecharts and Actors in C). This framework also comes with a prototyping toolkit (which I never used)
A bit of clarification may be in order here. I think I understand how to handle the UI side of things, but I’m less sure about the virtual communication port. For example, assuming that I can use something like circuits_uart to set up a port, how do I connect this to Phoenix? ELI5…