I’m playing around with the idea that maybe it’s possible to design new machine learning model/architecture could only works on Erlang by leveraging soft real-time. I guess there should be some algorithms requires the system being responsive and providing accurate timestamp for events so that we could apply this algorithm or concepts in a machine learning model.
Although it might seem like a “find nail for the hammer” situation, it could still be interesting to know I guess😂.
One direction I’m looking at is some variants of Markov chain involved with time.
Algorithms do not care about “real world time”. So this is moot question. Realtime is system requirement, not algorithmic one.
It is true if we only talk about mathematic correctness of an algorithm. But system requirement could have impact on real world performance like the efficiency/accuracy. One example is that neural network as algorithm is found really powerful after people find GPU and GPGPU programming could unleash its full potential. There should be some algorithms haven’t been properly implemented to run on realtime system to show what it could really achieve.