Implementing ML into your Elixir/Phoenix app

Hi guys, sorry if this sounds like a silly question, I am a beginner in general. I would like to know how are you implementing machine learning into your apps. A couple of scenarios I could think of are -

  1. Interact with a ML service in python over REST or similar.
  2. Calling python scripts using erlport or thrift.

Are there any suggestions or other approaches ?

I am trying to build a Phoenix app and planning to use ML to build custom user feed, recommendations and spam filter.

Thanks.

1 Like

here’s one guide. I’m going to look for the other one that I saw:
https://www.erlang-solutions.com/blog/how-to-build-a-machine-learning-project-in-elixir.html

3 Likes

above one only works if you are using tensorflow. this one is a more general python interop story:

2 Likes

Thanks, checking it out!

1 Like

Hello and welcome, there are a few posts in the forum that you could find interesting:

Hth

3 Likes

just a caution, unless I’m mistaken I would say except for tensorflex, the others are still experimental and not ready for prod deployment.

1 Like

Thank you for introducing my DeepPipe. Currently I am developing DeepPipe2. DP1 was slow and wrong. DP2 utilizes GPU for practical speed. I have finished implementing CNN. I’m also working on RNN for natural language processing.

5 Likes

Another approach to incorporate machine learning components into Elixir applications is to interoperate with other languages such as Python, e.g. through ports, NIFs, etc.

Erlang/OTP gives us other means to interface with- or incorporate external code, each with varying degrees of ease and robustness guarantees. See the Erlang Interoperability Tutorial for details.

Chapter 15 (Interfacing Techniques) of Programming Erlang by Joe Armstrong also contains some valuable information on this topic. Pages 233 through 242.

Here is an excellent article by @alvises which shows how to do this through Elixir. Elixir Mix ep. #91 goes into this article and related topics in more depth together with its author.

Erlang/OTP has excellent support for interoperability, which is one of its strengths. It might make more sense to leverage the existing Python libraries for machine learning in this way.

You can also develop your machine learning model as a separate application using Python and expose it through and API. TensorFlow comes with its own mechanisms for model serving. Then you can make use said API within your Elixir application and create an Elixir wrapper if needed. One benefit of using an Elixir port instead is that you can put your machine learning model behind a GenServer and supervise it as you would any other process in Elixir.

Another alternative is to develop and train your machine learning model using Python, and then bring the resulting model artefact into Elixir. You can serve and expose the model through an API written in Elixir. Have a look at out Tensorflex by @anshuman23 for inspiration on how this can be accomplished for neural network models through TensorFlow bindings.

Yet another approach which I have been thinking about lately is to embed the Python interpreter into Erlang/OTP or Elixir, and then use Elixir’s excellent meta-programming facilities (e.g. macros) to create a domain-specific language (DSL) specifically for machine learning. Supervision trees and GenServer could be used to spawn new processes representing instances of the Python interpreter to “cross-compile” the Elixir-based DSL to Python code and execute it. I found an old thread in the Erlang mailing list on this topic and an abandoned project on GitHub. This would likely require a considerable amount of effort, but could be an interesting way to vastly expand the libraries available in Elixir by essentially opening the door to all Python packages out there. Using ports is simpler.

Lastly, here are two related forum threads which contain some relevant information:

2 Likes

Thanks for the detailed response.

You’re very welcome, my friend!

Did you try some of those approaches?