Machine Learning & Elixir

Hello folks, is anyone working on a machine learning library of sorts for Elixir/Erlang?

I think that with the upcoming changes to the BEAM, specifically the work on a JIT, and the easy to use of GenStage and Flow, maybe it would become a viable solution to do number crunching (and I really mean maybe).

Yes, I’m aware that the BEAM at the current state is not good for such things as number crunching, but, well as this recent talk from Lukas Larsson on EEF17 has given me hope of a faster numeric Erlang. :slight_smile:


I feel this old (2009) stackoverflow thread on numerical computing in Erlang is relevant to the topic. I also think that the main feature of a language to be suitable for ML is vectorization, is Elixir good for vectorization of computations?

1 Like

Interesting talk, it seems however that BEAMJIT has been on its way for a long time. :neutral_face:


Garret Smith has been working on a system that wraps the Tensor Flow python libraries and implements an Erlang system around monitoring the processes. Check out


Is it possible for elixir to just be the controller for python processes? Since python has all the available libraries for easily working with data and it crunches numbers faster than elixir could python scripts do the work and pass back variables to an elixir app that is an API?

Easily yep, or go even lower level and access the base C/C++ API as well via a port or so. But yep, Python interacts wonderfully via a port, I’ve used it many times. :slight_smile:


You mean you’ve accessed python via REST call from elixir? ex: API in Django

Seems more like

You might want to check out,, and as well.


I can second @idi527 's erlport recommendation.

As for elixir+python for ml, we are calling python from Elixir for tensorflow usage, though we are also evaluating tensorflex for running the trained networks. Good times :slight_smile:

1 Like

Where I work, we apply Machine Learning/Deep Learning to various problems in the retail space. Leveraging Elixir and BEAM for ML/DL is something I’m also curious about. I’ve been looking at some of the available Elixir wrappers/libraries, such as Tensorflex.

Although Elixir and BEAM might not be ideal for doing the heavy number-crunching, I imagine that it could be a great technology for building a platform to facilitate for distributed training of neural networks. Some kind of worldwide machine learning orchestration platform/service.

Imagine a peer-to-peer-like platform where people could volunteer idle CPUs and GPUs. Members of the network could then use idle hardware on demand for training their neural nets. One could implement some kind of point-based system for fair time-sharing of hardware resources in the network, etc.

The general idea of distributed computing is nothing new, as it has been applied to many different fields (Folding@Home being an excellent example).

Something that might be interesting for interfacing python is Pyrlang. However, iirc it doesn’t work on 20 yet.

There is the matrex lib as well. You can find some discussion at another thread in this forum. According to some benchmarks it beats the Numpy in several tests.


Let’s keep in mind that supervised learning is only one approach to ML, and backpropagation (the heaviest number crunching part of Deep Learning) is falling out of favor. DL is not very scalable, and Unsupervised Learning (i.e. Reinforcement Learning) seems to be expected to be the key path to AGI (artificial general intelligence) — according to Richard Sutton and others

With that in mind, I have been successfully using Matrex for high-dimensional vectorized computation for Multi-Armed Bandits (Elementary Reinforcement Learning). You can check it out here. Its part of The Automata Project. Down the road I may need to use python or julia via erlports (or maybe even docker containers as used here) for the vectorized parts, but for prototyping, things are going well so far.

In my view, Neuroevolutionary Typology and Weight Evolving Artificial Neural Networks (TWEANN) with Novelty Search is one of the most promising alternative approaches, and Elixir has a head start in this sub-field of ML thanks to Gene Sher’s book.

As far as python interop goes, something like this looks pretty appealing for scaling ML as well.

The Automata Project is seeking contributors if anyone here is interested.


You should check this out. It looks pretty appealing for scaling ML using Poolboy.