Using a generated machine learning model in Elixir

Let me preface this question by saying that I’m not an ML expert so this question may or may not make sense.

Machine learning models can be generated through various languages and frameworks. These languages and frameworks are most likely to then use these ML models in production. Some model types are SavedModel (from Tensorflow) Pytorch can export Torchscript, ONNX, etc… (I’m probably not even scratching the surface of model export/serialization options).

My question is, assuming the heavy lifting of ML is in the model generation, are there Elixir libraries that can interface with a generated model?

I’m hoping I can generate a model in Pytorch/Tensorflow, export it, and import it to my Phoenix app to make predictions/recommendations/etc…

Can this be done?

Thank you

It doesn’t quite answer your question, but this article by @alvises shows one way to run ML models from Elixir with the actual prediction managed by Python:

@sym_num has done a lot of work in this area too - building a library to harness the GPU - see https://github.com/sasagawa888/deeppipe2 - I’m not sure what would be involved to port or run models.

2 Likes