Elixir and Deep learning

Hello everyone,

This is a light talk about Alphazero.

Recently I have been impressed by latest AI from google, learning the game of go in 21 days and chess in 4 hours. It looks like the matrix world is not so far.

Most of the tools for deep learning are not done in Erlang/Elixir, because they are not really good at numbers crunching. But they can communicate with external libraries, for example with Python.

It looks like a breaking change, the year computers started to learn by themselves.

Any thoughts about the possible role of Erlang/Elixir could play in building the next super AI?

1 Like

I think that all BEAM languages are currently hamstrung by not having an easy way of using them as front-end LLVM targets. As far as I know, most popular Deep Learning libraries are written in C/C++ with higher level language bindings made available via automated compiler toolchain magic. Compile to LLVM and generate the front-end language bindings (bytecode, opcode, whatever).

This may be due to a number of reasons: Data type incompatibilities, BEAM scheduler/thread incompatibility, BEAM isolation semantics, etc.

Dirty schedulers might help to a certain degree. But one would still need to be able to concretely map all low-level types present in C to BEAM bytecode and I don’t know how difficult that is.

This state of affairs bums me out and not just for Deep Learning. There seem to be many libs and SDKs available for a dozen languages, and Erlang/Elixir never makes these lists. Presumably, because it’s too difficult to auto-generate the language binding.

This doesn’t mean that all is lost. There are a number of ways to bridge Elixir with AI-ready languages. For example, Python is an incredibly popular language for building/training deep learning models. Inference can then happen in another front-end language. You can then bridge Elixir to this language. Or keep the small inference logic in Python and the rest of your inference server logic in Elixir. Use Ports to communicate between the Elixir/other_lang processes.

Even in this case you really need to like doing “everything else” in Elixir in order to justify the architecture and code overhead. And you still need to deal with data type impedance problems over the Port. That, unfortunately, is where I think Elixir and Deep Learning will be stuck in 2018.

1 Like

Thank You for this very good technical insight. I knew for some reason that Erlang/Elixir does not match well with Deep Learning, but You could explain why.

I was secretly hoping that we could use it for building the next brain, like in Handbook of Neuroevolution Through Erlang, but I guess it is still better using c/c++, or python.

Thanks. :slight_smile: I’m no expert. Just done a bit of research out of anger. There are others that can probably give a more correct and nuanced technical explanation of why BEAM can’t have nice things.

I think it boils down to this scenario being the other side of the BEAM coin. The same characteristics that make it better than other VMs for certain things, make it worse for this thing.

1 Like