Introducing Nx, Lambda days 2021

Long expected :slight_smile:


I know nothing about this subject but every time I listen to José I’m impressed by him.
José, você é o cara !


I had absolutely zero knowledge in machine learning, and that was actually a pretty good introductory class, despite its goal being only to show Nx features.


For those having some troubles compiling EXLA on Ubuntu 20.4, I found those commands to be helpful.

$ sudo apt install python3-pip
$ pip3 install numpy
$ cd /usr/bin
$ sudo ln -s python3 python

$ asdf plugin add bazel
$ asdf install bazel 3.1.0
$ asdf global bazel 3.1.0

The link is because it’s looking for python, not python3.

UPDATE: This is not the recommended way, see next post. It’s just a hack to get EXLA compilation working.

Bazel is also available as a plugin for asdf-vm. You need to use 3.1.0 instead of the latest 4.0.0

And now with all in place, I need to find what to do with it :slight_smile:


For MacOs Mojave asdf wa the simplst solution for bazel too.

1 Like

I have worked with Python for a few years and I can absolutely say this is not a good idea:

$ cd /usr/bin
$ sudo ln -s python3 python # :(

You should instead make a virtual environment and associate it with whatever directory you occupy while playing with nx.

For example, here is how I do it using pyenv (GitHub - pyenv/pyenv: Simple Python version management):

$ mkdir nx_fun
$ cd nx_fun
$ pyenv virtualenv 3.9-dev nx
$ pyenv local nx
$ pip install numpy

If you have set up pyenv properly then being in the nx_fun directory means python automatically points to the python3 shim for your virtual environment.

It is dangerous to override system python files which are often used under-the {:exla, “~> 0.1.0-dev”, github: “elixir-nx/nx”, sparse: “exla”},
{:nx, “~> 0.1.0-dev”, github: “elixir-nx/nx”, sparse: “nx”, override: true}-hood by a surprising number of packages. I advise never touching your system python files (let apt or yum or apk or brew handle those) and always use a virtual environment when you want to pip install.

The bazel instructions are good though. I also recommend asdf.

Update: I got a chance to try this and I want to add something. If using pyenv, you need to do pyenv global nx (if you followed the instructions above) in order for mix deps.compile exla to find numpy. I still recommend doing it this way since you can easily switch you global python version back when you are finished playing with nx. It’s still compiling for me… :clock1:


I am not a python guy, I will follow your advice :slight_smile:

Thanks for reporting.


It took 7 hours for EXLA to compile on my laptop…

Ouch… that hurts. I had to go through at least 7 attempts before successfully compiling EXLA, each time with different errors. As last hope I kept insisting on mix compile and then after three rounds of it I could get it compiled.
I’m on Ubuntu 20.04, the successfull compilation attempt used bazel 3.2.0 through bazelisk (installed with npm) and I used python_is_python3 apt package to alias python.
An attempt to compile with cuda support destroyed a first successful build (:scream:) and I had to try half the afternoon to get it compiled back again. The compilation process takes about 20 minutes on my machine.

1 Like


If you all can report to me some of your compilation problems and troubleshoot steps, I can put together a comprehensive guide for building EXLA so there are no more issues. Also please note that we fortunately no longer have the NumPy dependency :slight_smile:


Is there a good article about the mechanisms used in the demo? Why do we use dot, why is the loss function implemented that way, why the grad is implemented that way, etc. And why do we add a single hidden layer, and why we add it, etc.

I can find a lot of tutorials that explains how to code it, just as José did, but not an article that explains the choices in a not-too-complicated way. Then there are the hard-math papers. I mean I am not a maths person ; I don’t mind diving into maths when it is necessary but all docs I found expect me to be able to reduce integrals and multiply matrices like I breathe.

I understand that maybe this topic assumes the developers have a decent math knowledge, but in the end, the code is able to predict numbers (i.e. it works) with just 2-3 math functions so that should not be that hard to grasp with a good explanation.

Neural Networks and Deep Learning by Michael Nielsen is a good primer.

Chapter 1 (Using neural nets to recognize handwritten digits) walks through the specific use case on the MNIST dataset (with code near the bottom of the page).


If you like math and don’t mind reading a textbook:

If you don’t like math and you’d rather learn to the programming concepts:

Or if you’re willing to wait we can put some resources out there with explanations of Nx code :slight_smile:


Thank you @joddm this look perfect.

@seanmor5 Thank you for the links, I will look into it. I am absolutely willing to wait for resources but I am not sure I understand your comment correctly. I’ve coded along with José while watching the video (with a two hours pause to compile XLA before the end :smiley: ) and it was easy ; because of the great work you made with him, the library is really easy to use. But as José said in the video disclaimer, the video was not about the underlying concepts, and he would not explain the choices of mathematical functions (except maybe for the softmax wich is easy to get). So now If you’re saying that you want to put resources to explain that, yes, I’ll wait anytime.

I’ll add one more book recommendation. I gave it a quick read-through last year and it helps with the overall concepts. I need to go back through it and work examples to get them to stick in my brain now.


There’s also currently an interview and AMA with the author over on Devtalk - anyone commenting/taking part in the thread will get entered into a draw where the winner can win of his books :003:

(He’s also the author of Metaprogramming Ruby which I thought was an amazing book and incredibly well written!)

1 Like

I really loved reading this book! Perfect for beginners!

Turns out this is one of the titles that Prag Prog has published on You can read it for free with your medium membership: Programming Machine Learning | by The Pragmatic Programmers | The Pragmatic Programmers | Jan, 2021 | Medium.


Interesting… some serious potential for expressive neural network definition here.

Not sure if I’ll be using it in the near future as the tooling built for pytorch/tf is pretty well developed - but I’ll certainly be keeping an eye out Nx :slight_smile:

I’m excited to see the work by @versilov on torchx since if my reading is correct, this opens up some exciting opportunties for text analysis which would help reinvigorate the work on my text library.