Project Nx - predictions and potential use cases

Soon we’ll have Nx - numerical computing, GPU acceleration, ML, tensor operations, etc.

What will be the best applications for Nx? Simplest and most valuable techniques?

Post your predictions!

My guess: alerts, signals and event recognition from telemetry data.

10 Likes

Quantum supremacy!

6 Likes

Which should be done outside of the application anyway.

In general - AI (NLP, NN, etc.) and engineering computations - stuff that the NumPy/SciPy and Julia are commonly used. However I worry that it will be really hard to win over scientist hearts. Julia has troubles with winning hearts over Python while being designed from day 1 to be used in such cases. I do not think that Elixir will succeed in that, but prove me wrong.

5 Likes

Outside of the typical AI and ML space, here some examples of Jax (one of Nx’s inspirations) thriving in a broad number of spaces:

It seems as well that libraries like Jax are growing increasingly more popular for research. See DeepMind’s Jax Ecosystem

Of course, we are in our early stages, but hopefully we get to a point where people are comfortable adopting Elixir for some of their numerical work rather than shelling out to Python, Julia, R, MatLab, etc.

There are a lot of possibilities, it is only just the beginning! :slight_smile:

16 Likes

Given the relative popularity of bitcoin traders in the Elixir community, no doubt some will be out chasing alpha for fun with Nx.

I know I’ve had some mild interest in using BEAM for games, if the GC latency and jitter is low enough, and float performance could be dramatically improved. Between Nx and BeamAsm, this might even become realistic for less-intensive games.

3 Likes

Personally, as a non-data scientist, I’m really excited about NumPy/Scikit behavior provided in the Elixir ecosystem. It means on my next personal project/side project (think startups) I won’t have to add another server to my infrastructure or another build stage to my dockerfile, etc… This extends my “simple” infrastructure timeline much farther into the future giving me the opportunity to focus on business goals. Maybe if mono-repos can now do both Web and Data Science the odds for success are improved.

I have been debating for a while about what the best why to include data science into an app is. Server via http, one dockerfile with multistage builds, hand-off to sagemaker, forget elixir and just build with flask or django… now i actually get to rethink this solely in the elixir context. Huge win. So yeah, hoping for NumPy/SciKit type behavior.

Apps grow and change. If the app grows enough to warrant hiring Data people and extracting the DataSci part to Python or Julia etc… it’s still a success that Elixir helped attain.

13 Likes

It took Elixir going into Machine Learning for my first post here! Kudos to Jose, Sean and everybody else who helped launch this in record time.

The fact that Nx will be like Numpy and have some feature parity will be awesome. Just FYI, Numpy has been around 2006 and it wasnt till 2011 (5 years later) that Numpy could spread its wings and be a productive tool (at least for me) speaks volumes on what Jose and team accomplished in three months!.

On the Thinking Elixir podcast, Jose kept talking about tensors. For math geeks, tensors are really generalized matroids. Matroids are a combinatorial structure that lives at the intersection of Optimization, Linear Algebra, and Graph Theory (Source: Wikipedia). Matroids have many applications in geometry, topology, combinatorial optimization, network theory, coding theory, and machine learning. That’s where I would look for use cases and applications.

Given what Jose was discussing around Softmax classifiers, one could deduce, he is talking about Deep Learning based computer vision use cases like image classification and object detection (e.g. boundary boxes). Computer vision has one of the richest set of use cases in AI.

  • Protein structure prediction for medicine and vaccines
  • Human Pose Estimation
  • Image transformation like Snapchat filters
  • Converting 2D images into 3D models.
  • Medical Diagnostic Imaging analysis.

I have many more questions around the hardware side:

  • How will Nx interface with CPU, GPU and TPU (Tensor Processing units)? I got excited when Jose talked about how there are flags for GPU based operations.

  • Will Nx have support for GPU Acceleration using CUDA (proprietary) and OpenCL (open source)?

  • What other tensor compilers are supported in the initial release of Nx?

  • Today, Keras (higher level syntactic sugar wrapper for TensorFlow) is used mostly as TensorFlow is hard, so how does Nx play with Keras or TensorFlow? or even the next incumbent Pytorch?

  • The next question would be around are are we talking about small datasets or large ones?

Also, Interested in seeing a comparison akin to a Python vs Elixir showdown (all things equal) for the same training and inference workloads on the same hardware.

Sorry for the long post. I am so hyped and excited for Nx, it feels like the dam burst for me.

I live in Toronto (the spiritual home of Deep Learning) and the 2019 NBA Champions aka Toronto Raptors.

You have my vote to fork a new topic or category page for Nx in this forum.

17 Likes

How will Nx interface with CPU, GPU and TPU (Tensor Processing units)? I got excited when Jose talked about how there are flags for GPU based operations.

As José mentioned in the podcast Nx can interface with CPU and GPUs. It’s all backend dependent. As an example the XLA backend handles all of the device stuff. It’ll be easier to make sense of when everything gets released. On that note, XLA is most optimized for compiling to TPUs, so it’s certainly something that could be supported in the future; however, we haven’t gotten to that point yet. TPUs are also a bit harder to test on considering they’re only available in GCP. If anybody has the time or resources and wants to help contribute on that front that’s also welcome :slight_smile:

Will Nx have support for GPU Acceleration using CUDA (proprietary) and OpenCL (open source)?

Backends are flexible, so no reason it can’t have support for both.

What other tensor compilers are supported in the initial release of Nx?

XLA is the only one right now, but anything that can be surfaced to Elixir can probably be supported.

Today, Keras (higher level syntactic sugar wrapper for TensorFlow) is used mostly as TensorFlow is hard, so how does Nx play with Keras or TensorFlow? or even the next incumbent Pytorch?

Nx can’t interface directly with TensorFlow or PyTorch, although something like that would be possible with the addition of DLPack support.

The next question would be around are are we talking about small datasets or large ones?

Jax (thanks to XLA) holds the record for training ResNet-50 on ImageNet (~14 million images) in 29 seconds. I’d say this is probably more hardware/backend dependent than specifically Nx dependent.

It’s nice to see people so excited, I hope that answered your questions! Also recommend checking out the LambdaDays talk for more :smiley:

9 Likes

Being able to write a data processing pipeline using GenStage/Flow/Broadway or similar and then not needing to dump the data somewhere for Python or R to process it I could see being useful.

10 Likes

Very helpful response Sean. It answers a lot of my questions. Thank you!

Subsequent thought:

if Nx has significant feature parity with Python’s Numpy and Jax (which is Numpy on Steroids) on its first few releases, it would be a massive game changer, for existing and new users of Elixir who today have to still use Python’s data analysis tools like Numpy etc. Basically, in the future, there is a possibility Elixir could do it all

One final question:

I got excited when Jose talked about the “graph” (which is typically used for computations on tensor data structures in deep learning frameworks). Just curious if ‘eager execution’ feature will be supported on Nx?

I know in PyTorch you define the graph at runtime, which allows you to go back and forth between planning and execution very easily. The ability to evaluate operations immediately, without compiling graphs explicitly, is eager execution.

The reason I mention this is because I find Eager execution allows you to prototype faster and create new types of architectures but at the cost of speed. This is akin to the difference between compiled and interpreted languages. It opens to the door to other types of use cases.

This used to be a big deal a few years ago, since TensorFlow used static graphs back then, requiring you to define the entire graph first before pushing data through. However, both Pytorch and TensorFlow frameworks now support eager execution by default, and this has since been adopted as the go-to industry standard. Just wondering if Nx will follow along.

… I know I should be more patient till Lambda days and Feb 17 could not come sooner!

I am looking to use some pre-trained models already built and test it with Nx and share my results here with the community.

1 Like

No way. Miners are built with as low level as possible abstractions to squeeze out every ounce of performance. Yeah some are written in Go etc but they’re talking to C-based low level OpenCL drivers or even sometimes inline GCN assembler on AMD. Elixir + XLA will be very fast but is likely to show 5-10% perf gone for abstraction friendliness and that’s much too much for miners, because their power/capital costs are big and their revenues just a big bigger. In that scenario 5% faster can mean double profit. Remember miners have no use for what BEAM brings along other than possibly supervisors. They’re generally just one process doing one thing mega and simple as fast as possilble. Mining is not an architecturally complex problem. It’s a code-craft complex problem. Low level programming.

Personally, at the nexus of Phoenix/Beam/NX there is some fantastic stuff to be done. Any single-user use case or batch-style workflow (training) is still likely to be better served with Python, cos it’s just got such a lead. But if you’re doing anything that looks like compute as a service, this is fantastic, fantastic news. Personally I plan to build financial market yield curve fitting algos as a service. Fitting a 7-paramater yield curve using some functional forms takes many seconds even with AVX-enabled Numpy, and that’s too long when data is streaming in at you dozens of times a second. Obviously python can do GPU but you can’t be managing access to it as a service easily. Python is really a stone age runtime honestly. With GPU enabled Nx we can get be just as fast as Python 100 milliseconds and pass it back to the user “semi-real time”. And have the data streaming into the same stack with all the advantages BEAM offers for that. I’ve been struggling to get a scalable Elixir to python solution that would “fit” the BEAM paradigm for some time, with Pyrlang being my prime candidate, but it’s so dodgily maintained. This is going to allow me to throw that frustrating architecture out and just do Elixir. I can’t thank @josevalim enough.

7 Likes

Miners and traders are not the same thing, my man. :grinning:

(and I’m aware of the lengths miners go to, including custom ASICs)

2 Likes

I don’t know much about IoT, but I think it will be great in IoT with Nerves for edge computing.

3 Likes

Nx supports two pluggable parts:

  1. Backends - this is responsible for dispatching functions to the tensor wherever it is allocated at (memory, GPU, etc)

  2. Compilers - this allows you to receive a graph (expression tree) which you compile down somewhere

Backends are, in a way, what controls the eager execution mode. Compilers are the lazy mode. The only backend is pure Elixir. So you do have eager mode, but it will be slow. The compilers we have today is the evaluating compiler (so it invokes the backend) and the XLA compiler.

In other words, as soon as we have a fast backend (I have been thinking about libtorch’s Aten bindings), we should have fast eager (backend) and lazy (compile) modes.

8 Likes

With apologies for the colon (my target audience tends to be numerical Python), I’m pitching Nx into my financial coding network:

2 Likes

Thank you for quick response! (Elixir community rocks!) … If anything I am NOW MORE EXCITED!

My response to ATen bindings for libtorch(Pytorch) on Elixir - YES PLEASE with fries! (by fries I mean some kind of OpenAI Gym integration!

In all honesty, this would be a super big deal. I know there is already a Rust binding with PyTorch, why not Elixir!

I believe catering to both Pytorch and TensorFlow deep learning frameworks will ultimately expand the Elixir community to both Academia, Companies with existing Elixir code and those on the fence about making the jump to Elixir.

Repeating an opinion that I wrote on another topic, I am SUPER excited about Nx. I think that Elixir is a great language to express numeric computation such as linear algebra. The problem used to be that such a great expressivity was not backed by the necessary low-level support to achieve good performance, but Nx promises to fix this.

Machine learning seems to be the most obvious application (not only neural networks, rather a good chunk of the whole field), but linear algebra and multi-dimensional vector computations have much wider applications than that: think signal processing, information theory, graphics, cryptography, graph algorithms, control theory, etc.

I would personally love to implement some ideas for recommender systems that I previously experimented with in other languages. I am also eager to see what the community will come up with :slight_smile:

3 Likes

Agreed big time.

I’m a huge Elixir advocate & have lots of experience w/ ML. Despite these facts… I can’t see myself venturing away from the python stack until there is an incredibly compelling case to do so.

The ecosystem built around python for ML is absolutely massive and it’ll take awhile for anyone to come close to matching that ecosystem.

I’d like to try NX out for a research project - but I worry about things like distributed training, unexpected bottlenecks, etc.

1 Like

It is important to set our expectations and goals accordingly. I believe it is completely unrealistic for Nx success to be measured as winning data scientists hearts or by matching Python’s ecosystem.

As I mentioned in the podcast, I would already consider a success if we change the answer to “Can Elixir be used for Machine Learning?” from a clear “No” to a “Maybe”. Maybe you are doing a prototype in Elixir and you can quickly hook an existing model instead of having to learn another language and tooling. Or maybe you are integrating ML models with Nerves to run on the edge. I believe those are attainable and it will already be a big success, simply because it was not possible to do any of this a month ago. Once we get to a “Maybe”, we can start discussing if, when, and how to make that a “Yes”.

As we saw on Elixir’s early days, the best way to adopt the language was to try it on small, well-defined problems, and get a feel for it. I don’t think Nx is going to be any different. Try it on small and reasonably well-defined problems, so you can get an initial look and give us feedback, and decide on how to move forward accordingly. Hopefully this will happen enough times for us to continue improving. :slight_smile:

27 Likes