Run Elixir/Livebook on Google Colab

Google Colab is a great resource to learn/practice machine learning. We can go to Google Colab rather than running an ML stack on our local machines and it’s mostly free. (There are also other services out there like Kaggle that provide a similar service)

I did a bit of searching but couldn’t find any references to being able to run Elixir on Google Colab or using Livebook on it. Anyone know if there are any plans in this direction?

I know Fly.io have Livebook hosting but I’m guessing we can’t pick the hardware for it to be run on like a GPU/TPU as is possible with Google Colab.

Here is a discussion about the general topic of sandboxing Elixir for livebook:

I’ve had some success running Livebook on Colab and Kaggle. The trick is to start Livebook then expose the port using ngrok. I’ll share a link to the notebooks when I’m back on my desktop.

The main downside is how long it takes to install everything, but the precompiled EXLA should speed this up.

1 Like

@woohaaha, check this Kaggle example out.

Funny enough, this thread inspired me to try this out last year (Thanks!). Just never got back.


Here’s the colab example, not sure it still works. This is much older than the Kaggle example.


Importantly, you need to provide cuda options in your Livebook when installing dependencies,

Mix.install(
  [
    {:axon, "~> 0.1.0-dev", github: "elixir-nx/axon", branch: "main"},
    {:exla, "~> 0.2.0", github: "elixir-nx/nx", sparse: "exla", override: true},
    {:nx, "~> 0.2.0", github: "elixir-nx/nx", sparse: "nx", override: true},
    {:scidata, "~> 0.1.5"}
  ],
  system_env: [
    {"XLA_TARGET", "cuda111"},
    {"EXLA_TARGET", "cuda"},
    {"LIBTORCH_TARGET", "cuda"}
  ]
)

Lastly, you can import this livebook autoencoder gist for testing

1 Like

If this helps you out and you find an free way to persist the notebooks on Kaggle, please share.

1 Like