Erlang_python - Run Python across your Elixir cluster

Because it runs on the BEAM, you inherit Erlang/Elixir distribution for free.
Run Python on any node:

# Execute on remote node
:rpc.call(:"worker@host", :py, :call, [:numpy, :dot, [matrix_a, matrix_b]])

# Local call
{:ok, result} = :py.call(:sklearn, :fit, [model, data])

# Async task
{:ok, ref} = :py_event_loop.create_task(:heavy_compute, :run, [data])
# ... do other work ...
{:ok, result} = :py_event_loop.await(ref)

Key features:

  • Distributed by default via rpc:call
  • Async Task API (uvloop-inspired)
  • Channel API for bidirectional streaming
  • OWN_GIL subinterpreters (Python 3.12+)
  • Virtual environment management
  • ASGI/WSGI support

erlang_python 2.1 is out with async tasks and channel-based streaming.

Hex: erlang_python | Hex
GitHub:

Apache 2.0 licensed.

9 Likes

I’m curious about how this is different to pythonx?

1 Like

erlang_python embeds Python with true parallelism via sub-interpreters (each with its own GIL) or free-threading (no GIL). You get bidirectional communication: channels, async/await, and Python calling back into Erlang. Pythonx is designed for Livebook/Elixir with a single Python interpreter and shared GIL, which is enough for notebooks but limited for concurrent production workloads.

Afaik erlang_python is built to run Python applications inside the Erlang VM and inherit its efficiency.

2 Likes

Nice library!

I tried to read the code, but the project is huge and NIF part is around 15k lines of C code (thats around 5% of whole OTP C code size).

I think that it was generated, given that project is +90k lines in a month from you. Given that, it is strange to see that it is 2.1.0 version already. For me it feels more like its pre 1.0.0 version, given that it was rapidly developed and I assume was not used by anybody except you yet

Do you have any benchmarks? I am curious about how it compares to Snex, which uses erlang ports to spawn multiple interpreters and has very very tiny NIF surface. I think that erlang-python should be faster

Do you use it in production? What’s your use-case?

2 Likes

you can run benchmarks in the example folder:

➜  erlang-python git:(main) ✗ escript examples/bench_channel_async.erl

========================================
Channel Benchmark: Sync vs Async
========================================

System Information:
  Erlang/OTP: 28
  Python: 3.9.6 (default, Dec  2 2025, 07:27:58)
[Clang 17.0.0 (clang-1700.6.3.2)]

Python channel helpers ready.

--- Sync Channel Benchmark ---
(Erlang send + NIF try_receive - pure Erlang)

    Size |   Throughput |     Avg (us)
--------------------------------------
      64 |     10416667 |         0.10
    1024 |      6180470 |         0.16
   16384 |       833472 |         1.20

--- Async Task API Benchmark ---
(py_event_loop:create_task + await using stdlib)

      Operation |   Throughput |     Avg (us)
--------------------------------------------
      math.sqrt |       101174 |         9.88
     concurrent |       374111 |         2.67

--- Sync vs Async Comparison ---
(Channel operations: NIF sync vs py:call)

Message size: 1024 bytes, Iterations: 1000

         Method |    Time (ms) |   Throughput
---------------------------------------------
       NIF sync |         0.20 |      5102041
   py:call sync |         4.38 |       228154
     async task |         8.10 |       123426
    spawn batch |         3.38 |       295508

NIF sync is 22.4x faster than py:call
NIF sync is 41.3x faster than async task
Spawn batch is 2.4x faster than sequential async

========================================
Benchmark Complete
========================================

It’s used in a coming product and you can find it used in barrel_embed and some other products like hornbeam. recently I only merged and fixed . I’m pushing version when i introduce breaking change, also. The moment I start to use it for supported products it becomes 1.0.

1 Like

Oh, so it takes around 5ms to do py:call(math, sqrt, [2.0])! Thats very impressive. By the way, https://hornbeam.dev/ looks very cool and promising too!

I will try to play with it locally on my machine some time soon

1 Like

This is really cool — there’s a lot of Python+Elixir crossover happening right now.

I built a Python FFI for Elixir: https://github.com/nshkrdotcom/snakebridge

Then put together a sample project to test whether pooling Python workers from the BEAM side buys anything over straight Python. Turns out non-GIL Python is pretty capable on its own: https://github.com/nshkrdotcom/slither

That said, the whole stack is still a prototype — glued together with gRPC and architecturally rough. Feel free to steal anything useful. Really glad to see your project.

1 Like