Is Elixir and it's ecosystem suitable for general purpose use as a Go replacement?

Hey all! I’m new to Elixir and I absolutely love what I’ve seen so far. I’ve been looking for a functional programming language replacement for Go for a while and I’d like to get more opinions from people who’ve been using Elixir out in the wild.

My use cases are:

  • CLI apps
  • General purpose scripting for tasks like data processing, scraping etc.

I’ve read some post/articles that mentioned some pain points regarding deployment and building executables, however most of these posts are several years old and it looks like there are new tools that help alleviate these issues.

I’d love to hear your opinions.

Scripting and REST APIs are no problem. Also CLI apps are possible (escript). But the target system needs to have Elixir installed. But I don’t think that is such a big problem. People these days don’t hesitate to install ruby just to use Jekyll.

Not really. Go, C, Rust, and even Python are much better for that for simple reason - you do not need whole VM to run such tool (and Python is commonly available OotB on many platforms). Additionally startup times in BEAM aren’t great, which mean that for short-living CLI tools it makes little sense, as startup is often main reason for “slowness” of such tools.

Again, not really, again, because of the slow startup.

If these tasks will be ran as a long running processes, then probably yes. There are tools for Elixir that make writing such applications much simpler and easier than in some other languages.

Everywhere where you will need long-running and IO-heavy workload that can be split into set small, independent, and isolated set of tasks (and REST API is perfect example of that) then Erlang/Elixir will be great choice.

Most of these are there because a lot of people are coming to Elixir from languages like Ruby or Python and are comparing it to the Go or Rust deployment cycle, where you either just dump sources on the server or you build one executable and dump that. However it is more like building C# assemblies (which are like Erlang applications) and deploying them to the Java Application Servers. For me it isn’t hard, you just need to understand that it is done in a slightly different way than you get used to. Erlang releases are more like Docker containers rather than Ruby on Rails deployments.

Erlang do not really have “executables”. The nearest what we can get are escripts, but these are more like JARs, where you have all code needed for the application to be ran, but you still need VM to run it.

Or whole web browser to edit text. I call these people “problem” as they are the reason for software bloat in recent years.


As someone who builds CLI apps with Elixir, I cannot confirm a slow startup time. Maybe it’s noticeable if you execute them a thousand times after one another, but if it’s meant to be executed from a person, I don’t see a problem here.


If this is application ran once then this is not much of a problem, but a lot of CLI tools is designed to work in batch workloads and is often ran a lot of times (for example find's -exec flag). Of course it is possible to write CLI tools in Elixir, I am just saying that in a lot of workloads that CLI is used for it isn’t the best choice, as Elixir is focused more on client-server applications rather than batch processing.


Compared to a CLI app built in C, Rust, Go, etc it’s a lot slower since it has to boot the whole VM before starting. On my machine a “Hello World” escript takes about 0.5s.

So totally agree, if you are executing the script repeatedly in a hot path then it’s no good but if executed by a person that can be fine. For me it comes down to developer productivity, maintainability, familiarity, and ability to leverage the ecosystem (Ecto for example). Typically I can get non-trivial CLIs written much faster in Elixir.

1 Like

I built my own static site generator using Elixir. I also often build little tools to convert files from one format into another, etc.

I once wrote a Golang CLI app that took a JSON file as input and generated a bunch of other files. Half a year later I discovered a bug (because the JSON file structure changed (thanks Postman team)) and was not able to fix it. Following the dataflow was impossible.

1 Like

I too, and I often do not need anything more than AWK, sed, and shell. I understand that for some people Elixir can be easier, but at the same time this language wasn’t designed for it. This koan is about Vim, but I think it fits there as well:

One night there was a storm, and Master Wq’s house collapsed. The next morning he began to build it again using his old tools. His novice came to help him, and they built for a while and were making good progress. As they worked, the novice began to tell Master Wq of his latest accomplishments.

“Master, I have developed a wonderful Vim script to give all sorts of useful information about a document. It counts the words, the sentences, the paragraphs, and even tells you what kind of document it is using the syntax highlighting rules. I use it in my pipelines all the time. It is a thing of beauty, and I am very proud. Truly, Vim is the greatest tool!”

Master Wq did not reply. Thinking he had unwittingly angered his master, the novice fell silent and continued his work.

The novice finished aligning two beams and had positioned a nail ready for beating into the wood, but found the hammer was out of reach.

“Would you pass me the hammer, master?”

Master Wq handed the novice a saw.

At once, the novice was enlightened.

This marks few things:

  • Go is a bad language, and I do not like it (in nice words)
  • NIH is a problem, there is jq which probably would allow you to achieve the same thing with greater ease
  • Unbounded mutability is bad as this makes following data flow at least hard

I use elixir for scripting extensively, both within a project and stand alone. The vm bootstrap is incidental for any case except the find -exec case mentioned above.

I’m giving a talk on Tuesday about this. I’ve also written a module that makes scripting much easier, handling arguments, dispatch and help in a simple and compact way. It’s a work in progress and will follow up here on its availability if there’s interest.


My recent use case was to load huge JSON logfiles, iterate over each line, parse the JSON, do something with the data (removing data, generating hashes, etc.) and writing to a new file. I guess it can be done with a combination of other tools, but i am not sure how maintainable and expressive it would be. Seeing the input format of some of these magic linux tools, I am starting to shake.

The goal was to convert a Postman collection structure to API Blueprint documentation. I don’t think jq would do that. If I knew Elixir back then, it would have been so much easier to accomplish. Also, EEx is much better than the retarded Golang template engine.

Interesting. For my static site generator, I very much copied most of the Mix CLI logic.

Release announcement of my scripting module:
scripting release announcement


Elixir is still a quite specialized language, not nearly as much as Erlang, but at least for now, all I the decisions I do in elixir are BEAM centric. This is, at the moment, my personal opinion.

Great. Writing them in Elixir always seems like a good idea.

Data processing?
I see 2 use cases:

  1. Stream processing / constant processing in an instance (data processing service?)
  2. Ecosystem based decision: you’re processing something that comes from an Elixir app and goes to another elixir app?

Otherwise I don’t see why you would use BEAM for this.


  1. Does it some way benefit from OTP architecture?
  2. Will it be called from a bigger BEAM app? Can it be part of it? If not, is the cost of running it on BEAM worth the joy writing Elixir gives me?
1 Like

YES! :heart:


Massively. You can use Flow or even Broadway to simply make several small apps communicate between each other: App A can find links and post HTTP downloading commands to a queue, App B can do the download and then hand all results to App C which should to parsing which then hands them to App D that indexes or stores the data…

The transparent and very lightweight processes of the BEAM’s OTP, plus the fault tolerance (demonstrated by the usage of Supervisors and their various restart policies), is an ideal use-case for such workflows like a persistently working crawler, or report generator, or PDF exporter, and a lot more such.


I believe @hauleth summarised it perfectly, so I’ll just echo some points:

  1. CLI apps. Eh, no. I’d reach for Go or Rust when writing those. I don’t appreciate the startup pauses of the dynamic languages, Erlang and Elixir included. If you expect the CLI app to get very complex and big and want the expressiveness and ease of management of the FP languages like Elixir, then use Elixir. If not, just use Go or Rust.

  2. General-purpose scripting is very far from data processing / scraping. Addressing them separately:

    • General-purpose scripting: IMO don’t. Really depends on what you want to do but so far I haven’t found use of Elixir as a quick scripting language. This is a really 50/50 advice though; if you give us more examples, you’ll get a more precise answer.

    • Data processing / scrapping: definitely yes. The OTP architecture lends itself very well to multi-stage workflows with a lot of parallelism involved. There aren’t many other languages that do this well in the area.

  3. REST APIs, GraphQL, generic web apps, API gateways: Elixir is a perfect fit in my experience. Elixir apps almost never lag, they are robust and mostly self-recovering (with a little extra effort from your side) and are very fast which is counter-intuitive when you take into account that the language is dynamic. I rewrote two Rails apps to Phoenix and they got accelerated by 20x to 100x. I am not exaggerating; one of the both apps had a median response time of 280 - 320ms and when I finalised it with Phoenix its median response time dropped down to 2.5 - 3.5ms. Crazy.

  4. Deployment is still not as brain-dead easy as compiling a single executable as in Go and Rust but it’s definitely not as hard as many impatient bloggers out there make it sound. Yes, you will have to sink one weekend in it. After that you’ll do it in your sleep. :slight_smile:

Hope this helps you make an informed decision.


I don’t think all scrapers benefit from this. That’s a quite specific case.

In that scenario, you could those commands from a long running iex shell. Using a few utilities you can readily parallelize the process. For my IoT data processing I connect to a beam server on my machine, sync data, and run my conversion commands using a few Stream’s from canned utility functions. Much nicer in the long run if you’re running batched workloads. Throw in a dets/mnesia or KV store and it’s easy to track progress too. It’s often worth a few characters more to run a find command.

1 Like