Cost of running Gigalixir in Production

Hello all, I am wondering if anyone here is using Gigalixir in production and:

  1. What your setup is
  2. How much you pay
  3. What type of users activity/usage it supports (concurrent users, for example)

Let’s take a basic chat application as an example using Phoenix and websockets.

The free version supports 1 instance, 1 database (read somewhere this means pool size of 1), 200 MB of memory

Here is the pricing:

How would we estimate the cost for a chat app (and the hardware needed according to Gigalixir pricing model) that has:

  1. 100 concurrent users sending 1 message per 5 minutes (connected through Phoenix Websockets)
  2. 1000 concurrent users sending 1 message / 5 minutes
  3. 1000 concurrent users sending 1 message / 10 minutes
  4. 10,000 concurrent users sending 1 message / 5 minutes
  5. 100,000 concurent users sending 1 message / 5 minutes
  6. 1M concurrent users sending 1 message / 5 minutes

This would help me get a much better understanding of how it works than the current pricing estimator which just dials up the hardware and cost.


Doesn’t this depends entirely on your application’s performance though? If the messages are simply routed to end users then that will require very few resources at each of those levels, but if you’re also persisting everything to the DB that’ll impose some additional work, and if you’re doing some highly complex processing that may require yet more hardware for the same number of users.


I was wondering the same question. I came across gigalixir while I was reading the book Elixir and Elm tutorial. I ran into an issue while deploying to heroku. Searching for the solution led me to gigalixir. I’ve heard that heroku gets expensive.

Good point, let’s assume for this exercise all the messages are persisting in the DB but there is not very complex processing. I am just trying to get an estimate/general range here. Not an exact number. I know the numbers will vary depending on the situation.

The Phoenix post might give a sense of the memory usage required to keep that many connections open.

1 Like

I’ve read this. 128gb and 40 cores to reach 2M. It does not tell me how many messages are being sent and how often. Or am I missing something?

Anyway, if we do straight division that would be 64mb per 1k connections.

Gigalixir charges based on database size, number of replicas, and memory. So if all we do is dial up the memory without changing the other two, it would look like this:

  1. Free for 1k connections (64mb < 200mb)
  2. $50 / month for 10k connections (640mb < 1gb)
  3. $400 / month for 100k connections (6.4gb < 8gb)
  4. $1,600 / month for 500k connections (32gb)
  5. Contact them for higher amounts of memory

Again we don’t know how many messages were being sent by the 2M concurrent example to see if it matches the benchmarks in my first post.


I was looking at their cost from a database perspective. I like the idea of “heroku for phoenix” but they seem to be targeting companies that can pay rather than early product development where you don’t know if you have a market yet. I’m building a product that aggregates a number of external APIs and gives the user a dashboard. So, lots of background jobs and the data could grow. Serve the data from a json-api and the web server serves a single page to the client and multiple pages to admins.

My comparison:

  • Heroku: easy setup and enough room in the free postgres plugin to get started. I’ll need to upgrade to the $7/month plan to host the url.
  • Gigalixir: free postgres is really small and doesn’t have many connections at all. I’d need to at least bump up to the $25/month tier from the start.
  • Digital Ocean: a droplet with 20G is $5/month. It’s really underpowered but phoenix uses resources so well I’m really interested in giving it a try.
1 Like

I wouldn’t compare DO to Heroku or Gigalixir because they are very different products. DO provides very affordable, flexible access to solid infrastructure but it takes work to make the most of it. A PaaS does the heavy lifting for you and makes it a breeze to start deploying, but you get less oomph for your money.

You’ll always get more out of a VPS like Digital Ocean than a PaaS. I’ve been running a small Phoenix app with modest traffic on a $5 per month DO droplet and it barely moves the needle.

Getting a deployment setup sorted can be intimidating still, and it might take a bit of practise to get right; but the result is just as easy as using a platform, but you’re free to use any infrastructure/hardware you like. (Just my tuppence).


Nice. I definitely will give DO a try.

For what it’s worth i’ve been running on Gigalixir’s free plan for a few weeks now.
It’s a Nerves app that streams a base64 jpeg over phoenix channels about ~25 times a second and broadcasts it to how ever many clients are connected at once. Not efficient or scalable at all, but i have yet to have any hicups.


As an update, I’ve been procrastinating a ton on getting my app deployed to a server.

As someone who has only used Heroku in the past, I decided to just give the free tier of Gigalixir a try and I must say that it was a total breeze to deploy to compared to what I needed to get running on my Digital Ocean cluster. Having my app online now is super encouraging and gives me a small sense of accomplishment to keep going and learning more.

My current plan is to continue to use the free tier of Gigalixir until my app exceeds 10k connections (or until I notice any performance roadblocks/limitations).

I actually don’t think setting up Digital Ocean is that laborious, it just feels* risky and I’m concerned something will go wrong considering I haven’t worked with it before.


I’ve had nothing but good experiences with Gigalixir. Great support on Slack too.

I also got things working on Digital Ocean, a little more work but not too bad.