How much RAM does your dev machine have?

After seeing some of the responses in this thread, I’m curious how much RAM you’re opting for and whether there is any specific reason for you choosing that amount - please let us know if there is a particular reason you went for the amount you did.

How much RAM does your main developer machine have?
  • 2GB (or less)
  • 4GB
  • 6GB
  • 8GB
  • 12GB
  • 16GB
  • 24GB
  • 32GB
  • 48GB
  • 64GB
  • 96GB
  • 128GB
  • 256GB
  • 512GB
  • More than 512GB
  • Other - please say in thread
0 voters

I’ve tried to include the most common options, but if yours is missing please select other and let us know in the thread.

Maybe more than one choice should be allowed. For example I have two dev machines.

5 Likes

Go with what’s in your main/preferred machine :023:

1 Like

I need an option that says 36

4 Likes

I should add that RAM is usually the first thing I upgrade, but I had to buy this (16GB) machine out of necessity as a make-do as I’ve got a lot to catch up on and my old Mac had just become unusable (thanks Apple!) I figured I could sell my old Mac for around the same amount so would have gotten it for free anyway. However I have been very impressed with the machine for general use and general dev use so if anyone needs a machine just for that and wanted to try a Mac, don’t be afraid of getting a 16GB model :023: (I will probably revert back to a 64GB machine when I need to work with graphical software - though again, this 16GB machine is even ok at that).

Ah sorry! I looked on Crucial’s site and the Apple site and added all the options I saw. You could pick 32GB if you feel it is close enough. The poll is mainly to give people a general idea of what others are using or able to use without issue, to help them with their own purchasing decision.

2 Likes

In Gentoo depending on the package in order to compile it you need to have even 2GB for each CPU thread. In the worst case for the Ryzen Threadripper PRO 7995WX (96 core, 192 threads) you need at least 384 GB RAM (6 x 64 GB). :exploding_head:

Of course you need to have some GB free for the DE and other apps/services, but on the other side you can limit number of jobs in the compilation … :bulb:

3 Likes

Finally, a place in the polls for one of my unorthodox configurations. :smiling_face_with_three_hearts:

I had to get 48 GB of RAM… 32 wasn’t enough. I use NeoVim + tmux and a lot of tmux panes… ~400 MB per NeoVim instance (due to the Elixir LSP running on each one) really adds up. Plus, after leaving Firefox open, the RAM usage starts to crawl up and up, hovering around ~36 GB total usage after a while. But I still haven’t needed to upgrade to 64 GB yet. :wink:

1 Like

I have 16gb and I almost never hit the memory limit. Most of the time it resides in <40%. I develop somewhat big backends with LSP and in different languages, I run lots of different browsers simultaneously and I use memory-leaking and inefficient desktop apps (like telegram-desktop). Sometimes I run neural networks with eGPU and they use a lot of graphics memory, but it is unrelated to the topic. And I also have a QubesOS-alike setup where I run some apps in virtual machines (like browsers or videogames) but even then memory stays in 2GB per VM limit.

I used to work with 4GB machine and it was just fine, but sometimes I would have to close the old tabs or reopen some leaking websites.

I have no idea why would anyone have any setup which consumes more than 16GB (even with neural networks).

For example, right now I have ~20 tabs open in firefox, 3 tabs in chrome, 2 different big Elixir projects with LSP servers and one PHP project with phpactor lsp server and I have a an mpv as a music player playing streaming music and a ayugram-desktop. Total memory consumption is 4.5GB. (2GBs are consumer by awful websites like slack, linear, notion, bitbucket)

1 Like

For those of you with more than 64GB of RAM (looking at you @cmo @DaAnalyst @kokolegorille @patrickdm @Sorc96 @sbuttgereit) what made you opt for that much? Was there any specific reason?

1 Like

I do a lot of SCADA and PLC work, i.e. running software that likes CPU and RAM. I often have several VMs running. It’s probably more than I need but better than being near the limit of what I need.

I think my Windows laptop has 16gb and M3 MacBook has 32gb.

1 Like

In the last three year I switched most of my activities from web development to bioinformatics, where RAM availability is a primary factor for many kind of analysis.
I originally had 32 GB in my dev PC, which were nearly enough but often limiting.

1 Like

I use a MacBook Pro with 18GB of unified memory.

I chose that amount because it seemed sufficient for my development workloads at the time of purchase. I develop a Phoenix server and some Docker containers for testing, and I’ve run out of memory once or twice when memory-hungry applications consumed too much. :blush:

1 Like

Several reasons, but mostly the following:

  1. Never need a swap file
  2. Can run a server instance or two in parallel while developing
  3. 32GB allocated to a VM running Windows for some native apps that don’t exist for Linux
1 Like

I bought 32GB for this MacBook Pro because it was double what I had before in my Surface Book 2. For my next machine, I would probably get at least double again, maybe even 128 GB if I could afford that much at the time. But, other than running some local LLMs or VMs, I don’t really need much more than I currently have.

1 Like

There are several rationales (rationalizations, perhaps) that have gotten me into larger than normal machines.

In my consulting work, I’ll often times be working multiple projects with multiple clients with differing technical demands (non-technical management to data heavy technical work). So one thing I like to do is set up virtual machines for each client where I can keep data, administrative work products, etc. reasonably segregated and secured since sometimes the data can be sensitive. There are times when I’m working with multiple clients at once so having multiple client VMs running and ready to switch to is helpful; and since often I want each environment to be robust in its own right, the bare metal has to be substantial enough to service those environments.

When I am doing technical work, I’m often working with largish data sets… at least large as compared to personal workstation standards. So for example, I’ll be called on to do ERP data migrations in support of new ERP implementations or corporate mergers where systems are merging. While the final data migrations are not typical done on my equipment, my data conversion activities in advance of cutover are unless the data is simply too large. In the last migration I did for example, I needed to create a combination data migration and bi-directional data integration layer between a legacy SQL Server version of their application and a new PostgreSQL server based version. All of the real development work happened on my workstation where I setup A SQL Server VM, a PostgreSQL server VM, and a VM to host the integration software I was using; a reasonable equivalence to the final cloud production environment, but where I had access to more nobs/dials/logs to really know the dynamics. The dataset (to that point) was 10-20GB. I was able to get a reasonable end-to-end picture and a reasonable modelling of whole system performance (which was important) prior to bringing it all to the cloud PaaS brain damage that would complicate things.

There are non-development reasons, too. All of my formal education and even work for a time when I was young (many years ago) is in music composition (specifically film music composition). So… I will sometimes use this workstation as a DAW and use various other composition tools like notation software… the various sample libraries I use can be very RAM hungry. Also, I like being able to have Blender available, though I’ve only really toyed with it. Some of this stuff is admittedly aspirational right now since I don’t have enough free time, but assuming a long enough dry-spell comes my way and I’ll have some major works of art (LOL) in progress.

Anyway, there’s also a strong case that I really, really don’t need all of this and I’m just satiating my “I never want to buy another computer again” urges.

5 Likes

I too have a MacBook Pro with an odd 18 GB of RAM, it was the minimum option when opting for the M3 Pro chip.

2 Likes

I updated from 64 to 128 because I test a lot of AI models locally.

Client data should never be exposed to cloud. So I need a good setup, just to run different AI pipelines.

4 Likes

Thanks for the responses everyone - very interesting!

Ah that’s awesome! I used to have a little home-studio when I lived with my parents and it’s something I’ve wanted to get back into for a while. Which DAW do you use?

Do you find much difference between models below/above 64GB?

When @TimButterfield and were testing DeepSeek models I felt that the 4GB model produced better results than the 8GB version. Curious whether you’ve noticed much difference between 64GB vs 128GB models?

1 Like

For DAW, I use Reaper. Sometimes it feels like the musical equivalent of Emacs (for better and worse), but the set of features for the price make the value proposition very difficult to beat. For notation I use Dorico… I was a long time Finale user (like from version 1 in the late 80’s), but that product got discontinued several months ago.

1 Like

Pretty much just because it’s possible. I went from an ancient ThinkPad with one DDR3 slot to a Framework, which has two DDR5 slots and my boss just thought it would be funny to get two 48GB sticks. Haven’t really found a use for all that memory.

1 Like