Best practice for distributing largely static data from a database?

In an effort to decrease how frequently a database containing essentially static data is touched, we build a copy of the data during the initialization of the app and send it throughout our code. However, this feels like an antipattern. What are some examples of a more effective strategy, if any?

You can just let it hit the db or cache it in an ETS table or pop it in persistent_term. Note that there are caveats to persistent term, in that changing the data triggers GC.

There are caching caching libraries cachex, concache, etc and it is simple to write your own basic ETS based cache.


I’m not sure how this is an anti-pattern.

I’ve recently moved 80% of the data from postgresql into sqlite. Having the data in sqlite also means that the data can be deployed on every app server and the load on the main database is decreased. This is an easy way to improve performance.

I think any strategy really depends on the size / access pattern of the data though.


“Anti-pattern” is an expression used by consultants selling books and courses! A good % of the time you should not listen to them.

Semi-joking aside (because obviously there’s also a lot of solid programming advice out there), if it feels like it’ll do the job, there is nothing “anti-pattern” about it. I knew a guy who was reading data from CSV files and stored it to HTML and JSON. Made 50K EUR in 3 months because he knew the right people who needed that. A lot of programmers would cringe at his code. Well, good thing that nobody asked them. :003:

So yeah, man, do whatever is best for your project! Don’t be shy to ask for concrete advice (as you did before) and just produce the code that makes your life easier.

1 Like