Lookup functions via metaprogramming vs ETS?

Hi alchemists!

I’m working on a library where I need to incorporate some static data from a text file (50,000 - 100,000 entries). The data is used to lookup some information. As already named, the data is static and therefore I don’t need the ability to change it during runtime. You can think of the data as a simple dictionary.

Now I’ve two possible solutions to handle this job:
The first solution, and my preferred one, is to use metaprogramming to build some functions with a static argument in order to use pattern matching. I’ve tested this approach with just 1000 entries and it seems to work very well. There is already a similar topic (Using metaprogramming to create lookup functions) started from @uri with some additional information, but it lacks the final information whether this solution is better. :wink:

The other solution is to load the data into an ETS table. As I have already said, I don’t need to add/modify values during runtime.

As I prefer the metaprogramming solution, I ask me whether there is an upper limit for these kind of functions and what kind of “strings” to use. Char lists or ordinary Elixir strings? What do you think is the better solution?

A simple benchmark of ETS vs Map lookups (Benchmarking lookup time for map vs ets) shows that maps are faster. Maybe this applies also to “static functions”?


Benchmarking it should give you the right answers. You can check Defmap - embed maps into a module for faster/easier lookups. Feedback please for more discussion on this exact topic.


Depends how often you call it. We had some performance problems with initializing 10k DateTime maps from Timex.to_datetime with a timezone. Because it had to do timezone lookups in ets.

They use for that a library tzdata which had an old version using compiled in solution and new solution was ets table.

https://github.com/lau/tzdata/issues/31 see performance difference.

1 Like