My colleagues and I ran into challenges with atoms and tuples in our Elixir codebases, especially when modeling domain states or structured data clearly.
Atoms are handy but also sometimes annoying. They’re global, easy to typo and need extra tests to ensure correctness. Not to mention converting strings safely to atoms isn’t very straightforward (and sometimes unsafe/unreliable even with String.to_existing_atom/1).
I had an idea to solve this and built a PoC library but to be honest I’m not entirely sure if the community would like it. Especially I’m curious to know if you see this in a codebase would it make understanding the code easier or more difficult?
Here is my PoC: (It is called Enuma and is available on Hex.pm if you want to try it out)
defmodule Shape do
use Enuma
defenum do
item :circle, args: [float()] # Circle with radius
item :rectangle, args: [float(), float()] # Rectangle with width and height
item :triangle
end
end
circle = Shape.circle(5.0) # = {:circle, 5.0}
require Shape
case shape do
# Can use them in pattern matching as they are compile time macros
Shape.circle(r) -> :math.pi() * r * r
# Above case clause is same as
# {:circle, r} -> :math.pi() * r * r
Shape.rectangle(w, h) -> w * h
Shape.triangle() -> raise "Triangle Area Not Supported!"
end
Generally what Enuma does it generating macros based on the enum definition. It also generates is_* guards that can be used in “when” clauses.
Back to the question
Here’s the thing: Enuma does slightly alter how you’d typically write Elixir code, which might make some developers uncomfortable or lead to mixed styles in codebases.
I’m curious:
Do you think libraries like Enuma enhance clarity, or do they risk complicating Elixir’s simplicity?
Would Enuma fit well into your projects, or does it feel too different?
How do you currently handle structured domain modeling in your code?
If you compare it with Ecto.Enum, would you see useful benefits for your usecases?
I am not sure you should do that to begin with. What is the use-case where you are required to convert strings to atoms on a basis?
I’m not a big fan of using functions generated by macros in application code, it doesn’t play well with LSP and as the modules become more complex, it becomes a source of confusion. Maybe you could redesign the code to work around modules:
circle = Enuma.new(Shape, :circle, [5.0])
case shape do
Enuma.match(Shape, :circle, [radius]) -> ....
end
The way I did it a few times when I required enums was with functions:
def circle(radius) when is_float(radius), do: {:circle, radius}
case shape do
{:circle, radius} -> ....
end
While the macro version looks more organized, I think that simple functions are superior as you don’t tie your enum functions to a module, it amounts to easier refactors and you can cram a unlimited number of functions in a module. I am not a big fan of how functionality like defstruct couples with the module, but I am not entirely sure at the same time how a readable alternative would look like.
As it currently stands, it seems too alien for most of the codebases I worked with. Nonetheless, I think such a library might be a great addition to the ecosystem.
Good news for you all my fellow alchemists - it is built in:
defmodule Shape do
import Record
defrecord :circle, radius: nil
defrecord :rectangle, width: nil, height: nil
defrecord :triangle, []
end
circle = Shape.circle(radius: 5.0) # => {:circle, 5.0}
require Shape
case shape do
# Can use them in pattern matching as they are compile time macros
Shape.circle(radius: r) -> :math.pi() * r * r
Shape.rectangle(width: w, height: h) -> w * h
Shape.triangle() -> raise "Triangle Area Not Supported!"
end
Yours kind of does but I am not a fan of the dot syntax. Though I don’t see other possibilities. Might indeed be something that confuses LSP, no idea, haven’t tried. It just feels off; it’s forced on a language and runtime that are not made for it.
It’s not too different at all, it’s just about making people use something else which turns out to be super difficult with Elixir. A lot of people are already comfortable doing things certain way. I would give it an honest go however, once or twice.
Structs + strict validations. When you are NOT on the edge of your system it then becomes a simple matter of just pattern-matching on a struct because all the functions that produce them (i.e. after parsing complex JSON trees) ensure many properties. Works pretty well. Though I do agree the lack of sum types in Elixir is sticking like a sore thumb.
I don’t think they are rivals, I’d attempt to use both. But have to admit, I’d default to Ecto.Enum first.
Your idea is great and as a Rust fan I really really want sum types in Elixir. But I just don’t see how would that even work with the current realities of the BEAM VM. As much as I like the idea I’d lean to enforce stricter and more exhaustive pattern-matching clauses comprised of tagged tuples.
I also agree on most points here. I’m also surprised to see that you can use records in pattern matches too. (Makes me wonder if underneath they are also implemented with macros or not).
On top of that, when the type definitions get added to the language most of these issues will go away anyways.
Indeed, but it seems that records natively support to be used in matches:
handle(Msg, State) when Msg =:= #msg{to=void, no=3} ->
If I remember correctly, the last time I had to convert records from diameter, elixir doesn’t support the record # syntax, so it has to improvise with these workarounds.
In Elixir Records are managed by set of 3 macros - name/0 (create new record with defaults or match the record), name/1 (get field ID, create with other values, or match te values from macro) and name/2 (update fields in record).
I think this is a problem which is best solved by a type system. TypeScript’s string literal types solve a similar issue. Once you have type inference/checking for atoms the problems you describe should go away.