Because thats how one starts. Also in my opinion, the variant using Enum.group_by/2, Enum.map/2 and Enum.sum/1 reads much better, as it shows clearly the intend.
Your variant looks very obfuscated on the untrained eyes and needs a lot of brainwork to get deciphered.
But yes, perhaps I’d end up with your variant as well, once the many iterations using group_by have been identified as a bottleneck.
Because thats how one starts. Also in my opinion, the variant using Enum.group_by/2 , Enum.map/2 and Enum.sum/1 reads much better, as it shows clearly the intend.
Your variant looks very obfuscated on the untrained eyes and needs a lot of brainwork to get deciphered.
Fair point, although perhaps it could be argued that intent can be shown via naming. In production code I’d naturally split it into private functions with descriptive names.
I think the issue is how many people are introduced these days to higher order functions - by mapping over an array.
That doesn’t help anybody appreciate the fundamental importance of reduce/foldl and in turn recursion.
In my experience one has to go through the training to see how everything relates, i.e.:
Processing lists with recursion
Implementing reduce/foldl through recursion
Implementing map and filter in terms of reduce
I find that only once it becomes clear how all of these relate one can more fully appreciate the flexibility of higher order functions (and the usefulness of reduce/foldl and recursion).
But throwing some one liner in front of the OPs feet without any explanation doesn’t help either.
That’s why I tried to push the OP gently towards a solution that works for now. Then we could have talked about it and perhaps optimise for speed, readability or memory consumption, whatever is necessary.
I see reduce at the core of the problem - not an optimization. Personally group_by is usually an afterthought for me, not the first thing I go for. I also see the likes of Enum.sum/1 as an impediment to becoming familiar with reduce.
A gentle push usually isn’t enough for people unfamiliar with reduce - likely because it takes a bit of work to push past the initial confusion - which is why one liners aren’t the solution either.
When working in a team, explicitness, readability and clarity of intent are more important than picking the perfect FP construct.
Just today we had a small team discussion where I insisted on replacing Enum.filter(& &1) with Enum.reject(&is_nil/1) because it makes the intent clearer.
It’s also highly unlikely one to ever hit a performance bottleneck in your typical Elixir app. Better opt for a middle ground that accelerates code review.
That’s not the issue at all - it’s about recognizing reduce (or fold) as a generalized transformational problem solving approach. Also in this particular context we’re talking about a “teaching moment”.
People from a non-functional background will approach Enum as the equivalent of “stuff to work with collections”. They’ll scan over the functions, look at the ones they recognize by name. It would be very easy to use a number of the specialized functions without ever understanding the generalized notion of reduce (it’s all over the place in Enum’s source).
By knowing how to use Enum.reduce/3 (or List.foldl/3) well, you already know how to get most of the value.
The specialized functions have their place. They are less verbose and clearer but their existence can prevent newcomers from “getting” the generalized problem solving approach.
Sometimes the use of specialized functions can lead to long pipelines of successive micro-reductions - “can’t see the forest for the trees” - one shouldn’t miss opportunities to collapse stages by using for-purpose reducer functions.
because it makes the intent clearer.
Reduce can be made much clearer by avoiding inline functions as reducers, opting instead for named, statically defined functions or anonymous functions bound to an intention revealing name.