# How to sum number of group by

I get result of following map list.

``````  def group_by_sum do
m01 = %{id: 1,  price: 120}
m02 = %{id: 2,  price: 110}
m03 = %{id: 2,  price: 100}
m04 = %{id: 3,  price: 80}
Enum.group_by([m01, m02, m03, m04], &(&1.id))
end
``````

This is a result of map.

``````%{
1 => [%{id: 1, price: 120}],
2 => [%{id: 2, price: 110}, %{id: 2, price: 100}],
3 => [%{id: 3, price: 80}]
}
``````

I would like to get sum of price each id.

You can use the `Enum` functions on a map.

Youâll see each element as a tuple of key and value.

``````m01 = %{id: 1,  price: 120}
m02 = %{id: 2,  price: 110}
m03 = %{id: 2,  price: 100}
m04 = %{id: 3,  price: 80}

[m01, m02, m03, m04]
|> Enum.group_by(&(&1.id)) # groups as specified
|> Enum.map(fn {id, data} ->
# do the magic to sum up prices
{id, calculated_sum}
end)
|> Enum.into(%{})
``````

Some things that might help in the `fn` passed to `Enum.map/2` are `Enum.map/2`, `Enum.reduce/3`, `Enum.sum/1`, but not necessarily all of them at the same time

3 Likes

I have no idea how to sum in Enum.map. I donât know how to extract price data inside of function.

You have a list of maps, each of it has a `:price` which you want to sum.

If you had a list of plain values, youd do this:

``````Enum.sum([1,2,3]) # => 6
``````

Well, now you have a list of things that you need to make a list of plain values first. This can be easily done with `Enum/map/2`:

``````Enum.map([%{price: 1}, %{price: 2}, %{price: 3}], &(&1.price)) # => [1, 2, 3]
``````

Glueing these together is left as an exercise.

1 Like

I can not extract price data.

``````  def group_by_sum do
m01 = %{id: 1,  price: 120}
m02 = %{id: 2,  price: 110}
m03 = %{id: 2,  price: 100}
m04 = %{id: 3,  price: 80}
Enum.group_by([m01, m02, m03, m04], &(&1.id))
|> Enum.map(fn {_id, data} ->  data end)
end
``````

It returns following value.

``````[
[%{id: 1, price: 120}],
[%{id: 2, price: 110}, %{id: 2, price: 100}],
[%{id: 3, price: 80}]
]
``````

The following function returns [nil, nil, nil]

``````  def group_by_sum do
m01 = %{id: 1,  price: 120}
m02 = %{id: 2,  price: 110}
m03 = %{id: 2,  price: 100}
m04 = %{id: 3,  price: 80}
Enum.group_by([m01, m02, m03, m04], &(&1.id))
|> Enum.map(fn {_id, data} ->  data end)
|> Enum.map(fn (data) -> data[:price] end)
end
``````

How can I extract price data from map with multiple key?

Sorry for that. I updated previous code.

``````Enum.group_by([m02, m03], &(&1.id)) # => %{2 => [%{id: 2,  price: 110}, %{id: 2,  price: 100}]}
|> Enum.map(fn {_id, data} ->  data end) # => [[%{id: 2,  price: 110}, %{id: 2,  price: 100}]]
|> Enum.map(fn (data) -> data[:price] end) # => There is no `:price` in a list of lists.
``````

The trick is to nest the calls, take a look at my initial example, I placed a comment exactly where you have to do the calculations:

``````|> Enum.map(fn {id, data} ->
# do the magic to sum up prices
{id, calculated_sum}
end)
``````

Not sure what you need `Enum.group_by/2` for.

``````> values = [
%{id: 1, price: 120},
%{id: 2, price: 110},
%{id: 2, price: 100},
%{id: 3, price: 80}
]

> Enum.reduce(values, %{}, & Map.merge(&2, %{&1.id => (&2[&1.id] || 0) + &1.price}))
%{1 => 120, 2 => 210, 3 => 80}
``````
4 Likes

Because thats how one starts. Also in my opinion, the variant using `Enum.group_by/2`, `Enum.map/2` and `Enum.sum/1` reads much better, as it shows clearly the intend.

Your variant looks very obfuscated on the untrained eyes and needs a lot of brainwork to get deciphered.

But yes, perhaps Iâd end up with your variant as well, once the many iterations using `group_by` have been identified as a bottleneck.

3 Likes

Because thats how one starts. Also in my opinion, the variant using `Enum.group_by/2` , `Enum.map/2` and `Enum.sum/1` reads much better, as it shows clearly the intend.

Your variant looks very obfuscated on the untrained eyes and needs a lot of brainwork to get deciphered.

Fair point, although perhaps it could be argued that intent can be shown via naming. In production code Iâd naturally split it into private functions with descriptive names.

2 Likes

Alternatively it can be written in form:

``````Enum.reduce(values, %{}, fn %{id: id, price: price}, acc ->
Map.update(acc, id, price, & &1 + price)
end)
``````

Or equivalently:

``````for %{id: id, price: price} <- values, reduce: %{} do
acc -> Map.update(acc, id, price, & &1 + price)
end
``````
4 Likes

I think the issue is how many people are introduced these days to higher order functions - by mapping over an array.

That doesnât help anybody appreciate the fundamental importance of reduce/foldl and in turn recursion.

In my experience one has to go through the training to see how everything relates, i.e.:

• Processing lists with recursion
• Implementing reduce/foldl through recursion
• Implementing map and filter in terms of reduce

I find that only once it becomes clear how all of these relate one can more fully appreciate the flexibility of higher order functions (and the usefulness of reduce/foldl and recursion).

Learning How to Loop in Elixir Through Recursion

3 Likes

But throwing some one liner in front of the OPs feet without any explanation doesnât help either.

Thatâs why I tried to push the OP gently towards a solution that works for now. Then we could have talked about it and perhaps optimise for speed, readability or memory consumption, whatever is necessary.

2 Likes

I see `reduce` at the core of the problem - not an optimization. Personally `group_by` is usually an afterthought for me, not the first thing I go for. I also see the likes of `Enum.sum/1` as an impediment to becoming familiar with reduce.

A gentle push usually isnât enough for people unfamiliar with reduce - likely because it takes a bit of work to push past the initial confusion - which is why one liners arenât the solution either.

When working in a team, explicitness, readability and clarity of intent are more important than picking the perfect FP construct.

Just today we had a small team discussion where I insisted on replacing `Enum.filter(& &1)` with `Enum.reject(&is_nil/1)` because it makes the intent clearer.

Itâs also highly unlikely one to ever hit a performance bottleneck in your typical Elixir app. Better opt for a middle ground that accelerates code review.

Thatâs not the issue at all - itâs about recognizing reduce (or fold) as a generalized transformational problem solving approach. Also in this particular context weâre talking about a âteaching momentâ.

People from a non-functional background will approach `Enum` as the equivalent of âstuff to work with collectionsâ. Theyâll scan over the functions, look at the ones they recognize by name. It would be very easy to use a number of the specialized functions without ever understanding the generalized notion of reduce (itâs all over the place in `Enum`'s source).

By knowing how to use `Enum.reduce/3` (or `List.foldl/3`) well, you already know how to get most of the value.

The specialized functions have their place. They are less verbose and clearer but their existence can prevent newcomers from âgettingâ the generalized problem solving approach.

Sometimes the use of specialized functions can lead to long pipelines of successive micro-reductions - âcanât see the forest for the treesâ - one shouldnât miss opportunities to collapse stages by using for-purpose reducer functions.

because it makes the intent clearer.

Reduce can be made much clearer by avoiding inline functions as reducers, opting instead for named, statically defined functions or anonymous functions bound to an intention revealing name.

2 Likes

Sure, but thatâs a long journey. Takes a while to practice and understand to arrive at that point.

Most people just want to get over a hurdle they canât solve at the moment. And thatâs okay, thatâs why there are help sections in the forum.

100% agreed. It really improves readability and communicates intent much better.