How to sum number of group by

I get result of following map list.

  def group_by_sum do
    m01 = %{id: 1,  price: 120}
    m02 = %{id: 2,  price: 110}
    m03 = %{id: 2,  price: 100}
    m04 = %{id: 3,  price: 80}
    Enum.group_by([m01, m02, m03, m04], &(&1.id))
  end

This is a result of map.

%{
  1 => [%{id: 1, price: 120}],
  2 => [%{id: 2, price: 110}, %{id: 2, price: 100}],
  3 => [%{id: 3, price: 80}]
}

I would like to get sum of price each id.

How can I handle it? Please give me some advice.

You can use the Enum functions on a map.

You’ll see each element as a tuple of key and value.

m01 = %{id: 1,  price: 120}
m02 = %{id: 2,  price: 110}
m03 = %{id: 2,  price: 100}
m04 = %{id: 3,  price: 80}

[m01, m02, m03, m04]
|> Enum.group_by(&(&1.id)) # groups as specified
|> Enum.map(fn {id, data} ->
  # do the magic to sum up prices
  {id, calculated_sum}
end)
|> Enum.into(%{})

Some things that might help in the fn passed to Enum.map/2 are Enum.map/2, Enum.reduce/3, Enum.sum/1, but not necessarily all of them at the same time :wink:

3 Likes

I have no idea how to sum in Enum.map. I don’t know how to extract price data inside of function.

You have a list of maps, each of it has a :price which you want to sum.

If you had a list of plain values, youd do this:

Enum.sum([1,2,3]) # => 6

Well, now you have a list of things that you need to make a list of plain values first. This can be easily done with Enum/map/2:

Enum.map([%{price: 1}, %{price: 2}, %{price: 3}], &(&1.price)) # => [1, 2, 3]

Glueing these together is left as an exercise.

1 Like

I can not extract price data.

  def group_by_sum do
    m01 = %{id: 1,  price: 120}
    m02 = %{id: 2,  price: 110}
    m03 = %{id: 2,  price: 100}
    m04 = %{id: 3,  price: 80}
    Enum.group_by([m01, m02, m03, m04], &(&1.id))
      |> Enum.map(fn {_id, data} ->  data end)
  end

It returns following value.

[
  [%{id: 1, price: 120}],
  [%{id: 2, price: 110}, %{id: 2, price: 100}],
  [%{id: 3, price: 80}]
]

The following function returns [nil, nil, nil]

  def group_by_sum do
    m01 = %{id: 1,  price: 120}
    m02 = %{id: 2,  price: 110}
    m03 = %{id: 2,  price: 100}
    m04 = %{id: 3,  price: 80}
    Enum.group_by([m01, m02, m03, m04], &(&1.id))
      |> Enum.map(fn {_id, data} ->  data end)
      |> Enum.map(fn (data) -> data[:price] end)
  end

How can I extract price data from map with multiple key?

Sorry for that. I updated previous code.

Enum.group_by([m02, m03], &(&1.id)) # => %{2 => [%{id: 2,  price: 110}, %{id: 2,  price: 100}]}
|> Enum.map(fn {_id, data} ->  data end) # => [[%{id: 2,  price: 110}, %{id: 2,  price: 100}]]
|> Enum.map(fn (data) -> data[:price] end) # => There is no `:price` in a list of lists.

The trick is to nest the calls, take a look at my initial example, I placed a comment exactly where you have to do the calculations:

|> Enum.map(fn {id, data} ->
  # do the magic to sum up prices
  {id, calculated_sum}
end)

Not sure what you need Enum.group_by/2 for.

> values = [
  %{id: 1, price: 120},
  %{id: 2, price: 110},
  %{id: 2, price: 100},
  %{id: 3, price: 80}
]

> Enum.reduce(values, %{}, & Map.merge(&2, %{&1.id => (&2[&1.id] || 0) + &1.price}))
%{1 => 120, 2 => 210, 3 => 80}
4 Likes

Because thats how one starts. Also in my opinion, the variant using Enum.group_by/2, Enum.map/2 and Enum.sum/1 reads much better, as it shows clearly the intend.

Your variant looks very obfuscated on the untrained eyes and needs a lot of brainwork to get deciphered.

But yes, perhaps I’d end up with your variant as well, once the many iterations using group_by have been identified as a bottleneck.

3 Likes

Because thats how one starts. Also in my opinion, the variant using Enum.group_by/2 , Enum.map/2 and Enum.sum/1 reads much better, as it shows clearly the intend.

Your variant looks very obfuscated on the untrained eyes and needs a lot of brainwork to get deciphered.

Fair point, although perhaps it could be argued that intent can be shown via naming. In production code I’d naturally split it into private functions with descriptive names.

2 Likes

Alternatively it can be written in form:

Enum.reduce(values, %{}, fn %{id: id, price: price}, acc ->
  Map.update(acc, id, price, & &1 + price)
end)

Or equivalently:

for %{id: id, price: price} <- values, reduce: %{} do
  acc -> Map.update(acc, id, price, & &1 + price)
end
4 Likes

I think the issue is how many people are introduced these days to higher order functions - by mapping over an array.

That doesn’t help anybody appreciate the fundamental importance of reduce/foldl and in turn recursion.

In my experience one has to go through the training to see how everything relates, i.e.:

  • Processing lists with recursion
  • Implementing reduce/foldl through recursion
  • Implementing map and filter in terms of reduce

I find that only once it becomes clear how all of these relate one can more fully appreciate the flexibility of higher order functions (and the usefulness of reduce/foldl and recursion).

Learning How to Loop in Elixir Through Recursion

3 Likes

But throwing some one liner in front of the OPs feet without any explanation doesn’t help either.

That’s why I tried to push the OP gently towards a solution that works for now. Then we could have talked about it and perhaps optimise for speed, readability or memory consumption, whatever is necessary.

2 Likes

I see reduce at the core of the problem - not an optimization. Personally group_by is usually an afterthought for me, not the first thing I go for. I also see the likes of Enum.sum/1 as an impediment to becoming familiar with reduce.

A gentle push usually isn’t enough for people unfamiliar with reduce - likely because it takes a bit of work to push past the initial confusion - which is why one liners aren’t the solution either.

When working in a team, explicitness, readability and clarity of intent are more important than picking the perfect FP construct.

Just today we had a small team discussion where I insisted on replacing Enum.filter(& &1) with Enum.reject(&is_nil/1) because it makes the intent clearer.

It’s also highly unlikely one to ever hit a performance bottleneck in your typical Elixir app. Better opt for a middle ground that accelerates code review.

That’s not the issue at all - it’s about recognizing reduce (or fold) as a generalized transformational problem solving approach. Also in this particular context we’re talking about a “teaching moment”.

People from a non-functional background will approach Enum as the equivalent of “stuff to work with collections”. They’ll scan over the functions, look at the ones they recognize by name. It would be very easy to use a number of the specialized functions without ever understanding the generalized notion of reduce (it’s all over the place in Enum's source).

By knowing how to use Enum.reduce/3 (or List.foldl/3) well, you already know how to get most of the value.

The specialized functions have their place. They are less verbose and clearer but their existence can prevent newcomers from “getting” the generalized problem solving approach.

Sometimes the use of specialized functions can lead to long pipelines of successive micro-reductions - “can’t see the forest for the trees” - one shouldn’t miss opportunities to collapse stages by using for-purpose reducer functions.

because it makes the intent clearer.

Reduce can be made much clearer by avoiding inline functions as reducers, opting instead for named, statically defined functions or anonymous functions bound to an intention revealing name.

2 Likes

Sure, but that’s a long journey. Takes a while to practice and understand to arrive at that point.

Most people just want to get over a hurdle they can’t solve at the moment. And that’s okay, that’s why there are help sections in the forum.

100% agreed. It really improves readability and communicates intent much better.