Ecto sum of all items in each category

Hi, am new to Phoenix Ecto. I am doing a cashflow record system, where each cashflow_entry can be assigned a cashflow_category.

here is the Schema:

  schema "cashflow_entries" do
    field :amount, :decimal
    field :date, :date
    field :entry_type, Ecto.Enum, values: [:income, :expense]
    field :note, :string
    belongs_to :cashflow_category, Epic.CashflowCategories.CashflowCategory
    belongs_to :user, Epic.Accounts.User

    timestamps()
  end

  schema "cashflow_categories" do
    field :description, :string
    field :entry_type, Ecto.Enum, values: [:income, :expense]
    field :title, :string
    has_many :cashflow_entries, Epic.CashflowEntries.CashflowEntry
    belongs_to :user, Epic.Accounts.User

    timestamps()
  end

I am trying to query all cashflow_categories with SUM of all related cashflow_entries (where it belongs to the category) so I can display them as follows:

  1. Category 1, total 35 (this is the SUM of all cashflow_entry.amount with category 1)
  2. Category 2, total 125

I wanted to do it the Ecto way, but I have almost 0% idea how to accomplish this.

Extra nonsense
Currently I am doing this but don’t know how to go from here. Do i need subquery? or aggregate? or anything i didn’t know?

  def list_cashflow_categories_with_total(user) do
    CashflowCategory
    |> where(user_id: ^user.id)
    |> preload([:cashflow_entries])
    # |> group_by([cc], cc.id)
    |> select(
      [cc],
      {cc.id, cc.title, cc.description, cc.entry_type, cc.cashflow_entries} 
    )
    |> Repo.all()
  end

Wanted to display the SUM of cashflow_entries for each category as shown below.

You can probably use a combination of join, group_by and sum, which should look something like:

    CashflowCategory
    |> where(user_id: ^user.id)
    |> join(:left, [cc], ce in assoc(cc, :cashflow_entries))
    |> group_by([cc], cc.id)
    |> select(
      [cc, ce],
      {cc.id, cc.title, cc.description, cc.entry_type, sum(ce.amount)} 
    )
    |> Repo.all()
2 Likes

all fields present in the select that are not in an aggregate function, should be present in the group_by expression iirc. :thinking:

One exception / special case for that rule (at least on modern Postgres) - grouping by primary key means you don’t need to mention other columns on that table, since for instance there can only be one cc.description value for a given cc.id.

2 Likes

Thanks @sabiwara this works. But in Phoenix template I am unable to loop through them, I got error like “1st argument not an atom” I have to change a bit

  CashflowCategory
    |> where(user_id: ^user.id)
    |> join(:left, [cc], ce in assoc(cc, :cashflow_entries))
    |> group_by([cc], cc.id)
    |> select(
      [cc, ce],
      %{
        id: cc.id,
        title: cc.title,
        description: cc.description,
        entry_type: cc.entry_type,
        total: sum(ce.amount)
      } # here! is this call struct?
    )
    |> Repo.all()

One more question :smiling_face_with_three_hearts:

If I want to filter the cashflow_entry with specific date, example: SUM of all entries but within current month.
total: sum(ce.amount) # only SUM if current month.
this requires a subquery?

Postgres supports filtering aggregates without a subquery - you can create that SQL via Ecto’s filter

Wouldn’t you just use a where clause on date here?

2 Likes

+1, I think this would be the simplest approach here:

    |> join(:left, [cc], ce in assoc(cc, :cashflow_entries))
    |> where([cc, ce], ce.date > ^date)
    |> group_by([cc], cc.id)
   ...

@loon regarding your question

here! is this call struct?

This would be a map, not a struct (the difference is explained in these links).

Thanks, seem like I saw it somewhere previously but cannot remember how it is. Mind sharing an example?

Didn’t know it would be this simple :joy: I thought of something more complex. Thanks anyway, it works expectedly.

Thanks for the links!

For instance, you could write something like:

total: sum(ce.amount) |> filter(fragment("date_part('month', ?) = ?", ce.date, ^target_month))
1 Like