Grouping by region, aggregate function (grouping_error) column

I want to group by region. So I was getting this error:

(Postgrex.Error) ERROR 42803 (grouping_error) column “” must appear in the GROUP BY clause or be used in an aggregate function

I have a table with a column called “locations” and I want to query and group by country_region

id | country_region    	|  details
510  Alegeria		        |  ...
499  Australia		        |  ...
470  Australia		        |  ...
472  Australia		        |  ...
487  Australia		        |  ...

So I wrote this:

def grouped_by_region() do
    |> group_by([e], e.country_region)
    |> select([e], %{region: e.country_region, count: count(, items: fragment("ARRAY_AGG(?)",})
    |> order_by([e], asc: e.country_region)

And now I get this:

  %{region: "Algeria", count: 1, items: [510]},
  %{region: "Australia", count: 5, items: [499, 470, 471, 472, 487]},

I do not want items:[49,470,471] to contain the ids. I want items: to contain the actual records. So something similar to this:

  %{region: "Algeria", count: 1, items: [%Location{id: 510 ...} ...],
  %{region: "Australia", count: 5, items: [%Location{id: 499}, %Location{id: 470} ... ],

I have tried the following fragments.

fragment("ARRAY_AGG(?)",      # returns a lit of ids 
fragment("ARRAY_AGG(?)", e)          # error
fragment("ARRAY_AGG(?)", [e])        # error
fragment("ARRAY_AGG(?)", {e})       # error

How do I fix the query? Thanks.

If you want the full records I’m not sure you gain a lot by doing the aggregating at the SQL level. I’d just load the locations and use Enum.group_by to group them, then to arrive at that final data structure.

1 Like

Further Investigation
fragment("JSON_AGG(?)", e) instead of ("ARRAY_AGG(?)", would return you a Map with stringify keys.

I can’t somehow return the Schema within the fragment ?

Related articles:
It seems like other people have discussed this:

Ben’s advice. Keep it simple. I did exactly what he suggested here is the solution:

    def group_by_region do
        results = 
            |> Repo.all
            |> Enum.group_by(fn(item) -> item.country_region end)

        keys =
            results |> Map.keys, fn(key) -> 
            |> Map.put_new(:region, key)
            |> Map.put_new(:count, length(Map.get(results, key)))
            |> Map.put_new(:items, Map.get(results, key))  
1 Like

Not easily no, although you could cast the JSON into the schema shape by using Repo.load on the result. I’m still pretty sure that it’s faster to just do the grouping in memory instead of hogging a database connection while Postgres does it, and it’s roughly the same amount of data either way.

1 Like

This is interesting. I wasn’t aware of Repo.load. So in the future I could be in situation where I could take that JSON and pass in a custom Schema to cast from the JSON. Nice tip.