All places that uses filter should have the same behavior

I’m upgrading one of my projects from Ash 2.0 to 3.0 and one of the things that changed was this:

resource.actions.read.filter can now be specified multiple times. Multiple filters will be combined with and .

I do like this change, but it seems that this is only available for read actions, if I try to do the same with my aggregates I will get the following error:

Multiple values for key `:filter`

Of course, the documentation explicitly says that is only for read.filter but I think this ends up creating more confusion since now we have the filter keyword with different behaviors depending where it is used.

Is there some technical issue that an aggregation (and other places that use filter) can’t have the same behavior as a read.filter and allow being specified multiple time like this?

    count :total_students_this_month, :students do
      filter expr(inserted_at >= fragment("date_trunc('month', now())"))
      filter expr(inserted_at < fragment("date_trunc('month', now()) + interval '1 month'"))
    end

Be sure to also update spark to its latest version, let me know if the issue persists after that.

By latest version you mean from github main repo or the latest version in hex? If the later, I’m already using it (version 2.2.4)

I just meant latest version in hex :+1:

Ah, sorry, reading more closely: filter cannot currently be stated twice in aggregates, only in actions. Please open a feature request for this on ash. :bowing_man:

Here it is: Support multiple filters in aggregations · Issue #1265 · ash-project/ash · GitHub

1 Like