Best inputs for searching methods (e.g. list_items)

This is more of a general question, but it really hit home as I read through the “Craft GraphQL APIs” Absinthe book. I noticed this pattern:

defmodule MyContext do
    def list_items(args) do 
      args
      |> Enum.reduce(Item, fn 
        {:order, order}, query ->
          query |> order_by({^order, :name}) 
        {:filter, filter}, query ->
          query |> filter_with(filter) 
      end)
      |> Repo.all
    end

    defp filter_with(query, filter) do
      Enum.reduce(filter, query, fn
        {:name, name}, query ->
          from q in query, where: ilike(q.name, ^"%#{name}%")
        {:priced_above, price}, query ->
          from q in query, where: q.price >= ^price
        {:priced_below, price}, query ->
          from q in query, where: q.price <= ^price
        {:category, category_name}, query ->
          from q in query,
            join: c in assoc(q, :category),
            where: ilike(c.name, ^"%#{category_name}%")
        {:tag, tag_name}, query ->
          from q in query,
            join: t in assoc(q, :tags),
            where: ilike(t.name, ^"%#{tag_name}%")
      end)
    end
end

That seems like a brilliant and expressive way of structuring dynamic search criteria, and it has the benefit of tying in well to the Absinthe resolvers, but my questions are:

  1. What exactly IS that function expecting? Is it something like this: args = [{name: "Some name"}, {order: :asc}]
  2. How could you “feed” into this structure with a standard REST endpoint? I’m having a hard time wrapping my head around how you could send a JSON body (let alone standard form-encoded inputs) that would properly represent linked lists.
  3. I have noticed that for standard Phoenix controllers, the submitted params arrives as a map using binary keys, but here within the GraphQL resolves, the inputs use atom keys. Would it be a good idea to normalize both forms of input?

Thanks – sorry if this is rambling.

Yes, that’s exactly what it’s expecting, which is one of Elixir’s built-in data structures, a keyword list. You might not recognize it because it has a few different representations which are all equivalent:

list_items([{name: "Some name"}, {order: :asc}])
list_items([name: "Some name", order: :asc])
list_items(name: "Some name", order: :asc)

You could do something like this:

def list(conn, params) do
  # params = %{"order" => "asc", "name" => "Some Name"}

  args =
    params
    |> Enum.map(fn
      {"order", order} -> {:order, order}
      {"name", name} -> {:name, name}
    end)
    |> Enum.into([])

  MyContext.list_items(args)
end

What the code above is doing is expecting very specific keys for this endpoint and converting the data from an external form (map with string keys) to an internal form (keyword list). Yes, in a sense this is “duplicating” a specification of the expected keys, but this validation/conversion from external data to internal data is very important for a robust and secure application. And if you find yourself doing this often then you can use some tools to help you like Ecto (using embedded schemas), or a library like vex or optimal.

Yes it is good to normalize the input, which is what I was getting at above where you are converting untrusted external input, into trusted (or semi-trusted) internal input. Absinthe is able to do the initial validation step for you because you initially provided it a GraphQL Schema which states exactly which arguments, and their types that you expect, and if a required argument is missing, or have the incorrect type, then your resolver function is never even called.

Also in the case of a Phoenix Controller, it would be unwise to directly convert an incoming params map (with string keys) into atom keys because you will end up calling a function like String.to_atom which (as the docs state) should not be called with untrusted external input because created atoms are not garbage collected which means that eventually an attacker could bring your system down.

1 Like

Thank you @axelson for the thorough explanation. Conceptually, a map makes more sense to me, but that’s probably just because I’ve never worked in a language that had keyword lists available. Maps can be easily represented in HTML web forms, and I discovered that Enum.reduce() can traverse them in the same way as a list.

In other words,

Enum.reduce(filter, "", fn x, acc -> IO.inspect x end)

Produces the same output no matter if the filter is represented as a map or a keyword list:

# Same output via Enum.reduce()
filter = %{some_field: "some_value", other_field: "other value"}
filter = [{:some_field, "some_value"}, {:other_field, "other value"}]

In either case, the elements that are enumerated come through the same way:

{:some_field, "some_value"}
{:other_field, "other value"}
# ... etc...

So it turns out that the repackaging of the map is only really necessary because we need to normalize the types of keys (e.g. to atoms). I’ll take a look at vex or optimal as way to help with this process.

1 Like

For maps, the order in which key-value-pairs are iterated is not guaranteed. For lists it is though.

So the output is only the same by accident.

I checked out Optimal and I really like it (I did submit a PR to add some more examples and fix some of the broken code in the existing examples), but I feel it would be even better if I could define my validation rules in one place and have that handle the normalization between string and atom keys. It feels smelly to have to do that in my controllers…

2 Likes

Does anyone have a good example of how to provide BOTH sorting direction (asc/desc) AND the sorting column?

This bit here only lets you specify one… and I can’t think of any good alternatives…

{:order, order}, query ->
          query |> order_by({^order, :name})

If you pass a keyword list as order_by it would work (for example with order_by == [asc: :name]):

{:order_by, order_by}, query ->
          query |> order_by(order_by)

Or just separate out the clauses:

{:order_asc, field}, query ->
  query |> order_by([asc: field])

{:order_desc, field}, query ->
  query |> order_by([desc: field])

Does that accomplish what you’re trying to do? If not a larger code snippet would probably be helpful.

2 Likes

As per the docs you can pass in a list of things to order by:

query |> order_by([c], asc: c.name, desc: c.population)

It can even be dynamic:

values = [asc: :name, desc_nulls_first: :population]
query |> order_by([c], ^values)
2 Likes

That’s brilliant! Yeah, that totally will work – I just want to support the ability for a UI where the user can click a column header to sort by that column.

I have staring at that, but trying to figure how to set values for BOTH the column(s) and the direction via a request (think: a web form) has eluded me. Probably I would need to write a helper to translate those into a single list that I could pass to the order_by() function…

You can even do it by multiple order_by clauses as I recall, I think they get combined in order. It all depends on how you are giving the information to it. ^.^