Elixir: Using Absinthe to query Dgraph, a graph database. GraphQL to GraphQL+ mapping

I am using Absinthe to build a GraphQL API. The datastore is Dgraph, which uses GraphQL+ as query language. It is similar to GraphQL but not identical.

This would in theory put me in a wonderful situation. A GraphQL query like

query {
  user {
    id
    username
    posts {
      title
      text
      comments {
        text
      }
    }
  }
}

could be also just one query in Dgraph. It would look almost identical:

{
  users(func: has(type_user))
  {
    id
    username
    posts {
      title
      text
      comments {
        text
      }
    }
  }
}

This power of graph databases to load complex relations in one go is something that I would like to use. The problem is just: In Absinthe the schema is supposed to be composable. In the schema would have one :user object that has a :posts field which would be a list_of(:post). And then a :post object. Etc. pp.

To help prevent N+1 queries you would use dataloader or batch loading.

Now I could just load everything in one go. I could for example write a resolver that does just that:

defmodule MyApp.Resolvers.User do

  alias MyApp.Users

  def users(_, _args, _) do
    {:ok, Users.all()}
  end

end

And the user context that actually queries the db

defmodule MyApp.Users do

  alias MyApp.Users.User

  def all do
    query = """
      {
        users(func: has(type_user))
        {
          id
          username
          posts {
            title
            text
            comments {
              text
            }
          }
        }
      }
    """

    case ExDgraph.query(conn(), query) do
      {:ok, msg} ->
        %{result: %{users: users}} = msg
        Enum.map(users, fn x -> struct(User, x) end)

      {:error, _error} ->
        []
    end
  end
  
end

The issue here is that I overfetch. I ALWAYS query everything, even if I only want a list of users. This works but is not very good performance wise. And I loose the composability.

What would be a solution is if I had access to the query in the resolver to understand which fields are requested. I could then use pattern matching to build the query and then send it to Dgraph. I could even have one central resolver and many query builders. But I would need to hook into the query and parse it directly.

Is something like that possible? Any idea where I could find something interesting to solve this? Maybe with Absinthe middleware?

Thanks!

1 Like

Yeah is possible! ^^

I’ve been working on a Dgraph library that does exactly that (non open source atm, but hope soon will if this pull is accepted), here the parser:

defmodule Vortex.Query.Builder do
  @moduledoc """
  A helper class that build a valid Dgraph query based on Absinthe project info
  """
  alias Absinthe.Blueprint.Document.Field
  alias Absinthe.Blueprint.Input.Argument

  alias Vortex.Query
  alias Vortex.Parser.State

  @valid_languages Application.fetch_env!(:okami, :valid_languages)
  @fallback_language Application.fetch_env!(:okami, :fallback_language)

  @doc """
  Converts the given Absinthe Info into a valid Dgraph query based on the given Vertex.

  The given `language` is used to translate every string
  """
  def build_query(vertex, info, language \\ @fallback_language) do
    build_query_with_fields(vertex, Absinthe.Resolution.project(info), language)
  end

  @doc """
  Converts the given Absinthe Field list into a valid Dgraph query based on the given Vertex
  """
  def build_query_with_fields(vertex, fields, language \\ @fallback_language) do
    fields
    |> map_fields(vertex, language)
    |> Query.stringify_fields()
  end

  # Get the fields info and translate it into a common "schema" called `Vortex.Parser.State`
  defp map_fields(fields, vertex, language) when is_list(fields) do
    fields
    |> Enum.map(&map_field(vertex, &1, language))
    |> Enum.reject(&is_nil(&1.name))
  end

  # Convert the given field metadata into `Vortex.Parser.State`
  defp map_field(
         vertex,
         %Field{name: name, selections: childs, arguments: args},
         language
       ) do
    field_name = name_to_atom(name)
    translatable? = is_translatable(vertex, field_name)
    language = resolve_language(language)

    %State{
      name: resolve_name(vertex, field_name, translatable?, language),
      func: resolve_func_args(vertex, args, field_name),
      childs: resolve_childs(vertex, field_name, childs, language),
      resolve: field_resolve(vertex, field_name)
    }
  end

  # TODO: Is better make this on the Field struct properties?
  defp is_translatable(vertex, name) do
    case vertex.__schema__(:field, name) do
      %{type: :string} ->
        true

      _ ->
        false
    end
  end

  # Check the given language is a valid one
  defp resolve_language(language) when language not in @valid_languages, do: @fallback_language
  defp resolve_language(language), do: language

  # Check if the field is translatable (is string) and apply the language,
  # If not just alias the name with the real field name
  # If don't exist return nil (and don't process)
  defp resolve_name(vertex, name, is_translatable?, language) do
    on_fields? = name in vertex.__schema__(:fields)

    cond do
      on_fields? and is_translatable? ->
        "#{name}: #{vertex.__schema__(:name)}.#{name}@#{language}:."

      on_fields? ->
        "#{name}: #{vertex.__schema__(:name)}.#{name}"

      true ->
        nil
    end
  end

  # Get the function arguments, based on the Absinthe Input, ignore non-valid ones
  defp resolve_func_args(_vertex, [], _field_name), do: nil

  defp resolve_func_args(vertex, args, field_name) do
    # TODO: Hanlde input_object argument (nested arguments)

    Argument.value_map(args)
    |> Map.to_list()
    |> Enum.map(fn {arg_name, value} ->
      vertex.__schema__(:argument, field_name, arg_name, value)
    end)
    |> Enum.reject(&is_nil/1)
  end

  # Resolve the field childs
  defp resolve_childs(_vertex, _name, [], _language), do: nil

  defp resolve_childs(vertex, field_name, childs, language) do
    case vertex.__schema__(:field, field_name) do
      %Vortex.Field{child: nil} ->
        nil

      %Vortex.Field{child: model} ->
        map_fields(childs, model, language)

      _ ->
        nil
    end
  end

  defp field_resolve(vertex, field_name) do
    vertex.__schema__(:resolve, field_name)
  end

  # Convert the name into a valid atom
  defp name_to_atom(name) do
    Regex.replace(~r/([A-Z])/, name, "_\\0")
    |> String.downcase()
    |> String.to_existing_atom()
  end
end

So basically you can get the absinthe request data by doing project function, the info parameter is the one given by the resolve function in absinte, it gives you the info you need to build the query by parsing the Field struct (see map_field function in my code example), and you can get the childs by recursion ^^

I was really exited about the dgraph official support for graphql that could simplify a lot all of this, but seems like we need to wait a lot for that

Hope that helps

3 Likes

Thank you for your input! I will have a look into that. Not sure yet if it solves my issue. But I already had a look into the project function of Absinthe.

I also replied to your PR on my client.

I think I found a solution. Let’s take a resolver that gets you a list of users. Something like this:

object :user_queries do
  field :users, list_of(:user) do
    resolve fn _parent, _, resolution ->
      IO.inspect(resolution.definition.selections)
      {:ok, Users.all()}
    end
  end
end

If you hit it with a nested query like:

{
  users {
    id
    name
    posts {
      id
      title
      text
      comments {
        id
        text
      }
    }
  }
}

you will see a complex nested map in the terminal. And it contains all the info we need. The selections contain the fields. Each field has a name. And if it has a nested object it contains again a list selections.

Now we only have to recurse our way down the rabbit hole, collect info for our Dgraph query and build it accordingly. And since it already passed the Absinthe validation etc. it looks like a good option.

Interesting will be if an approach like that will lead to problems with the optimizations Absinthe comes with, like in memory caching of already fetched objects…

2 Likes

Well that solution is exactly what the code I gave to you does, if you check the map_field function it takes Absinthe.Blueprint.Document.Field check the name and if have selection (childs) map them as well, the other code is for ensure some functionalities of my library, but your solution is exactly the same as mine in some way ^^

@ospaarmann Hey, I’m thinking of using Dgraph for a social network type web application.

I want to know how it’s working out for you? How do we handle authentication in dgraph?

Any idea when ExDgraph will be close to production ready? Not that I’m impatient or anything. LOL. I just stumbled into Dgraph and it looks really interesting. Gonna dig into the docs.

Hey there, I am still missing transaction support. And there were two issues that I couldn’t resolve so far (see GitHub). After that it should be good to go. If you want it quicker you can always contribute :wink:

1 Like