Absinthe serialization vs field type mismatch?

I’ve been running into an issue with Absinthe where the type of the data returned in a query doesn’t match up with the type specified in the field.

As far as I can tell, for many of the basic scalars (integer, float, boolean), Absinthe passes along whatever is returned by the field’s resolver function without attempting to cast to the declared type.

Just to make sure, I started a fresh project and followed the How To GraphQL example through the first query at the bottom of https://www.howtographql.com/graphql-elixir/2-queries/ and then changed the fields to:

  object :link do
    field :id, non_null(:float)
    field :url, non_null(:integer)
    field :description, non_null(:boolean)

and resolver to

  def all_links(_root, _args, _info) do
    {:ok, %{id: "Hello", url: ["World", "!"], description: %{:ok => :error}}}

which happily returned the query result

  "data": {
    "allLinks": [
        "url": [
        "id": "Hello",
        "description": {
          "ok": "error"

Does the outgoing type validation have to be pushed down to the resolvers? Allowing arbitrary types to be returned in a field breaks the contract the schema makes with the user, and it feels like Absinthe should be trying to cast the data into the appropriate type and throwing an error when it can’t?

1 Like

Hi @jfeng,
did you come up with a solution?

Also similar threads:

Ecto Changeset vs Absinthe Scalar Type Validation?

Absinthe Serialization & Validation

Nope, but it looks like there is a fix coming from what was said in the Absinthe Serialization & Validation thread!