Having trouble de-serializing data from postgres to be compatible with graphQL Query

I’m having trouble formulating what I want to ask, so I’ll give you a bit of back story:

I’m working on a browser based 3D VR Game. Using Babylon.js I’m able to render 3D objects such as floors, walls (Meshes), Lights etc in the browser using WebGL. These things are part of a “scene”, which is a collection of all the things we can interact with or render in 3D, purely on the client side.

Now I want to create a graphQL api to define a spec for creating scenes, so graphQL can have client-side validation of types for a scene but I just want to persist the entire data of a scene as string or jsonb in postgres (the backend doesn’t need a separate ecto schema backing each type).

Here is a small snippet of graphQL schema:

  mutation do
    field :create_scene, :scene do
      arg(:input, non_null(:scene_input))

  @desc "Input for creating scene"
  input_object :scene_input do
    field :name, non_null(:string)
    field :floors, list_of(:floor_input)
    # TODO... add more list of other types
    # TODO... can we have a list of polymorphic type or union type?

  @desc "A scene contains all 3D objects that are loaded together to set the stage"
  object :scene do
    field :id, :id
    field :slug, :string
    field :name, :string
    field :description, :string
    field :floors, list_of(:floor)

  object :floor do
    field :name, :string
    field :pattern, :floor_pattern
    field :width, :float
    field :length, :float
  enum :floor_pattern do

But it occurred to me that I would be adding a lot of different kind of object types to the scene and I didn’t want to create a new column in the database for each new type. I just wanted one data blob. So my table schema looks like this:

defmodule VRMeet.Rooms.Scene do
  use Ecto.Schema
  import Ecto.Changeset

  @primary_key {:id, :binary_id, autogenerate: true}
  @foreign_key_type :binary_id
  schema "scenes" do
    field :data, :map, null: false, default: "{}"
    field :description, :string
    field :name, :string
    field :slug, :string


And the resolver for the mutation looks like this:

  def create_scene(_, %{input: params}, _) do
    # I just stuffed the params into the data field
    new_params = %{name: params.name, data: params}

    case Rooms.create_scene(new_params) do
      {:error, _} ->
        {:error, "Could not create scene"}

      {:ok, _} = success ->

This seems to persist the correct json into the “data” field, however pulling the databack out with graphQL queries doesn’t quite work as expected.

First of all the Repo query returns the data field as string keys, and I think Absinthe wants atom keys. But even if I atomize all the keys in the data field, a query like this will fail:

  scenes {
    floors {

On the enum of pattern:

 (KeyError) key "grid" not found in: %{grid: %Absinthe.Type.Enum.Value{__reference__: %{location: %{file: "/home/homan/Documents/vrmeet/lib/vrmeet_web/schema.ex", line: 44}, module: VRMeetWeb.Schema}, deprecation: nil, description: nil, enum_identifier: :floor_pattern, name: "GRID", value: :grid}}

I think it’s around the capitalization of the enum. In the DB it is persisted as “grid”, but maybe Absinthe is expecting a symbol :grid.

Is this approach wrong? I’m just starting out, and totally new to Absinthe, any advice is appreciated.


The 2nd question I have about this design is eventually I want to store more and more different types: shields, bats, guns, cars, etc. Can graphQL support a polymorphic input type? Where everything inherits from say a Mesh type, but we can have different attributes on the child sub class. And a Scene is just a giant list of polymorphic types. And again I don’t think the backend persistence needs to care about the types at all.

1 Like

@benwilson512 any advice?

Sorry not familiar with Absinthe specifically.

Can you change your function that fetches from the db to turn all the string keys into atoms before returning?

Yes, I tried that. It works, but not for enum fields because I think Absinthe needs those to be symbols too.