How do I load ecto structs from a postgresql function?

I’ve defined a table in postgresql to store a NonBinarySearchTree. The table is named nodes.
Each row contains an id, and optionally a parent_id (as a self referencing foreign key).

For ease and convenience i’ve defined two postgresql functions all_ancestors_of(id) and all_descendants_of(id)

The migration/definition of the ancestors-one looks like this

  def up do
     execute """
    create function all_ancestors_of(node_id UUID) returns SETOF nodes AS
    'WITH RECURSIVE ancestors AS (
      SELECT
        *
      FROM
          nodes
      WHERE
          nodes.id = all_ancestors_of.nodes_id

      UNION

      SELECT
          parents.*
      FROM
          nodes AS parents
      INNER JOIN ancestors a ON a.parent_id = parents.id
    ) SELECT * from ancestors where id != all_ancestors_of.node_id;
    ' LANGUAGE SQL;
    """
  end

This works beautifully in SQL, but Im struggling to use these functions via Ecto.

How do I get the list of nodes (i have a schema for them) without resorting to

    res = Ecto.Adapters.SQL.query!(Repo, "SELECT * from all_ancestors_of($1)", [uuid])
    Enum.map(res.rows, &Repo.load(Node, {res.columns, &1}))

?

I’m was hoping to be able to do something like

query = from e in fragment(all_ancestors_of($1), [uuid])
nodes = Repo.all(query)
1 Like

Hello, could Arbor (the lib or its source code) be helpfull?

Arbor

Ecto adjacency list and tree traversal using CTEs. Arbor uses a parent_id field and CTEs to create simple deep tree-like SQL hierarchies. […]

Thanks! I haven’t looked at Arbor’s source yet and Im reluctant to use it off the shelf, it would be complete overkill for my use case.

Discarding my practical use case here, does anyone have a neat approach for loading a a function result set as schema x structs using Ecto?

Some (partially outdated) info:

UPDATE: for Ecto 3 you can use
__MODULE__.load(model, fields)