Using an Ash Resource with a Dynamic Repository

Hi, we are writing an application that will have its own single/static database, as well as needing to manage connections to an arbitrary number of other databases. Ecto supports this and I wonder if there are any Ash-specific twists to this approach?
If it simplifies things, I think there would be some resources that are always from a preconfigured database, while some other resources would always come from dynamic repos.

To do that with Ash the way it generally works is that you can set a context in your action to set (or override the default) repo. For example:

read :read do
  prepare fn query, _ -> 
    repo = figure_out_repo()
    Ash.Query.set_context(query, %{data_layer: %{repo: repo}})
  end
end

You can abstract this using a module-backed preparation, and attach it to a resource using the global preparations block in the resource, to trigger it on all queries:

preparations do
  prepare SetDynamicRepo
end

Not necessarily a full breakdown, but that should give you a starting point :slight_smile:

1 Like

It seems that preparations are for read actions only. Is there something more generic that would allow the correct repo to be used for actions that write to the database?

In our preparation we have tried to use the process-scoped setting Ecto.Repo.put_dynamic_repo/1 (this works from IEx such that MyApp.DynamicRepo.query!(some_sql) is sent to the correct database) however queries generated by Ash give a permission denied error. It looks like Ash is running the query in a different process to that which ran the preparation with put_dynamic_repo.

Is there some way to intercept a query in the process that runs it, in a way that applies to all action types, not just read? If I could put_dynamic_repo there I think it would work.

So, there are two relevant answers here.

Changes are the equivalent of preparations for other action types

Preparations are for read actions only, but changes can do the same thing for all other action types.

changes do
  change SetDynamicRepo
end

Keeping process context

For this, you need to add an Ash.Tracer. Ash.Tracer can ā€œmoveā€ process context from one process to any process that Ash starts.

defmodule YourApp.DynamicRepoTracer do
  use Ash.Tracer

  def get_process_context() do
     get_dynamic_repo()
  end

  def set_process_context(repo) do
    set_dynamic_repo(repo)
  end
end

And then you can configure the tracer statically:

config :ash, tracer: [YourApp.DynamicRepoTracer]

YMMV on using put_dynamic_repo, but will be interested to hear how it goes if you go that route.

1 Like

Thanks for the advice @zachdaniel, it didnā€™t pan out that way but we did find something that works.

Ash.Tracer has get_span_context/0 and set_span_context/1 however the problem remained of finding a nice way to execute custom code to fetch the desired repo from the span context in the querying process.

I also looked into supplying a function instead of a module name to the repo declaration in the postgres block of our resources, however that function only receives the resource and the operation type, not the queryable which is what we need to decide which repo to use.

My colleague wrote a module that implements various functions from using Ecto.Repo, and in each one checks for the data emplaced by our preparation, uses that to get the right repo and passes it on to Ecto.Repo.Queryable.

I wasnā€™t comfortable with that approach but unable to find another way, settled on something similar. I wrote a wrapper module that walks the module_info(:exports) for the target module in the __using__ macro, wrapping every user function in the target module sot that if the first argument has the repo identifier within it, the repo is obtained and put_dynamic_repo/1 called with it. Then the original function is called unconditionally.

This late binding of the indicated repo to the dictionary of the process calling the repo functions is the only thing weā€™ve found that works.

This definitely sounds unideal and I feel confident we could find a way to do this better. Would it be at all possible for you to make a small example repo that has one or two resources and has the same dynamic repo set up?

Iā€™ll see what I can do, maybe on the weekend. For multi-tenancy in Ash there is the postgresql schema approach, but what weā€™re trying for is a different database per tenant (and so, each requires a different connection/process and the whole put_dynamic_repo/1 rabbit hole). I know database-per-tenant has its own trade-offs but weā€™re integrating with a pre-existing system so weā€™re stuck with it, for some resources, at least for now.

Thanks for your input and more importantly thanks for Ash!

Totally, makes sense. We ultimately want Ash to be able to support all of those models too, even though naturally database per tenant will be a bit more challenging than the alternatives. I could even see a case for building database-per-tenant multi tenancy into ash_postgres as a first class thing at some point.

2 Likes

Tragically the fix that I found was working in Ash v2.18.2 and Ash Postgres v1.4.0, but upgrading to v2.19.x and v1.5.x respectively broke it.

It seems that the refactoring done in Ash to improve performance and run queries async has lead to calls to in_transaction? being made when no arguments are present indicating which database to use.

Iā€™ve created a git repository that doesnā€™t attempt this macrology to wrap the repo, and instead just tries calling put_dynamic_repo/1 from a preparation, which of course does not work:

Perfect, I will look at this this week, thanks.

1 Like

BTW weā€™ve tried this in a preparation:
Ash.Query.set_context(query, %{data_layer: %{repo: dynamic_repo_pid}})

However it leads to ArgumentError 1st argument: not an atom from AshPostgres.DataLayer.run_query/2. I believe that this is because repo is expected to be an atom indicating a repo module, not a pid corresponding to a connection using a repo module.

[edit: thatā€™s a non-starter but led on to a possible solutionā€¦]

I found a minimal change to Ash.ProcessHelpers that makes carrying the ecto dynamic repo info around possible:

That enables this preparation to work:

      prepare fn query, _context ->
        if %{arguments: %{tenant: tenant}} = query do
          pid = Dyndb.Repo.get_connection!(tenant)
          Dyndb.Repo.put_dynamic_repo(pid)
          Ash.set_context(%{dynamic_repo_module: Dyndb.Repo})
        end

        query
      end

So, I added a comment to your PR that you linked to, but I believe that the preparation you added plus a custom tracer ought to do what you need. The custom tracer example is here: Using an Ash Resource with a Dynamic Repository - #10 by zachdaniel

1 Like

Iā€™ve tried that again and Iā€™m running into the same problem as before - in the preparation I set the span context (which does a Process.put) and set the tracer in the query context, but then get_span_context (which would do a Process.get and then call put_dynamic_repo) is never called.

      prepare fn query, context ->
        if %{arguments: %{tenant: tenant}} = query do
          pid = Dyndb.Repo.get_connection!(tenant)
          Ash.Tracer.set_span_context(Dyndb.DynamicRepoTracer, {Dyndb.Repo, pid})
          Ash.Query.set_context(query, %{tracer: Dyndb.DynamicRepoTracer})
        else
          query
        end
      end

ā€¦and the tracerā€¦

defmodule Dyndb.DynamicRepoTracer do
  use Ash.Tracer

  @impl Ash.Tracer
  def get_span_context() do
    {repo_module, repo_pid} = Process.get(:dynamic_repo_tracer)
    repo_module.put_dynamic_repo(repo_pid)
  end

  @impl Ash.Tracer
  def set_span_context({_repo_module, _repo_pid} = dynamic_repo) do
    Process.put(:dynamic_repo_tracer, dynamic_repo)
  end
end

Did you configure your app to use the tracer?

config :ash, tracer: [Dyndb.DynamicRepoTracer]

Apologies if I forgot to mention that.

1 Like

Hey I got it working with the tracer - having not used tracers before and couldnā€™t find examples of use was the problem. Now Iā€™m worried that the tracer is being called everywhere - is there a way to only turn on the tracer for the duration/scope of a dynamic repo query?

Thanks for all your help!

The preparation:

      prepare fn query, context ->
        if %{arguments: %{tenant: tenant}} = query do
          pid = Dyndb.Repo.get_connection!(tenant)
          Dyndb.DynamicRepoTracer.set_span_context({Dyndb.Repo, pid})
        end

        query
      end

ā€¦and the custom part of the tracerā€¦

  @impl Ash.Tracer
  def get_span_context() do
    Process.get(:dynamic_repo_tracer)
  end

  @impl Ash.Tracer
  def set_span_context({repo_module, repo_pid} = dynamic_repo) do
    Process.put(:dynamic_repo_tracer, dynamic_repo)
    repo_module.put_dynamic_repo(repo_pid)
    :ok
  end

The github repo has been updated.

I donā€™t think that there is a way currently to conditionally enable the tracer, or at least not a way that I think would be seamless for you. However, Process.get/1 is very cheap, so I think that if your set_span_context just handles the case where that value is nil then it shouldnā€™t be a concern.

1 Like