In my system, during a process step, I will generate a identical copy of a table (A) schema in a new table (B).
After this, I want to insert data into B, but since I don’t a elixir resource linked to that table, I need to do that using Repo.insert, that’s bothersome because all my custom types (uuid, atom, etc) will not cast automatically.
So I was wondering, is it possible, at runtime, for me to use the Ash resource for table A but write to table B?
I’m using that action in a bulk_create call, it will raise with an error that it couldn’t find the temp_entities table (but the table exists inside the temp schema), so it is only not using the schema.
If I add schema "temp" to the postgres code block, it works fine.
@zachdaniel I just noticed that Ash.Query has the same problem, setting the table_name works, but setting the schema doesn’t
Context: resolving data on fetch Pacman.Markets.Entity.read_test
* Context: resolving data on fetch Pacman.Markets.Entity.read_test
** (Postgrex.Error) ERROR 42P01 (undefined_table) relation "public.temp_entities" does not exist
action:
read :read_test do
prepare fn query, _ ->
Ash.Query.set_context(query, %{data_layer: %{table: "temp_entities", schema: "temp"}})
end
end