Altering database structure at runtime

I have an app where users can create custom data stores, and after many different approaches (MongoDB, JSON column, single table as key value store…) I came to the conclusion that having actual database tables was the best approach.

My issue with ecto now is how I change the database schema. I can use SQL directly, but writing alter table SQL is a bit painful, so I was wondering if there was a way to use the Ecto migration code to generate SQL.

I tried something like:


|> Multi.run(:sync, fn repo, %{table: table} = ops ->
    Ecto.Migration.create Ecto.Migration.table(:"tbl_#{table.name}") do
      Ecto.Migration.timestamps()
    end
end)

but I get ** (RuntimeError) could not find migration runner process for #PID<0.523.0>

so before I dig further I’d like to get your advice on this.

1 Like

According to my brief read of the code & docs, Ecto.Migrator is the module designed for executing migrations at run-time. However, there’s no real discussion of assembling a migration at run-time to execute.

Migrations are also designed to run in sequence and be reversible so you may get in a tangle if they are only temporarily available.

If I was approaching this I’d probably dig around the migration source code and find the convenience functions that assemble the DDL from the migration definition and then bypass the rest of the migration infrastructure and execute the DDL directly. It looks like most of this mapping happens in the ecto_sql package in \lib\ecto\migration\runner.ex - check out the private command functions.

2 Likes

Thanks to @mindok pointers, I examined the source and found a way to do what I want.

The function I need is execute_ddl/1 which will turn structs into SQL. This functions takes commands which can be built either manually or with a few helpers.

I have not tried everything yet (index…), but I can create a table with the following code:

      cols =
        [
          {:add, "_id", :bigint, []},
          {:add, "_version", :bigint, []},
          {:add, "_inserted_at", :utc_datetime, []}
        ] ++
          Enum.map(cols, fn {k, v} ->
            {:add, v.db_name, v.db_type, []} # db_name is a string and db_type is an atom, the last argument are options
          end)

      {:create, Ecto.Migration.table(:"#{db.name}_#{table.name}"), cols}
      |> Ecto.Adapters.Postgres.Connection.execute_ddl()
      |> IO.iodata_to_binary()
      |> repo.query()

Of course I fully realize those API are private, but they do the work. I know my use case must be very marginal as this is a weird/bad/really specific practice, but maybe they could be exposed in public API.

I have not found how to do |> Ecto.Adapters.Postgres.Connection.execute_ddl() in a database agnostic way (from repo).

1 Like

I’m not sure it’s such an weird use case. I’d imagine it’s quite common in highly configurable platform software like ERPs or data visualisation platforms.

I also wonder if there may be a requirement to somehow get postgrex to rescan data base types (in case you added or modified a type) - postgrex does quite a bit of work at application start so I wonder how the that will interplay. I’m curious and interested what your experience ends up being on this.

Came across this post while looking for a way to create new schema and tables in a multi-tenant setup. We needed a way to do create new schema on the fly whenever a user creates an organisation. We have settled on using raw SQL queries to achieve this.

Leaving our solution here for reference and improvements.

def create_organisation(attrs \\ %{}) do
    %Organisation{}
    |> Organisation.changeset(attrs)
    |> Repo.insert()
    |> create_schema()
    |> create_devices_table()
    |> create_events_table()
    |> broadcast(:org_created)
  end

defp create_schema({:ok, %Organisation{} = organisation}) do
    # Create an org specific schema - this will be owned by the default db user
    schema_name = organisation.id
    sql = "CREATE SCHEMA \"#{schema_name}\";"
    Repo.query!(sql, [])
    {:ok, organisation}
  end

  defp create_devices_table({:ok, %Organisation{} = organisation}) do
    schema_name = organisation.id
    sql = """
      CREATE TABLE \"#{schema_name}\".devices (
        id VARCHAR PRIMARY KEY,
        inserted_at TIMESTAMP WITHOUT TIME ZONE NOT NULL DEFAULT NOW(),
        updated_at TIMESTAMP WITHOUT TIME ZONE NOT NULL DEFAULT NOW()
      );
    """
    Repo.query(sql)
    {:ok, organisation}
  end

  defp create_events_table({:ok, %Organisation{} = organisation}) do
    schema_name = organisation.id
    sql = """
      CREATE TABLE \"#{schema_name}\".events (
        id uuid DEFAULT uuid_generate_v4() PRIMARY KEY,
        device_id VARCHAR NOT NULL,
        data JSONB NOT NULL,
        inserted_at TIMESTAMP WITHOUT TIME ZONE NOT NULL DEFAULT NOW(),
        updated_at TIMESTAMP WITHOUT TIME ZONE NOT NULL DEFAULT NOW(),
        CONSTRAINT fk_device
          FOREIGN KEY(device_id)
        REFERENCES \"#{schema_name}\".devices(id)
      );
    """
    Repo.query(sql)
    {:ok, organisation}
  end
3 Likes