Best practice integrating Phoenix with existing database

I’m rather new to elixir and phoneix so the answer might be obvious. All the phoenix examples I have seen seems to assume a totally new project where the db migrations etc is handled by phoenix. I have a project running node which works well so I have no need to replace it all. But my idea was using phoneix for building the admin system for this project (and keeping node + svelte for the client facing part).

What is the best practice for doing that? Just do everything like normal but skip the migrating part? Any possibility to “introspect” the db in order to generate the different routes etc needed?

2 Likes

Hello and welcome,

You can skip migrations, but You need to write corresponding schemas and contextes.

Then write the UI. Routes depends if You choose normal controllers, or liveview.

Following to see some answers since I’m just behind you in terms of getting things up and running, but yes I have noticed this same issue. Actually it’s not even just Phoenix related; almost any framework these days that includes migrations just assumes you’ll be using it to manage the app from soup to nuts. I have no data to back this up, but I suspect a good bit of systems are more like the ones you describe.

Unless the app is very very simple, I don’t see how you could infer what routes should exist from the structure of the database alone. You could maybe generate some of the ecto schemas from database introspection, but routing is a matter of mapping external inputs through business logic, to the database. The database won’t tell you what inputs to expect or how to match them with business logic.

3 Likes

I created the migration files that correspond to the existing database tables. However, I modify the schema_migrations table by hand in the production database to select which migrations I want to be applied.

This allows me to create development and test databases and to apply selected migrations to the production database.

4 Likes

I recently migrated a Python/MySQL app to Phoenix and the only thing I really needed to do was use create_if_not_exists.

create_if_not_exists table(:thing)

Then the migrations work fine for tests but don’t do anything in production.

Because I was had to use MySQL I also had to wrap index creation in an environment check, but if you’re using Postgres I think create if not exists should work fine with indexes.

I had about 30 tables to create, it wasn’t exactly a fun task but it only needed doing once and going through all the detail by hand was quite beneficial to understanding everything.

5 Likes

If you only want to connect to that database, then just skip the migration part. You still need to define your schema, though, like

# Keep the module name short and meaningful
defmodule MyApp.Users.User do
  use Ecto.Schema

  # You need this if the primary key field isn't `id`,
  # or the type is not `:integer`,
  # or it is not auto generated by database.
  @primary_key {"the_primary_key_field_name", :integer, autogenerate: true}

  # The actual table name can be messy,
  # but usually you won't need it in your business logic.
  schema "the_user_table_name_in_the_existing_db" do
    # The `:source` option specifies the name of the actual db column name
    field :full_name, :string, source: "name"
  end
end

For more info, check Ecto.Schema — Ecto v3.7.1

3 Likes

Chances are if you write tests that you’re going to need at least some migrations to bring the test db up to where it should be to be able to, well, test stuff. Luckily, when connecting to an existing db, you have several options to make migrations only for test purposes, and only the ones you actually need for the part of the db your app interacts with. One is to put them into a different folder, say priv/repo/test_migrations:

mix ecto.gen.migration <YourMigrationName> --migrations-path priv/repo/test_migrations

Then you’ll have to specify that path when running tests.
In your use case, considering you’re building the admin for an existing app in phoenix- I don’t know if auth exists in your node app (if it does, it won’t be easy using hashed passwords generated in the node app to log in to the admin from phoenix :slight_smile: ), but chances are you’ll either need a user table in your existing db that your phoenix app uses to persist user data, or your phoenix app connects to both the existing db and its own user db. How you connect to two databases (don’t worry, it’s easy) is maybe a little out of scope here, but you can create and apply migrations to select databases only:

mix ecto.gen.migration <YourMigrationName> -r <RepoName> 
mix  ecto.migrate -r <RepoName>
1 Like

It seems like mix ecto.dump and mix ecto.load are far better suited for this. Do a dump from the existing database, then before tests do a load, and that’s it, no need to manage extra migrations.

9 Likes

Nice one I wasn’t aware of.

I haven’t tried it, but isn’t there a chance that it’s going to take a while to dump and load a complete db, especially if the project is large?

Also, wouldn’t it be preferable to start tests with an empty db? Just asking.

The dump and load tasks only dump the structure of the database, not it’s contents.

4 Likes

Fantastic- again something i wasn’t aware of!

For your first migration, run “create if not exists” DDL queries for your existing tables. Then proceed as usual.

I’m not seeing how this is possible if the other application continues to create new migrations. You’d have to regularly mirror those migrations with create if not exists style changes which isn’t particularly usual or always easy.

The easiest and safest options is to designate one of the applications the official DB migrator, and the other should always just work off of dumps of the SQL structure.

Oh definitely, if you’re going to have heterogeneous languages/migration tools you’ll have to designate one as being in charge of migrations, if admin system is going to be the source of truth and implemented in elixir then might as well go with ecto migrations and dump knex or whatever is being used on node side.

1 Like

Thanks for all the replies! Helped me better understand how everything is connected. As it looks now I might actually just build everything using phoenix because damn the developer speed is so much faster than my previous setup (Sveltekit + node backend).

4 Likes