What is the standard way to work with existing databases and tables in Ecto in 2025?

I need to communicate to an existing database. I would like to have a schema for the table that I will query. Do I need to create the schema by hand? Or is there a standard tool to do this?

My searches show old libraries that are officially not supported. Other suggestions that encourages manually writing the schema seem pretty old. If manually doing it is the way, I am fine with that. I am trying to double check that is the standard solution for this problem

2 Likes

Back in the day I hand wrote a script to turn a pg_dump into a bunch of ecto files. It worked but that was 8+ years ago so I don’t think I have it anymore. Today though? I’d chuck a schema only pg_dump at an LLM and then review it afterward. Text transformations are its bread and butter.

5 Likes

there are the old ones plsm comes to mind, and then LLM - also worth noting ash can generate resources from existing DB ash_postgres/documentation/tutorials/set-up-with-existing-database.md at main · ash-project/ash_postgres · GitHub - even if you don’t need ash it might generate structured data for you (or an LLM)..

5 Likes

Thanks for the answers.

I went with a mixed solution. I got the LLM to generate a script that would will create the schema for a single table.

1 Like

You might also try with dynamic introspection if you ever need it: Endo — Endo v0.1.24

Not sure the library is very actively maintained anymore but if it is feature-complete then that does not matter.

There is also this: GitHub - cogini/ecto_extract_migrations: Elixir library to generate Ecto migrations from a PostgreSQL schema SQL file. Uses NimbleParsec and macro-style code generation.

Never tried it and it has not been updated in a long time but if you have the time, you might give it a shot.

None of these two address your need directly. But, in case you need a slightly different scenario in the future.

1 Like