hi, i have a lot of table in my db, is there any tool that automaticallly generate schema based on table column (with column types, etc)
something like, i just need to define credentials and database, and the tool will generate schema for all table
hi, i have a lot of table in my db, is there any tool that automaticallly generate schema based on table column (with column types, etc)
something like, i just need to define credentials and database, and the tool will generate schema for all table
@jihantoro It’s not so simple. To make full generation you would need to write it on yourself with your custom mapping to contexts and module names.
defmodule MyApp.SchemaGenerator.Rules do
@context_a_tables ["table_a1", "table_a2", "table_a3", …]
…
def context_for(tables) where tables in @context_a_tables, do: MyApp.AContext
…
def schema_module_part("table_a1"), do: :TableA1
…
end
and even that’s not all. You could later add some virtual fields (if any needed) … Also don’t forget about custom types, migrations and many more …
Creating full generator for every project is not possible. You could write a library, but people would change generator results anyway, so I don’t think that it have a bigger sense.
A quick Google shows me this tool: https://github.com/alexandrubagu/ecto_generator/blob/master/README.md
I’ve not used it, so I’m not sure of it’s effectiveness, but may be a good start, and then edit the generated files to fill in holes.
Firstly it uses template. I think that much better is simple macro here. It’s harder to read its code. Secondly it supports really low number of types. I can see only DateTime
, but no Date
and Time
. There is no array
support as well as json
/jsonb
support. Thirdly it generates field with float
type for decimal
database type. As said of course such tool does not support lots of other things which @jihantoro would need to change anyway. There could be a good scenic
tool which asks what to do in special cases, but such generator is too limited and there is no simple solution for all edge cases. Of course we could say that’s enough at start, but if you have really big number of different tables then ensuring that decimal
have proper declaration is not so easy too while project could still compile which causes lots of confusion. For example think how to properly support enum
inside json
/jsonb
column.
Possibly helpful posts;
I was looking for a tool like this too. Something that worked much like django’s inspect.db. I ended up writing some one off scripts and using phoenix’s mix phx.gen.schema.
You can obtain information about your database using something like:
select *
from INFORMATION_SCHEMA.TABLES
where table_schema = 'database';
and
select *
from INFORMATION_SCHEMA.COLUMNS
where TABLE_NAME = 'table_name';
I still ended up doing plenty manual work but this did save a lot of time. It was a one off thing it didn’t bother me too much. I don’t know how much that helps but it’s another option.
For note, I wouldn’t really consider such a tool very useful, schema’s should match the query patterns, which may not necessarily match the tables. In my case I do have a schema ‘near’ matching each table, but I also have a lot more schemas for all the return data from the queries to keep everything typed and clean. (Except in a few cases where I return maps/lists/tuples/whatever for dynamic or pipelining reasons.)