Abstract model / Polymorphism?

Having some trouble wrapping my head around how I could use Polymorphic associations like this.

Say I have a abstract Posts schema.
Then say I have all kinds of variations of posts like VideoPosts or ImagePosts ect.
I want to use the Posts schema as the common abstract record and then create the respective record with its more specific values.

That way I can do a simple query on all Posts where I need the common values regardless of their specific type but without doing complex UNION queries.

The common examples give the example of using comments on posts. The issue I’m running into is that is backwards for my type of associations and I’m not sure how I can flip it.

Ideally I will be able to create a record for any of the Complex schemas and have created a Abstract record that abstracts our “common fields” for that given Complex schema and have created a relation between the two records that allow for me to be able to query across all the complex schemas without building large and difficult to manage a UNION query to get all the common fields for our different complex schemas (IE just look up the Abstract table).

Issues I’d like to avoid:

  • Having to create large manifest of belongs_to/has_n per N number of columns for N number of Complex schemas my system may end up with.
  • Having to create and manage a large complex UNION query to look over all of the Complex schemas to get an aggregation of all the complex schemas. (ie looking up counts for all complex schemas ect.)
  • Coupling logic of the abstract layer to the complex layer. Say one complex schema has a wildly different need then that of another complex schema. I don’t want that concern to bleed into the abstract layer

I also created an example repo that I’m gonna use to explore this issue.


1 Like

Although I don’t get exactly, I think what you want to do is Table Inheriting.
The most easiest way to go is Single Table Inheritance, where you need posts table which has all the columns VideoPost and ImagePost will have. Imitating from your example:

Migration will look like

create table(posts) do
  add :concrete_table_type, :string # , null: false <- if Post not directly used
  add :common_field1, :string
  add :common_field2, :string
  add :foo, :string  # cannot be null: false
  add :bar, :string  # cannot be null: false
  add :baz, :string  # cannot be null: false


Schemas will be

defmodule Post do
  schema "posts" do
    field :common_field1
    field :common_field2

  # abstract logic here
  def abstract_logic(%Post{} = post), do: ...

  defmodule ConcretePostBehaviour do
    @callback table_type() :: String.t
defmodule VideoPost do
  @behaviour Post.ConcreatePostBehvaiour

  schema "posts" do       # Use the same table as above
    field :common_field1
    field :common_field2
    field :foo

  @impl Post.ConcretePostBehaviour
  def table_type, do: "VideoPost"

  def changeset(post, attrs) do
    |> cast([:common_field1, :common_field2, :foo])
    |> put_change(:concrete_table_type, table_type())
      # NOT NULL of foo cannot be defined in STI so define here
    |> validate_required([:foo, :concrete_table_type])

With this you can

def list_posts, do: %Post{} |> Repo.all  # Gets all the abstract data
def list_video_posts do
  |> where(:concrete_table_type, "VideoPost")
  |> Repo.all

Because it is not objects and inheritance, you cannot do Post.abstract_logic() to %VideoPost{}, but you can always create a Protocol for the concrete types, or create a macro inside the abstract module and require it.
Well I mean you can do video_post |> Post.abstract_logic() if you don’t do guarding on that function, but it is always safer to define a protocol for that :slight_smile:


Side-stepping the Elixir-specifics for a bit since they are not really important here:

In any relational database there are basically two ways you can ‘emulate’ polymorphism:

  1. Have one table with all fields of the different variants of structures. Advantage: Everything is in one place. Disadvantage: Many fields will be empty in each structure.
  2. Have one table with the common fields and for each special structure a separate table with the structure-specific fields that references the former. Advantage: Clear separation of structure data. Disadvantage: Many tables; extra fetching logic is needed.
  3. not really a relational solution: Pretending the database is not relational, and dumping all non-relational data in one column (as e.g. JSON or another format), which makes querying nigh impossible. (Although some databases like Postgres have some non-relational features that make this somewhat feasible in some cases).

I think there’s one more:
Have all common fields plus only their own fields in each table.


  • This leads to no null columns (unless you accept it)
  • Fetching each logical table can be done easily


  • Common fields live across many fields, so unique constraints must be done at the application level.
  • The polymorphism must be done mostly in the application level

You can use a behavior module that defines common stuff (fields, changeset functions, etc) and let the using modules define the specifics in a flexible embed, we have discussed it here Generalization & Specialization and Changesets


Mapping Objects to Relational Databases: O/R Mapping In Detail (2002)
2. Mapping Inheritance Structures

i.e. polymorphic entities

2.6 Comparing The Strategies


Thank you all this was very helpful. :smile: