Generalization & Specialization and Changesets

I am confused in finding the best approach to my problem.

I have a table called accounts which is pretty much the Generalization table. Many of the fields values actually depends of the specialized record like type which is an ENUM but based on the specialized record, the same with status.

defmodule CreateAccounts do
  use Ecto.Migration

  def change do
    create table(:accounts) do
      add(:username, :string, null: false)
      add(:encrypted_password, :string, null: false)

      add(:uid, :string, null: false, size: 36)
      add(:type, :string, null: false, size: 25) # it will depends of the specialization
      add(:status, :string, null: false, size: 25) # it will depends of the specialization

      add(:data, :map)

      timestamps()
    end

    create(index(:accounts, [:username], unique: true))
    create(index(:accounts, [:uid], unique: true))
  end
end

So basically now I will have a bunch of specialized schemas like Costumer, Admin, Company or whatever I want, specially that data is just a :map so I can put whatever I want inside as JSON, so I dont have to deal with table columns, anyway.

So far I am good but I am confused in how should I manage my workflows and the changesets. I want to be able to say Customer.register_costumer(attrs) and that will validate the data according to the Customer specialization but I don’t know if I should replicate the schema on my Customer module and just put on Account module common functions around the generalization. Should I use embedded_schema or just schema on my specialization.

I am struggling because if I duplicate the schema definition then I will have to add all the fields again and stuff like that but if I am not using it then I need to transform at some point Customer to Account changeset I guess?

Sorry for the confusion, but I can’t explain what I dont understand. So I would love to know your feedbacks in how do you handle Generalization & Specialization and your workflows + changesets.

1 Like

You can have multiple schemas in multiple contexts all dealing with the same table but using different fields. With your map field you can use embedded context specific schemas and cast them in each of your schema, these can also have nested embeds if needed.

So nothing is stopping you from using a database table and a map field in whatever way you want (in different ways for different parts of the app), if I understood your issue correctly.

Just start small, create a couple contexts with couple schemas and nested embedded schemas, do some validation / casting and see what you get in the database, it will be clearer then.

1 Like

Got you, now should I put some validations on the Generalization module or should I duplicate code instead of try to reuse as much as I can (I know how it will work today but I am afraid of tomorrow becoming a mess)

1 Like

I never had any general module to be honest but I think I’d move the general stuff (like a changeset function, all kinds of special validations etc that more then one context needs) into a helper module that anyone can use, no duplication needed since everything is just functions and sets of functions (modules).

1 Like

upd I think I get what you mean by Generalization module, we’ve had a case when we needed content elements with each having an id, int order etc plus a map field with element specific stuff, I implemented it with using macro (kinda like GenServer) this way you can keep the common schema part in the behavior module and let the using modules define the specifics. Not sure if that is exactly your case but you might want to consider the possibility.

2 Likes

Around those lines …

1 Like

Here’s some simplified code, maybe it’ll help.

Behavior module:

defmodule App.Contents.Element do  
  @doc """
  Mandatory implementation of the embed data changeset.
  """
  @callback data_changeset(struct(), map()) :: Changeset.t

  @doc """
  Optional implementation of the whole embed changeset, this is the wrapper
  changeset that uses above data changeset but also handles files.

  Only override if some special tweaking is required on this level.
  """
  @callback embed_changeset(struct(), map()) :: Changeset.t

  # other  callbacks

  @doc false
  defmacro __using__(opts) do
    quote [location: :keep] do
      use Ecto.Schema

      import Ecto.Changeset
      
      @behaviour Element
      @opts unquote(opts)

      unless @opts[:data] do
        raise("Please define the fields as data options key")
      end

      @doc """
      Default implementation of the embed changeset.
      """
      @impl true
      @spec embed_changeset(struct(), map()) :: Changeset.t
      def embed_changeset(struct, params) do
        struct
        |> data_changeset(params)
        |> cast_uuids(params)
        |> handle_uuids()
        |> handle_files(params)
      end

      # other default callback implementations

      defoverridable Element

      schema "contents" do
        field :order, :integer
        field :type, :string

        # other globally needed feidls

        embeds_one :data, Data, [on_replace: :delete] do
          # no @opts available since technically it's a declaration of another
          # (child) module
          el_opts = unquote(opts)
          for {key, type} <- el_opts[:data] do
            field(key, type)
          end

          # further things like files etc
        end

        timestamps()
      end

      # private (not overridable) functions go here
  end
end

example module that uses it:

defmodule AppWeb.Elements.Headline do
  @moduledoc """
  A headline content element.
  """

  alias AppWeb.ElementView

  use App.Contents.Element, [
    data: [
      {:level, :integer},
      {:text, :string}
    ]
  ]

  @impl true
  def data_changeset(struct, params) do
    struct
    |> cast(params, [:level, :text])
    |> validate_required([:level, :text])
    |> validate_inclusion(:level, 1..6)
  end
  
  # other callbacks go here
end

So basically each “using” module is rewritten to a whole schema / embedded schema with duplicated code, but you only have to write it once and let it duplicate itself :slight_smile:

2 Likes

I like that, I will use the metaprogramming only for the schema definition only but I like this approach, thank you very much.

2 Likes