Use Ecto schema directly or convert to a custom struct?

Hello,

I have a question that probably is somewhat opinion-based.

Basically, when you have a schema for some persisted data in your database, do you use it directly in your code or do you convert it to another struct first?

Example

Say, I have a table to store financial data:

schema "candles" do
  field :timestamp, :utc_datetime_usec
  field :open, :decimal
  field :close, :decimal
  field :high, :decimal
  field :low, :decimal
  field :volume, :decimal
end

It’s typespec would be like this:

@type t :: %__MODULE__{
        __meta__: Ecto.Schema.Metadata.t(),
        id: integer | nil,
        timestamp: DateTime.t() | nil,
        open: Decimal.t() | nil,
        close: Decimal.t() | nil,
        high: Decimal.t() | nil,
        low: Decimal.t() | nil,
        volume: Decimal.t() | nil
      }

Normally what I do is create a struct like this in another module:

defmodule Candle do
  defstruct [:timestamp, :open, :close, :high, :low, :volume]
end

And whenever I read from the database, I always convert from the schema to the struct I created, and that struct is what is used in the rest of the program.

Do you think this is necessary or just a waste of CPU cycles? Can you see any advantages/disadvantages from each approach?

You’ve described what you’re doing, but not why. What advantage were you aiming for when setting up this pattern?

IMO this feels like boilerplate: you’re literally winding up with the same maps modulo a couple keys.

If you were feeling really fancy you could even do it with key-tweaking:

%CandleSchema{timestamp: ..., etc etc}
|> Map.from_struct()
|> Map.drop([:__meta__])
|> struct!(Candle)

Great question. This is similar to many discussions i’ve had at work in a few teams.

On the surface it seems like that would be extra work for not much gain. Especially as ecto is super flexible, allowing us to define a schema that only use some of the fields on the table, and allowing virtual fields.

However, I too use this approach, with one adjustment; I make the structs embedded schemas so all of them are ecto structs.

Why? So that i have a clear separation between business logic and the database.

I took a while to get used to the approach, it’s not necessary for every application, it adds some overhead.

But if you do it well where the data is stored is an implementation detail, allowing you yo easily pull things out and move them around without touching the domain logic.

I could have domain fields made from several db tables or even from different APIs.

That kind of approach is taking influence from DDD and hexagonal architecture, so could be more pros / cons in that literature.

1 Like

Sorry, indeed I forgot to list my reason haha

The main point that I see as somewhat advantages is that the custom struct is a less “noisy” struct because it only has the data that the business logic is interest in. Most of the system doesn’t need to see or use the __meta__ or id field for example.

Your code snippet is literally what I do, but I do remove the id too (and I do have other schemas that have more associations, etc which needs more work to do the conversion).

I also wrote a small lib that helps me with that sort of stuff if you are interested called ecto_morph. It handles nested embeds and associations automatically

I will take a look, thanks a lot!

Hi Adzz,
Could you please give me more detail example? I read the book " Functional Web Development with Elixir, OTP, and Phoenix" and I concern about how to combine Elixir structs for business logic design and Ecto schema.