Because of how I’ve designed ibGib, I create an inordinate amount of duplicate records. This is unavoidable and by design. However, I accrue a lot of log messages like the following:
postgres | ERROR: duplicate key value violates unique constraint "ibgibs_ib_gib_index"
postgres | DETAIL: Key (ib, gib)=(query, 5ED058551DCFEE680FE114B0425129B469477A2842979C19D0E93B97E36436A1) already exists.
postgres | STATEMENT: INSERT INTO "ibgibs" ("data","gib","ib","rel8ns","inserted_at","updated_at") VALUES ($1,$2,$3,$4,$5,$6) RETURNING "id"
and here is the changeset validation code (not sure if it helps):
def changeset(content, params \\ :empty) do
content
|> cast(params, @required_fields, @optional_fields)
|> validate_required([:ib, :gib, :rel8ns])
|> validate_length(:ib, min: @min_id_length, max: @max_id_length)
|> validate_length(:gib, min: @min_id_length, max: @max_id_length)
|> validate_change(:rel8ns, &ValidateHelper.do_validate_change(&1,&2))
|> validate_change(:data, &ValidateHelper.do_validate_change(&1,&2))
|> unique_constraint(:ib, name: :ibgibs_ib_gib_index)
end
From my understanding, the changeset cannot test for the unique constraint without it having to hit the db and failing. That’s , but the log messages are piling up! Is there a way to configure ecto and/or postgrex to filter this particular message only? I the utility of the logging overall and I use it constantly, so I don’t want to disable logging completely. It’s just that this particular message doesn’t provide any value for my use case.
Thanks!