I am looking for a good way to enhance the strictness of unique constraint in my changeset. Basically I have a customer schema which I want to check the name to make sure no duplicates are created. With unique constraint this works if it matches exactly but not if there is an extra space or if they misspell the name or they use an underscore, etc.
Something like you would use when searching for a customer by name.
Maybe you can “hash” the data that need to be uniq end then execute an exist? on the hash.
The hash can remove all spaces, underscore etc.
It’s sound like a mdm needs. I don’t know any lib that can help you with that. But I’ll not use the BeamVM to perform that kind of behaviour for a large amount of data.
Doing it Elixir-side will put you right back into the “unsafe” part of unsafe_validate_unique.
The usual approach would be to normalize the value to a separate column, then enforce uniqueness on that.
For instance, if you wanted to enforce case-and-punctuation-insensitive uniqueness on a field called company_name you’d create a unique index on a normalized_company_name column containing company_name downcased with punctuation characters removed.
I believe (but have not tried) that correctly passing the name of that unique index to unique_index will work, ensuring that a constraint violation on normalized_company_name adds an error to company_name.