Today, I faced with something sounded really simple, but I’ve not found an efficient solution yet. Yesterday, I added bunch of new array column to one table in DB, unfortunately, old data didn’t have that field so it all null. I use Absinthe to create a small wrap around these data, so FE get all of the new array fields null. They are upset with me, so I said I will change these null to [].
I found some solutions but not really efficient:
I may update return of all resolve that use this Object (there are ton of these)
I may create resolves for each new list_of fields
I may migrate null data in DB to [] (the last choose)
object :a do
field :b, list_of(:string)
field :c, list_of(:string)
field :d, list_of(:string)
field :e, list_of(:string)
Well, the fundamental question is whether null is a valid value for that column or not. If it is a valid value, then the front end needs to deal with it. If it isn’t a valid value, then you should set a database level default and set a non null constraint.
Thanks you, really interesting article, I come up with lots of use cases with this approach. Back to my problem, migrating data in production DB is always a critical operation, I definitely want to avoid the risk, so I put in the last options. Moreover, the requirement for the default value of old records may change which mean another migration.
Normally, I think FE is the best suit for this case. They can change without downtime and less risk.
At the end of the day, I come up with the second solution with macro.
defmacro field!(name, type, default_value: default_value) do
quote do
field unquote(name), unquote(type) do
resolve(fn
%{unquote(name) => nil}, _, _ -> {:ok, unquote(default_value)}
%{unquote(name) => v}, _, _ -> {:ok, v}
end)
end
end
end