Hi, does anyone knows how to do lazy loading with multiple data sources using absinthe data loader?
object :user do
# always known
field :id, non_null(:id), description: "User ID"
# resolved using API
field :username, non_null(:string), description: "Username of the user"
field :avatar, :string, description: "Avatar of the user"
# resolved using DB
field :last_seen, :datetime, description: "Last seen date of user"
end
I ve the following object, which has certain fields to be loaded from an external API
I thought to use dataloader on loading the object from API and reusing across the query
So there need to be field level resolvers from username and avatar(both should use dataloader to fetch the user by id and return appropriate data). Also sometimes relevant data can be present already
But as i remember, If i call dataloader.run
its not batching requests, how we can use dataloader with batching?(user_id = 1 should be only loaded 1 time during the query)
My current code
def resolve_username(%{username: username}, _args, _resolution)
when is_binary(username) do
{:ok, username}
end
def resolve_username(parent, _args, resolution) do
loader = resolution.context.loader
user_id = parent.id
# what to do here to load user from API with batching
end
def load_user_by_id(loader, id) do
loader
|> Dataloader.load(:user, :user, id)
|> on_load(fn loader ->
loader
|> Dataloader.get(:user, :user, id)
|> case do
%UsersBulkResponse.User{} = user ->
{:ok, user}
%UsersBulkResponse.Error{error: err} ->
AbsintheErrors.error(err)
end
end)
end