Hello, we’ve been investigating the performance of one of our queries - basically it’s used to list products and categories like you’d see on an e-commerce site. On our dev environment it takes roughly 10s, and with tracing I can see that the first ~2 seconds are spent fetching the main body of data (DB queries etc). Then at the end Absinthe.Phase.Document.Execution.Resolution takes up around 6 to 7s and Absinthe.Phase.Document.Result takes around 500ms.
We’re a bit surprised that the bulk of the request time is spent resolving the fields - in order to check that our resolvers aren’t the problem, we moved all the processing and logic into the context function, so that the fields are just resolved by Map.get/2
Is it just expected for the resolution phase to take this long? It is a big query to be fair (if I prettify the response it takes up 87,056 lines)
I’m not an expert on absinthe or graphql but from my experience field resolution always takes the most time.
If I put an expensive operation in a field resolver it multiplies when resolving this field in 100 entities.
Hey there @volroom to sanity check a few things here, you’re saying that all of the actual IO to load data has already been done?
Some additional context that would be helpful.
-
How big is the query itself? Can you share it?
-
How big is the result JSON (in bytes)
-
Are you using the Persistent Term backend? This helps a lot with memory efficiency which can help.
Hi @benwilson512
Yes, we moved all IO & data wrangling right to the beginning.
Here’s the query for context:
query Categories($id: UUID, $slug: String, $fulfillmentDatetime: UTCTimestamp, $fulfillmentType: FulfillmentType) {
store(id: $id, slug: $slug) {
categories(
fulfillmentDatetime: $fulfillmentDatetime
fulfillmentType: $fulfillmentType
) {
id
name
description
products {
id
allergens
calorieData {
caloriesPerServing
}
defaultVariantId
description
images {
original
standard
thumb
}
inStock
limit
dietaryRequirements
modifierGroups {
id
name
maximum
minimum
modifiers {
id
allergens
image {
original
standard
thumb
}
inStock
name
price {
base
discounted
reduction
}
restrictions {
alcoholic
}
}
}
name
options {
name
values
}
pricing {
lowestVariant {
base
discounted
reduction
}
maximum {
base
discounted
reduction
}
minimum {
base
discounted
reduction
}
}
promotion {
amount
discountType
}
quickAddAllowed
seoDescription
slug
restrictions {
alcoholic
}
variants {
allergens
id
inStock
name
limit
pricing {
absolute {
base
discounted
reduction
}
relative {
base
discounted
reduction
}
}
restrictions {
alcoholic
}
options {
name
value
}
}
}
}
}
}
Which returns an output of 3071.81KB
We aren’t using the Persistent Term backend, will give that a go.
Worth pointing out that the times I quoted in my OP are on a very CPU-constrained test environment, not production.
Persistent Term doesn’t seem to have had a significant impact
3 megs is a TON of data to ship via GraphQL. Unlike just shipping regular JSON every return value is type checked against the schema, errors are tracked at the node level, and so on. While there are hypothetical optimizations I have wanted to do in the resolution phase they’d involve a level of refactor I just haven’t had time for to date.
Overall I’d find a way to maybe paginate products if that’s the big field or something.
2 Likes