-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calculate the query complexity based on how nested a field is (depth limitation) #96
Comments
BTW I forked this repo since I thought I might find some kind of clue there and open a PR, but I also love to hear what you know about this. |
So my first question now I know that I have access to something called But in runtime we have access to it: This object is the |
I had some progress, now I know I can calculate field nestedness level using {
getPosts {
id
author {
posts {
id
}
}
}
} I cannot differentiate the first |
Had some progress: https://stackoverflow.com/q/79297272/8784518 |
So I managed to do it. Here is how I did it: I have a Of course I am not using dataloader and I have the issue of N+1. But if you're interested in it check out my code here. Off-topic:
Any comment on these? |
BTW I was digging on this topic of depth limitation and I read here that this lib does check the complexity of a query by assigning a number to each field. So when I looked at my own code I realized that most likely I do not need even that lib since I am already calculating the depth of a field and incorporate it to the field's complexity score. So maybe we can add it to this lib so that people know how they can limit the complexity of a query by its depth + a static score. My current implementation look like this: @Field(() => UserDto, {
complexity({ childComplexity, node }: ComplexityEstimatorArgs) {
const depth = fieldDepth(node);
const complexity = 1 * depth + childComplexity;
return complexity;
},
})
author: UserDto; And here is the code for import { FieldNode } from 'graphql';
import { fieldDepthQueryNormalizer } from './field-depth-query-normalizer.util';
/**
* @description Calculates the depth of a field
*/
export function fieldDepth(node: Readonly<FieldNode>) {
if (!node.loc) {
throw 'EmptyNodeLocation';
}
if (!node.name.loc) {
throw 'EmptyNodeNameLocation';
}
const normalizedSourceBody = fieldDepthQueryNormalizer(
node.loc.source.body,
);
return (
normalizedSourceBody.slice(0, node.name.loc.start).split('{')
.length - 2
);
} And finally the import {
DefinitionNode,
DocumentNode,
Kind,
parse,
print,
} from 'graphql';
/**
* @description Strips the query from fragments
*/
export function fieldDepthQueryNormalizer(
query: Readonly<string>,
): string {
const ast = parse(query);
const definitionNodes: DefinitionNode[] = [];
const definitions = (ast as DocumentNode).definitions;
for (const definition of definitions) {
if (definition.kind !== Kind.OPERATION_DEFINITION) {
continue;
}
definitionNodes.push(definition);
}
return definitionNodes
.map((definitionNode) => print(definitionNode))
.join(' ');
} |
So I was trying to see how other calculate their query complexity and ended up in the hygraph website, as you can see in their doc they are saying that how they calculate the query complexity is like this:
So I tried to find out how I can do it with this lib but it seems it is not possible with the current API. I also wanna point out that I feel like it is not completely impossible but I kinda have no clue as to how should I go about this: "multiply their complexity times the level of nesting in the query."
Any comment?
Side note: In NestJS we can see the same recommendation about scalar fields, i.e. to add one point to the overall complexity per scalar field (ref).
The text was updated successfully, but these errors were encountered: