Skip to content

Fix issue with recursive input object definitions in SDL#4705

Closed
benjie wants to merge 2 commits into16.x.xfrom
fix-sdl-recursive-default
Closed

Fix issue with recursive input object definitions in SDL#4705
benjie wants to merge 2 commits into16.x.xfrom
fix-sdl-recursive-default

Conversation

@benjie
Copy link
Copy Markdown
Member

@benjie benjie commented May 1, 2026

Prior to this, the following would throw "Maximum call stack size exceeded":

import { buildSchema } from './npmDist/index.mjs';
const sdl = `
type Query {
  depth(person: PersonInput): Int
}

input PersonInput {
  parent: PersonInput = {parent: null}
}
`;

buildSchema(sdl);

The equivalent in native code would not:

import {
  GraphQLSchema,
  GraphQLInt,
  GraphQLObjectType,
  GraphQLInputObjectType,
  printSchema,
} from './npmDist/index.mjs';

const PersonInput = new GraphQLInputObjectType({
  name: 'PersonInput',
  fields: () => ({
    parent: {
      type: PersonInput,
      defaultValue: { parent: null },
    },
  }),
});

const schema = new GraphQLSchema({
  query: new GraphQLObjectType({
    name: 'Query',
    fields: {
      depth: {
        args: {
          person: {
            type: PersonInput,
          },
        },
        type: GraphQLInt,
      },
    },
  }),
});

console.log(printSchema(schema));

This means if you printSchema on a valid schema and try to rebuild it from SDL, the rebuild might fail (it will for the above schema). The reason being somewhat related to https://rfcs.graphql.org/rfcs/793/ - in code we treat the defaultValue as coerced, but in SDL we do the coercion of the default value ourselves, which requires us to know the definition of the types, which requires us to know the default value, which requires us to know the types, which [...infinite recursion...]

v17 solves this properly but required a breaking change. To address the issue in v16 I've made it so that the fields() thunk will not allow recursion (it'll return the currently-being-resolved value instead), and I've written everything except the defaultValue to the being-resolved object first, and finally I follow up with a "apply default value" pass which then adds the default values in.

I could not come up with SDL that constructs in the current v16 (i.e. doesn't throw the max depth error) that this would cause a divergence in behavior for, so I think it's safe, but I could do with another pair of eyes.

@vercel
Copy link
Copy Markdown

vercel Bot commented May 1, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Actions Updated (UTC)
graphql-js Ignored Ignored May 1, 2026 10:16am

Request Review

@benjie benjie marked this pull request as draft May 1, 2026 10:36
@benjie
Copy link
Copy Markdown
Member Author

benjie commented May 1, 2026

Hmmm... no it's not safe because it can lead to unexpected outputs:

import { buildSchema, printSchema } from './npmDist/index.mjs';

const sdl = `
type Query {
  field(in: A): Int
}

input A {
  b: B = {}
  label: String = "A"
}

input B {
  a: A = {b: null}
  tail: String = "B"
}
`;

const schema = buildSchema(sdl);
console.log(printSchema(schema));

Produces:

type Query {
  field(in: A): Int
}

input A {
  b: B = {a: {b: null}, tail: "B"}
  label: String = "A"
}

input B {
  a: A = {b: null}
  tail: String = "B"
}

I.e. the default value for B.a does not include the expected label: "A"

Worse when you start adding non-nulls things get even weirder:

import { buildSchema, printSchema } from './npmDist/index.mjs';
const sdl = `
type Query {
  field(in: A): Int
}

input A {
  b: B = {}
  label: String! = "A"
}

input B {
  a: A! = {b: null}
  tail: String = "B"
}
`;

const schema = buildSchema(sdl);
console.log(printSchema(schema));

produces:

type Query {
  field(in: A): Int
}

input A {
  b: B
  label: String! = "A"
}

input B {
  a: A!
  tail: String = "B"
}

The defaultValues have gone missing! This is due to the field.defaultValue !== undefined check in valueFromAST - the default value for this hasn't been resolved yet.

@benjie
Copy link
Copy Markdown
Member Author

benjie commented May 1, 2026

I think we're better off leaving the behavior as-is and waiting for v17 to fix it - we'll leave the current schema definitions as invalid.

(Might be worth turning the Maximum call stack size exceeded error into a pro-active error, but that's a separate concern.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

RangeError for combination of recursive input and default value

1 participant