Optional but non-nullable fields in GraphQL - node.js

In an update to our GraphQL API only the models _id field is required hence the ! in the below SDL language code. Other fields such as name don't have to be included on an update but also cannot have null value. Currently, excluding the ! from the name field allows the end user to not have to pass a name in an update but it allows them to pass a null value for the name in, which cannot be allowed.
A null value lets us know that a field needs to be removed from the database.
Below is an example of a model where this would cause a problem - the Name custom scalar doesn't allow null values but GraphQL still allows them through:
type language {
_id: ObjectId
iso: Language_ISO
auto_translate: Boolean
name: Name
updated_at: Date_time
created_at: Date_time
}
input language_create {
iso: Language_ISO!
auto_translate: Boolean
name: Name!
}
input language_update {
_id: ObjectId!
iso: Language_ISO!
auto_translate: Boolean
name: Name
}
When a null value is passed in it bypasses our Scalars so we cannot throw a user input validation error if null isn't an allowed value.
I am aware that ! means non-nullable and that the lack of the ! means the field is nullable however it is frustrating that, as far as I can see, we cannot specify the exact values for a field if a field is not required / optional. This issue only occurs on updates.
Are there any ways to work around this issue through custom Scalars without having to start hardcoding logic into each update resolver which seems cumbersome?
EXAMPLE MUTATION THAT SHOULD FAIL
mutation tests_language_create( $input: language_update! ) { language_update( input: $input ) { name }}
Variables
input: {
_id: "1234",
name: null
}
UPDATE 9/11/18: for reference, I can't find a way around this as there are issues with using custom scalars, custom directives and validation rules. I've opened an issue on GitHub here: https://github.com/apollographql/apollo-server/issues/1942

What you're effectively looking for is custom validation logic. You can add any validation rules you want on top of the "default" set that is normally included when you build a schema. Here's a rough example of how to add a rule that checks for null values on specific types or scalars when they are used as arguments:
const { specifiedRules } = require('graphql/validation')
const { GraphQLError } = require('graphql/error')
const typesToValidate = ['Foo', 'Bar']
// This returns a "Visitor" whose properties get called for
// each node in the document that matches the property's name
function CustomInputFieldsNonNull(context) {
return {
Argument(node) {
const argDef = context.getArgument();
const checkType = typesToValidate.includes(argDef.astNode.type.name.value)
if (checkType && node.value.kind === 'NullValue') {
context.reportError(
new GraphQLError(
`Type ${argDef.astNode.type.name.value} cannot be null`,
node,
),
)
}
},
}
}
// We're going to override the validation rules, so we want to grab
// the existing set of rules and just add on to it
const validationRules = specifiedRules.concat(CustomInputFieldsNonNull)
const server = new ApolloServer({
typeDefs,
resolvers,
validationRules,
})
EDIT: The above only works if you're not using variables, which isn't going to be very helpful in most cases. As a workaround, I was able to utilize a FIELD_DEFINITION directive to achieve the desired behavior. There's probably a number of ways you could approach this, but here's a basic example:
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const { args: { paths } } = this
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
for (const path of paths) {
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
}
return resolve.apply(this, resolverArgs)
}
}
}
Then in your schema:
directive #nonNullInput(paths: [String!]!) on FIELD_DEFINITION
input FooInput {
foo: String
bar: String
}
type Query {
foo (input: FooInput!): String #nonNullInput(paths: ["input.foo"])
}
Assuming that the "non null" input fields are the same each time the input is used in the schema, you could map each input's name to an array of field names that should be validated. So you could do something like this as well:
const nonNullFieldMap = {
FooInput: ['foo'],
}
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const visitedTypeArgs = this.visitedType.args
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
visitedTypeArgs.forEach(arg => {
const argType = arg.type.toString().replace("!", "")
const nonNullFields = nonNullFieldMap[argType]
nonNullFields.forEach(nonNullField => {
const path = `${arg.name}.${nonNullField}`
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
})
})
return resolve.apply(this, resolverArgs)
}
}
}
And then in your schema:
directive #nonNullInput on FIELD_DEFINITION
type Query {
foo (input: FooInput!): String #nonNullInput
}

Related

Can a type definition have a default value in Gatsby?

Reading the docs on Customizing the GraphQL Schema I'm trying to see if I have frontmatter, code:
---
title: Sample Post
date: 2019-04-01
fooId:
---
is it possible to set a default value for fooId? If I live it empty in the markdown file I get:
Cannot query field "fooId" on type "MdxFrontmatter".
If you don't expect "youTubeId" to exist on the type "MdxFrontmatter"
it is most likely a typo. However, if you expect "youTubeId" to exist
there are a couple of solutions to common problems:
If you added a new data source and/or changed something inside gatsby-node/gatsby-config, please try a restart of your development
server.
You want to optionally use your field "fooId" and right now it is not used anywhere.
It is recommended to explicitly type your GraphQL schema if you want
to use optional fields.
Attempt
exports.createSchemaCustomization = ({ actions, schema }) => {
const { createTypes } = actions
const typeDefs = [
'type MarkdownRemark implements Node { frontmatter: Frontmatter }',
schema.buildObjectType({
name: 'Frontmatter',
fields: {
tags: {
type: '[String!]',
resolve(source) {
const { fooId } = source
if (fooId === null) return 'foo'
return fooId
},
},
},
}),
]
createTypes(typeDefs)
}
When I implement the above code I still get the same error in the terminal. Is there a way in gatsby-node.js I can default fooId?
Try it like this:
exports.createSchemaCustomization = ({ actions }) => {
const { createTypes } = actions
const typeDefs = `
type MdxFrontmatter implements Node {
fooId: String
}
`
createTypes(typeDefs)
}
Is not a "default" value per se as you mention but using type definitions you are able to customize the expected outcome of the Node when fetched. By default, all (mostly) the values are set as non-nullable (in the case above as String!). Using the previous type definition, you are setting the fooId as a nullable value, meaning that is not required, without the exclamation mark, !, what represents the nullability/non-nullability, allowing the fooId to be empty.
Just wanted to point out that if you use exports.sourceNodes in Gatsby 4.19.2:
exports.sourceNodes = ({ actions }) => {
const { createTypes } = actions
const typeDefs = `
type MdxFrontmatter implements Node {
fooId: String
}
`
createTypes(typeDefs)
}
you'll get a deprecation warning which was originally posted and to prevent this issue you should use createSchemaCustomization:
exports.createSchemaCustomization = ({ actions }) => {
const { createTypes } = actions
const typeDefs = `
type MdxFrontmatter implements Node {
fooId: String
}
`
createTypes(typeDefs)
}

How to create multilevel nested queries in nestjs/graphql using #ResolveField?

Hello. I can't figure out how to create multiple levels of nested queries with #ResolveFiled. I hope for your help. 🙏
What I'm doing and Context:
I have a product. The product has a supplier. A vendor-specific product contains product variants. Variants contain options.
I need to make a request in 4 levels:
Product
ProductHasProvider
Product Variants
Variant Options
I use the "Code First" approach, created an ObjectType for each entity. Next, I create a "Product" resolver.
Creating a second level "ProductHasProvider" with #ResolveField
When adding a ResolveField ("Providers") - it appears inside the main resolver "Product" and resolves the Providers ObjectType. Okay, it works, I can make requests at the 2nd level correctly.
#ResolveField('Providers', () => [ProductHasProvider])
async getProductProviders (#Parent() product: Product) {
const { id } = product;
return await this.productListService.ProductsProviders( { id });
}
I want to make third level where ProductHasProvider has Variants. I decorate ProductHasProvider as the parent.
#ResolveField(('variants'), type => [Variant])
async getVariants (#Parent() productHasProvider: ProductHasProvider) {
const { id } = productHasProvider;
return await this.productListService.getVariants({ id });
}
In this case, this ResolveField defines the ObjectType for [Variants], but for some reason at the first level. That is, in Apollo studio, the field is displayed in "Product". I can't query Variants for ProductHasProvider.
query Products {
getProducts {
Providers {
id
}
variants {
id
options {
id
}
}
}
}
Expected behavior:
I add a new #ResolveField(() => [Variants]) with "ProductHasProvider" parent (Which is already #ResorveField for Product). And I can do 3rd and 4th level queries.
query Products {
getProducts {
id
Providers {
id
variants {
id
options {
id
}
}
}
}
}
Please tell me what I'm doing wrong and how to achieve what I want. Thank you.🙏
#ResolveField is to be put in a Resolver, to specify how to return a specific field for a specific entity.
In your case, you have a Resolver for Products, in which you specify a #ResolveField for the field Providers.
I'm guessing that you are adding another #ResolveField in the same Resolver, and it will specify how to return another field of Products.
What you want is to create another Resolver, for Providers, in which you specify how to return the field variants.
Here is how it is working for me :
#Resolver('Product')
export class ProductsResolver {
#ResolveField('Providers', () => [ProductHasProvider])
async getProductProviders (#Parent() product: Product) {
const { id } = product;
return await this.productListService.ProductsProviders( { id });
}
}
#Resolver('Provider')
export class ProvidersResolver {
#ResolveField('variants', () => [Variant])
async getProductProviders (#Parent() provider: ProductHasProvider) {
const { id } = provider;
return await this.variantsService.getVariantForProvider( { id });
}
}

Wrong data from client passes GraphQL validation

I've made simple CRUD app with React and Apollo client on NestJS server with GraphQL API.
I have this simple Mutations:
schema.gql:
type Mutation {
createUser(input: CreateUserInput!): User! // CreateUserInput type you can see in user.input.ts below
updateUser(id: ID!, input: UpdateUserInput!): User!
deleteUser(id: ID!): User!
}
user.input.ts:
import { InputType, Field } from "#nestjs/graphql";
import { EmailScalar } from "../email.scalar-type";
#InputType()
export class CreateUserInput {
// EmailScalar is a custom Scalar GraphQL Type that i took from the internet and it worked well
#Field(() => EmailScalar)
readonly email: string;
#Field()
readonly name: string;
}
"EmailScalar" type checks if "email" input has *#*.* format basically
And when i make createUser Query to GraphQL API like this:
It cannot pass validation
(because Email type works fine)
But when Query sent from client - it passes validation:
NestJS server log (from code below)
users.resolver.ts:
#Mutation(() => User)
async createUser(#Args('input') input: CreateUserInput) { // Type from user.input.ts
Logger.log(input); // log from screenshot, so if it's here it passed validation
return this.usersService.create(input); // usersService makes requests to MongoDB
}
And it gets into MongoDB
Here is client side part:
App.tsx:
...
// CreateUserInput class is not imported to App.tsx (it is at server part) but it seems to be fine with it
const ADD_USER = gql`
mutation AddMutation($input: CreateUserInput!) {
createUser(input: $input) {
id
name
email
}
}
`
function App(props: any) {
const { loading, error, data } = useQuery(GET_USERS);
const [addUser] = useMutation(
ADD_USER,
{
update: (cache: any, { data: { createUser } }: any) => {
const { users } = cache.readQuery({ query: GET_USERS });
cache.writeQuery({
query: GET_USERS,
data: {
users: [createUser, ...users],
},
})
}
}
);
...
if (loading) return <p>Loading...</p>;
if (error) return <p>Error :(</p>;
return <UserTable users={data.users} addUser={addUser} updateUser={updateUser} deleteUser={deleteUser} />;
}
Can someone please explain to me, how does client Query passes validation and what have i done wrong?
Even two empty strings can pass through.
Never worked with NestJS, Apollo, React or GraphQL before, so I'm kinda lost.
For full code:
https://github.com/N238635/nest-react-crud-test
This is how your custom scalar's methods are defined:
parseValue(value: string): string {
return value;
}
serialize(value: string): string {
return value;
}
parseLiteral(ast: ValueNode): string {
if (ast.kind !== Kind.STRING) {
throw new GraphQLError('Query error: Can only parse strings got a: ' + ast.kind, [ast]);
}
// Regex taken from: http://stackoverflow.com/a/46181/761555
var re = /^([\w-]+(?:\.[\w-]+)*)#((?:[\w-]+\.)*\w[\w-]{0,66})\.([a-z]{2,6}(?:\.[a-z]{2})?)$/i;
if (!re.test(ast.value)) {
throw new GraphQLError('Query error: Not a valid Email', [ast]);
}
return ast.value;
}
parseLiteral is called when parsing literal values inside the query (i.e. literal strings wrapped in double quotes). parseValue is called when parsing variable values. When your client sends the query, it sends the value as a variable, not as a literal value. So parseValue is used instead of parseLiteral. But your parseValue does not do any kind of validation -- you just return the value as-is. You need to implement the validation logic in both methods.
It would also be a good idea to implement the serialize method so that your scalar can be used for both input and response validation.

`parseValue` are not called for input parameter of a customised scalar type

I define a schema like this:
const query = new GraphQLObjectType({
name: 'Query',
fields: {
quote: {
type: queryType,
args: {
id: { type: QueryID }
},
},
},
});
const schema = new GraphQLSchema({
query,
});
The QueryID is a customised scalar type.
const QueryID = new GraphQLScalarType({
name: 'QueryID',
description: 'query id field',
serialize(dt) {
// value sent to the client
return dt;
},
parseLiteral(ast) {
if (ast.kind === 'IntValue') {
return Number(ast.value);
}
return null;
},
parseValue(v) {
// value from the client
return v;
},
});
client query
query {
quote(queryType: 1)
}
I found that the parseValue method is not called when clients send query to my server. I can see parseLiteral is called correctly.
In most of the document I can find, they use gql to define schema and they need to put scalar QueryID in their schema definition. But in my case, I am using GraphQLSchema object for schema. Is this the root cause of that? If yes, what is the best way to make it works? I don't want to switch to gql format because I need to construct my schema at runtime.
serialize is only called when sending the scalar back to the client in the response. The value it receives as a parameter is the value returned in the resolver (or if the resolver returned a Promise, the value the Promise resolved to).
parseLiteral is only called when parsing a literal value in a query. Literal values include strings ("foo"), numbers (42), booleans (true) and null. The value the method receives as a parameter is the AST representation of this literal value.
parseValue is only called when parsing a variable value in a query. In this case, the method receives as a parameter the relevant JSON value from the variables object submitted along with the query.
So, assuming a schema like this:
type Query {
someField(someArg: CustomScalar): String
someOtherField: CustomScalar
}
serialize:
query {
someOtherField: CustomScalar
}
parseLiteral:
query {
someField(someArg: "something")
}
parseValue:
query ($myVariable: CustomScalar) {
someField(someArg: $myVariable)
}

#IsPhoneNumber() npm class validator how to add multiple countries code

In Nest js dto I want to validate user mobile number with multiple countries Regex. How can I do this?
#IsPhoneNumber('IN', {
message: (args: ValidationArguments) => {
if (args.value.length !== 10) {
throw new BadRequestException(`${args.value} Wrong Phone Number`);
} else {
throw new InternalServerErrorException();
}
},
})
Different countries has different length of phone numbers. And my suggestion is to keep list of country codes instead of custom regex. It's easier to maintain, and it's more readable. So solution is:
parse phone number
if it's valid check country code
if it's valid pass to next built-in decorator
So I've created my own decorator with libphonenumber-js
Usage in DTO:
export class PhoneDto {
#ToPhone
#IsString({ message: 'must be a valid number' })
readonly phone!: string;
}
Implementation:
import { Transform } from 'class-transformer';
import { parsePhoneNumberFromString } from 'libphonenumber-js';
const validCountries = ['US', 'UK'];
export const ToPhone = Transform(
(value: any) => {
if (typeof value !== 'string') return undefined;
const parsed = parsePhoneNumberFromString(value);
if (!parsed) return undefined;
if (!validCountries.includes(parsed.country)) return undefined;
return parsed.number;
},
{ toClassOnly: true },
);
And yes, this solution adds one more library, it could be slower (actually it depends on your countries list) because of parsing, but as I said before It's more readable and maintainable.
Passing a restricted set of locations to #IsPhoneNumber(region: string) is currently not supported.
Your only chance is to pass "ZZ" as region which will force users to enter numbers with the intl. prefix (see docs)

Resources