GraphQL custom directive without declaring explicitly in schema - node.js

I am trying to implement a custom GraphQL directive. My understanding is that if my SchemaDirectiveVisitor subclass implements static getDirectiveDeclaration(directiveName, schema) then I don't have to manually declare the directive in my SDL (Schema Definition Language).
Because AuthDirective implements getDirectiveDeclaration, it’s no longer necessary for the schema author to include the directive #auth ... declaration explicitly in the schema. The returned GraphQLDirective object will be used to enforce the argument types and default values, as well as enabling tools like GraphiQL to discover the directive using schema introspection. Additionally, if the AuthDirective class fails to implement visitObject or visitFieldDefinition, a helpful error will be thrown.
Source: https://blog.apollographql.com/reusable-graphql-schema-directives-131fb3a177d1
and
However, if you’re implementing a reusable SchemaDirectiveVisitor for public consumption, you will probably not be the person writing the SDL syntax, so you may not have control over which directives the schema author decides to declare, and how. That’s why a well-implemented, reusable SchemaDirectiveVisitor should consider overriding the getDirectiveDeclaration method
Source: https://www.apollographql.com/docs/apollo-server/features/creating-directives.html
In my code, despite having implemented static getDirectiveDeclaration(directiveName, schema) I still have to declare the directive in SDL.
Shouldn't it work without manually declaring in SDL?
Full Example Code:
const { ApolloServer, gql, SchemaDirectiveVisitor } = require('apollo-server');
const { DirectiveLocation, GraphQLDirective, defaultFieldResolver } = require("graphql");
class UpperCaseDirective extends SchemaDirectiveVisitor {
static getDirectiveDeclaration(directiveName, schema) {
console.log("inside getDirectiveDeclaration", directiveName)
return new GraphQLDirective({
name: directiveName,
locations: [
DirectiveLocation.FIELD_DEFINITION,
],
args: {}
});
}
visitFieldDefinition(field) {
console.log("inside visitFieldDefinition")
const { resolve = defaultFieldResolver } = field;
field.resolve = async function (...args) {
const result = await resolve.apply(this, args);
if (typeof result === 'string') {
return result.toUpperCase();
}
return result;
};
}
}
const books = [
{
title: 'Harry Potter and the Chamber of Secrets',
author: 'J.K. Rowling',
},
{
title: 'Jurassic Park',
author: 'Michael Crichton',
},
];
const typeDefs = gql`
#########################################
# ONLY WORKS WITH THIS LINE UNCOMMENTED #
#########################################
directive #upper on FIELD_DEFINITION
type Book {
title: String
author: String #upper
}
type Query {
books: [Book]
}
`;
const resolvers = {
Query: {
books: () => books,
},
};
const server = new ApolloServer({
typeDefs,
resolvers,
schemaDirectives: {
upper: UpperCaseDirective
}
});
server.listen().then(({ url }) => {
console.log(`🚀 Server ready at ${url}`);
});

I have the same problem and was able to find this comment from graphql-tools issue #957.
From the changelog:
NOTE: graphql 14 includes breaking changes. We're bumping the major version of graphql-tools to accommodate those breaking changes. If you're planning on using graphql 14 with graphql-tools 4.0.0, please make sure you've reviewed the graphql breaking changes list.
This is likely caused by the fact that graphql-js now requires you to define your directives in your schema, before you attempt to use them. For example:
directive #upper on FIELD_DEFINITION
type TestObject {
hello: String #upper
}
You can likely work around this by pre-defining your directives in your schema, but I'd like to confirm this. If this works, we'll need to update the docs.

Related

Can a type definition have a default value in Gatsby?

Reading the docs on Customizing the GraphQL Schema I'm trying to see if I have frontmatter, code:
---
title: Sample Post
date: 2019-04-01
fooId:
---
is it possible to set a default value for fooId? If I live it empty in the markdown file I get:
Cannot query field "fooId" on type "MdxFrontmatter".
If you don't expect "youTubeId" to exist on the type "MdxFrontmatter"
it is most likely a typo. However, if you expect "youTubeId" to exist
there are a couple of solutions to common problems:
If you added a new data source and/or changed something inside gatsby-node/gatsby-config, please try a restart of your development
server.
You want to optionally use your field "fooId" and right now it is not used anywhere.
It is recommended to explicitly type your GraphQL schema if you want
to use optional fields.
Attempt
exports.createSchemaCustomization = ({ actions, schema }) => {
const { createTypes } = actions
const typeDefs = [
'type MarkdownRemark implements Node { frontmatter: Frontmatter }',
schema.buildObjectType({
name: 'Frontmatter',
fields: {
tags: {
type: '[String!]',
resolve(source) {
const { fooId } = source
if (fooId === null) return 'foo'
return fooId
},
},
},
}),
]
createTypes(typeDefs)
}
When I implement the above code I still get the same error in the terminal. Is there a way in gatsby-node.js I can default fooId?
Try it like this:
exports.createSchemaCustomization = ({ actions }) => {
const { createTypes } = actions
const typeDefs = `
type MdxFrontmatter implements Node {
fooId: String
}
`
createTypes(typeDefs)
}
Is not a "default" value per se as you mention but using type definitions you are able to customize the expected outcome of the Node when fetched. By default, all (mostly) the values are set as non-nullable (in the case above as String!). Using the previous type definition, you are setting the fooId as a nullable value, meaning that is not required, without the exclamation mark, !, what represents the nullability/non-nullability, allowing the fooId to be empty.
Just wanted to point out that if you use exports.sourceNodes in Gatsby 4.19.2:
exports.sourceNodes = ({ actions }) => {
const { createTypes } = actions
const typeDefs = `
type MdxFrontmatter implements Node {
fooId: String
}
`
createTypes(typeDefs)
}
you'll get a deprecation warning which was originally posted and to prevent this issue you should use createSchemaCustomization:
exports.createSchemaCustomization = ({ actions }) => {
const { createTypes } = actions
const typeDefs = `
type MdxFrontmatter implements Node {
fooId: String
}
`
createTypes(typeDefs)
}

How can I use the fields in a GraphQL query to perform nested reads with Prisma?

I'm using Prisma to implement a GraphQL interface to expose some data stored in a PostgreSQL database. My code is inspired by the GraphQL Tools (SDL-first) example. This logic is pretty inefficient though and I'd like to improve it.
Here is a minimal piece of code to show the problem and ask for a solution. My real code is of course more complicated.
My GraphQL schema
type Query {
allUsers: [User!]!
}
type User {
name: String!
posts: [Post!]!
}
type Post {
text: String!
author: User!
}
My resolver object, in the Node.JS code
const resolvers = {
Query: {
allUsers: ()=>prisma.users.findMany()
},
User: {
posts: (user)=>prisma.posts.findMany({where:{author:user.id}})
}
};
Problems
This code works but it's inefficient. Imagine you're running the query {allUsers{posts{text}}}:
My code runs N+1 queries against PostgreSQL to fetch the whole result: one to fetch the list of the users, then other N: one for each user. A single query, using a JOIN, should be enough.
My code selects every column from every table it queries, even though I only need user.id and don't need user.name or anything else.
Question
I know that Prisma supports nested searches (include and select options) which could fix both problems. However I don't know how to configure the options object using the GraphQL query.
How can I extract from the GraphQL query the list of fields that are requested? And how can I use these to create to options object to perform an optimal nested-search with Prisma?
This package can help you parse the request info: https://www.npmjs.com/package/graphql-parse-resolve-info
Then you need to transform it to a usable parameter that you can use in your ORM.
Here is an example with NestJS:
import {createParamDecorator, ExecutionContext} from '#nestjs/common';
import {GqlExecutionContext} from '#nestjs/graphql';
import {GraphQLResolveInfo} from 'graphql';
import {parseResolveInfo, ResolveTree} from 'graphql-parse-resolve-info';
export type PrismaSelect = {
select: {
[key: string]: true | PrismaSelect;
};
};
export const Relations = createParamDecorator(
(data: unknown, ctx: ExecutionContext) => {
const info = GqlExecutionContext.create(ctx).getInfo<GraphQLResolveInfo>();
const ast = parseResolveInfo(info);
return astToPrisma(Object.values((ast as ResolveTree).fieldsByTypeName)[0]);
},
);
export const astToPrisma = (ast: {
[str: string]: ResolveTree;
}): PrismaSelect => {
return {
select: Object.fromEntries(
Object.values(ast).map(field => [
field.name,
Object.keys(field.fieldsByTypeName).length === 0
? true
: astToPrisma(Object.values(field.fieldsByTypeName)[0]),
]),
),
};
};
Then you do:
import {Parent, Query, ResolveField, Resolver} from '#nestjs/graphql';
import {PrismaService} from '../services/prisma.service';
import {User} from '../entities/user.entity';
import {Relations} from 'src/decorators/relations.decorator';
import {Prisma} from '#prisma/client';
#Resolver(() => User)
export class UserResolver {
constructor(public prisma: PrismaService) {}
#Query(() => [User])
async usersWithRelationsResolver(
#Relations() relations: {select: Prisma.UserSelect},
): Promise<Partial<User>[]> {
return this.prisma.user.findMany({
...relations,
});
}
Alternatively, if you want to solve the N+1 problem you can use Prisma built-in findUnique method. See https://www.prisma.io/docs/guides/performance-and-optimization/query-optimization-performance#solving-the-n1-problem

graphql: Must provide Source. Received: {kind: "Document, definition ...}

I'm really new to Graphql (just yesterday actually). I am "playing" around and try the various tools of the ecosystem (apollo-server, graphql.js ...ect).
For the sake of experimenting, I am trying to call a query from within nodejs (and not from a client in the browser, such as a react application)
First of all this is my simple schema along with resolvers:
export const mySchema = gql`
type User {
id: ID!
name:
surname: String
}
# root query has been defined in another file
extend type Query {
users: [User]
test: [User]
}
`
export const myResolvers = {
users: () => [ __array_of_users__ ],
test: () => /* this is where I would like to re-invoke the 'users query'
}
Using the makeExecutableSchema function, I create a schema object with my types and my resolvers and I export this schema into the apollo server application. Every thing works fine so far.
Now following this stackoverflow suggested solution, I created a helper function which should allow me to invoke a query defined in my schema as following:
import { graphql } from "graphql";
import { schema } from "./my-schema";
export const execute = str => {
return graphql(schema, str );
};
With this helper function, my resolvers become:
import { gql } from "apollo-server-express";
import { execute } from '__path_to_helper_function__';
export const myResolvers = {
users: () => [ __array_of_users__ ],
test: () => execute( gql`
query users {
name
}
`)
}
But in the playground, when I try the query:
{
test {
name
}
}
I get the following error:
I don't even know if what I am trying to do (to call a query from within node) can be done. Any suggestion will be greatly appreciated.
Thnaks
graphql-tag takes a string and parses it into a DocumentNode object. This is effectively the same as passing a String to the parse function. Some functions exported by the graphql module, like execute, expect to be passed in a DocumentNode object -- the graphql function does not. It should be passed just a plain String as the request, as you can see from the signature:
graphql(
schema: GraphQLSchema,
requestString: string,
rootValue?: ?any,
contextValue?: ?any,
variableValues?: ?{[key: string]: any},
operationName?: ?string
): Promise<GraphQLResult>
So, just drop the gql tag. You can see an (incomplete) API reference here.

Optional but non-nullable fields in GraphQL

In an update to our GraphQL API only the models _id field is required hence the ! in the below SDL language code. Other fields such as name don't have to be included on an update but also cannot have null value. Currently, excluding the ! from the name field allows the end user to not have to pass a name in an update but it allows them to pass a null value for the name in, which cannot be allowed.
A null value lets us know that a field needs to be removed from the database.
Below is an example of a model where this would cause a problem - the Name custom scalar doesn't allow null values but GraphQL still allows them through:
type language {
_id: ObjectId
iso: Language_ISO
auto_translate: Boolean
name: Name
updated_at: Date_time
created_at: Date_time
}
input language_create {
iso: Language_ISO!
auto_translate: Boolean
name: Name!
}
input language_update {
_id: ObjectId!
iso: Language_ISO!
auto_translate: Boolean
name: Name
}
When a null value is passed in it bypasses our Scalars so we cannot throw a user input validation error if null isn't an allowed value.
I am aware that ! means non-nullable and that the lack of the ! means the field is nullable however it is frustrating that, as far as I can see, we cannot specify the exact values for a field if a field is not required / optional. This issue only occurs on updates.
Are there any ways to work around this issue through custom Scalars without having to start hardcoding logic into each update resolver which seems cumbersome?
EXAMPLE MUTATION THAT SHOULD FAIL
mutation tests_language_create( $input: language_update! ) { language_update( input: $input ) { name }}
Variables
input: {
_id: "1234",
name: null
}
UPDATE 9/11/18: for reference, I can't find a way around this as there are issues with using custom scalars, custom directives and validation rules. I've opened an issue on GitHub here: https://github.com/apollographql/apollo-server/issues/1942
What you're effectively looking for is custom validation logic. You can add any validation rules you want on top of the "default" set that is normally included when you build a schema. Here's a rough example of how to add a rule that checks for null values on specific types or scalars when they are used as arguments:
const { specifiedRules } = require('graphql/validation')
const { GraphQLError } = require('graphql/error')
const typesToValidate = ['Foo', 'Bar']
// This returns a "Visitor" whose properties get called for
// each node in the document that matches the property's name
function CustomInputFieldsNonNull(context) {
return {
Argument(node) {
const argDef = context.getArgument();
const checkType = typesToValidate.includes(argDef.astNode.type.name.value)
if (checkType && node.value.kind === 'NullValue') {
context.reportError(
new GraphQLError(
`Type ${argDef.astNode.type.name.value} cannot be null`,
node,
),
)
}
},
}
}
// We're going to override the validation rules, so we want to grab
// the existing set of rules and just add on to it
const validationRules = specifiedRules.concat(CustomInputFieldsNonNull)
const server = new ApolloServer({
typeDefs,
resolvers,
validationRules,
})
EDIT: The above only works if you're not using variables, which isn't going to be very helpful in most cases. As a workaround, I was able to utilize a FIELD_DEFINITION directive to achieve the desired behavior. There's probably a number of ways you could approach this, but here's a basic example:
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const { args: { paths } } = this
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
for (const path of paths) {
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
}
return resolve.apply(this, resolverArgs)
}
}
}
Then in your schema:
directive #nonNullInput(paths: [String!]!) on FIELD_DEFINITION
input FooInput {
foo: String
bar: String
}
type Query {
foo (input: FooInput!): String #nonNullInput(paths: ["input.foo"])
}
Assuming that the "non null" input fields are the same each time the input is used in the schema, you could map each input's name to an array of field names that should be validated. So you could do something like this as well:
const nonNullFieldMap = {
FooInput: ['foo'],
}
class NonNullInputDirective extends SchemaDirectiveVisitor {
visitFieldDefinition(field) {
const { resolve = defaultFieldResolver } = field
const visitedTypeArgs = this.visitedType.args
field.resolve = async function (...resolverArgs) {
const fieldArgs = resolverArgs[1]
visitedTypeArgs.forEach(arg => {
const argType = arg.type.toString().replace("!", "")
const nonNullFields = nonNullFieldMap[argType]
nonNullFields.forEach(nonNullField => {
const path = `${arg.name}.${nonNullField}`
if (_.get(fieldArgs, path) === null) {
throw new Error(`${path} cannot be null`)
}
})
})
return resolve.apply(this, resolverArgs)
}
}
}
And then in your schema:
directive #nonNullInput on FIELD_DEFINITION
type Query {
foo (input: FooInput!): String #nonNullInput
}

Node.js Testing with Mongoose. unique gets ignored

I'm having a little trouble with an integration test for my mongoose application. The problem is, that my unique setting gets constantly ignored. The Schema looks more or less like this (so no fancy stuff in there)
const RealmSchema:Schema = new mongoose.Schema({
Title : {
type : String,
required : true,
unique : true
},
SchemaVersion : {
type : String,
default : SchemaVersion,
enum: [ SchemaVersion ]
}
}, {
timestamps : {
createdAt : "Created",
updatedAt : "Updated"
}
});
It looks like basically all the rules set in the schema are beeing ignored. I can pass in a Number/Boolean where string was required. The only thing that is working is fields that have not been declared in the schema won't be saved to the db
First probable cause:
I have the feeling, that it might have to do with the way I test. I have multiple integration tests. After each one my database gets dropped (so I have the same condition for every test and precondition the database in that test).
Is is possible that the reason is my indices beeing droped with the database and not beeing reinitiated when the next text creates database and collection again? And if this is the case, how could I make sure that after every test I get an empty database that still respects all my schema settings?
Second probable cause:
I'm using TypeScript in this project. Maybe there is something wrong in defining the Schema and the Model. This is what i do.
1. Create the Schema (code from above)
2. Create an Interface for the model (where IRealmM extends the Interface for the use in mongoose)
import { SpecificAttributeSelect } from "../classes/class.specificAttribute.Select";
import { SpecificAttributeText } from "../classes/class.specificAttribute.Text";
import { Document } from "mongoose";
interface IRealm{
Title : String;
Attributes : (SpecificAttributeSelect | SpecificAttributeText)[];
}
interface IRealmM extends IRealm, Document {
}
export { IRealm, IRealmM }
3. Create the model
import { RealmSchema } from '../schemas/schema.Realm';
import { Model } from 'mongoose';
import { IRealmM } from '../interfaces/interface.realm';
// Apply Authentication Plugin and create Model
const RealmModel:Model<IRealmM> = mongoose.model('realm', RealmSchema);
// Export the Model
export { RealmModel }
Unique options is not a validator. Check out this link from Mongoose docs.
OK i finally figured it out. The key issue is described here
Mongoose Unique index not working!
Solstice333 states in his answer that ensureIndex is deprecated (a warning I have been getting for some time now, I thought it was still working though)
After adding .createIndexes() to the model leaving me with the following code it works (at least as far as I'm not testing. More on that after the code)
// Apply Authentication Plugin and create Model
const RealmModel:Model<IRealmM> = mongoose.model('realm', RealmSchema);
RealmModel.createIndexes();
Now the problem with this will be that the indexes are beeing set when you're connection is first established, but not if you drop the database in your process (which at least for me occurs after every integration test)
So in my tests the resetDatabase function will look like this to make sure all the indexes are set
const resetDatabase = done => {
if(mongoose.connection.readyState === 1){
mongoose.connection.db.dropDatabase( async () => {
await resetIndexes(mongoose.models);
done();
});
} else {
mongoose.connection.once('open', () => {
mongoose.connection.db.dropDatabase( async () => {
await resetIndexes(mongoose.models);
done();
});
});
}
};
const resetIndexes = async (Models:Object) => {
let indexesReset: any[] = [];
for(let key in Models){
indexesReset.push(Models[key].createIndexes());
}
Promise.all(indexesReset).then( () => {
return true;
});
}

Resources