How to implement isTypeOf method? - node.js

Given this schema:
interface INode {
id: ID
}
type Todo implements INode {
id: ID
title: String!
}
type Query {
node(id: ID!): INode
}
Given this class:
export default class Todo {
constructor (public id: string, public title: string) { }
isTypeOf(value: any): Boolean {
return value instanceof Todo;
}
}
Given this resolver:
type NodeArgs = {
id: string
}
export const resolver = {
node: ({ id }: NodeArgs) => {
return new Todo('1', 'Todo 1');
}
}
When I call the query:
query {
node(id: "1") {
id
... on Todo {
title
}
}
}
Then I get the return below:
{
"errors": [
{
"message": "Abstract type INode must resolve to an Object type at runtime for field Query.node with value { id: \"1\", title: \"Todo 1\" }, received \"undefined\". Either the INode type should provide a \"resolveType\" function or each possible type should provide an \"isTypeOf\" function.",
"locations": [
{
"line": 2,
"column": 3
}
],
"path": [
"node"
]
}
],
"data": {
"node": null
}
}
As you can see, I've implemented the isTypeOf function but I am still getting the error message.
What am I doing wrong?
Notes:
I am using Typescript, express and express-graphql;

isTypeOf is a function that is passed to the constructor of a GraphQLObjectType when you create your schema programatically. Ditto for resolveType functions and unions/interfaces. If you use SDL and create your schema using buildSchema, there is no way to inject those functions into your created schema, just like you don't have a way to provide resolvers for fields on types other than Query and Mutation.
You have a couple of options. One option is to utilize the default resolveType behavior. This checks for a __typename property on the object, and falls back to calling isTypeOf on every implementing type until it matches. That means if you're using a class, it should be sufficient to do something like this:
export default class Todo {
get __typename() {
return 'Todo'
}
}
The better option would be to drop buildSchema and use makeExecutableSchema from graphql-tools. Then you can define your resolveType and/or isTypeOf functions directly in your resolvers. For example:
const resolvers = {
Query: {
node: (obj, args, context, info) => {
return new Todo('1', 'Todo 1')
}
},
INode: {
__resolveType: (obj, context, info) => {
if (obj instanceof Todo) return 'Todo'
},
}
}
Not only can you easily define isTypeOf or resolveType this way, you can also easily add resolvers for fields of any type and add custom scalars without any hassle. You cannot do any of that (easily) if you're using just buildSchema.
Edit:
If you prefer to utilize isTypeOf instead of resolveType, the resolvers would look something like this:
const resolvers = {
Query: {
node: (obj, args, context, info) => {
return new Todo('1', 'Todo 1')
}
},
Todo: {
__isTypeOf: (obj, context, info) => {
return obj instanceof Todo
},
}
}
Only one or the other is necessary. Either write a resolveType function for every abstract type you use, or write a isTypeOf for every object type that could be an abstract type.

Related

"Abstract type X must resolve to an Object type at runtime for field Query.user with value

this my code
schema
gql`
type Query {
user: X!
}
type User {
name: String!
}
type Time {
age: Int!
}
union X = User | Time
`;
resolvers
{
X: {
__resolveType: obj => {
if (obj.name) return { name: "Amasia" };
if (obj.age) return { age: 70 };
return null;
}
},
Query: {
user: () => {
return {
name: "Amasia"
};
}
}
}
request
query {
user{
... on User {
name
}
... on Time {
age
}
}
}
When I make a request do I get Error
"Abstract type X must resolve to an Object type at runtime for field Query.user with value { name: \"Amasia\" }, received \"{ name: \"Amasia\" }\". Either the X type should provide a \"resolveType\" function or each possible type should provide an \"isTypeOf\" function."
What is the reason.?
The resolveType function should return a string with the name of the concrete type the abstract type should resolve to. You are returning an object, not string. In this case, you should return "User" or "Time".
Simply add the __typename to the object to resolve it:
{
Query: {
user: () => {
return {
__typename: 'User',
name: "Amasia"
};
}
}
}

Cyclic schema giving error as "Error: Schema must contain unique named types but contains multiple types named "Node"."

I am new to 'GraphQL' using nodejs. I am stucked into bi-directional schema mapping. posts <-> authors. Using graphql and graphql-relay module.
Following are the two schema we are using.
--posts.js // here we are requiring authors.js
const {
AuthorType,
schema: AuthorSchema,
AuthorsConnection
} = require('./authors');
class Post {}
const {
nodeInterface,
nodeField
} = nodeDefinitions(
globalId => {
const {
type,
id
} = fromGlobalId(globalId);
// return based on the id
return DataSource['posts'][0];
},
obj => {
console.log(" : PostType : ", PostType);
// type to be return
return Post;
}
);
const PostType = new GraphQLObjectType({
"name": "PostType",
"description": "Posts type and it's relevant fields",
"fields": () => ({
"id": globalIdField('Post'),
"title": {
"type": GraphQLString
},
"body": {
"type": GraphQLString
},
"author": {
"type": AuthorsConnection,
"resolve": (parent, argument, root, currentSdl) => {
console.log("v1, v2, v3, v4 :", parent);
if (parent.author)
return connectionFromArray(DataSource['authors'], {})
return [];
}
}
}),
isTypeOf: Post,
interfaces: [nodeInterface]
});
const {
connectionType: PostsConnection,
edgeType: GQLPostEdge
} = connectionDefinitions({
name: "Post",
nodeType: PostType
});
module.exports = exports = {
PostType,
PostsConnection,
schema: {
post: nodeField,
posts: {
type: PostsConnection,
resolve: (root, v2, v3) => {
return connectionFromArray(DataSource['posts'], {});
}
}
}
};
--authors.js // here we have required posts.js
const {
PostType,
PostsConnection
} = require('./posts');
class Author {}
const {
nodeInterface,
nodeField
} = nodeDefinitions(
globalId => {
const {
type,
id
} = fromGlobalId(globalId);
// return based on the id
return DataSource['authors'][0];
},
obj => {
console.log(" : Authorype : ", Authorype);
// type to be return
return Author;
}
);
const AuthorType = new GraphQLObjectType({
"name": "AuthorType",
"description": "Author type and it's relevant fields",
"fields": () => ({
"id": globalIdField('Author'),
"firstName": {
"type": GraphQLString
},
"lastName": {
"type": GraphQLString
},
authorPosts: {
type: PostsConnection,
resolve: (parent, args, root, context) => {
return connectionFromArray(DataSource['posts'], {});
}
}
}),
isTypeOf: null,
interfaces: [nodeInterface]
});
const {
connectionType: AuthorsConnection,
edgeType: GQLAuthorEdge
} = connectionDefinitions({
name: "Author",
nodeType: AuthorType
});
module.exports = exports = {
AuthorType,
AuthorsConnection,
schema: {
author: nodeField,
authors: {
type: AuthorsConnection,
resolve: (root, v2, v3) => {
return connectionFromArray(DataSource['authors'], {});
}
}
}
};
Once I merge above schema for GraphQL I am getting following error.
Error: Schema must contain unique named types but contains multiple types named "Node".
I tried to debugged this issue, following is I observed following.
Once I change "authors" field from posts schema to other than
"AuthorsConnection" it starts working.
Or if removed "authors" field
from posts schema it starts working.
Please let me know what is issue here, is it relevant to nodeDefinitions function?
It is indeed related to the nodeDefinitions function. From the graphql-relay docs:
nodeDefinitions returns the Node interface that objects can implement, and returns the node root field to include on the query type. To implement this, it takes a function to resolve an ID to an object, and to determine the type of a given object.
You're calling this twice, which is resulting in the Node type being defined twice, and you're referencing one of each:
schema: {
post: nodeField,
// ...
schema: {
author: nodeField,
This is causing the error - there's now two independent instances of Node which is invalid.
The solution is to only call nodeDefinitions once, and then pass the reference to the generated nodeField and nodeInterface to the relevant places. Then your globalId => {...} function will need to look at the type to figure out how to get the relevant record, be it an author or a post.
Along with above answer given by #Benjie. I find out the way to overcome issues which was resulting into error of Error: Schema must contain unique named types but contains multiple types named "Node"..
Following are the key points to be check when we are making graphql in modular way.
Don't create new instances of type, For eg: const PostType = new GraphQLObjectType({}) it should always send single object rather than new object every time.
use nodeDefinations only once.
Check for the cyclic issues in common javascript issue which will occurs.
Thanks.

Graphiql variables not being passed to server

I'm building an Apollo Server. I have one simple endpoint communicating with Mongo. There's a collection of announcements.
export const typeDefs = gql`
type Query {
announcements: [Announcement]
announcementsByAuthor(author: String!): [Announcement]
}
type Announcement {
_id: ID!
msg: String!
author: String!
title: String
}
`;
export const resolvers = {
Query: {
announcements: () => {
return new AnnouncementController().getAnnouncements();
},
announcementsByAuthor: (author: string) => {
console.log('RESOLVER: ', author);
return new AnnouncementController().getAnnouncementsByAuthor(author);
}
},
}
In my graphiql interface, the announcements query works correctly:
{
announcements {
msg
author
}
}
The announcementsByAuthor query does not seem to be accepting the string argument, either from a variable or when hardcoded into the query.
query($author: String!){
announcementsByAuthor(author: $author) {
msg
author
}
}
Variables:
{
"author":"Nate"
}
I've logged out from the resolver, and an empty string is being passed in, instead of the specified value for the author variable. I'm new to graphql and I'm hoping someone can enlighten me as to what I'm sure is a simple oversight.
Try this instead:
announcementsByAuthor: (doc, {author}) => {

Graphql: How can get field arguments of a type in resolver?

I use graphql-tools library and makeExecutableSchema function to make my schema by passing schema and resolver to it
here is my schema:
type Trip {
code: String!
driver: User!
vehicle: Vehicle!
destination: Location!
passengers(page: Int, count: Int): [User!]!
}
type Query {
trip(id: String!): Trip
}
and here is my resolver:
// some imports...
export default {
Query: {
async trip(_, { id }, ctx, info) {
const trip = await Trip.findById(id);
// const page = ???, count = ???
// work on fetch data...
return result;
},
};
how can I get page and count which are defined as nested argument for passengers?
You should define a resolver for the type Trip, such as:
export default {
Query: {
async trip(_, { id }, ctx, info) {
const trip = await Trip.findById(id);
// const page = ???, count = ???
// work on fetch data...
return result;
},
Trip: {
async passengers(trip, { page, count }, ctx, info) {
...
},
}
};
In GraphQL, it's not the concept of "nested fields of a type", but just combinations of "the type of a field". The trip field of type Query has the Trip type, so when you want to work with the passengers field, it should be considered as a field under the Trip type, not a nested field of the Query type.

Dynamic default scopes in sequelize.js

I am building kind of multitenancy using sequelize.js. Technically I need to filter all queries by predefined column and dynamic value of the current context. General idea was to use defaultScope to filter out other contexts, something like:
var context = () => { return "some current context id"; }
connection.define('kid', {
firstName: Sequelize.STRING,
photoUrl: Sequelize.STRING,
context: {
type: Sequelize.STRING,
defaultValue: context // this part works, it accepts function
}
}, {
defaultScope: {
where: {
context: context // this does not work, it does not accept function and values is defined only once
}
}
});
However this does not work because defaultScope is defined on the application start.
What is the right way to do this?
The problem is that Sequelize scopes are defined on the model but you need to apply the scope just before the query because that's when you have context such as the user and role.
Here's a slightly modified copy of the scope merge function from Sequelize which you can use in your hooks such as beforeFind()
// Feel free to write a more fp version; mutations stink.
const {assign, assignWith} = require('lodash')
const applyScope = ({scope, options}) => {
if (!scope) {
throw new Error('Invalid scope.')
}
if (!options) {
throw new Error('Invalid options.')
}
assignWith(options, scope, (objectValue, sourceValue, key) => {
if (key === 'where') {
if (Array.isArray(sourceValue)) {
return sourceValue
}
return assign(objectValue || {}, sourceValue)
}
else if (['attributes', 'include'].indexOf(key) >= 0
&& Array.isArray(objectValue)
&& Array.isArray(sourceValue)
) {
return objectValue.concat(sourceValue)
}
return objectValue ? objectValue : sourceValue
})
}
In your model:
{
hooks: {
beforeFind(options) {
// Mutates options...
applyScope({
scope: this.options.scopes.user(options.user)
, options
})
return options
}
}
, scopes: {
user(user) {
// Set the scope based on user/role.
return {
where: {
id: user.id
}
}
}
}
}
Finally in your query, set an option with the context that you need.
const user = {id: 12, role: 'admin'}
YourModel.findOne({
attributes: [
'id'
]
, where: {
status: 'enabled'
}
, user
})
I'm not sure it will help, but you can override a model default scope anytime.
let defaultScope = {
where: {
context: ""
}
};
defaultScope.where.context = context();
model.addScope('defaultScope',defaultScope,{override: true});
Maybe too late here but scopes can take arguments if defined as functions. From documentation Sequelize scope docs if the scope is defined as
scopes: {
accessLevel (value) {
return {
where: {
accessLevel: {
[Op.gte]: value
}
}
}
}
sequelize,
modelName: 'project'
}
you can use it like: Project.scope({ method: ['accessLevel', 19]}).findAll(); where 19 is the dynamic value the scope will use.
As per defaultScope I'm not sure it can be defined as a function

Resources