Trouble migrating from graphql-import to just graphql-tools with ApolloServer, directives cease to work - node.js

My plight began as a simple desire to expand my graphql schema from a single .graphql file to multiple files so i can better organize the schema and so it wouldn;t grow to one huge file out of control.
My original layout was very straight forward and i had a working schema in a schema.graphql file. I would be able to parse it into a string using importSchema('server/schema.graphql') from the graphql-import library, which is now deprecated https://github.com/ardatan/graphql-import
They mention that it has been merged into graphql-tools in the newest version and provide a migration tutorial here https://www.graphql-tools.com/docs/migration-from-import The tutorial seems very straight forward since their first example pretty much illustrate exactly what my code looks like (except i dont use es6 import but old fashoined require):
import { importSchema } from 'graphql-import';
import { makeExecutableSchema } from 'graphql-tools';
const typeDefs = importSchema(join(__dirname, 'schema.graphql'));
const resolvers = {
Query: {...}
};
const schema = makeExecutableSchema({ typeDefs, resolvers });
And then they say to modify it, simply make these changes
import { loadSchemaSync } from '#graphql-tools/load';
import { GraphQLFileLoader } from '#graphql-tools/graphql-file-loader';
import { addResolversToSchema } from '#graphql-tools/schema';
const schema = loadSchemaSync(join(__dirname, 'schema.graphql'), { loaders: [new GraphQLFileLoader()] });
const resolvers = { Query: {...} };
const schemaWithResolvers = addResolversToSchema({
schema,
resolvers,
});
I made those changes but the vital difference is that they no longer use makeExecutableSchema() in their example, which is pretty important for me since i need to include the directives. What do i do now with the schema? How do i declare the directives? their documentation for directives still uses makeExecutableSchema but i cant use it anymore since the new loadSchemaSync function returns an object instead of a string literal which i would need to pass to typeDefs in makeExecutableSchema
I am using apollo-server, so it seemed a possible workaround was to just declare the directives in the apollo-server constructor and just pass in this new schemaWithResolvers as a schema as such
const server = new ApolloServer({
schema, //this includes now the returned value of using addResolversToSchema()
schemaDirectives : {
auth:AuthDirective,
authRole: AuthRoleDirective
}
context : ({req}) => //dostuff,
});
This allows my server to run, and i can perform queries and mutations, however, my directives are no longer working, and i no longer have authentication on protected queries.
I would like a way to import my .graphql file and parse it into a string so i can use it inside typeDefs as i used to with importSchema() or a way to declase my directies without using makeExecutableSchema() so that they continue working again!
I have gone up and down the documentation and seen other libraries and so far i keep coming up short, any tips or guidance is greatly appreciated

makeExecutableSchema is still part of graphql-tools and you can continue to use it as shown here in the docs. The issue with the example shown in the docs is that it's not equivalent to what you were doing before. You should use loadTypedefsSync instead:
import { loadTypedefsSync } from '#graphql-tools/load';
import { GraphQLFileLoader } from '#graphql-tools/graphql-file-loader';
import { addResolversToSchema } from '#graphql-tools/schema';
const sources = loadTypedefsSync(join(__dirname, 'schema.graphql'), { loaders: [new GraphQLFileLoader()] });
const documentNodes = sources.map(source => source.document);
const resolvers = { Query: {...} };
const schema = makeExecutableSchema({ typeDefs, resolvers });
Alternatively, if you go the loadSchema route, you should be able to apply the directives to your schema after loading it:
import { SchemaDirectiveVisitor } from "#graphql-tools/utils";
import { loadSchemaSync } from '#graphql-tools/load';
import { GraphQLFileLoader } from '#graphql-tools/graphql-file-loader';
import { addResolversToSchema } from '#graphql-tools/schema';
const schema = loadSchemaSync(join(__dirname, 'schema.graphql'), { loaders: [new GraphQLFileLoader()] });
const resolvers = { Query: {...} };
const schemaWithResolvers = addResolversToSchema({
schema,
resolvers,
});
SchemaDirectiveVisitor.visitSchemaDirectives(schemaWithResolvers, schemaDirectives);

I tried this way but I couldn't solve the problem. A unique solution that managed to take the following approach:
const { ApolloServer, makeExecutableSchema, gql} = require('apollo-server-express')
const { loadTypedefsSync } = require('#graphql-tools/load')
const { GraphQLFileLoader } = require('#graphql-tools/graphql-file-loader')
const path = require('path')
const sources = loadTypedefsSync(
path.resolve(__dirname, '../schema/root.graphql'),
{ loaders: [new GraphQLFileLoader()] }
)
const typeDefs = sources.map(source => source.document)
const schema = makeExecutableSchema({
typeDefs: gql`${typeDefs[0]}`,
resolvers,
})

I had the same issue I loaded the Schema via .graphql and I want to add the graphql-constraint-directive. My Solution was to load the schema with loadSchemaSync and then to use the wrapSchema to use the transform functions, you need also to add the directives into one of your .graphql files:
import { addResolversToSchema, wrapSchema } from 'graphql-tools';
import { GraphQLSchema } from 'graphql';
import resolvers from './resolver';
schema = loadSchemaSync('./**/*.graphql', {
loaders: [new GraphQLFileLoader()],
});
const schemaWithResolver = addResolversToSchema({
schema,
resolvers
});
const { constraintDirective } = require('graphql-constraint-directive')
const schemaConstrain = wrapSchema({
schema: schemaWithResolver,
transforms: [constraintDirective()]
})
Documentation to Schema Wrapping

Related

How to search or find data from multiple collection in mongoose using Node Js

I am building the MERN app of Movies, where I have to implement search field, but stuck at how I can run search or query through multiple collections in MongoDB and get the data based on search in NodeJS.
controllers/movies.js
import MoviePost from "../../models/movies/moviesSchema.js";
import mongoose from 'mongoose'
export const getAllMov=async(req,res)=>{
try {
const data=await MoviePost.find().sort({_id:-1})
res.status(201).json(data)
} catch (error) {
res.status(401).json('error')
}
}
export const getMovById=async(req,res)=>{
const {id} =req.params
try {
const data=await MoviePost.findById(id)
res.status(201).json(data)
} catch (error) {
res.status(401).json('error')
}
}
export const getMov=async(req,res)=>{
const limit=10
const page=1
try {
const data=await MoviePost.find().sort({_id:-1}).limit(limit * 1).skip((page-1)*limit)
res.status(201).json(data)
} catch (error) {
res.status(401).json('error')
}
}
export const createMov=async(req,res)=>{
const data=req.body
const movie=new MoviePost(data)
try {
await movie.save()
res.status(201).json(movie)
} catch (error) {
res.status(401).json('error')
}
}
export const updateMov=async(req,res)=>{
const {id}=req.params
const updateMovie=req.body
if(!mongoose.Types.ObjectId.isValid((id))) return res.status(401).json('no data with that id')
const updatedMovie=await MoviePost.findByIdAndUpdate(id,updateMovie,{new:true})
res.status(201).json(updatedMovie)
}
export const deleteMov=async(req,res)=>{
const {id}=req.params
if(!mongoose.Types.ObjectId.isValid((id))) return res.status(401).json('no data with that id')
await MoviePost.findByIdAndRemove(id)
res.status(201).json('data Deleted')
}
export const getDataBySearch=async(req,res)=>{
const {searchQuery}=req.query
try {
const title=new RegExp(searchQuery,'i')
const data=await MoviePost.find({title})
res.status(201).json(data)
} catch (error) {
res.status(404).json({message:error.message})
}
}
in models there are three mongoose model with same schema, models named as in codes:
Movies
Trending
Webshow
I am putting movies.js here same applies for all models
models/movies.js
import mongoose from 'mongoose'
const MovieSchema=mongoose.Schema({
poster:String,
youtube:String,
title:String,
genre:[String],
director:String,
duration:String,
quality:String,
release:String,
imdb:String,
name:String,
description:String,
detailtitle:String,
screenshots:[String],
createdAt:{
type:Date,
default:new Date()
}
})
var WebShowPost=mongoose.model('Webshow',MovieSchema)
export default WebShowPost
as for the models routes are all similar.
routes/movies.js
import express from 'express'
const router=express.Router()
import { createMov, deleteMov, getAllMov, getMov, getMovById, updateMov,getDataBySearch } from '../../controllers/movies/movies.js'
router.get('/search',getDataBySearch)
router.get('/',getMov)
router.get('/all',getAllMov)
router.get('/:id',getMovById)
router.post('/',createMov)
router.patch('/:id',updateMov)
router.delete('/:id',deleteMov)
export default router
in main server.js
import bodyParser from 'body-parser'
import cors from 'cors'
import dotenv from 'dotenv'
import mongoose from 'mongoose'
import express from 'express'
import movieRoutes from './routes/routes.js'
import AdminRoutes from './routes/Admin.js'
import trendingCardRouter from './routes/movies/Trending.js'
import movieCardRouter from './routes/movies/movies.js'
import WebShowRouter from './routes/movies/WebShow.js'
const app=express()
dotenv.config({path:'./.env'})
app.use(bodyParser.json({limit:'4gb',extended:true}))
app.use(bodyParser.urlencoded({limit:'4gb',extended:true}))
app.use(cors())
app.use('/movies',movieRoutes)
app.use('/admin',AdminRoutes)
app.use('/trendCard',trendingCardRouter)
app.use('/movieCard',movieCardRouter)
app.use('/webshowCard',WebShowRouter)
app.get('/',(req,res)=>{
res.send('hello')
})
mongoose.connect(process.env.MONGO_URI,{useNewUrlParser:true,useUnifiedTopology:true})
.then(()=>app.listen('5000',()=> console.log('connected')))
.catch((err)=>console.log(err))
mongoose.set('useFindAndModify',false)
and please also tell me how to send data back directly to the frontend
for searching query in database you should send query for each databases and I recommend to use Promise.all for making your search faster for example use want to search in two collection , users and orders do it
Promise.all([
Users.find(query),
Orders.find(query)
])
.then(results=>{
var users = results[0] //// first call in Promise.all
var orders = results[1] //// second call in Promise.all
})
.catch(errors=>{
console.log('errors' : errors)
})
You can use Aggregation pipeline, and in first stage you can merge documents from different collections with $unionWith and then in second stage you can do search on some field with $match.
Collection_1.aggregate([
{
"$unionWith": {
"coll": "collection_2"
}
},
{
"$match": {
"filed_name": {
"$regex": "search_query"
}
}
}
])

TypeError when trying to create GraphQL Directive

I'm using Apollo-Server/TypeScript and utilizing graphql-tool's makeExecutableSchema() to setup schema/directives.
I'm currently getting this error when trying to add a barebones simple GraphQL Directive:
TypeError: Class constructor SchemaDirectiveVisitor cannot be invoked without 'new' at new AuthDirective
(/home/node/app/src/api/directives/AuthDirective.ts:58:42)
Here is the setup for the schema:
import AuthDirective, { authTypeDefs } from "./directives/AuthDirective";
import { makeExecutableSchema } from "graphql-tools";
const schema = makeExecutableSchema({
resolvers: [...],
typeDefs: [...], // authTypeDefs is included here
schemaDirectives: {
auth: AuthDirective,
},
});
export default schema;
The AuthDirective file:
import { SchemaDirectiveVisitor } from "graphql-tools";
import { defaultFieldResolver } from "graphql";
export default class AuthDirective extends SchemaDirectiveVisitor {
public visitFieldDefinition(field) {
console.log("VISIT FIELD: ", field);
const { resolve = defaultFieldResolver } = field;
field.resolve = async function (...args) {
return resolve.apply(this, args);
};
}
}
export const authTypeDefs = `
enum AppRole {
USER
ADMIN
}
directive #auth(
requires: AppRole! = USER
) on FIELD_DEFINITION
`;
I've been following the documentation here. Everything seems to be in order but I could be overlooking something.
What's absurd though is that the error is saying something about line 58 in the AuthDirective file.
The file is only 23/24 lines long.
Fixed the issue. I've changed the way of implementing directives using graphql-tools directive resolvers (instead of the class-based SchemaDirectiveVisitor way). Documentation here

Firestore where 'IN' query on google-cloud firestore

(Using typescript for better readability. Vanilla js is always welcome)
Nodejs App, using these imports:
import { FieldPath, DocumentReference } from '#google-cloud/firestore';
and this function
async getByIds(ids: DocumentReference[]) {
const collection = this.db.client.collection('authors');
const query = await collection.where(FieldPath.documentId(), 'in', ids).get();
return query.docs.map(d => ({ id: d.id, ...d.data() }));
}
returns this very specific error:
The corresponding value for FieldPath.documentId() must be a string or a DocumentReference.
The debugger confirms that ids is in fact a DocumentReference array.
Maybe the #google-cloud/firestore package isn't aligned to the firebase one?
EDIT:
as noted by Doug in it's comment, I forgot to put the code for this.db.client. Here you are:
export class DatabaseProvider {
private _db: Firestore;
get client(): Firestore {
return this._db;
}
constructor() {
this._db = new Firestore({
projectId: ...,
keyFilename: ...
});
}
}
And used as
const db = new DatabaseProvider();
It seems like what you're trying to do is a batch get, which is available via a different method: getAll(). I think you want this:
async getByIds(ids: DocumentReference[]) {
return this.db.client.getAll(...ids);
}

Unknown type "Upload" in Apollo Server 2.6

I want to upload a file through GraphQL, and followed this article.
Here's the my schema:
extend type Mutation {
bannerAdd(
title: String!
image: Upload
): ID
}
However when I run the app, this gives me this error:
Unknown type "Upload". Did you mean "Float"?
Followed above article, Apollo Server will automatically generate Upload scalar, but why this is happening?
Also define Upload scalar manually also not working:
scalar Upload
...
Gives me this error:
Error: There can be only one type named "Upload".
Seems nothing wrong with my code. Is there an anything that I missed? Using Node#10.14.2, Apollo Server#2.6.1, Apollo Server Express#2.6.1 and polka#0.5.2.
Any advice will very appreciate it.
Fix this problem with GraphQLUpload of Apollo Server for create a custom scalar called FileUpload.
Server setup with Apollo Server:
const {ApolloServer, gql, GraphQLUpload} = require('apollo-server');
const typeDefs = gql`
scalar FileUpload
type File {
filename: String!
mimetype: String!
encoding: String!
}
type Query {
uploads: [File]
}
type Mutation {
singleUpload(file: FileUpload!): File!
}
`;
const resolvers = {
FileUpload: GraphQLUpload,
Query: {
uploads: (parent, args) => {},
},
Mutation: {
singleUpload: async (_, {file}) => {
const {createReadStream, filename, mimetype, encoding} = await file;
const stream = createReadStream();
// Rest of your code: validate file, save in your DB and static storage
return {filename, mimetype, encoding};
},
},
};
const server = new ApolloServer({
typeDefs,
resolvers,
});
server.listen().then(({url}) => {
console.log(`🚀 Server ready at ${url}`);
});
Client Setup with Apollo Client and React.js:
You need to install the apollo-upload-client package too.
import React from 'react';
import ReactDOM from 'react-dom';
import { ApolloClient, InMemoryCache, ApolloProvider, gql, useMutation } from '#apollo/client';
import { createUploadLink } from 'apollo-upload-client';
const httpLink = createUploadLink({
uri: 'http://localhost:4000'
});
const client = new ApolloClient({
link: httpLink,
cache: new InMemoryCache()
});
const UPLOAD_FILE = gql`
mutation uploadFile($file: FileUpload!) {
singleUpload(file: $file) {
filename
mimetype
encoding
}
}
`;
function FileInput() {
const [uploadFile] = useMutation(UPLOAD_FILE);
return (
<input
type="file"
required
onChange={({target: {validity, files: [file]}}) =>
validity.valid && uploadFile({variables: {file}})
}
/>
);
}
function App() {
return (
<ApolloProvider client={client}>
<div>
<FileInput/>
</div>
</ApolloProvider>
);
}
ReactDOM.render(
<React.StrictMode>
<App/>
</React.StrictMode>,
document.getElementById('root')
);
Here's the solution what I did, adding custom scalar named "FileUpload" and add GraphQLUpload as resolver like this:
import { GraphQLUpload } from 'graphql-upload';
export const resolvers = {
FileUpload: GraphQLUpload
};
It works great, but it could be not perfect solution. Hope apollo fix this soon.
P.S. To upload file from your browser, you also need to set upload link in Apollo Client properly. Here's my code:
import { ApolloLink, split } from 'apollo-link';
import { createHttpLink } from 'apollo-link-http';
import { createUploadLink } from 'apollo-upload-client';
// Create HTTP Link
const httpLink = createHttpLink({
uri: ...,
credentials: 'include'
});
// Create File Upload Link
const isFile = value =>
(typeof File !== 'undefined' && value instanceof File) || (typeof Blob !== 'undefined' && value instanceof Blob);
const isUpload = ({ variables }) => Object.values(variables).some(isFile);
const uploadLink = createUploadLink({
uri: ...
credentials: 'include'
});
const terminatingLink = (isUpload, uploadLink, httpLink);
const link = ApolloLink.from([<Some Other Link...>, <Another Other Link...>, terminatingLink]);
const apolloClient = new ApolloClient({
link,
...
});
This issue can be caused by passing an executable schema (schema option) when initializing your server instead of the newer API of passing typeDefs and resolvers separately.
Old:
const server = new ApolloServer({
schema: makeExecutableSchema({ typeDefs, resolvers })
})
New:
const server = new ApolloServer({
typeDefs,
resolvers,
})
Or as explained in the docs:
Note: When using typeDefs, Apollo Server adds scalar Upload to your schema, so any existing declaration of scalar Upload in the type definitions should be removed. If you create your schema with makeExecutableSchema and pass it to ApolloServer constructor using the schema param, make sure to include scalar Upload.

Need to find the error with connecting subscription with schema stitching

I am using apollo-server-express for graphql back-end. I am going to process only mutations there, but I want to redirect query and subscription on hasura by means of schema stitching with introspection. Queries through apollo-server to hasura are working fine and returning the expected data.
But subscriptions are not working and I am getting this error: " Expected Iterable, but did not find one for field subscription_root.users".
And besides, server hasura is receiving events:
But apollo-server resents the answer from hasura. It is not the first day I suffer with this and I can not understand what the problem is.
In the editor hasura subscriptions work.
Link to full code
If you need any additional info, I will gladly provide it to you.
import {
introspectSchema,
makeExecutableSchema,
makeRemoteExecutableSchema,
mergeSchemas,
transformSchema,
FilterRootFields
} from 'graphql-tools';
import { HttpLink } from 'apollo-link-http';
import nodeFetch from 'node-fetch';
import { resolvers } from './resolvers';
import { hasRoleResolver } from './directives';
import { typeDefs } from './types';
import { WebSocketLink } from 'apollo-link-ws';
import { split } from 'apollo-link';
import { getMainDefinition } from 'apollo-utilities';
import { SubscriptionClient } from 'subscriptions-transport-ws';
import * as ws from 'ws';
import { OperationTypeNode } from 'graphql';
interface IDefinitionsParams {
operation?: OperationTypeNode,
kind: 'OperationDefinition' | 'FragmentDefinition'
}
const wsurl = 'ws://graphql-engine:8080/v1alpha1/graphql';
const getWsClient = function (wsurl: string) {
const client = new SubscriptionClient(wsurl, {
reconnect: true,
lazy: true
}, ws);
return client;
};
const wsLink = new WebSocketLink(getWsClient(wsurl));
const createRemoteSchema = async () => {
const httpLink = new HttpLink({
uri: 'http://graphql-engine:8080/v1alpha1/graphql',
fetch: (nodeFetch as any)
});
const link = split(
({ query }) => {
const { kind, operation }: IDefinitionsParams = getMainDefinition(query);
console.log('kind = ', kind, 'operation = ', operation);
return kind === 'OperationDefinition' && operation === 'subscription';
},
wsLink,
httpLink,
);
const remoteSchema = await introspectSchema(link);
const remoteExecutableSchema = makeRemoteExecutableSchema({
link,
schema: remoteSchema
});
const renamedSchema = transformSchema(
remoteExecutableSchema,
[
new FilterRootFields((operation, fieldName) => {
return (operation === 'Mutation') ? false : true; // && fieldName === 'password'
})
]
);
return renamedSchema;
};
export const createNewSchema = async () => {
const hasuraExecutableSchema = await createRemoteSchema();
const apolloSchema = makeExecutableSchema({
typeDefs,
resolvers,
directiveResolvers: {
hasRole: hasRoleResolver
}
});
return mergeSchemas({
schemas: [
hasuraExecutableSchema,
apolloSchema
]
});
};
Fixed by installing graphql-tools 4th version. It tutns out the editor did not even notice that I do not have this dependency and simply took the version of node_modules, which was installed by some other package. Problem was with version 3.x. Pull request is where the bug was fixed.
I had the same problem, different cause and solution.
My subscription was working well, until I introduced the 'resolve' key in
my subscription resolver:
Here is the 'Subscription' part of My resolver:
Subscription: {
mySubName: {
resolve: (payload) => {
console.log('In mySubName resolver, payload:',payload)
return payload;
},
subscribe:() => pubSub.asyncIterator(['requestsIncomplete']),
// )
},
The console.log proved the resolve() function was being called with a well structured payload (shaped the same as my Schema definiton - specifically the an object with a key named after the graphQL Subscriber, pointing to an array (array is an iterable):
In mySubName resolver, payload: { mySubName:
[ { id: 41,
...,
},
{...},
{...}
...
...
]
Even though I was returning that same unadulterated object, it caused the error expected Iterable, but did not find one for field "Subscription.mySubName"
When I commented out that resolve function all together, the subscription worked, which is further evidence that my payload was well structured, with the right key pointing to an iterable.
I must be mis-using the resolve field. From https://www.apollographql.com/docs/graphql-subscriptions/subscriptions-to-schema/
When using subscribe field, it's also possible to manipulate the event
payload before running it through the GraphQL execution engine.
Add resolve method near your subscribe and change the payload as you wish
so I am not sure how to properly use that function, specifically don't know what shape object to return from it, but using it as above breaks the subscription in the same manner you describe in your question.
I was already using graphql-tools 4.0.0, I upgraded to 4.0.8 but it made no difference.

Resources