Is there a way to implement custom logic right after the graphql query has been parsed, but before any of the resolvers have executed?
Given this query schema
type Query {
products(...): ProductConnection!
productByHandle(handle: String!): Product
}
How can I accomplish the task of logging the info object for the products and productByHandle queries, before their resolvers have had a chance to execute?
I'm basically looking to "hook up" to an imaginary event like query:parsed, but it doesn't appear to exist. I'm using the express-graphql package.
Props to #xadm for figuring this out.
express-graphql package accepts a custom execute function, which is the function that gets called after the query has been parsed. Its return value is what gets returned from the /graphql endpoint.
import { graphHTTP } from 'express-graphql'
import { execute } from 'graphql'
app.use('/graphql', graphHTTP((req, res) => {
return {
...,
async customExecuteFn(ExecutionArgs) {
// The `info` object is available on ExecutionArgs
// { data: {...}, errors: [...] }
const result = await execute(ExecutionArgs)
return result
}
}
}))
I will still leave this here, as it might be useful for something more specific, but you should probably use the code above.
// This returns an object, whose keys are the query names and the values are the definitions ( name, resolve etc )
const queryFields = graphqlSchema.getQueryType().getFields()
// They can then be iterated, and the original `resolve` method can be monkey-patched
for (const queryName in queryFields) {
const queryInfo = queryFields[queryName]
// Grab a copy of the original method
const originalResolve = queryInfo.resolve
// Overwrite the original `resolve` method
queryInfo.resolve = function patchedResolve(src, args, context, info) {
// Your custom logic goes here
console.log(info);
// Call the original `resolve` method, preserving the context and
// passing in the arguments
return originalResolve.apply(this, arguments)
}
}
Related
I'm testing my component with jest.
Inside it I have a custom component I stub:
function mountComponent(propsData, data) {
const wrapper = mount(Upload, {
props: propsData,
global: {
stubs: {
myCustomComponent: true,
},
plugins: [router],
},
data,
});
return wrapper;
}
my usage of the custom componenet is:
<my-custom-component
#upload="uploadMethod"
></my-custom-component>
I saw I can trigger the method uploadMethod by:
const upload = wrapper.find('component-stub');
upload.trigger('uploadMethod');
but my method - uploadmethod has both parameters and return value
my question is how can I set the parameters and how can I get the return value?
my question is how can I set the parameters...
Exactly like calling uploadMethod(...args) directly, except you add the event name as first parameter:
const args = [param1, param2, param3];
const upload = wrapper.find('component-stub');
upload.trigger('uploadMethod', ...args);
... and how can I get the return value?
You don't, because the subcomponent doesn't, either. You expect() that whatever should have happened in the parent component when uploadMethod is called actually happened.
I'm trying to migrate an existing function to use it inside an Apify actor.
Originally, the function loads a given URL, reads its JSON response, and according to some supplied parameters, extracts some data and returns an object with results.
If you ask, it's not scraping anything "final" at this point. Its results are temporary and will be used to create other URLs which will be scraped then (with another crawler) for actual, useful results.
The current function that executes the crawler is something like this:
let url = new URL('/content', someBaseURL);
url.searchParams.set('search', someKeyword);
const reqList = new apify.RequestList({
sources: [ { url: url.toString() } ]
});
await reqList.initialize();
const crawler = new apify.BasicCrawler({
requestList: reqList,
handleRequestFunction: reqHandler
});
// How do I set the inputs for reqHandler() here ?
await crawler.run();
// How do I get the output from reqHandler() here ?
And the reqHandler code is something like this:
async function reqHandler(options) {
const response = await apify.utils.requestAsBrowser({
url: options.request.url
});
// How do I read parameters from the caller here ?
let searchResults = JSON.parse(response.body);
// ... result object creation logic goes here ...
// How do I return a result to the caller here ?
}
I am pretty new to this Apify thing and lost in the documentation.
Thanks for your help.
handleRequestFunction doesn't take any external input or produce any outputs. Simply use it as a closure and capture inputs from the surrounding code or you can wrap it in a different function.
Normally we do it like this:
const context = {}; // put your inputs here
const crawler = new apify.BasicCrawler({
requestList: reqList,
handleRequestFunction: async () => {
// use context here
// output data
await Apify.pushData(results);
}
});
EDIT: I forgot to mention a use-case on how to pass input. You need to do it via the request.userData object when adding to a queue or a list.
// The same userData is available in request list.
await requestQueue.addRequest({
url: 'https://example.com',
userData: { myInput: 'any-data' }
});
// Then in handleRequestFunction
handleRequestFunction: async (( request }) => {
const { myInput } = request.userData;
// ...
}
Context: Am not too experienced with TypeScript, as we don't use it at work, am attempting to just build a little portfolio piece for personal exposure.
So to start with this is my code:
import { request, Request, Response } from 'express';
import { Neighborhood as NeighborhoodType } from '../interfaces/neighborhood.interface';
import Neighborhood from '../models/neighborhood';
const fetchNeighborhoods = async (request: Request, response: Response): Promise<void> => {
try {
const neighborhoods: NeighborhoodType[] = await Neighborhood.paginate();
response.status(200).send(neighborhoods);
} catch (error) {
throw error;
}
};
Am attempting to fetch the neighborhoods from the DB, and am receiving the error Type 'PaginateResult<Neighborhood>' is missing the following properties from type 'Neighborhood[]': length, pop, push, concat, and 26 more. on this line const neighborhoods: NeighborhoodType[] = await Neighborhood.paginate();
If I remove the NeighborhoodType[] then the method will work fine. The neighborhood interface is literally an object with a string.
export interface Neighborhood extends Document {
name: string,
}
Is it an issue with MY code or is it an issue with one of the dependencies?
For anyone who encounters this issue:
The problem stems from trying to set the return type. As Mongoose will always return one document, an array of documents or an empty array (unless using onFail()) the return type can be inferred so there is no need to add NeighborhoodType[].
The PaginateResult Type is essentially the type of Array if I'm not mistaken and is expecting the Neighborhood type to have all of the array methods which it will not.
I am trying to:
Poll a public API every 5 seconds
Store the resulting JSON in a variable
Store the next query to this same API in a second variable
Compare the first variable to the second
Print the second variable if it is different from the first
Else: Print the phrase: 'The objects are the same' if they haven't changed
Unfortunately, the comparison part appears to fail. I am realizing that this implementation is probably lacking the appropriate variable scoping but I can't put my finger on it. Any advice would be highly appreciated.
data: {
chatters: {
viewers: {
},
},
},
};
//prints out pretty JSON
function prettyJSON(obj) {
console.log(JSON.stringify(obj, null, 2));
}
// Gets Users from Twitch API endpoint via axios request
const getUsers = async () => {
try {
return await axios.get("http://tmi.twitch.tv/group/user/sixteenbitninja/chatters");
} catch (error) {
console.error(error);
}
};
//Intended to display
const displayViewers = async (previousResponse) => {
const usersInChannel = await getUsers();
if (usersInChannel.data.chatters.viewers === previousResponse){
console.log("The objects are the same");
} else {
if (usersInChannel.data.chatters) {
prettyJSON(usersInChannel.data.chatters.viewers);
const previousResponse = usersInChannel.data.chatters.viewers;
console.log(previousResponse);
intervalFunction(previousResponse);
}
}
};
// polls display function every 5 seconds
const interval = setInterval(function () {
// Calls Display Function
displayViewers()
}, 5000);```
The issue is that you are using equality operator === on objects. two objects are equal if they have the same reference. While you want to know if they are identical. Check this:
console.log({} === {})
For your usecase you might want to store stringified version of the previousResponse and compare it with stringified version of the new object (usersInChannel.data.chatters.viewers) like:
console.log(JSON.stringify({}) === JSON.stringify({}))
Note: There can be issues with this approach too, if the order of property changes in the response. In which case, you'd have to check individual properties within the response objects.
May be you can use npm packages like following
https://www.npmjs.com/package/#radarlabs/api-diff
I am using Dynamodb streams to do some work on records as they are added or modified to my table. I am also using Dynamoose models in my application.
The Dynamodb stream event passes an event object to my node.js lambda handler that includes the objects record.dynamoDb.NewImage and record.dynamoDb.OldImage. However, these objects are in DynamoDB's AttributeValue format including all of the data types ('S' for string), rather than a normal javascript object. So record.id becomes record.id.S.
Dynamoose models allow you to instantiate a model from an object, like so: new Model(object). However, it expects that argument to be a normal object.
I know that Dynamoose has a dynamodb parser, I think its Schema.prototype.dynamodbparse(). However, that doesn't work as expected.
import { DYNAMODB_EVENT } from '../../constant';
import _get from 'lodash/get';
import { Entry } from '../../model/entry';
import { applyEntry } from './applyEntry';
async function entryStream(event) {
await Promise.all(
event.Records.map(async record => {
// If this record is being deleted, do nothing
if (DYNAMODB_EVENT.Remove === record.eventName) {
return;
}
// What I've tried:
let entry = new Entry(record.dynamodb.NewImage);
// What I wish I could do
entry = new Entry(Entry.schema.dynamodbparse(record.dynamodb.newImage));
await applyEntry(entry);
})
);
}
export const handler = entryStream;
So is there a way to instantiate a Dynamoose model from DynamoDB's AttributeValue format? Has anyone else done this?
The alternative, is that I simply extract the key from the record, and then make a trip to the database using Model.get({ id: idFromEvent }); But I think that would be inefficient, since the record was just handed to me from the stream.
I solved it by using AWS.DynamoDB.Converter.unmarshall to parse the object before passing to Dynamoose.
import { DYNAMODB_EVENT } from '../../constant';
import _get from 'lodash/get';
import { Entry } from '../../model/entry';
import { applyEntry } from './applyEntry';
// https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/Converter.html#unmarshall-property
var AWS = require('aws-sdk');
var parseDynamo = AWS.DynamoDB.Converter.unmarshall;
async function entryStream(event) {
await Promise.all(
event.Records.map(async record => {
// If this record is being deleted, do nothing
if (DYNAMODB_EVENT.Remove === record.eventName) {
return;
}
entry = new Entry(parseDynamo(record.dynamodb.newImage));
await applyEntry(entry);
})
);
}
export const handler = entryStream;