Responding to a callback from a child message in node.js - node.js

I've run into a problem with node.js and can't figure out the correct way to handle this situation.
I have worker process that handles all the data for a leaderboard. When a request comes in for the leaderboard, I send the request to the worker to handle. The worker will send back the response via the child_process messaging.
My problem is how to efficiently get the response to the callback. This is my first attempt, but wont work as I'm always rebinding the 'message' event to a different callback.
Manager.setup_worker = function () {
Manager.worker = require('child_process').fork("./workers/leaderboard");
}
Manager.process_request = function (request, callback) {
Manager.worker.on("message", function (response) {
callback(response);
})
Manager.worker.send(request);
}

Related

How to get returned data when work with NestJS Bull queue?

Currently, I am working on a NestJS project with the bull queue. In my controller, I have a get function to receive the request from the front end. Based on the request, I will send a gRPC call to retrieve data from other microservice. I would like to let the gRPC call function work with the bull queue. So, in the get function, I put the gRPC call function into producer, which can be executed in the consumer. However, after the gRPC call function is executed in the consumer, I can not find a way to return the retrieved data to the previous get function so that I can send the data back to the front end.
Any help would be appreciated.
You won't be able, the main purpose of using queues is non-blockage of any incoming request.
What you can do is, returning the job id of bull queue and then the front-end dev can track the response on it, or maybe use some event-driven approaches or websocket so you can tell him to refresh the resposne for it
You can, actually !
Here is an example :
import { Process, Processor } from '#nestjs/bull';
import { Job } from 'bull';
#Processor('myProcessor')
export class MyProcessor {
#Process('myProcess')
async handleMyProcess(job: Job<{ myInput: string }>) {
await new Promise((resolve) => setTimeout(resolve, 5000));
return 'hello world !';
}
}
Then in your service :
const compressJob = await this.myQueue.add('myProcess', {
myInput: 'foo',
});
const test = await compressJob.finished();
console.log(compressJob, test);

Acknowledge Response sent too late; NodeJS Express Asynchrone Response

I am using NodeJS with express.
I receive from an information publisher a request. I must send back an acknowledge response as soon as I receive the request. Then I should process the request and push back a response.
According to their logs, I sent back the acknowledge response too late and also after the push of the real response. This is wrong and not what I want. I want to follow the model in this picture:
So I have a router.js file with the following code:
app.post('/import/message.io', Importer.handleMessage);
In the importer controller the handleMessage function handles the request:
function handleMessage(req, res){
let otaRequestBuffer = [];
req.on('readable', () => {
let data;
while (data = req.read()) {
otaRequestBuffer.push(data.toString());
}
});
req.on('end', () => {
let otaRequest = otaRequestBuffer.join('');
try {
let parser = new DOMParser();
let xmlDoc = parser.parseFromString(otaRequest, "text/xml");
let soapHeader = parseSoapHeader(xmlDoc);
const soapAction = req.headers['soapaction'].toString();
// Acknowledge the request; The soapHeader is used in here too. That is why it is after the code above
res.writeHead(200);
res.write(
`<?xml version='1.0' encoding='utf-8'?> ...
`
);
res.end();
// Kick off a new job by adding it to the work queue; Response is send in the consumer.js of this job
let job = jobs.create('worker-job', args);
job.on('complete', function(result){
console.log('Job completed');
// Other job.on code would be here
})
} catch (error) {
console.log(error);
} finally {
console.log('DevInfo: Finally called here');
}
});
Other insights:
So the acknowledge response is in the right format.
When I test locally I receive the acknowledge immediately
The logs of the information publisher are here
According to the logs it seems like they actual response (Line5) is pushed to them before the acknowledge response (Line1) .
Is this possible with the code example I pasted above?
Thanks for the help.
It looks to be entirely possible that the real response (push) is sent before the acknowledgement is received by the client. Since response.end() is asynchronous (e.g. the call will complete before the response is sent), we're effectively racing
res.end();
with
let job = jobs.create('worker-job', args);
This is the kind of annoying thing one tends to see only once one goes to production!
Perhaps you could try queuing your job once the end() callback is called. e.g.
res.end(() => {
console.log("End callback, response stream is complete, starting job...");
// Kick off a new job by adding it to the work queue; Response is send in the consumer.js of this job
let job = jobs.create('worker-job', args);
job.on('complete', function(result){
console.log('Job completed');
// Other job.on code would be here
})
});
Since (from the Node.js docs)...
If callback is specified, it will be called when the response stream is finished.
https://nodejs.org/api/http.html#http_response_end_data_encoding_callback
(This is the method that Express wraps with response.end)
http://expressjs.com/en/5x/api.html#res.end
I'd also suggest logging detailed timestamps down to the millisecond on your server, so you can see exactly when calls are happening.
The setTimeout approach is certainly worth a try, it might get the results you wish:
// Set delay as appropriate
const delayMs = 2000;
setTimeout(() => {
console.log("Timeout complete, starting job...");
// Kick off a new job by adding it to the work queue; Response is send in the consumer.js of this job
let job = jobs.create('worker-job', args);
job.on('complete', function(result){
console.log('Job completed');
// Other job.on code would be here
})
}, delayMs);

IORedis or (node_redis) callback not firing after calling custom Redis commands/modules

When using redis client (ioredis or node_redis) inside websocket's message event in a nodejs app, the callback for any command is not immediately fired. (the operation does take place on redis server though)
What is strange is that the callback for the first command will fire after i sent a second message, and the callback for the second will fire after i send a third.
wss.on('connection', (socket, request) => {
socket.on('message', (data) => {
console.log("will send test command")
this.pubClient.hset("test10", "f1","v1", (err,value) => {
//callback not firing first time
console.log("test command reply received")
})
})
}
the redis command is working as expected though in other parts of the app and even when inside the on connection directly like below.
wss.on('connection', (socket, request) => {
console.log("will send test command")
this.pubClient.hset("test10", "f1","v1", (err,value) => {
//callback fires
console.log("test command reply received")
})
socket.on('message', (data) => {})
}
UPDATE:
I had this all wrong. The reason for the weird callback behavior is the result of one my custom Redis modules not returning a reply.
And this seems to have caused all callbacks after this call to seem to have some kind of a one step delay.
I had this all wrong. The reason for the weird callback behavior is the result of one my custom Redis modules not returning a reply. And this seems to have caused all callbacks after this call to seem to have some kind of a one step delay.

Node.js child process fork return response -- Cannot set headers after they are sent to the client

situation:
have an function that does an expensive operation such as fetching a large query from mongodb, then performing a lot of parsing and analysis on the response. I have offloaded this expensive operation to a child process fork, and waiting for the worker to be done before sending response in order to not block the main event loop.
current implentation:
I have an API endpoint GET {{backend}}/api/missionHistory/flightSummary?days=90&token={{token}}
api entry point code:
missionReports.js
const cp = require('child_process');
//if reportChild is initailzed here, "Cant send headers after they were sent"
const reportChild = cp.fork('workers/reportWorker.js');
exports.flightSummary = function (req, res) {
let request = req.query;
// if initialized here, there is no error.
const reportChild = cp.fork('workers/reportWorker.js');
logger.debug(search_query);
let payload = {'task':'flight_summary', 'search_params': search_query};
reportChild.send(payload);
reportChild.on('message', function (msg) {
logger.info(msg);
if (msg.worker_status === 'completed')
{
return res.json(msg.data);
}
});
};
worker code:
reportWorker.js
process.on('message', function (msg) {
process.send({'worker_status': 'started'});
console.log(msg);
switch(msg.task)
{
case 'flight_summary':
findFlightHours(msg.search_params,function (response) {
logger.info('completed')
process.send({'worker_status': 'completed', 'data':response});
})
break;
}
});
scenario 1: reportChild (fork) is initialized at beginning of module definitions. api call works once, and returns correct data. on second call, it crashes with cannot send headers after theyve been sent. I stepped through the code, and it definitely only sends it once per api call.
scenario 2: if i initalize the reportChild inside of the api definition, it works perfectly every time. Why is that? Is the child forked process not killed unless its redefined? Is this standard implementation of child proceses?
This is my first attempt at threading in node.js, I am trying to move expensive operations off of the main event loop into different workers. Let me know what is best practice for this situation. Thanks.

Triggering the fulfillment webhook asynchronously from an intent?

I have some intents that need to trigger the fulfillment webhook and don't care about the response. The webhook takes longer than the timeout to respond so I'd like the intent to simply respond with "Thanks for chatting" and then close the conversation while actually triggering the webhook.
Feels easy but I'm missing something. Also I'm new to the dialogflow stuff.
I can do this in any language, but here's an example in Javascript:
fdk.handle(function (input) {
// Some code here that takes 20 seconds.
return {'fulfillmentText': 'i can respond but I will never make it here.'}
});
EDIT 1 - Trying async
When I use an async function, the POST request never happens. So in the following code:
fdk.handle(function (input) {
callFlow(input);
return { 'fulfillmentText': 'here is the response from the webhook!!' }
});
async function callFlow(input) {
console.log("input is --> " + input)
var url = "some_url"
console.log("Requesting " + url)
request(url, { json: true, headers: {'Access-Control-Allow-Origin' : '*'} }, (err, res, body) => {
if (err) { return console.log(err); }
console.log("body is...")
console.log(body)
});
}
I see in the logs the two console.log outputs but nothing from the request. And the request doesn't seem to happen either because I don't see it at my endpoint.
SOLUTION
Thanks Prisoner for the tip. Seems like I needed to return the fulfillment JSON back through the callFlow() and handle() functions. Now Google Home doesn't timeout and both the HTTP call and response are generated.
const fdk = require('#fnproject/fdk');
const request = require('request');
fdk.handle(function (input) {
return callFlow(input);
});
async function callFlow(input) {
var searchPhrase = input || "cats"
var url = "some url"
return new Promise((resolve, reject) => {
request.post(url, {
headers: { 'content-type': 'application/x-www-form-urlencoded' },
body: searchPhrase
},
function (err, resp, body) {
if (err) { return console.log(err) }
r = { 'fulfillmentText': `OK I've triggered the flow function with search term ${searchPhrase}` }
resolve(r)
}
);
});
}
You cannot trigger the fulfillment asynchronously. In a conversational model, it is expected that the fulfillment will perform some logic that determines the response.
You can, however, perform an asynchronous operation in the fulfillment that does not complete before you return the result.
If you are using a sufficiently modern version of node (version 8 and up), you can do this by declaring a function as an async function, but not calling it with the await keyword. (If you did call it with await, it would wait for the asynchronous operation to complete before continuing.)
So something like this should work, given your example:
async function doSomethingLong(){
// This takes 20 seconds
}
fdk.handle(function (input) {
doSomethingLong();
return {'fulfillmentText': 'This might respond before doSomethingLong finishes.'}
});
Update 1 based on your code example.
It seems odd that you report that the call to request doesn't appear to be done at all, but there are some odd things about it that may be causing it.
First, request itself isn't an async function. It is using a callback model and async functions don't just automatically wait for those callbacks to be called. So your callFlow() function calls console.log() a couple of times, calls request() and returns before the callbacks are called back.
You probably should replace request with something like the request-promise-native package and await the Promise that you get from the call. This makes callFlow() truly asynchronous (and you can log when it finishes the call).
Second, I'd point out that the code you showed doesn't do a POST operation. It does a GET by default. If you, or the API you're calling, expect a POST, that may be the source of the error. However, I would have expected the err parameter to be populated, and your code does look like it checks for, and logs, this.
The one unknown in the whole setup, for me, is that I don't know how fdk handles async functions, and my cursory reading of the documentation hasn't educated me. I've done this with other frameworks, and this isn't a problem, but I don't know if the fdk handler times out or does other things to kill a call once it sends a reply.

Resources