How to stop a process in Express when the request timeouts? - node.js

How to stop a process in Express when the request timeouts?
install 'connect-timeout'
import timeout from 'connect-timeout';
// halt_on_timeout.js
module.exports = function haltOnTimedout(req, res, next) {
if (!req.timedout) next();
};
expensive process takes longer than 30 seconds
the route timesouts, but the expensive_long_operation never stops running..
route.post(
'upload_alot_of_content/'
timeout('30s'),
haltOnTimedout,
async (req, res) => {
const result = await expensive_long_operation();
if (req.timedout) {
next('error!')
}
res({....})
})

There is no way to arbitrarily stop an asynchronous process in nodejs, however there are a couple options that you can implement to achieve what you want.
You could run the the expensive_long_operation in a forked nodejs process and then kill that process if there is timeout.
You could write the expensive_long_operation as an asynchronous C++ module and build in a mechanism that would allow you to break out of the operation.

Related

Forked process in expressjs causes server to restart

I have an expressjs server and it uses a fork inside a route to make sure that the main loop isn't blocked (it's somewhat computing intensive). Every time this route is called, the server restarts. I've debugged this problem and found that the forking is what causes this behaviour but I don't understand why. The route is defined as follows:
module.exports = async function someComputingIntensiveFunction(req, res) {
try {
// some stuff
const childProcess = fork('../path/to/file.js');
childProcess.on('message', (data) => {
res.status(201).json(data).end();
});
catch (error) {
res.status(500).end()
}
}
Inside this file is
process.on('message', (data) => {
// do some stuff with data
// based on whatever the result is
process.send(result);
process.exit(result.status);
});
Am I forgetting a crucial part of forking which causes the expressjs server to restart? Thanks in advance for any help.

Why we need async Callback in Node JS Since Event Loop offers Worker Pool to handle expensive task?

I was studying how Node JS improves performance for multiple concurrent request! After reading couple of blogs I found out that:
When any request come an event is triggered and corresponding
callback function is placed in Event Queue.
An event Loop(Main Thread) is responsible for handling all requests
in Event Queue. Event Loop processes the request and send back the
response if request uses Non-Blocking I/O.
If request contains Blocking I/O Event Loop internally assigns this request to
an idle worker from Work Pool and when worker send back the result
Event Loop sends the response.
My Question is since Event Loop is passing heavy blocking work internally to Work Pool using libuv library, why we need Asynchronous callback?
For Better Understanding please see the below code:
const express = require('express')
const app = express()
const port = 3000
function readUserSync(miliseconds) {
var currentTime = new Date().getTime();
while (currentTime + miliseconds >= new Date().getTime()) {
}
return "User"
}
async function readUserAsync(miliseconds) {
var currentTime = new Date().getTime();
while (currentTime + miliseconds >= new Date().getTime()) {
}
return "User"
}
app.get('/sync', (req, res) => {
const user = readUserSync(80)
res.send(user)
})
app.get('/async', async (req, res) => {
const user = await readUserAsync(80)
res.send(user)
})
app.listen(port, () => {
console.log(`Example app listening at http://localhost:${port}`)
})
I checked performance for both endpoints using apache benchmark tool, assuming each I/O operation takes 80ms.
ab -c 10 -t 5 "http://127.0.0.1:3000/async/"
ab -c 10 -t 5 "http://127.0.0.1:3000/sync/"
And surprisingly for endpoint with async callback had higher number of request per second.
So how Event Loop, Thread Pool and async await works internally to handle more concurrent requests?

How do you programmatically exit a node.js/express route on an event?

I want to be able to exit execution of a post route when an event is sent from the client-side. I'm using socket.io but I'm not sure it can do what I want. I am using the uploads route to process a file, but if the user deletes the file, I want the app.post execution to end, similar to either a res.end() or return statement.
My app in the front-end receives a file from the user and immediately is sent to the post route for processing. If the user deletes the file and uploads a new one, the previous post route is still going. I want to make sure the previous one was terminated, cancelled, etc.
I'm currently using socket.io to communicate front-end to back-end.
How can I achieve this?
app.post('/uploads', async (req, res) => {
// async func1
// async func2
// if we receive an event from the front end while processing here, how can I exit the post route?
// async func3
});
You can add UUID for each request you make and return it to the front-end. The request will be resolved with the 202 ACCEPTED status code meaning the request was accepted and being handled but the HTTP request will be resolved.
Now you can implement a resourceManagerServeic that will allow APIs (http or ws) to change the state of a resource (like canceling it).
app.post('/uploads', async (req, res) => {
const resourceUuid = resourceManagerServeic.createResource();
res.status(202); // ACCEPTED
res.send({ uuid: resourceUuid });
// start besnise logic
await function1();
if(resourceManagerServeic.isCanceled(resourceUuid)) {
// cleanup
return; // stop request handling
}
await function2();
if(resourceManagerServeic.isCanceled(resourceUuid)) {
// cleanup
return; // stop request handling
}
await function3();
if(resourceManagerServeic.isCanceled(resourceUuid)) {
// cleanup
return; // stop request handling
}
});
app.del('/uploads/:resourceUuid', async (req, res) => {
resourceManagerServeic.cancle(req.params.resourceUuid);
res.end() // handle response
});
I guess that your are using Express. Take a look at express-async-handler
You can invoke it
const asyncHandler = require('express-async-handler')
app.post('/upload', asyncHandler(async(req, res) => {
await firstfunc()
await secondfunc()
}))

How to catch async errors in nodejs Express outside request context?

I have a simple express server where a POST request kicks off a long-running job and returns a job ID. The server monitors the job status and the client can query for the job status over time.
I do this with child_process.spawn, and have callbacks for the usual events on the child_process.
Sometimes an exception will happen during the job's execution, long after the initial "start job" request has returned. That calls my error callback, but then what? I can't throw an ApiError there, because express won't handle it -- I'll get an UnhandledPromiseRejectionWarning (which in a future node.js version will terminate the process).
Is there any way to set up a "global error handler" for express that would put a try/catch around the whole server?
A simple example would be something like this:
app.post('/testing', (req, res) => {
setTimeout(() => { raise new ApiError('oops!') }, 1000)
})
straight from express docs,
"You must catch errors that occur in asynchronous code invoked by route handlers or middleware and pass them to Express for processing. For example:"
app.get('/testing', function (req, res, next) {
setTimeout(function () {
try {
throw new Error('BROKEN')
} catch (err) {
next(err)
}
}, 100)
})

Express hold a request until last request is finished

So I'm writing an application in node.js+express which I want to achieve the following goal.
User POST many requests at a nearly same time (like with command curl...& which & makes it run at the background)
Process each request at one time, hold other requests before one is finished. The order can be determined by request arrive time, if same then choose randomly. So if I POST 5 requests to add things in the database at nearly the same time, the first request will be added into database first, while other requests will be held (not responding anything yet) until the first request is been processed and respond with a 200 code then follow on processing the second request.
Is it possible to achieve this with express, so when I send couple requests at one time, it won't occur issue like something isn't add into MongoDB properly.
You can set up middleware before/after your routes to queue up and dequeue requests if one is in progress. As people have mentioned this is not really best practice, but this is a way to do it within a single process (will not work for serverless models)
const queue = [];
const inprogress = null;
app.use((req, res, next) => {
if (inprogress) {
queue.push({req, res, next})
} else {
inprogress = res;
}
})
app.get('/your-route', (req, res, next) => {
// run your code
res.json({ some: 'payload' })
next();
})
app.use((req, res, next) => {
inprogress = null;
if (queue.length > 0) {
const queued = queue.shift();
inprogress = queued.res;
queued.next();
}
next();
})

Resources