I have a nodejs app which is using express.
For a specific GET request, I would like to set a timeout, and if this timeout is reached, then I would like to completely end the request and redirect to a timeout page.
I tried the following in my route.js file :
app.get('/exec', isLoggedIn, function(req, res) {
var customTimeout = 10000;
req.connection.setTimeout(customTimeout, function(){
console.log("TIMED!");
res.render('timeout.ejs', {
user: req.user,
});
});
//execution of the GET request
res.render('success.ejs', {
user: req.user,
});
});
After 10 seconds, I can see the "TIMED!" message in the logs, but I'm not redirected to the timeout page, and the request is still running in the background...
Can someone help me deal with this ?
This works for me:
app.get('/exec', isLoggedIn, function(req, res) {
var customTimeout = 10000;
res.setTimeout(customTimeout, function(){
console.log("TIMED!");
res.render('timeout.ejs', { user: req.user });
});
res.render('success.ejs', { user: req.user });
});
I'm assuming that instead of the last res.render(), you're executing some sort of operation that may take a lot of time to finish (and after 10 seconds you want to notify the user that the operation timed out).
If that operation isn't cancellable somehow, eventually it will finish and also try to send back a response, in which case you can run into errors (most likely "Can't set headers after they are sent").
So before sending a response, you need to check if one hasn't been sent already by the timeout handler. You can use res.headersSent for that.
Related
I have created a to-do list app using Node, Express and Mongoose:
To delete a task, the user hits the cross button on the right hand side. This sends a POST request with the task ID to the /delete_task endpoint. The router for this endpoint is /routes/delete_task.js:
var express = require('express');
const Task = require('../models/task');
var router = express.Router();
express.json();
router.post('/', async (req, res, next) => {
const deleted_task = await Task.findByIdAndDelete(req.body.taskID);
console.log('Deleted task: \n', deleted_task);
res.redirect('..');
}
);
module.exports = router;
The router performs a findByIdAndDelete, and then redirects to the home directory. The router for the home directory renders a view of all the existing tasks in the collection, and looks like:
var express = require('express');
const Task = require('../models/task');
var router = express.Router();
/* GET home page. */
router.get('/', function(req, res, next) {
Task.find({}, function (err, result) {
if (err) {console.error(err)};
if (result) {
return res.render('index', {title: 'To-do list', tasks: result})
};
});
});
module.exports = router;
My problem is that when deleting a task, the findByIdAndDelete successfully deletes the task, but this is not reflected in the redirected home page. The deleted task only disappears once I refresh the page. This suggests that it's some kind of async issue, and that the redirect is happening before the findByIdAndDelete query has finished executing.
To address this, I have made the router.post() callback an async function and am using await on the findByIdAndDelete, and I have also tried placing the res.redirect('..') in a callback function of the findByIdAndDelete, which also does not fix the problem:
router.post('/', (req, res, next) => {
Task.findByIdAndDelete(req.body.taskID, (err, result) => {
if (err) {
console.error(err)
};
if (result) {
console.log(result)
};
res.redirect('..');
});
});
I have looked for other questions on stackoverflow, all of which seem to suggest that this is an async issue caused by the redirect happening before the query has finished executing. The suggested solutions I have found were to make the router.post(...) callback an async function and await the result of the Mongoose query, or to place the res.redirect('..') in the callback of the findByIdAndDelete so that the redirect happens after the query has finished executing. I have tried both of these but the problem remained.
The only other thing I can think of is that I am trying to redirect from within a POST request, and I don't know if this is legit. It seems to work fine looking at the log (see last 2 lines where the GET request to / follows the POST request to /delete_task):
New task submitted: cake
New task created successfully: cake
POST /new_task 302 29.747 ms - 46
GET / 200 4.641 ms - 1701
GET /stylesheets/style.css 304 0.849 ms - -
GET /javascripts/delete_task.js 304 0.479 ms - -
Deleted task:
{
_id: new ObjectId("636a993ca0b8e1f2cc79232a"),
content: 'cake',
completed: false,
__v: 0
}
POST /delete_task 302 10.358 ms - 24
GET / 200 3.867 ms - 1348
This is where I've hit a brick wall and I can't see what might be causing the issue. Really appreciate any help or suggestions anyone might have - cheers.
I don't think this is a problem with asynchronousness, because you wait properly before responding to the POST request.
But the res.redirect makes sense only if hitting the cross button navigates from the To-do list page to the /delete_task page and from there back, by virtue of the redirection. This would be possible only with an HTML <form> element that is submitted upon hitting the button.
Is that how you have implemented it? You say that you "send a POST request", but is this through a <form>, or rather through an axios.post or a similar Javascript method? In the latter case, the following would happen:
The Javascript client sends the POST request and the deletion is carried out on the database.
The Javascript client receives a redirection response and sends the GET request.
The Javascript client receives the HTML page for the to-do list as response, but does nothing with it.
In other words: the To-do list page would not be reloaded by this axios.post request. If you want this to happen, don't respond to the POST request with a redirection, but simply with 200 OK, and have the Javascript client execute location.reload() when it receives this response.
I have an API POST route where I receive data from a client and upload the data to another service. This upload is done inside of the post request (async) and takes awhile. The client wants to know their post req was received prior to the async (create project function) is finished. How can I send without ending the POST? (res.send stops, res.write doesn't send it out)
I thought about making an http request back to their server as soon as this POST route is hit. . .
app.post('/v0/projects', function postProjects(req, res, next) {
console.log('POST notice to me');
// *** HERE, I want to send client message
// This is the async function
createProject(req.body, function (projectResponse) {
projectResponse.on('data', function (data) {
parseString(data.toString('ascii'), function (err, result) {
res.message = result;
});
});
projectResponse.on('end', function () {
if (res.message.error) {
console.log('MY ERROR: ' + JSON.stringify(res.message.error));
next(new Error(res));
} else {
// *** HERE is where they finally receive a message
res.status(200).send(res.message);
}
});
projectResponse.on('error', function (err) {
res.status(500).send(err.message);
});
});
});
The internal system requires that this createProject function is called in the POST request (needs to exist and have something uploaded or else it doesn't exist) -- otherwise I'd call it later.
Thank you!
I think you can't send first response that post request received and send another when internal job i.e. createProject has finished no matter success or fail.
But possibly, you can try:
createProject(payload, callback); // i am async will let you know when done! & it will push payload.jobId in doneJobs
Possibility 1, If actual job response is not required:
app.post('/v0/projects', function (req, res, next) {
// call any async job(s) here
createProject(req.body);
res.send('Hey Client! I have received post request, stay tuned!');
next();
});
});
Possibility 2, If actual job response is required, try maintaining queue:
var q = []; // try option 3 if this is not making sense
var jobsDone = []; // this will be updated by `createProject` callback
app.post('/v0/projects', function (req, res, next) {
// call async job and push it to queue
let randomId = randomId(); // generates random but unique id depending on requests received
q.push({jobId: randomId });
req.body.jobId = randomId;
createProject(req.body);
res.send('Hey Client! I have received post request, stay tuned!');
next();
});
});
// hit this api after sometime to know whether job is done or not
app.get('/v0/status/:jobId', function (req, res, next) {
// check if job is done
// based on checks if done then remove from **q** or retry or whatever is needed
let result = jobsDone.indexOf(req.params.jobId) > -1 ? 'Done' : 'Still Processing';
res.send(result);
next();
});
});
Possibility 3, redis can be used instead of in-memory queue in possibility 2.
P.S. There are other options available as well to achieve the desired results but above mentioned are possible ones.
I am trying to build a facebook messenger bot using nodejs. I got the bot developed with core features. While testing for
a negative scenario where the user sends in a GIF or sticker, it has to respond "I couldn't get you. Sorry". It does send
that message but it hangs and keeps sending that message thereafter every few minutes. I noticed that the ngrok server threw an 500 HTTP
Internal Server Error for the POST request. On further debugging, I was able to find out that res.send(200) is not getting executed properly.
The console.log stmt that I have after res.send(200) never gets printed. Not sure what I may be missing. Any help is appreciated. Tried restarting the server and resubscribed to the app with a new ngork https link. The same message continues to get printed out :(.
Here is my code.
server.post('/', (req, res, next) => {
// Extract the body of the POST request
let data = req.body;
let incomingMessage = '';
if(data.object === 'page') {
data.entry.forEach(pageObj => {
// Iterate through the messaging Array
pageObj.messaging.forEach(msgEvent => {
incomingMessage = {
sender: msgEvent.sender.id,
timeOfMessage: msgEvent.timestamp,
message: msgEvent.message
}
});
});
}
const {
message,
sender
} = incomingMessage
if(message.text) {
f.txt(sender, 'Hi there!!!');
} else {
f.txt(sender, `I couldn't get you. Sorry :(`);
//res.send(200);
}
res.send(200);
console.log('We are at the end of post');
return next();
});
Maybe this answer doesn't resolve your problem but it can be helpful.
If you want to send a 200 HTTP code use this instead:
res.sendStatus(200); // equivalent to res.status(200).send('OK')
On the other hand, if this is not a middleware, you can remove the return next(); line.
I'm trying to send a http delete request to my web server in Mocha's after hook. Here's the relevant code:
after(function() {
console.log('here at after');
request.del('http://localhost:3000/api/deleteAllTasks', function(req, res) {
console.log(req);
});
});
The problem is that the delete endpoint is never being hit. It console.logs "here at after" but never console.logs the request in the callback for request.del. I'm not sure what is causing this; I know the endpoint works as I've sent curl requests to it successfully. Anyone have any ideas? Ultimately, I want this endpoint to clear the DB after this particular test suite runs.
Betting that your script is ending before the asynchronous request is actually made. Try changing your after(function() {... to include the "done" parameter like after(function(done) {... and calling done() within the inner callback.
Here is a new after block that will make your test work. leetibbett is completely correct.
after(function(done) {
console.log('here at after');
request.del('http://localhost:3000/api/deleteAllTasks', function(err, res) {
console.log(res);
done();
});
});
i want to send datas in same client (not all clients) with this code;
app.post("/search", function(req, res) {
//some other codes
//if database saved {
io.sockets.emit('preview-post', {title: "blabla" });// io variable was declared as GLOBAL
// } database saved end.
res.send({bla:bla});// response end before database saving process
});
this sample is working ! but it sends to all clients , How can i emit data to same opened browser(same client) ?
Second Question is: Are there any alternative ways to do this scenario?
My algorithm is post method fired > async call to an api > response end and page loaded on client > async call to an api is still continue > if async call is finished > send alert to client . But How? i wanted to do it wiht socket .io , if i use your 3.part , it'll work , can i do this scenario any other way?
This indeed sends to all sockets.
There are a couple ways to achieve what you are looking to do. The first way is to do something like this on the server:
io.on('connection', function(socket) {
socket.on('search', function(*/ client supplied arguments /*){
socket.emit('preview-post', {title: "blabla" });
});
});
If you are insistent on using a post request, and then sending it back to the client, there are two ways to achieve this.
The easiest way, if you only need to respond to this response, is just send a standard response from node, and let the client handle it:
res.send({
event: 'preview-post',
payload: {title: "blabla" }
});
This removes socket.io's event system, so if you are insistent on using socket.io to send this event back to the same client, you are going to need to use cookies. Express and the module cookie-parser make this easy for you.
Once you have this setup, inside your request you could do something like this:
app.post("/search", function(req, res) {
var socket = findSocketByCookie(req.cookies.myUniqueCookie);
socket.emit('preview-post', {title: "blabla" });
});
function findSocketByCookie(cookie) {
for(var i in io.sockets.connected) {
var socket = io.sockets.connected[i];
if(socket.handshake.headers.cookie.indexOf(cookie) !== -1){
return socket;
}
}
}