Handling multiple post requests with locking - node.js

So I have to write some NodeJS code that does the following: whenever a post request is made, I attempt to execute some program; if the program is already executing (because of a previous request), I ignore the request. If not, I execute the program. I'm using NodeJS child_process.exec to accomplish this; however, there's no way for me to know when exec(program) terminates; I thought of using execSync, but this simply blocks any requests until the program is done executing, instead of ignoring them completely. Here is the code I have right now:
function fun () {
execFile('C:\\Windows\\System32\\notepad.exe', ['package.json'],);
}

execFile is an EventEmitter, so you can listen for events that occur while execFile operates, including the exit event, which tells you the process has completed.
ignoreNextRequest = true;
execFile('C:\\Windows\\System32\\notepad.exe', ['package.json']).once('exit', (code, signal) => {
// Your code to handle the end of the process here.
ignoreNextRequest = false;
});

Related

event that the unzip process is finished

I use the following code which works
but I want to know when the API is done the extracting
and finish the process, is there anyway to do it with this lib?
it based on yauzl
https://www.npmjs.com/package/extract-zip
var extract = require('extract-zip')
extract(source, {dir: target}, function (err) {
// extraction is complete. make sure to handle the err
})
I didnt find any event that the process done, but maybe I miss something ?
This module doesn't fire events. However, the callback function is called whenever the unzip process has finished (or an error occurred). This is the only way you are informed the process has finished. You can put your logic that needs to run after the process has finished there.
var extract = require('extract-zip')
extract(source, {dir: target}, function (err) {
// extraction is complete. make sure to handle the err
// If you are here the process of unzipping is done (or an error occurred)
})
If you really do want events, you can take a look at the underlying yauzl package as it uses events and streams extensively.

Is there any risk to read/write the same file content from different 'sessions' in Node JS?

I'm new in Node JS and i wonder if under mentioned snippets of code has multisession problem.
Consider I have Node JS server (express) and I listen on some POST request:
app.post('/sync/:method', onPostRequest);
var onPostRequest = function(req,res){
// parse request and fetch email list
var emails = [....]; // pseudocode
doJob(emails);
res.status(200).end('OK');
}
function doJob(_emails){
try {
emailsFromFile = fs.readFileSync(FILE_PATH, "utf8") || {};
if(_.isString(oldEmails)){
emailsFromFile = JSON.parse(emailsFromFile);
}
_emails.forEach(function(_email){
if( !emailsFromFile[_email] ){
emailsFromFile[_email] = 0;
}
else{
emailsFromFile[_email] += 1;
}
});
// write object back
fs.writeFileSync(FILE_PATH, JSON.stringify(emailsFromFile));
} catch (e) {
console.error(e);
};
}
So doJob method receives _emails list and I update (counter +1) these emails from object emailsFromFile loaded from file.
Consider I got 2 requests at the same time and it triggers doJob twice. I afraid that when one request loaded emailsFromFile from file, the second request might change file content.
Can anybody spread the light on this issue?
Because the code in the doJob() function is all synchronous, there is no risk of multiple requests causing a concurrency problem.
If you were using async IO in that function, then there would be possible concurrency issues.
To explain, Javascript in node.js is single threaded. So, there is only one thread of Javascript execution running at a time and that thread of execution runs until it returns back to the event loop. So, any sequence of entirely synchronous code like you have in doJob() will run to completion without interruption.
If, on the other hand, you use any asynchronous operations such as fs.readFile() instead of fs.readFileSync(), then that thread of execution will return back to the event loop at the point you call fs.readFileSync() and another request can be run while it is reading the file. If that were the case, then you could end up with two requests conflicting over the same file. In that case, you would have to implement some form of concurrency protection (some sort of flag or queue). This is the type of thing that databases offer lots of features for.
I have a node.js app running on a Raspberry Pi that uses lots of async file I/O and I can have conflicts with that code from multiple requests. I solved it by setting a flag anytime I'm writing to a specific file and any other requests that want to write to that file first check that flag and if it is set, those requests going into my own queue are then served when the prior request finishes its write operation. There are many other ways to solve that too. If this happens in a lot of places, then it's probably worth just getting a database that offers features for this type of write contention.

Nodejs async.whilst() runs only one time

I'm writing a script to batch process some text documents and insert them into a mysql database. I'm trying to use the async library because using a standard while loop blocks the event queue and prevents the insert queries from getting run until all are generated. Since that may take 10 minutes or more, I get a timeout. So, I am trying to use async to avoid blocking the main thread. However, it's not working as expected. When I run the simplest form of the code below, using node test.js, in the command line, it only executes once, instead of infinitely. It seems like the computer is terminating the node process early since it is non-blocking. This, of course, is not what I want. Why is this, and how can I get it to work correctly?
//this code should run forever, constantly printing "working". However it only runs once.
var async = require('async')
async.whilst(function(){return true},function(){console.log("working")})
The second parameter for whilst() is a function that takes in a callback that needs to be called when the current iteration is "done."
So if you modify the code this way, you'll get what you're expecting:
var async = require('async');
async.whilst(function() {
return true
}, function(cb) {
console.log("working");
cb();
});

When does the cycle's code execution stop in NodeJS Express?

After reading Node.JS Express documentation, I'm not clear about when the client cycle is finished after he requests something.
Does res.json(), res.send() and res.render() finish the cycle entirely? By entirely I mean: all the code execution in the server for that request. Or is it that the the client perceives a cycle finish by receiving a response from the server but the server continues to execute code?
For example, if I have this:
router.get('/home/about', function(req, res) {
// code block #1
// ...
if (condition) {
res.render('blah.ejs');
}
// code block #2
// ...
res.render('about.ejs');
});
If condition is true, I have a few questions:
Is code block #2 executed?
Is res.render('about.ejs') executed?
If the answer to above question (2) is true, what happends with that response? because I'm pretty sure that the user will receive the response of res.render('blah.ejs')
Also, what changes if I write return; below res.render('blah.ejs)`
The function doesn't exit when you call res.render or any of the other examples you showed, but the pipe will be closed shortly after you call it so in most cases your example will result in an error that you tried to write to a closed pipe.
The best way to do what you're showing is something like this:
if(condition){
return res.render('blah.ejs');
}
res.render('about.ejs');
This way when the code hits that 'return' statement, nothing else in the function scope will execute.
You could also (and many folks do) throw in a return statement before the second res.render call just to be explicit.

Node.js: Will node always wait for setTimeout() to complete before exiting?

Consider:
node -e "setTimeout(function() {console.log('abc'); }, 2000);"
This will actually wait for the timeout to fire before the program exits.
I am basically wondering if this means that node is intended to wait for all timeouts to complete before quitting.
Here is my situation. My client has a node.js server he's gonna run from Windows with a Shortcut icon. If the node app encounters an exceptional condition, it will typically instantly exit, not leaving enough time to see in the console what the error was, and this is bad.
My approach is to wrap the entire program with a try catch, so now it looks like this: try { (function () { ... })(); } catch (e) { console.log("EXCEPTION CAUGHT:", e); }, but of course this will also cause the program to immediately exit.
So at this point I want to leave about 10 seconds for the user to take a peek or screenshot of the exception before it quits.
I figure I should just use blocking sleep() through the npm module, but I discovered in testing that setting a timeout also seems to work. (i.e. why bother with a module if something builtin works?) I guess the significance of this isn't big, but I'm just curious about whether it is specified somewhere that node will actually wait for all timeouts to complete before quitting, so that I can feel safe doing this.
In general, node will wait for all timeouts to fire before quitting normally. Calling process.exit() will exit before the timeouts.
The details are part of libuv, but the documentation makes a vague comment about it:
http://nodejs.org/api/all.html#all_ref
you can call ref() to explicitly request the timer hold the program open
Putting all of the facts together, setTimeout by default is designed to hold the event loop open (so if that's the only thing pending, the program will wait). You can programmatically disable or re-enable the behavior.
Late answer, but a definite yes - Nodejs will wait around for setTimeout to finish - see this documentation. Coincidentally, there is also a way to not wait around for setTimeout, and that is by calling unref on the object returned from setTimeout or setInterval.
To summarize: if you want Nodejs to wait until the timeout has been called, there's nothing you need to do. If you want Nodejs to not wait for a particular timeout, call unref on it.
If node didn't wait for all setTimeout or setInterval calls to complete, you wouldn't be able to use them in simple scripts.
Once you tell node to listen for an event, as with the setTimeout or some async I/O call, the event loop will loop until it is told to exit.
Rather than wrap everything in a try/catch you can bind an event listener to process just as the example in the docs:
process.on('uncaughtException', function(err) {
console.log('Caught exception: ' + err);
});
setTimeout(function() {
console.log('This will still run.');
}, 500);
// Intentionally cause an exception, but don't catch it.
nonexistentFunc();
console.log('This will not run.');
In the uncaughtException event, you can then add a setTimeout to exit after 10 seconds:
process.on('uncaughtException', function(err) {
console.log('Caught exception: ' + err);
setTimeout(function(){ process.exit(1); }, 10000);
});
If this exception is something you can recover from, you may want to look at domains: http://nodejs.org/api/domain.html
edit:
There may actually be another issue at hand: your client application doesn't do enough (or any?) logging. You can use log4js-node to write to a temp file or some application-specific location.
Easy way Solution:
Make a batch (.bat) file that starts nodejs
make a shortcut out of it
Why this is best. This way you client would run nodejs in command line. And even if nodejs program returns nothing would happen to command line.
Making bat file:
Make a text file
put START cmd.exe /k "node abc.js"
Save it
Rename It to abc.bat
make a shortcut or whatever.
Opening it will Open CommandLine and run nodejs file.
using settimeout for this is a bad idea.
The odd ones out are when you call process.exit() or there's an uncaught exception, as pointed out by Jim Schubert. Other than that, node will wait for the timeout to complete.
Node does remember timers, but only if it can keep track of them. At least that is my experience.
If you use setTimeout in an arrow / anonymous function I would recommend to keep track of your timers in an array, like:
=> {
timers.push(setTimeout(doThisLater, 2000));
}
and make sure let timers = []; isn't set in a method that will vanish, so i.e. globally.

Resources