passing multiple arguments while execute an exe file in node.js - node.js

Am using bellow code to pass two arguments along with an exe file execution as follows.But it is not working.in command line it is working properly.
var osName =jobData[0].os;
exec('Shedule.exe',['value=Start'],['ID=osName'], function (err, data) {
console.log(data);
});
in cmd
C:\Users\Desktop\ver>Shedule.exe value=Start ID=WIN7-64

Try this.I think this will solve your problem.
var osName =jobData[0].os;
exec('Shedule.exe',['value=Start','ID='+osName], function (err, data) {
console.log(data);
});

Related

Set variable equal to mongodb key value

var userLat = db.collection('users', function (err, document){
document.findOne({_id: loggedUserID}, function(err, docs) {
console.log(docs.currentUserLat);
})
});
This is my code, I'm trying to get the value that's console logged into the variable. I just can't find the correct syntax to do this. The console log does return the correct value just need to drop it into the variable. Grateful for some help.
What do you want to do with 'docs.currentUserLat'?
You can do what you need to do without saving docs.currentUserLat to a variable that has scope outside of your db.collection call. Some examples:
If you simply want to change the document in your database, take advantage of the many methods specified in the Collections API: http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html. For example, to update the document and simultaneously resave it in the database:
db.collection('users', function (err, document){
document.findOneAndUpdate({_id: loggedUserID},
{currentUserLat: [updated value]},
function(err, docs) {
if(err) console.log(err);
}
)
});
If you just wanted to use docs.currentUserLat inside some node function, you'll need to properly nest the document.findOne function inside a callback (or vice versa). For example, to write currentUserLat to a file using the fs module:
var fs = require('fs');
db.collection('users', function (err, document){
document.findOne({_id: loggedUserID}, function(err, docs) {
fs.writeFile("pathToYourFile", docs.currentUserLat, function(err) {
if(err) {return console.log(err);}
});
});
});
Or, if you want to send it in response to a simple http request:
var http = require('http');
http.createServer(function(request,response){
db.collection('users', function (err, document){
document.findOne({_id: loggedUserID}, function(err, docs) {
response.writeHead(200,{'Content-Type':'text/html'});
response.end(docs.currentUserLat);
});
});
});
The key thing to remember is what JohnnyHK said in their comment: docs.currentUserLat is only available inside the anonymous function passed to findOne. So, whatever it is that you need to do, do it inside this function.
(Reading the link JohnnyHK provided is a great way to get started with understanding asynchronous functions in Node. Another is https://github.com/rvagg/learnyounode)
First of all you have to understand how javascript callback works. After that you will see that nothing assigns docs.currentUserLat to your userLat variable. The reason behind this is that your docs.currentUserLat is available only inside the callback. Think about it in the following way:
You program started to execute and encountered the line: var userLat = .... This line tells: do a callback (which basically asks someone else to do the job), your while your job is being executed the program continues, by assigning userLat to undefined and executes further. Then at some period of time callback finishes and console.log your docs.currentUserLat.
One way to have the desired behavior is to make userLat global and instead of console.log(docs.currentUserLat); do userLat = docs.currentUserLat. The problem that if you will do this, your userLat eventually will have the desired value (if callback will not fail), but you can not predict when. So if you will do
var userLat = db.collection('users', function (err, document){ ... });
.. some other code
console.log(userLat);
you will not be sure that you will get the output. Another way to do put everything in another callback.

Node.js spawn EMFILE

I am trying to run a command inside a async.forEach loop using ChildProcess.exec in my node job. here is the code
async.forEach( docPaths, function(docPath, callback) {
var run = [];
// some command using docPath variable here..
run.push(command);
debugger;
exec(run.join(' '), function(error, stdout, stderr){
callback();
});
}, callback);
Here is the error
"stack":"Error: spawn EMFILE\
at errnoException (child_process.js:478:11)\
at ChildProcess.spawn (child_process.js:445:11)\
at child_process.js:343:9\
at Object.execFile (child_process.js:253:15)\
at child_process.js:220:18\
a quick google shows i need to set ulimit value to increase the number of file descriptors can be open. some thing like "ulimit -n 10000".. (from link below)
https://groups.google.com/forum/#!topic/nodejs/jeec5pAqhps
where can i increase this..? or is there any other solution to circumvent the issue?
Appreciate your help.. Thanks much !!
First of all its not advisable to mess with ulimit, as it may have system wide impacts.
Instead since you are already using async, it comes with a limit paramater which you can use to limit the number of parallely executions.
async.eachLimit( docPaths, 100, function(docPath, callback) {
var run = [];
// some command using docPath variable here..
run.push(command);
debugger;
exec(run.join(' '), function(error, stdout, stderr){
callback();
});
}, callback);
Please do trial and error and replace 100 with suitable value.

Run synchronous tasks using node.js on windows

I am using the Node FFI module and am trying to run sync tasks on Windows. I can successfully run a task using the following code.
var ffi=require('ffi')
var nativeC = new ffi.Library("Kernel32", {
"WinExec": ["int32", ["string"]]
});
nativeC.WinExec('ls -lrt');
I presume this is the way to execute sync tasks, but this code always exits after the 1st 'ls -lrt' command; if I chain a few more commands, they won't work. So, is there a callback function over here, in the FFI module, or another way I can chain commands in node.js on Windows so they run in sync, one after the other.
I'm not sure you need WinExec to run a windows command. As Jonathan pointed out, ls isn't available.
However, if you want to chain commands you could use async.js and exec like this:
var
async = require('async');
exec = require('child_process').exec,
commands = [ 'dir /w', 'echo test'];
var executeCommand = function(command, callback){
exec(command, function (err, stdout, stderr) {
if(err) return callback(err);
console.log(stdout);
callback();
});
};
async.eachSeries(commands, executeCommand, function(err){
console.log('error: ' + err);
});

Async.js Parallel Callback not executing

I'm working with the parallel function in Async.js and for some reason the final call back is not getting executed and I do not see an error happening anywhere.
I'm dynamically creating an array of functions that are passed to the parallel call as such:
// 'theFiles' is an array of files I'm working with in a code-generator style type of scenario
var callItems = [];
theFiles.forEach(function(currentFile) {
var genFileFunc = generateFileFunc(destDir + "/" + currentFile, packageName, appName);
callItems.push(genFileFunc(function(err, results) {
if(err) {
console.error("*** ERROR ***" + err);
} else {
console.log("Done: " + results);
}
}));
});
async.parallel(callItems, function(err, results) {
console.log(err);
console.log(results);
if(err) {
console.error("**** ERROR ****");
} else {
console.log("***** ALL ITEMS HAVE BEEN CALLED WITHOUT ERROR ****");
}
});
Then in an outside function (outside of the function that is executing the forEach above) I have the generateFileFunc() function.
// Function that returns a function that works with a file (modifies it/etc).
function generateFileFunc(file, packageName, appName) {
return function(callback) {
generateFile(file, packageName, appName, callback);
}
}
I've looked at this SO post and it helped me get to where I'm at. However the final call back is not being executed. All of the items in the parallel call are being executed though. Inside of gnerateFile (function) at the very bottom I call the callback, so thats golden.
Anyone have any idea why this might not be executing properly?
The end result is to work with each function call in parallel and then be notified when I'm done so I can continue executing some other instructions.
Thanks!
Analyze what is happening line by line, starting with this:
var genFileFunc = generateFileFunc(...);
Since your function generateFileFunc returns function, so variable genFileFunc is a following function
genFileFunc === function(callback) {
generateFile( ... );
};
Now it is clear that this function returns nothing ( there is no return statement ). And obviously by nothing I understand JavaScript's built-in undefined constant. In particular you have
genFileFunc(function(err, results) { ... } ) === undefined
which is the result of calling it. Therefore you push undefined to callItems. No wonder it does not work.
It is hard to tell how to fix this without knowing what generateFile exactly does, but I'll try it anyway. Try simply doing this:
callItems.push(genFileFunc);
because you have to push function to callItems, not the result of the function, which is undefined.
Curious.
Best guess so far: Inside generateFile, RETURN callback instead of calling it.
You can achieve the stated goal with
async.map(theFiles, function(file, done) {
generateFile(destDir + "/" + file, packageName, appName, done);
}, function(err, res) {
// do something with the error/results
});

nodejs express fs iterating files into array or object failing

So Im trying to use the nodejs express FS module to iterate a directory in my app, store each filename in an array, which I can pass to my express view and iterate through the list, but Im struggling to do so. When I do a console.log within the files.forEach function loop, its printing the filename just fine, but as soon as I try to do anything such as:
var myfiles = [];
var fs = require('fs');
fs.readdir('./myfiles/', function (err, files) { if (err) throw err;
files.forEach( function (file) {
myfiles.push(file);
});
});
console.log(myfiles);
it fails, just logs an empty object. So Im not sure exactly what is going on, I think it has to do with callback functions, but if someone could walk me through what Im doing wrong, and why its not working, (and how to make it work), it would be much appreciated.
The myfiles array is empty because the callback hasn't been called before you call console.log().
You'll need to do something like:
var fs = require('fs');
fs.readdir('./myfiles/',function(err,files){
if(err) throw err;
files.forEach(function(file){
// do something with each file HERE!
});
});
// because trying to do something with files here won't work because
// the callback hasn't fired yet.
Remember, everything in node happens at the same time, in the sense that, unless you're doing your processing inside your callbacks, you cannot guarantee asynchronous functions have completed yet.
One way around this problem for you would be to use an EventEmitter:
var fs=require('fs'),
EventEmitter=require('events').EventEmitter,
filesEE=new EventEmitter(),
myfiles=[];
// this event will be called when all files have been added to myfiles
filesEE.on('files_ready',function(){
console.dir(myfiles);
});
// read all files from current directory
fs.readdir('.',function(err,files){
if(err) throw err;
files.forEach(function(file){
myfiles.push(file);
});
filesEE.emit('files_ready'); // trigger files_ready event
});
As several have mentioned, you are using an async method, so you have a nondeterministic execution path.
However, there is an easy way around this. Simply use the Sync version of the method:
var myfiles = [];
var fs = require('fs');
var arrayOfFiles = fs.readdirSync('./myfiles/');
//Yes, the following is not super-smart, but you might want to process the files. This is how:
arrayOfFiles.forEach( function (file) {
myfiles.push(file);
});
console.log(myfiles);
That should work as you want. However, using sync statements is not good, so you should not do it unless it is vitally important for it to be sync.
Read more here: fs.readdirSync
fs.readdir is asynchronous (as with many operations in node.js). This means that the console.log line is going to run before readdir has a chance to call the function passed to it.
You need to either:
Put the console.log line within the callback function given to readdir, i.e:
fs.readdir('./myfiles/', function (err, files) { if (err) throw err;
files.forEach( function (file) {
myfiles.push(file);
});
console.log(myfiles);
});
Or simply perform some action with each file inside the forEach.
I think it has to do with callback functions,
Exactly.
fs.readdir makes an asynchronous request to the file system for that information, and calls the callback at some later time with the results.
So function (err, files) { ... } doesn't run immediately, but console.log(myfiles) does.
At some later point in time, myfiles will contain the desired information.
You should note BTW that files is already an Array, so there is really no point in manually appending each element to some other blank array. If the idea is to put together the results from several calls, then use .concat; if you just want to get the data once, then you can just assign myfiles = files directly.
Overall, you really ought to read up on "Continuation-passing style".
I faced the same problem, and basing on answers given in this post I've solved it with Promises, that seem to be of perfect use in this situation:
router.get('/', (req, res) => {
var viewBag = {}; // It's just my little habit from .NET MVC ;)
var readFiles = new Promise((resolve, reject) => {
fs.readdir('./myfiles/',(err,files) => {
if(err) {
reject(err);
} else {
resolve(files);
}
});
});
// showcase just in case you will need to implement more async operations before route will response
var anotherPromise = new Promise((resolve, reject) => {
doAsyncStuff((err, anotherResult) => {
if(err) {
reject(err);
} else {
resolve(anotherResult);
}
});
});
Promise.all([readFiles, anotherPromise]).then((values) => {
viewBag.files = values[0];
viewBag.otherStuff = values[1];
console.log(viewBag.files); // logs e.g. [ 'file.txt' ]
res.render('your_view', viewBag);
}).catch((errors) => {
res.render('your_view',{errors:errors}); // you can use 'errors' property to render errors in view or implement different error handling schema
});
});
Note: you don't have to push found files into new array because you already get an array from fs.readdir()'c callback. According to node docs:
The callback gets two arguments (err, files) where files is an array
of the names of the files in the directory excluding '.' and '..'.
I belive this is very elegant and handy solution, and most of all - it doesn't require you to bring in and handle new modules to your script.

Resources