Node.js synchronous shell exec - node.js

I am having a problem with async shell executes in node.js.
In my case, node.js is installed on a Linux operating system on a raspberry pi. I want to fill an array with values that are parsed from a shell script which is called on the pi. This works fine, however, the exec() function is called asynchronously.
I need the function to be absolute synchron to avoid messing up my whole system. Is there any way to achieve this? Currently I am trying a lib called .exe, but the code still seems to behave asynchron.
Here's my code:
function execute(cmd, cb)
{
child = exec(cmd, function(error, stdout, stderr)
{
cb(stdout, stderr);
});
}
function chooseGroup()
{
var groups = [];
execute("bash /home/pi/scripts/group_show.sh", function(stdout, stderr)
{
groups_str = stdout;
groups = groups_str.split("\n");
});
return groups;
}
//Test
console.log(chooseGroup());

If what you're using is child_process.exec, it is asynchronous already.
Your chooseGroup() function will not work properly because it is asynchronous. The groups variable will always be empty.
Your chooseGroup() function can work if you change it like this:
function chooseGroup() {
execute("bash /home/pi/scripts/group_show.sh", function(stdout, stderr) {
var groups = stdout.split("\n");
// put the code here that uses groups
console.log(groups);
});
}
// you cannot use groups here because the result is obtained asynchronously
// and thus is not yet available here.
If, for some reason, you're looking for a synchronous version of .exec(), there is child_process.execSync() though it is rarely recommended in server-based code because it is blocking and thus blocks execution of other things in node.js while it is running.

Related

Binding custom node addon to lua with extentions

I have a large collection of asynchronous functions that I have in nodejs code that I would like to expose to lua. The basic idea is that I would like to execute lua scripts and allow those scripts to call back into some of my nodejs code, as well as asynchronously return a value from an executed lua script,
In this example myCustomNodejsAddon would be a custom addon that I write that knows how to bind lua and run lua scripts. One outstanding question is how do I asynchronously return a value from a lua script?
Has anyone done something like this before? I would be very interested in any pointers, thoughts, examples.
EDIT with better example:
-- user written lua script
getUser(1, function(err, user)
if err then
print('Error', err)
else
print('Found user with id', user.id)
return ''
end
end)
/*Create object with mapping of async functions*/
var callbacks = {
"getUser": function(userId, cb) {
db.Users.fetchById(userId).then(function(user) {
cb(null, user);
}, function(err) {
cb(err, null);
}
}
};
myCustomNodejsAddon.provideCallbacks(callbacks);
/* user written lua script has been stored into `scriptSrc` variable */
myCustomNodejsAddon.execute(scriptSrc, function(returnValueOfScript) {
console.log('done running user script: ', retrunValueOfScript);
});
More than one approaches to this problem comes to my mind.
The first would be to create a nodejs script that once executed read the program command line arguments or input stream and execute the code indicated by this channel and stream the response back in JSON format for example. This is the less invasive way of doing this. The script would be something like:
if(require.main === module){
// asume first argument to be the source module for the function of iterest
var mod = require(process.argv[2]);
var fnc = mod[process.argv[3]];
args = process.argv.slice(4);
// by convention the last argument is a callback function
args.push(function(){
console.log(JSON.stringify(arguments));
process.exit();
})
fnc.apply(null, args);
}
An example usage will be:
$ node my-script.js fs readdir /some/path
This will respond with something like [null, ['file1', 'file2']] acording with the files on /some/path. Then you can create a lua module that invoque node with this script and pass the parameters according with the functions you want to call.

Node.js spawn() call not working on Windows

I need to print a file using the command line. When doing something like
rundll32 C:\WINDOWS\system32\shimgvw.dll,ImageView_PrintTo "d:\Temp\test.jpg" "Canon_CP1000"
in CMD manually it works just fine and the image gets printed. However, as soon as I use Nodes "spawn" command to achieve the same behaviour it doesn't do anything.
var spawn = require('child_process').spawn;
var cliCmd = spawn('rundll32', [
'C:\\WINDOWS\\system32\\shimgvw.dll,ImageView_PrintTo "d:\\Temp\\test.jpg" "Canon_CP1000"',
]);
cliCmd.stdout.on('data', function (data) {
console.log('stdout', data);
});
cliCmd.stderr.on('data', function (data) {
console.log('stderr', data);
});
There is also no output in the console at all. Other commands in spawn work (e.g. "ipconfig, ['/all']"). I have also tried to separate the space-separated attributes and placing it in an array slot each. No effect.
Help is very much appreciated. Thanks!
Use an array of parameters to pass to span your child process.
var cliCmd = spawn('rundll32.exe', [
'C:\\WINDOWS\\system32\\shimgvw.dll,ImageView_PrintTo', 'd:\\Temp\\test.jpg', 'Canon_CP1000'
]);
Also, make sure you have the full name of rundll32.exe. You might also have to specify its path.

Asynchronous Database Queries with PostgreSQL in Node not working

Using Node.js and the node-postgres module to communicate with a database, I'm attempting to write a function that accepts an array of queries and callbacks and executes them all asynchronously using the same database connection. The function accepts a two-dimensional array and calling it looks like this:
perform_queries_async([
['SELECT COUNT(id) as count FROM ideas', function(result) {
console.log("FUNCTION 1");
}],
["INSERT INTO ideas (name) VALUES ('test')", function(result) {
console.log("FUNCTION 2");
}]
]);
And the function iterates over the array, creating a query for each sub-array, like so:
function perform_queries_async(queries) {
var client = new pg.Client(process.env.DATABASE_URL);
for(var i=0; i<queries.length; i++) {
var q = queries[i];
client.query(q[0], function(err, result) {
if(err) {
console.log(err);
} else {
q[1](result);
}
});
}
client.on('drain', function() {
console.log("drained");
client.end();
});
client.connect();
}
When I ran the above code, I expected to see output like this:
FUNCTION 1
FUNCTION 2
drained
However, the output bizarrely appears like so:
FUNCTION 2
drained
FUNCTION 2
Not only is the second function getting called for both requests, it also seems as though the drain code is getting called before the client's queue of queries is finished running...yet the second query still runs perfectly fine even though the client.end() code ostensibly killed the client once the event is called.
I've been tearing my hair out about this for hours. I tried hardcoding in my sample array (thus removing the for loop), and my code worked as expected, which leads me to believe that there is some problem with my loop that I'm not seeing.
Any ideas on why this might be happening would be greatly appreciated.
The simplest way to properly capture the value of the q variable in a closure in modern JavaScript is to use forEach:
queries.forEach(function(q) {
client.query(q[0], function(err, result) {
if(err) {
console.log(err);
} else {
q[1](result);
}
});
});
If you don't capture the value, your code reflects the last value that q had, as the callback function executed later, in the context of the containing function.
forEach, by using a callback function isolates and captures the value of q so it can be properly evaluated by the inner callback.
A victim of the famous Javascript closure/loop gotcha. See my (and other) answers here:
I am trying to open 10 websocket connections with nodejs, but somehow my loop doesnt work
Basically, at the time your callback is executed, q is set to the last element of the input array. The way around it is to dynamically generate the closure.
It will be good to execute this using async module . It will help you to reuse the code also . and will make the code more readable . I just love the auto function provided by async module
Ref: https://github.com/caolan/async

Is the following node.js code blocking or non-blocking?

I have the node.js code running on a server and would like to know if it is blocking or not. It is kind of similar to this:
function addUserIfNoneExists(name, callback) {
userAccounts.findOne({name:name}, function(err, obj) {
if (obj) {
callback('user exists');
} else {
// Add the user 'name' to DB and run the callback when done.
// This is non-blocking to here.
user = addUser(name, callback)
// Do something heavy, doesn't matter when this completes.
// Is this part blocking?
doSomeHeavyWork(user);
}
});
};
Once addUser completes the doSomeHeavyWork function is run and eventually places something back into the database. It does not matter how long this function takes, but it should not block other events on the server.
With that, is it possible to test if node.js code ends up blocking or not?
Generally, if it reaches out to another service, like a database or a webservice, then it is non-blocking and you'll need to have some sort of callback. However, any function will block until something (even if nothing) is returned...
If the doSomeHeavyWork function is non-blocking, then it's likely that whatever library you're using will allow for some sort of callback. So you could write the function to accept a callback like so:
var doSomHeavyWork = function(user, callback) {
callTheNonBlockingStuff(function(error, whatever) { // Whatever that is it likely takes a callback which returns an error (in case something bad happened) and possible a "whatever" which is what you're looking to get or something.
if (error) {
console.log('There was an error!!!!');
console.log(error);
callback(error, null); //Call callback with error
}
callback(null, whatever); //Call callback with object you're hoping to get back.
});
return; //This line will most likely run before the callback gets called which makes it a non-blocking (asynchronous) function. Which is why you need the callback.
};
You should avoid in any part of your Node.js code synchronous blocks which don't call system or I/O operations and which computation takes long time (in computer meaning), e.g iterating over big arrays. Instead move this type of code to the separate worker or divide it to smaller synchronous pieces using process.nextTick(). You can find explanation for process.nextTick() here but read all comments too.

node.js scope problems?

I'm working on making a script with node.js for setting up my dzen2 in i3, and haven't really used node for anything like this before.
I need the geometry of the screen to start with, which I can get with something like this:
geometry = getGeo();
function getGeo() {
var sh = require('child_process').exec("i3-msg -t get_outputs",
function(error, stdout, stderr) {
var out = JSON.parse(stdout);
return out[0].rect; //this is the geometry, {"x":0, "y":0, "width":1280, "height":768}
});
};
console.log(geometry);
console.log is logging undefined.
I'm not sure what the proper way to do this is, my brain is tired.
You can't return from the callback function since is async.
Rather write another function and pass the callback object to it.
function getGeo() {
var sh = require('child_process').exec("i3-msg -t get_outputs",
function(error, stdout, stderr) {
var out = JSON.parse(stdout);
getRect(return out[0].rect);
});
};
function getRect(rect) {
// Utilize rect here...
}
You never return a value from getGeo(), you're returning a function from the anonymous function within it. But because of the async nature of the .exec() call, you can't return the value. You can put the console.log into the callback function, but that may not be where you want to use it in your real program.

Resources