node, grunt custom task using stream - node.js

Hi can anyone help me with a custom grunt task?
I'm basically using child_process to call
var cp = exec(cmd, {}, function (err, stdout, stderr) {}
this of course has the stdout etc. If I
cp.stdout.pipe(process.stdout);
it rights a bunch of stuff to the console. I am making this call in a loop so here's a skeleton
this.files.forEach(function(f) {
var src = f.src.map(function(filepath) {
var cmd = util.format("%s %s", options.xUnit, src);
var cp = exec(cmd, {}, function (err, stdout, stderr) {
....
cb();
});
if (options.stdout || grunt.option('verbose')) {
cp.stdout.pipe(process.stdout);
}
This all works fine. but since I'm doing it several times I want at the very least to be able to index how many times stderr has a value. I'm basically running collections of tests. If all pass then stderr is blank if not, I don't want to stop the rest of the tests I want them all to run, but i'd like to output at the end "you have 3 collections that have failing tests". I don't see why I can't seem to get it to work. If I declare
var output = 0;
then
if (options.stderr || grunt.option('verbose')) {
cp.stdout.pipe(process.stdout);
output ++;
}
I then try to print to console after they all run but it's always 0.
I would really like to some how get a copy of ALL the output and parse out the last line from each test session which has the number of tests that failed in it. But that just seems way out of realm of possiblity. I've tried all manner of stuff with the streams and got less than nothing.
Anyway, if anyone can help I would greatly appreciate it

Related

How to send input to child process created with spawn? nodejs

I'm running Windows 10, and I have a program, let's call it program, that can be run from the command line. When run, it responds to commands the user enters. The user enters a command, presses the return key, and the program prints a response. I did not make this program and do not have the source, so I cannot modify it.
I want to run this program from within Node.js, and have my Node.js program act as the user, sending it commands and getting the responses. I spawn my program like this:
var spawn = require('child_process').spawn;
var child = spawn('program');
child.stdout.on('data', function(data) {
console.log(`stdout: ${data}`);
});
Then I attempt to send it a command, for example, help.
child.stdin.write("help\n");
And nothing happens. If I manually run the program, type help, and press the return key, I get output. I want Node.js to run the program, send it input, and receive the output exactly as a human user would. I assumed that stdin.write() would send the program a command as if the user typed it in the console. However, as the program does not respond, I assume this is not the case. How can I send the program input?
I've seen many similar questions, but unfortunately the solutions their authors report as "working" did not work for me.
Sending input data to child process in node.js
I've seen this question and answer and tried everything in it with no success. I've tried ending the command with \r\n instead of \n. I've also tried adding the line child.stdin.end() after writing. Neither of these worked.
How to pass STDIN to node.js child process
This person, in their self-answer, says that they got theirs to work almost exactly as I'm doing it, but mine does not work.
Nodejs Child Process: write to stdin from an already initialised process
This person, in their self-answer, says they got it to work by writing their input to a file and then piping that file to stdin. This sounds overly complicated to send a simple string.
This worked for me, when running from Win10 CMD or Git Bash:
console.log('Running child process...');
const spawn = require('child_process').spawn;
const child = spawn('node');
// Also worked, from Git Bash:
//const child = spawn('cat');
child.stdout.on('data', (data) => {
console.log(`stdout: "${data}"`);
});
child.stdin.write("console.log('Hello!');\n");
child.stdin.end(); // EOF
child.on('close', (code) => {
console.log(`Child process exited with code ${code}.`);
});
Result:
D:\Martin\dev\node>node test11.js
Running child process...
stdout: "Hello!
"
Child process exited with code 0.
I also tried running aws configure like this, first it didn't work because I sent only a single line. But when sending four lines for the expected four input values, it worked.
Maybe your program expects special properties for stdin, like being a real terminal, and therefore doesn't take your input?
Or did you forget to send the EOF using child.stdin.end();? (If you remove that call from my example, the child waits for input forever.)
Here is what worked for me. I have used child_process exec to create a child process. Inside this child process Promise, I am handling the i/o part of the cmd given as parameter. Its' not perfect, but its working.
Sample function call where you dont need any human input.
executeCLI("cat ~/index.html");
Sample function call where you interact with aws cli. Here
executeCLI("aws configure --profile dev")
Code for custom executeCLI function.
var { exec } = require('child_process');
async function executeCLI(cmd) {
console.log("About to execute this: ", cmd);
var child = exec(cmd);
return new Promise((resolve, reject) => {
child.stdout.on('data', (data) => {
console.log(`${data}`);
process.stdin.pipe(child.stdin);
});
child.on('close', function (err, data) {
if (err) {
console.log("Error executing cmd: ", err);
reject(err);
} else {
// console.log("data:", data)
resolve(data);
}
});
});
}
Extract the user input code from browser and save that code into a file on your system using fs module. Let that file be 'program.cpp' and save the user data input in a text file.
As we can compile our c++ code on our terminal using g++ similarly we will be using child_process to access our system terminal and run user's code.
execFile can be used for executing our program
var { execFile } = require('child_process');
execFile("g++", ['program.cpp'], (err, stdout, stderr) => {
if (err) {
console.log("compilation error: ",err);
} else{
execFile ('./a.out' ,['<', 'input.txt'], {shell: true}, (err, stdout, stderr) => {
console.log("output: ", stdout);
})
}
})
In this code we simply require the child_process and uses its execFile function.
First we compile the code present in program.cpp, which creates a default a.out as output file
Then we pass the a.out file with input that is present in input.txt
Hence you can view the generated output in your terminal and pass that back to the user.
for more details you can check: Child Processes

get response from CasperJS in Node.JS

I have a script which is calling another CasperJS script.
var exec = require('child_process').exec;
exec("casperjs webCheck/thisWebCheck.js",puts);
I need a way to get a response from this script back to the calling script. It just has to be a Boolean value, just yes the webcheck worked, or no the website is down. Then depending on that response I will execute one piece of code or another.
I have searched all through web posts, blog posts and stack exchange for an answer to this and have come up empty.
When I try to use the exit code, it never comes back with what I told it to exit with.
When i use STDOUT and try to validate against it such as
thisWebCheck.js
...
if(failed) {
console.log("failed");
} else {
console.log("success")
}
Main.js
var puts = function(error, stdout, stderr){
if (stdout == "failed"){
doSomething();
} else if (stdout == "success") {
doSomethingElse();
} else {
console.log(stdout);
}
};
The main.js will console.log() the stdout but won't validate against the strings.
TL/DR:
I just need some way to communicate between the two scripts. Also, if there is a better way to call the casper file, let me know.

Node.js synchronous shell exec

I am having a problem with async shell executes in node.js.
In my case, node.js is installed on a Linux operating system on a raspberry pi. I want to fill an array with values that are parsed from a shell script which is called on the pi. This works fine, however, the exec() function is called asynchronously.
I need the function to be absolute synchron to avoid messing up my whole system. Is there any way to achieve this? Currently I am trying a lib called .exe, but the code still seems to behave asynchron.
Here's my code:
function execute(cmd, cb)
{
child = exec(cmd, function(error, stdout, stderr)
{
cb(stdout, stderr);
});
}
function chooseGroup()
{
var groups = [];
execute("bash /home/pi/scripts/group_show.sh", function(stdout, stderr)
{
groups_str = stdout;
groups = groups_str.split("\n");
});
return groups;
}
//Test
console.log(chooseGroup());
If what you're using is child_process.exec, it is asynchronous already.
Your chooseGroup() function will not work properly because it is asynchronous. The groups variable will always be empty.
Your chooseGroup() function can work if you change it like this:
function chooseGroup() {
execute("bash /home/pi/scripts/group_show.sh", function(stdout, stderr) {
var groups = stdout.split("\n");
// put the code here that uses groups
console.log(groups);
});
}
// you cannot use groups here because the result is obtained asynchronously
// and thus is not yet available here.
If, for some reason, you're looking for a synchronous version of .exec(), there is child_process.execSync() though it is rarely recommended in server-based code because it is blocking and thus blocks execution of other things in node.js while it is running.

Binding custom node addon to lua with extentions

I have a large collection of asynchronous functions that I have in nodejs code that I would like to expose to lua. The basic idea is that I would like to execute lua scripts and allow those scripts to call back into some of my nodejs code, as well as asynchronously return a value from an executed lua script,
In this example myCustomNodejsAddon would be a custom addon that I write that knows how to bind lua and run lua scripts. One outstanding question is how do I asynchronously return a value from a lua script?
Has anyone done something like this before? I would be very interested in any pointers, thoughts, examples.
EDIT with better example:
-- user written lua script
getUser(1, function(err, user)
if err then
print('Error', err)
else
print('Found user with id', user.id)
return ''
end
end)
/*Create object with mapping of async functions*/
var callbacks = {
"getUser": function(userId, cb) {
db.Users.fetchById(userId).then(function(user) {
cb(null, user);
}, function(err) {
cb(err, null);
}
}
};
myCustomNodejsAddon.provideCallbacks(callbacks);
/* user written lua script has been stored into `scriptSrc` variable */
myCustomNodejsAddon.execute(scriptSrc, function(returnValueOfScript) {
console.log('done running user script: ', retrunValueOfScript);
});
More than one approaches to this problem comes to my mind.
The first would be to create a nodejs script that once executed read the program command line arguments or input stream and execute the code indicated by this channel and stream the response back in JSON format for example. This is the less invasive way of doing this. The script would be something like:
if(require.main === module){
// asume first argument to be the source module for the function of iterest
var mod = require(process.argv[2]);
var fnc = mod[process.argv[3]];
args = process.argv.slice(4);
// by convention the last argument is a callback function
args.push(function(){
console.log(JSON.stringify(arguments));
process.exit();
})
fnc.apply(null, args);
}
An example usage will be:
$ node my-script.js fs readdir /some/path
This will respond with something like [null, ['file1', 'file2']] acording with the files on /some/path. Then you can create a lua module that invoque node with this script and pass the parameters according with the functions you want to call.

Node.js spawn EMFILE

I am trying to run a command inside a async.forEach loop using ChildProcess.exec in my node job. here is the code
async.forEach( docPaths, function(docPath, callback) {
var run = [];
// some command using docPath variable here..
run.push(command);
debugger;
exec(run.join(' '), function(error, stdout, stderr){
callback();
});
}, callback);
Here is the error
"stack":"Error: spawn EMFILE\
at errnoException (child_process.js:478:11)\
at ChildProcess.spawn (child_process.js:445:11)\
at child_process.js:343:9\
at Object.execFile (child_process.js:253:15)\
at child_process.js:220:18\
a quick google shows i need to set ulimit value to increase the number of file descriptors can be open. some thing like "ulimit -n 10000".. (from link below)
https://groups.google.com/forum/#!topic/nodejs/jeec5pAqhps
where can i increase this..? or is there any other solution to circumvent the issue?
Appreciate your help.. Thanks much !!
First of all its not advisable to mess with ulimit, as it may have system wide impacts.
Instead since you are already using async, it comes with a limit paramater which you can use to limit the number of parallely executions.
async.eachLimit( docPaths, 100, function(docPath, callback) {
var run = [];
// some command using docPath variable here..
run.push(command);
debugger;
exec(run.join(' '), function(error, stdout, stderr){
callback();
});
}, callback);
Please do trial and error and replace 100 with suitable value.

Resources