How to handle multiple async requests within a Grunt custom task? - node.js

I am working on a grunt task that installs the latest wordpress plugins from the wordpress svn repo, this is done via a command line task.
Ideally I would like this done synchronously so that I can see output as each plugin installs (via svn co) .. but it seems like exec simply executes and doesn't wait, so using var done = this.async() and done(error) works well with a single async action, but not multiple like in this case ... what am I missing?
grunt.registerTask('install-plugin', 'Get latest versions of required plugins', function(p){
var exec = require('child_process').exec;
var plugins = p ? [p] : grunt.config.get('wp_plugins');
var pattern = 'svn co http://plugins.svn.wordpress.org/%s/tags/$(curl -s http://plugins.svn.wordpress.org/%s/trunk/readme.txt | grep "Stable tag:" | sed -E "s/[Ss]table tag: *//")/ plugins/%s'
var done = this.async();
plugins.forEach(function(plugin) {
// using split/join instead of util.format('foo%s', 'bar')
var cmd = pattern.split("%s").join(plugin);
exec(cmd, function (error, stdout, stderr) {
grunt.log.writeln('Installing WordPress Plugin: "' + plugin + '"');
grunt.log.writeln(stdout);
done(error);
});
});
});

Your problem here is you are calling done (aka this.async()) each time through the loop but grunt doesn't know you have a loop. You need to have your own callback which tracks when all of the async things are done and then only once call grunt's this.async().
Alternatively, searching the web for "node exec synchronous" will turn up plenty of ways to achieve that task if it's what you want.

You are calling done before it gets stdout message. Listen on events to get notify when it is actually done... something along these lines:
exec(cmd, function (error, stdout, stderr) {
stdout.on('data', function(data){
grunt.log.writeln('Installing WordPress Plugin: "' + plugin + '"');
grunt.log.writeln(stdout);
done();
});
stderr.on('data', function(err){
done(err);
});
});

Related

Unable to trigger NodeJS based BATCH/Exe file using jenkins

Im trying to execute the my batch/exe file through NodeJS script by using the:
var child_process = require('child_process');
i.e.
child_process.execFile For exe
child_process.exec For Batch file
When I'm trying to execute my scripts by using followings TWO method:
Triggering via CMD it will be get executed successfully.
Triggering via Jenkins it will NOT get executed.
In both cases directory is same.
I have been using the followings function for this purpose:
exports.exec_exe_file = exec_exe_file = function(exe)
//child_process.execFile(exe, function(error, stderr, stdout) {
child_process.exec(exe, function(error, stderr, stdout) { if (error) {
console.error('stderr', stderr); throw error; } //console.log('stdout',
stdout); });
}
Called as:
var autoit = __dirname + "\\autoit\\start_AutoitExe.bat";
//var autoit = __dirname + '\\autoit\\Script.exe';
exec_exe_file(autoit);

How to run shell script file using nodejs?

I need to run a shell script file using nodeJS that executes a set of Cassandra DB commands. Can anybody please help me on this.
inside db.sh file:
create keyspace dummy with replication = {'class':'SimpleStrategy','replication_factor':3}
create table dummy (userhandle text, email text primary key , name text,profilepic)
You could use "child process" module of nodejs to execute any shell commands or scripts with in nodejs. Let me show you with an example, I am running a shell script(hi.sh) with in nodejs.
hi.sh
echo "Hi There!"
node_program.js
const { exec } = require('child_process');
var yourscript = exec('sh hi.sh',
(error, stdout, stderr) => {
console.log(stdout);
console.log(stderr);
if (error !== null) {
console.log(`exec error: ${error}`);
}
});
Here, when I run the nodejs file, it will execute the shell file and the output would be:
Run
node node_program.js
output
Hi There!
You can execute any script just by mentioning the shell command or shell script in exec callback.
You can execute any shell command using the shelljs module
const shell = require('shelljs')
shell.exec('./path_to_your_file')
you can go:
var cp = require('child_process');
and then:
cp.exec('./myScript.sh', function(err, stdout, stderr) {
// handle err, stdout, stderr
});
to run a command in your $SHELL.
Or go
cp.spawn('./myScript.sh', [args], function(err, stdout, stderr) {
// handle err, stdout, stderr
});
to run a file WITHOUT a shell.
Or go
cp.execFile();
which is the same as cp.exec() but doesn't look in the $PATH.
You can also go
cp.fork('myJS.js', function(err, stdout, stderr) {
// handle err, stdout, stderr
});
to run a javascript file with node.js, but in a child process (for big programs).
EDIT
You might also have to access stdin and stdout with event listeners. e.g.:
var child = cp.spawn('./myScript.sh', [args]);
child.stdout.on('data', function(data) {
// handle stdout as `data`
});
Also, you can use shelljs plugin.
It's easy and it's cross-platform.
Install command:
npm install [-g] shelljs
What is shellJS
ShellJS is a portable (Windows/Linux/OS X) implementation of Unix
shell commands on top of the Node.js API. You can use it to eliminate
your shell script's dependency on Unix while still keeping its
familiar and powerful commands. You can also install it globally so
you can run it from outside Node projects - say goodbye to those
gnarly Bash scripts!
An example of how it works:
var shell = require('shelljs');
if (!shell.which('git')) {
shell.echo('Sorry, this script requires git');
shell.exit(1);
}
// Copy files to release dir
shell.rm('-rf', 'out/Release');
shell.cp('-R', 'stuff/', 'out/Release');
// Replace macros in each .js file
shell.cd('lib');
shell.ls('*.js').forEach(function (file) {
shell.sed('-i', 'BUILD_VERSION', 'v0.1.2', file);
shell.sed('-i', /^.*REMOVE_THIS_LINE.*$/, '', file);
shell.sed('-i', /.*REPLACE_LINE_WITH_MACRO.*\n/, shell.cat('macro.js'), file);
});
shell.cd('..');
// Run external tool synchronously
if (shell.exec('git commit -am "Auto-commit"').code !== 0) {
shell.echo('Error: Git commit failed');
shell.exit(1);
}
Also, you can use from the command line:
$ shx mkdir -p foo
$ shx touch foo/bar.txt
$ shx rm -rf foo

Automatic writing text to the console using Node.js

I need to clone GitHub repository using SSH and Node.js script:
var exec = require('child_process').exec;
exec('git clone git#github.com:jquery/jquery.git',
function (error, stdout, stderr) {
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error !== null) {
console.log('exec error: ' + error);
}
}
);
If github.com not in known_hosts file, SSH forcing to enter "yes" on the question "Are you sure you want to continue connecting (yes/no)?".
How can I automate the input of this text?
P.S. I know about StrictHostKeyChecking=no, but I need to clone repository without changing SSH config.
Sure, that is entirely possible. When you call child_process.exec, it actually returns a ChildProcess Object. It contains an .stdin object which is an implementation of a Writable Stream, which you can pipe to / write to. Documentation on ChildProcess.stdin, also on Writable Stream.
Here is some example code that relates to your question:
var exec = require('child_process').exec;
var cmd = exec('git clone git#github.com:jquery/jquery.git', function (error, stdout, stderr) {
// ...
});
cmd.stdin.write("yes");
You could pass in yes prior to your git clone command.
exec('yes | git clone git#github.com:jquery/jquery.git'...

Using grunt to run a node server and do cleanup after

So basically this is what I want to do. Have a grunt script that compiles my coffee files to JS. Then run the node server and then, either after the server closes or while it's still running, delete the JS files that were the result of the compilation and only keep the .coffee ones.
I'm having a couple of issues getting it to work. Most importantly, the way I'm currently doing it is this:
grunt.loadNpmTasks("grunt-contrib-coffee");
grunt.registerTask("node", "Starting node server", function () {
var done = this.async();
console.log("test");
var sp = grunt.util.spawn({
cmd: "node",
args: ["index"]
}, function (err, res, code) {
console.log(err, res, code);
done();
});
});
grunt.registerTask("default", ["coffee", "node"]);
The problem here is that the node serer isn't run in the same process as grunt. This matters because I can't just CTRL-C once to terminate JUST the node server.
Ideally, I'd like to have it run in the same process and have the grunt script pause while it's waiting for me to CTRL-C the server. Then, after it's finished, I want grunt to remove the said files.
How can I achieve this?
Edit: Note that the snippet doesn't have the actual removal implemented since I can't get this to work.
If you keep the variable sp in a more global scope, you can define a task node:kill that simply checks whether sp === null (or similar), and if not, does sp.kill(). Then you can simply run the node:kill task after your testing task. You could additionally invoke a separate task that just deletes the generated JS files.
For something similar I used grunt-shell-spawn in conjunction with a shutdown listener.
In your grunt initConfig:
shell: {
runSuperCoolJavaServer:{
command:'java -jar mysupercoolserver.jar',
options: {
async:true //spawn it instead!
}
}
},
Then outside of initConfig, you can set up a listener for when the user ctrl+c's out of your grunt task:
grunt.registerTask("superCoolServerShutdownListener",function(step){
var name = this.name;
if (step === 'exit') process.exit();
else {
process.on("SIGINT",function(){
grunt.log.writeln("").writeln("Shutting down super cool server...");
grunt.task.run(["shell:runSuperCoolJavaServer:kill"]); //the key!
grunt.task.current.async()();
});
}
});
Finally, register the tasks
grunt.registerTask('serverWithKill', [
'runSuperCoolJavaServer',
'superCoolServerShutdownListener']
);

Redirecting output to a log file using node.js

I have a child process that I am using as follows in node.js. Instead of redirecting the output to the console I would like to put the output in a log file located somewhere on the machine this is running on (and should work for both windows and mac).
The code below is what I am using and I would like to output the files into a log file. What changes needed to do that here? Thanks!
My Code:
var spawn = require('child_process').spawn,
ls = spawn('ls', ['-lh', '/usr']);
ls.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
ls.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
ls.on('close', function (code) {
console.log('child process exited with code ' + code);
});
Here's an example of logging to file using streams.
var logStream = fs.createWriteStream('./logFile.log', {flags: 'a'});
var spawn = require('child_process').spawn,
ls = spawn('ls', ['-lh', '/usr']);
ls.stdout.pipe(logStream);
ls.stderr.pipe(logStream);
ls.on('close', function (code) {
console.log('child process exited with code ' + code);
});
There are two ways you can achieve this, one is using
let logConsoleStream = fs.createWriteStream('./logConsoleFile.log', {flags: 'a'});
let logErrorStream = fs.createWriteStream('./logErrorFile.log', {flags: 'a'});
and redirect all logs or errors using this
ls.stdout.pipe(logConsoleStream ); // redirect stdout/logs only
ls.stderr.pipe(logErrorStream ); // redirect error logs only
by separating log files you will have separate files for Error logs and console logs
this is exactly same as generalhenry shared above
And Second Way for Achieving this with the help of Command Line
when you execute node app from the command line
node app/src/index.js
you can specify here where you want to redirect logs and Errors from this application
there are three stream redirection commands using the command line
`>` It will redirect your only stdout or logs to the specified path
`2>` It will redirect your errors logs to the specified path
`2>&1 ` It will redirect all your stdout and stderr to a single file
example: how you will use these commands
node app/src/index.js > ./logsOnly.txt
node app/src/index.js 2> ./ErrorsOnly.txt
node app/src/index.js 2>&1 ./consoleLogsAndErrors.txt
I hope someone coming later finds this helpful
if there is I done wrong way please do let me know it will help me and others
Thanks
If you run your JS script with forever then you have the option to define a log file as parameter which will handle all your console.log messages. Not to mention the benefit of keeping your nodejs app live permanently.
Alternatively try this:
sudo forever start myapp.js 2>&1 /home/someuser/myapp/myapp.log
use forever with below options
forever start -o out.log -e err.log server.js
The best answer was in the comments and is mentioned in a previous question here: stackoverflow.com/questions/2496710/nodejs-write-to-file
It is as follows:
var fs = require('fs');
fs.writeFile("/tmp/test", "Hey there!", function(err) {
if(err) {
console.log(err);
} else {
console.log("The file was saved!");
}
});

Resources