Node js Child_process not returning stdout - node.js

Not sure why this does not work, if I run a simple command such as cmd='ls -all' then I get the output back, but when I use this to run a command which takes some time to complete I don't get anything at all returned.
In this example, I am using lftp to mirror some folders and want to get the reply, if I run the command from the terminal then of course I see the output, but using child process I get nothing:
var childProcess = require('child_process');
var cmd = 'lftp sftp://user:password#somehost -e "mirror -R --delete --parallel=5 /usr/share/scripts/ /volumes/folders/usr/share/;bye"';
childProcess.exec(cmd, function (error, stdout, stderr) {
console.log('stdout:'+stdout);
console.log('stderr:'+stderr);
console.log('error:'+error);
});
I also tried the spawn method, nothing returned from that either:
var spawn = require('child_process').spawn;
var lftp = spawn('lftp',['sftp://user:password#somehost', '-e "mirror -R --delete --parallel=5 /usr/share/scripts/ /volumes/folders/usr/share/;bye"']);
lftp.stdout.on('data', function(data) {
console.log(data.toString());
});

I think the problem is that lftp buffers its output.
To get around that you'll need to use unbuffer (http://expect.sourceforge.net/example/unbuffer.man.html) to send the output directly to stdout.

Related

How to run shell script file using nodejs?

I need to run a shell script file using nodeJS that executes a set of Cassandra DB commands. Can anybody please help me on this.
inside db.sh file:
create keyspace dummy with replication = {'class':'SimpleStrategy','replication_factor':3}
create table dummy (userhandle text, email text primary key , name text,profilepic)
You could use "child process" module of nodejs to execute any shell commands or scripts with in nodejs. Let me show you with an example, I am running a shell script(hi.sh) with in nodejs.
hi.sh
echo "Hi There!"
node_program.js
const { exec } = require('child_process');
var yourscript = exec('sh hi.sh',
(error, stdout, stderr) => {
console.log(stdout);
console.log(stderr);
if (error !== null) {
console.log(`exec error: ${error}`);
}
});
Here, when I run the nodejs file, it will execute the shell file and the output would be:
Run
node node_program.js
output
Hi There!
You can execute any script just by mentioning the shell command or shell script in exec callback.
You can execute any shell command using the shelljs module
const shell = require('shelljs')
shell.exec('./path_to_your_file')
you can go:
var cp = require('child_process');
and then:
cp.exec('./myScript.sh', function(err, stdout, stderr) {
// handle err, stdout, stderr
});
to run a command in your $SHELL.
Or go
cp.spawn('./myScript.sh', [args], function(err, stdout, stderr) {
// handle err, stdout, stderr
});
to run a file WITHOUT a shell.
Or go
cp.execFile();
which is the same as cp.exec() but doesn't look in the $PATH.
You can also go
cp.fork('myJS.js', function(err, stdout, stderr) {
// handle err, stdout, stderr
});
to run a javascript file with node.js, but in a child process (for big programs).
EDIT
You might also have to access stdin and stdout with event listeners. e.g.:
var child = cp.spawn('./myScript.sh', [args]);
child.stdout.on('data', function(data) {
// handle stdout as `data`
});
Also, you can use shelljs plugin.
It's easy and it's cross-platform.
Install command:
npm install [-g] shelljs
What is shellJS
ShellJS is a portable (Windows/Linux/OS X) implementation of Unix
shell commands on top of the Node.js API. You can use it to eliminate
your shell script's dependency on Unix while still keeping its
familiar and powerful commands. You can also install it globally so
you can run it from outside Node projects - say goodbye to those
gnarly Bash scripts!
An example of how it works:
var shell = require('shelljs');
if (!shell.which('git')) {
shell.echo('Sorry, this script requires git');
shell.exit(1);
}
// Copy files to release dir
shell.rm('-rf', 'out/Release');
shell.cp('-R', 'stuff/', 'out/Release');
// Replace macros in each .js file
shell.cd('lib');
shell.ls('*.js').forEach(function (file) {
shell.sed('-i', 'BUILD_VERSION', 'v0.1.2', file);
shell.sed('-i', /^.*REMOVE_THIS_LINE.*$/, '', file);
shell.sed('-i', /.*REPLACE_LINE_WITH_MACRO.*\n/, shell.cat('macro.js'), file);
});
shell.cd('..');
// Run external tool synchronously
if (shell.exec('git commit -am "Auto-commit"').code !== 0) {
shell.echo('Error: Git commit failed');
shell.exit(1);
}
Also, you can use from the command line:
$ shx mkdir -p foo
$ shx touch foo/bar.txt
$ shx rm -rf foo

Command not called, anything wrong with this spawn syntax?

When i run this pidof command by hand, it works. Then put into my server.js.
// send signal to start the install script
var spw = cp.spawn('/sbin/pidof', ['-x', 'wait4signal.py', '|', 'xargs', 'kill', '-USR1']);
spw.stderr.on('data', function(data) {
res.write('----- Install Error !!! -----\n');
res.write(data.toString());
console.log(data.toString());
});
spw.stdout.on('data', function(data) {
res.write('----- Install Data -----\n');
res.write(data.toString());
console.log(data.toString());
});
spw.on('close', function(data) {
res.end('----- Install Finished, please to to status page !!! -----\n');
console.log('88');
});
In the web i only see "----- Install Finished, please to to status page !!!". My install script seems never get this USR1 signal. Anything wrong please ?
The problem is that you have two separate commands. You are piping the output of your /sbin/pidof command to the input stream of your xargs command. If you are using spawn (rather than exec, which a string exactly as you would write on the command line), you need to spawn one process per command.
Spawn your processes like this:
const pidof = spawn('/sbin/pidof', ['-x', 'wait4signal.py']);
const xargs = spawn('xargs', ['kill', '-USR1']);
Now pipe the output of the first process to the input of the second, like so:
pidof.stdout.pipe(xargs.stdin);
Now you can listen to events on your xargs process, like so:
xargs.stdout.on('data', data => {
console.log(data.toString());
});

How to out output live console.log with exec in node

Is there a way to run a command line command from within a node app and get the output to be live?
eg:
var exec = require('child_process').exec;
var fs = require('fs');
exec( 'nightwatch --config nightwatch_dev.json ', function( error, stdout, stderr ){
console.log( stdout );
});
or:
var exec = require('child_process').exec;
var fs = require('fs');
exec( 'rsync -avz /some/folder/ john#8.8.8.8:/some/folder/', function( error, stdout, stderr ){
console.log( stdout );
});
There are many many instances where it would be nice and easy to script something up in node but the output is only dumped to the terminal after the command has finished.
Cheers
J
If you want results as they occur, then you should use spawn() instead of exec(). exec() buffers the output and then gives it to you all at once when the process has finished. spawn() returns an event emitter and you get the output as it happens.
Examples here in the node.js doc for .spawn():

Use child_process.execSync but keep output in console

I'd like to use the execSync method which was added in NodeJS 0.12 but still have the output in the console window from which i ran the Node script.
E.g. if I run a NodeJS script which has the following line I'd like to see the full output of the rsync command "live" inside the console:
require('child_process').execSync('rsync -avAXz --info=progress2 "/src" "/dest"');
I understand that execSync returns the ouput of the command and that I could print that to the console after execution but this way I don't have "live" output...
You can pass the parent´s stdio to the child process if that´s what you want:
require('child_process').execSync(
'rsync -avAXz --info=progress2 "/src" "/dest"',
{stdio: 'inherit'}
);
You can simply use .toString().
var result = require('child_process').execSync('rsync -avAXz --info=progress2 "/src" "/dest"').toString();
console.log(result);
Edit: Looking back on this, I've realised that it doesn't actually answer the specific question because it doesn't show the output to you 'live' — only once the command has finished running.
However, I'm leaving this answer here because I know quite a few people come across this question just looking for how to print the result of the command after execution.
Unless you redirect stdout and stderr as the accepted answer suggests, this is not possible with execSync or spawnSync. Without redirecting stdout and stderr those commands only return stdout and stderr when the command is completed.
To do this without redirecting stdout and stderr, you are going to need to use spawn to do this but it's pretty straight forward:
var spawn = require('child_process').spawn;
//kick off process of listing files
var child = spawn('ls', ['-l', '/']);
//spit stdout to screen
child.stdout.on('data', function (data) { process.stdout.write(data.toString()); });
//spit stderr to screen
child.stderr.on('data', function (data) { process.stdout.write(data.toString()); });
child.on('close', function (code) {
console.log("Finished with code " + code);
});
I used an ls command that recursively lists files so that you can test it quickly. Spawn takes as first argument the executable name you are trying to run and as it's second argument it takes an array of strings representing each parameter you want to pass to that executable.
However, if you are set on using execSync and can't redirect stdout or stderr for some reason, you can open up another terminal like xterm and pass it a command like so:
var execSync = require('child_process').execSync;
execSync("xterm -title RecursiveFileListing -e ls -latkR /");
This will allow you to see what your command is doing in the new terminal but still have the synchronous call.
Simply:
try {
const cmd = 'git rev-parse --is-inside-work-tree';
execSync(cmd).toString();
} catch (error) {
console.log(`Status Code: ${error.status} with '${error.message}'`;
}
Ref: https://stackoverflow.com/a/43077917/104085
// nodejs
var execSync = require('child_process').execSync;
// typescript
const { execSync } = require("child_process");
try {
const cmd = 'git rev-parse --is-inside-work-tree';
execSync(cmd).toString();
} catch (error) {
error.status; // 0 : successful exit, but here in exception it has to be greater than 0
error.message; // Holds the message you typically want.
error.stderr; // Holds the stderr output. Use `.toString()`.
error.stdout; // Holds the stdout output. Use `.toString()`.
}
When command runs successful:
Add {"encoding": "utf8"} in options.
execSync(`pwd`, {
encoding: "utf8"
})

Running a shell command from Node.js without buffering output

I'm trying to launch a shell command from Node.js, without redirecting that command's input and output -- just like shelling out to a command using a shell script, or using Ruby's system command. If the child process wants to write to STDOUT, I want that to go straight to the console (or get redirected, if my Node app's output was redirected).
Node doesn't seem to have any straightforward way to do this. It looks like the only way to run another process is with child_process, which always redirects the child process's input and output to pipes. I can write code to accept data from those pipes and write it to my process's STDOUT and STDERR, but if I do that, the APIs force me to sacrifice some flexibility.
I want two features:
Shell syntax. I want to be able to pipe output between commands, or run Windows batch files.
Unlimited output. If I'm shelling out to a compiler and it wants to generate megabytes of compiler warnings, I want them all to scroll across the screen (until the user gets sick of it and hits Ctrl+C).
It looks like Node wants to force me choose between those two features.
If I want an unlimited amount of output, I can use child_process.spawn and then do child.stdout.on('data', function(data) { process.stdout.write(data); }); and the same thing for stderr, and it'll happily pipe data until the cows come home. Unfortunately, spawn doesn't support shell syntax.
If I want shell syntax, I can use child_process.exec. But exec insists on buffering the child process's STDOUT and STDERR for me and giving them to me all at the end, and it limits the size of those buffers (configurable, 200K by default). I can still hook the on('data') events, if I want to see the output as it's generated, but exec will still add the data to its buffers too. When the amount of data exceeds the predefined buffer size, exec will terminate the child process.
(There's also child_process.execFile, which is the worst of both worlds from a flexibility standpoint: no shell syntax, but you still have to cap the amount of output you expect.)
Am I missing something? Is there any way to just shell out to a child process in Node, and not redirect its input and output? Something that supports shell syntax and doesn't crap out after a predefined amount of output, just like is available in shell scripts, Ruby, etc.?
You can inherit stdin/out/error streams via spawn argument so you don't need to pipe them manually:
var spawn = require('child_process').spawn;
spawn('ls', [], { stdio: 'inherit' });
Use shell for shell syntax - for bash it's -c parameter to read script from string:
var spawn = require('child_process').spawn;
var shellSyntaxCommand = 'ls -l | grep test | wc -c';
spawn('sh', ['-c', shellSyntaxCommand], { stdio: 'inherit' });
To summarise:
var spawn = require('child_process').spawn;
function shspawn(command) {
spawn('sh', ['-c', command], { stdio: 'inherit' });
}
shspawn('ls -l | grep test | wc -c');
You can replace exec by spawn and use the shell syntax simply with:
const {spawn} = require ('child_process');
const cmd = 'ls -l | grep test | wc -c';
const p = spawn (cmd, [], {shell: true});
p.stdout.on ('data', (data) => {
console.log (data.toString ());
});
The magic is just {shell: true}.
I haven't used it, but I've seen this library: https://github.com/polotek/procstreams
It you'd do this. The .out() automatically pipes to the process's stdin/out.
var $p = require('procstreams');
$p('cat lines.txt').pipe('wc -l').out();
If doesn't support shell syntax, but that's pretty trivial I think.
var command_str = "cat lines.txt | wc -l";
var cmds = command_str.split(/\s?\|\s?/);
var cmd = $p(cmds.shift());
while(cmds.length) cmd = cmd.pipe(cmds.shift());
cmd
.out()
.on('exit', function() {
// Do whatever
});
There's an example in the node docs for the child_process module:
Example of detaching a long-running process and redirecting its output to a file:
var fs = require('fs'),
spawn = require('child_process').spawn,
out = fs.openSync('./out.log', 'a'),
err = fs.openSync('./out.log', 'a');
var child = spawn('prg', [], {
detached: true,
stdio: [ 'ignore', out, err ]
});
child.unref();

Resources