fetching 'rsync' output with nodejs child_process.exec callback - node.js

Currently I'm failing to fetch the rsync output when I'm calling nodejs child_process.exec with a callback-function like in this snippet:
var sys = require('sys'),
exec = require('child_process').exec;
cmd = 'rsync -rpz test/test-files/one.txt jloos#test.mygnia.de:~/remote-test/a/b/'
exec(cmd, function(error, stdio, stderr) {
sys.print('s: ' + stdio + '\n');
sys.print('e: ' + stderr + '\n');
});
I think this is caused by the specific behavior of rsync. rsync communicates with it's counterpart via terminal. So how can I fetch the messages from rsync, if even possible?
When I use cmd = 'ls -la' I get the expected output.
Thanks

Often stdout is buffered when the program isn't running in a virtual terminal.
Many languages have a pty module which will trick the program into behaving as though it is running in a terminal.
This provides that functionality for NodeJs;
https://github.com/chjj/pty.js
Keep in mind that rsync may be writing lots of special characters or using something like ncurses to provide the updating status messages, which may make it more difficult to work with the output.

Related

How to pass or output data from a ruby script to a node.js script?

I am looking for a way to pass or output data from one script to another script so that the late script can execute itself with the output that came from the first one.
Basically, I have a ruby script with some instructions in it and I want to pass (or output...) the result of the ruby script to a node.js script.
I would like help ( and examples ... ) on how to realize this and/or recommendations for techniques or technologies I might have never heard of it that might do the trick
Thank you.
You can use child_process exec to execute a script and handle it's output.
Ruby Script
# example.rb
puts "hello world"
Node Script
// example.js
const exec = require('child_process').exec
exec('ruby example.rb', function(err, stdout, stderr) {
console.error(err)
console.error('stderr: ' + stderr)
console.log('stdout: ' + stdout) // logs "hello world"
});

shelljs performance is slow

I have been using shelljs
On my super fast system I execute this:
var shell = require('shelljs')
const exec = require('child_process').exec
console.time('shell mktemp -d')
shell.exec('mktemp -d', {silent: true})
console.timeEnd('shell mktemp -d')
console.time('child exec mktemp -d')
exec('mktemp', ['-d'], function(error, stdout, stderr) {
if (error) {
console.error('stderr', stderr)
throw error
}
console.log('exec stdout', stdout)
console.timeEnd('child exec mktemp -d')
})
Its giving the following execution times:
shell mktemp -d: 208.126ms
exec stdout /tmp/tmp.w22tyS5Uyu
child exec mktemp -d: 48.812ms
Why is shelljs 4 times slower? Any thoughts?
Your code example compares async child_process.exec() with sync shell.exec(), which isn't entirely a fair comparison. I think you'll find shell.exec(..., { async: true }) performs a bit better: this is because sync shell.exec() does extra work to provide real-time stdio while still capturing stdout/stderr/return code as part of its return value; async shell.exec() can provide the same feature mostly for free.
Even with { silent: true }, the extra work is still necessary. shell.exec() is built on top of child_process.execSync(), which only returns stdout. We need to perform the same extra work in order to return return code and stderr.
Have a look to how shelljs is implemented:
It fully relies on node.js fs library. This library is cross platform and written in C++ but not as performant as C language. More generally, you can't have in JS the perfs you get in C...
Another thing, abstraction layers:
you're using exec(Command) where Command is a C tailored (Linux C here I think). The machine creates a thread and executes a command in it.
When using shell.js, there are many mechanisms to ensure cross plateform and keep the abstraction of your command as a function and keep the result as a variable. See the code of exec in shell.js:
https://github.com/shelljs/shelljs/blob/master/src/exec.js
It is not really doing the same thing as your line of code.
Hope that helps!

How to get mongo shell output(three dot) for unterminated command

When type a unterminated command in a mongo shell, it will return three dots indicating need more input to complete this command like below:
> db.test.find(
... {
...
I am using nodejs child_process.spawn to create a mongo shell process and listen on its output. I can get the standard and error output from the mongo shell but I can't get the ... output. Below is my nodejs code:
const shell = spawn('mongo', params);
shell
.stdout
.on('data', (data) => {
winston.debug('get output ' + data);
});
shell
.stderr
.on('data', (data) => {
const output = data + '';
winston.error('get error output ', data);
});
I run below code to send command on the shell:
shell.stdin.write('db.test.find(');
I wander why I can't get the ... output on above method. Is it a special output?
EDIT1
I tried to use node-pty and pty.js. They can get the ... output but they mix the input and output data together. It is not possible to separate them.
I also tried to use stdbuf and unbuffer to disable buffer but it still doesn't work.
It seems that nodejs child_process doesn't work well with interactive command.
Your code doesn't include anything that writes to the stdin of your child process so I would be surprised if you got the ellipsis that indicates incomplete command when in fact you don't send any command at all - incomplete or otherwise.
That having been said, many command line utilities behave differently when they discover a real terminal connected to their stdin/stdout. E.g. git log will page the results when you run it directly but not when you pipe the results to some other command like git log | cat so this may also be the case here.
This can also have to do with the buffering - if your stream is line-buffered then you won't see any line that is not ended with a newline right away.
The real question is: do you see the > prompt? Do you send any command to the mongo shell?
Scritping interactive CLI tools can be tricky. E.g. see what I had to do to test a very simple interactive program here:
https://github.com/rsp/rsp-pjc-c01/blob/master/test-z05.sh#L8-L16
I had to create two named pipes, make sure that stdin, stderr and stdout are not buffered, and then use some other tricks to make it work. It is a shell script but it's just to show you an example.

Cannot Launch `gksudo` using NodeJs `exec`

Goal
I want to show graphical password prompt in nodejs to elevate priviledge thus gain some power to copy file content into another, but the last is owned by root.
In the implementation, I try to execute dd along with it's argument with gksudo with exec() function.
exec = require('child_process').exec
printall = function (error, stdout, stderr) {
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error) {
console.log('exec err: ' + error);
}
}
exec("gksudo dd if=/home/user/minor.txt of=/home/user/major.txt", printall)
Error
But I always fail, with no good reason.
It said,
stdout:
stderr:
exec err: Error: Command failed: /bin/sh -c gksudo dd if=/home/user/minor.txt of=/home/user/major.txt
If I reproduce the command into terminal, it missed double quotes and instead run gksudo only. Well, in nodejs, it simply fails.
Notes
I originally develop Atom package. It's my first time, so, I found out about a different version of Node (or IOJs?). I execute the whole code inside Atom.
Question
If you expect a clear question, well, possibly
How to execute gksudo within node.js to run other command along with the arguments?

Making lftp write to stdout without having to close the process first

I'm trying to wrap the lftp program in a node.js application, using child_process. The problem is that lftp doesn't write its output to stdout, so I cannot catch its output in node.js. Sample code:
var proc = require('child_process').spawn('lftp', ['-p', port, '-u', username + ',' + password, host]);
proc.stdout.on('data', function (data) {
console.log('stdout:', data.toString('utf-8'));
});
proc.on('exit', function (code) {
console.log('process exited with code ' + code);
});
proc.stdin.write('ls');
// proc.stdin.end();
If I uncomment the line that calls stdin.end() for the lftp child process, the output from the ls command appears in my terminal as it should. If I don't the process simply hangs and nothing gets outputted.
I've also tried using unbuffer, but it doesn't seem to allow me to write to lftp's stdin anymore. It outputs the usual "[Resolving host address...]" stuff, but not the output from the ls command.
My question is: what do I have to do to be able to interact with lftp using node.js' child_process?
Well, this was dumb. I forgot to write a newline after the ls command to stdin. It seems to work without the need for unbuffer.

Resources