Piping Stdout of One NodeJS Process to Another - node.js

I have this code
var exec = require('child_process').exec;
(function(){
exec('node ../../service2.js', function(err, stdout, stderr){
stdout.pipe(process.stdin);
process.stdout.write(process.stdin);
});
})();
I'm trying to have it so I only have one service in a cluster logging. For some reason this won't write the other streams' logs. Is there something that I'm missing such as a read from stdin? I'd like to pipe as much as I can together.

Related

How to pass or output data from a ruby script to a node.js script?

I am looking for a way to pass or output data from one script to another script so that the late script can execute itself with the output that came from the first one.
Basically, I have a ruby script with some instructions in it and I want to pass (or output...) the result of the ruby script to a node.js script.
I would like help ( and examples ... ) on how to realize this and/or recommendations for techniques or technologies I might have never heard of it that might do the trick
Thank you.
You can use child_process exec to execute a script and handle it's output.
Ruby Script
# example.rb
puts "hello world"
Node Script
// example.js
const exec = require('child_process').exec
exec('ruby example.rb', function(err, stdout, stderr) {
console.error(err)
console.error('stderr: ' + stderr)
console.log('stdout: ' + stdout) // logs "hello world"
});

NodeJs spawn giving ENOENT error (Raspbian)

i'm having an error regarding spawning nodeJs script:
exec('node ./modules/buttons', function(error, stdout, stderr) {
if(error) console.log(error);
console.log(stdout);
if(stderr) console.log(stderr);
});
Exec Works perfectly fine. However spawn
var buttons = spawn('node ./modules/buttons.js', []);
buttons.stdout.on('data', function(data){
console.log(data);
});
Gives me the following error:
spawn node ./modules/buttons.js ENOENT
Defining the absolute path to the script results in the same error. Would appreciate it if someone could help me resolving this; I have absolutely no clue what could be the cause of this and google isn't helping me either.
exec accepts the command to be executed along with all the command line parameters, but spawn, OTOH, accepts the program to invoke and the command line arguments as an array.
In your case, Node.js is trying to execute a program called node ./modules/buttons.js, not node with ./modules/buttons.js as command line argument. That is why it is failing.
Quoting the example from the spawn docs,
const spawn = require('child_process').spawn;
const ls = spawn('ls', ['-lh', '/usr']);
The difference between exec and spawn is that, exec will be default launch the command in a shell, spawn simply invokes the program.
Note: BTW, as you are simply invoking a JavaScript file, you are better off using execFile

fetching 'rsync' output with nodejs child_process.exec callback

Currently I'm failing to fetch the rsync output when I'm calling nodejs child_process.exec with a callback-function like in this snippet:
var sys = require('sys'),
exec = require('child_process').exec;
cmd = 'rsync -rpz test/test-files/one.txt jloos#test.mygnia.de:~/remote-test/a/b/'
exec(cmd, function(error, stdio, stderr) {
sys.print('s: ' + stdio + '\n');
sys.print('e: ' + stderr + '\n');
});
I think this is caused by the specific behavior of rsync. rsync communicates with it's counterpart via terminal. So how can I fetch the messages from rsync, if even possible?
When I use cmd = 'ls -la' I get the expected output.
Thanks
Often stdout is buffered when the program isn't running in a virtual terminal.
Many languages have a pty module which will trick the program into behaving as though it is running in a terminal.
This provides that functionality for NodeJs;
https://github.com/chjj/pty.js
Keep in mind that rsync may be writing lots of special characters or using something like ncurses to provide the updating status messages, which may make it more difficult to work with the output.

Making lftp write to stdout without having to close the process first

I'm trying to wrap the lftp program in a node.js application, using child_process. The problem is that lftp doesn't write its output to stdout, so I cannot catch its output in node.js. Sample code:
var proc = require('child_process').spawn('lftp', ['-p', port, '-u', username + ',' + password, host]);
proc.stdout.on('data', function (data) {
console.log('stdout:', data.toString('utf-8'));
});
proc.on('exit', function (code) {
console.log('process exited with code ' + code);
});
proc.stdin.write('ls');
// proc.stdin.end();
If I uncomment the line that calls stdin.end() for the lftp child process, the output from the ls command appears in my terminal as it should. If I don't the process simply hangs and nothing gets outputted.
I've also tried using unbuffer, but it doesn't seem to allow me to write to lftp's stdin anymore. It outputs the usual "[Resolving host address...]" stuff, but not the output from the ls command.
My question is: what do I have to do to be able to interact with lftp using node.js' child_process?
Well, this was dumb. I forgot to write a newline after the ls command to stdin. It seems to work without the need for unbuffer.

How can I only use core Node.js to check up the filesystems status.( as 'df' command )

i want to implement a node.js program to checkup the filesystems(such as ext3..)status. However, the fs module only provide the operations of file. Must i use something else third part module?
One option would be to capture the output of the 'df' command and parse it.
You can run commands using child processes.
http://nodejs.org/docs/latest/api/child_processes.html#child_process.exec
var child_process = require('child_process');
child_process.exec('df', function(err, stdout, stderr) {
// 'stdout' here is a string containing the things printed by 'df'
console.log(stdout);
});

Resources