Using node-ssh JS library, one can execute remote commands. Is it possible to get streamed output from a long-running command, as it is generated? (Without waiting the command to end).
The example in the project page suggests you can only get stdout and stderr once the command is done:
ssh.execCommand('hh_client --json', { cwd:'/var/www'})
.then(function(result) {
console.log('STDOUT: ' + result.stdout)
console.log('STDERR: ' + result.stderr)
})
Related
I am trying to execute a command to fork a chain using ganache. I have struggled to find a way to do the fork programmatically so I want to try execute it through a child process.
The code I am using is:
const { exec } = require("child_process");
const ganache = require("ganache");
const ls = exec('ganache-cli --fork https://speedy-nodes-nyc.moralis.io/KEY_HERE/bsc/mainnet/archive', function (error, stdout, stderr) {
if (error) {
console.log(error.stack);
console.log('Error code: ' + error.code);
console.log('Signal received: ' + error.signal);
}
console.log('Child Process STDOUT: ' + stdout);
console.log('Child Process STDERR: ' + stderr);
});
ls.on('exit', function (code) {
console.log('Child process exited with exit code ' + code);
});
But when I run this, No output is given at all. No errors or anything it just carries on with the program asif this code did not exist. When I replace the command with 'dir', it works perfectly and lists the files in the directory. Even if I remove the node address and just use "ganache-cli --fork" as the command, nothing happens.
Why does nothing happen when I add the ganache commands?
If you take a look at the child process docs, they say exec:
spawns a shell and runs a command within that shell, passing the stdout and stderr to a callback function when complete.
However, Ganache is a process that continues running and doesn't "complete" until you kill it. This allows you to send multiple requests to Ganache without it shutting down on you.
So far I have gotten my script to execute a windows .bat file with child_process, my issue is that it opens it in the background with no way to “connect” to it to see what happens and debug, is there a way to “listen” for a certain output to happen? For example, if the .bat outputs a “Done!” in the shell at one point, is there a way to make my node.js script detect that certain keyword and run further commands if it does?
Thanks!
Some clarification: The .bat outputs "Done!" and stays running, it doesn't stop, all I want to do is detect that "Done!" so that I can send a message to the user that the server has successfully started
My current code:
exec('D:\\servers\\game_server_1\\start.bat', {shell: true, cwd: 'D:\\servers\\game_server_1'});
Well, if you're trying to do a one and done type of NodeJS script, you can just spawn a process that launches with the given command and exits when all commands completed. This creates a one and done streaming interface that you can monitor. The stdout returns a data buffer that returns the command you ran, unless it's something like START to launch a program-- it returns null. You could just issue a KILL command after the START -- your_program.exe:
const spawn = require('child_process').spawn;
const child = spawn('cmd.exe', ['/c', 'commands.bat']);
let DONE = 0;
const done = () => {
console.log("log it");
DONE++;
};
child.stdout.on('data', function (data) {
console.log('stdout: ' + data);
//it's important to add some type of counter to
//prevent any logic from running twice, since
//this will run twice for any given command
if ( data.toString().includes("DONE") && DONE === 0 ) {
done();
}
});
child.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
child.on('exit', function (code) {
console.log('child process exited with code ' + code);
});
Keep in mind, when you run a command to launch a program and the program launches, the data buffer will be null in stdout event listener. The error event will only fire if there was an issue with launching the program.
YOUR .BAT:
ECHO starting batch script
//example launching of program
START "" https://localhost:3000
//issue a command after your program launch
ECHO DONE
EXIT
You could also issue an ECHO DONE command right after the command where you launched the program and listen for that, and try and parse out that command from stdout.
You could use a Regular expression.
const { spawn } = require('child_process');
const child = spawn(...);
child.stdout.on('data', function (data) {
console.log('stdout: ' + data);
// Now use a regular expression to detect a done event
// For example
data.toString().match(/Done!/);
});
// Error handling etc. here
I have a command
omxplayer /home/pi/videos/9886a3n2545r7i505rzz.mp4 -o alsa:sysdefault
It runs fine from the command line, but if I translate that command to a spawn command:
let omxProcess = spawn('omxplayer', ['/home/pi/videos/9886a3n2545r7i505rzz.mp4', '-o', 'alsa:sysdefault'])
The command fails (without any error).
But if I run the following removing the :sysdefault it runs (But without the :sysdefault, the command is not the same and I need to run it with :sysdefault
let omxProcess = spawn('omxplayer', ['/home/pi/videos/9886a3n2545r7i505rzz.mp4', '-o', 'alsa'])
I'm thinking it has to do with having an ":" in the arg.
Any thoughts?
Since you're not using the shell: true flag, it's almost certainly not caused by the : in the command. You can always verify this, just to be on the safe side.
An easy way to check if the environment is messing with your arguments is calling another binary, for example echo, instead of omxplayer. Does it echo back your arguments? Is the colon still there?
The binary is probably exitting with some error code (and possibly an error message). To capture them, be sure to register handlers on the output streams, as well as an exit handler that should tell you the exit code. This is outlined in the child_process docs, right below spawn(). Adapted for your case:
omxProcess.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
omxProcess.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
omxProcess.on('close', (code) => {
console.log(`child process exited with code ${code}`);
});
Based on the output and the exit code, you should be able to debug the issue.
When trying to use child_process.spawn in node.js to run a tar command, runtime fails with ENOENT error.
If I run it as child_process.exec, the command runs and returns one line of output through exec.stdout.on('data') event handler. When running the command in a terminal it has more output than what exec is returning.
The key problem is that with the flags -cvMf tar requires the user to press enter to continue its process, which I was hoping to do via exec.send
Here's an excerpt of the code I have:
try {
let subprocess = exec('tar -cvMf /dev/nst0 /mnt/Data/2GBdata1 /mnt/Data/1GBdata');
subprocess.stdout.on('data', function (data) {
console.log('stdout on data: \n', data.toString());
});
subprocess.on('data', function(data) {
console.log('on data: \n', data);
})
subprocess.stdout.on('message', function (message) {
console.log('sub messgae: ', message);
});
subprocess.stderr.on('error', function(err) {
console.log('sub error thrown', err);
throw err;
})
subprocess.on('exit', function (data) {
subprocess = null;
console.log('sub exited: \n', data);
});
}
catch (err) {
console.error(`exec error: ${err}`);
output = `${err}`;
}
The output is simply:
stdout on data:
/mnt/Data/2GBdata1
stdout on data:
/mnt/Data/1GBdata
Then the process simply hangs.
The expected output (when running the command in terminal) is the following:
tar -cvMf /dev/nst0 /mnt/Data/2GBdata /mnt/Data/1GBdata
tar: Removing leading `/' from member names
/mnt/Data/2GBdata
/mnt/Data/1GBdata
Prepare volume #2 for ‘/dev/nst0’ and hit return:
My last option will be to write a shell script and use execFile, but I'd like to avoid over-complicating things if possible. Any and all help is appreciated, thanks!
I figured it out. For whatever reason the line Prepare volume #2 for ‘/dev/nst0’ and hit return: doesn't register as a stout event. However upon further research I found that tar has a -F flag, which allows you to define a script to be ran when it reaches the end of the tape.
I wrote a simple shell script that echos when the end of a tape is reached, which then registers as an stdout event.
With this, I was able to continue my project without any hassle.
The command now looks like this:
tar -F './tartest.sh' -cvMf /dev/nst0 '(data files here)'
Consider the following C program (test.c):
#include <stdio.h>
int main() {
printf("string out 1\n");
fprintf(stderr, "string err 1\n");
getchar();
printf("string out 2\n");
fprintf(stderr, "string err 2\n");
fclose(stdout);
}
Which should print a line to stdout, a line to stderr, then wait for user input, then another line to stdout and another line to stderr. Very basic!
When compiled and run on the command line the output of the program when complete (user input is received for getchar()):
$ ./test
string out 1
string err 1
string out 2
string err 2
When trying to spawn this program as a child process using nodejs with the following code:
var TEST_EXEC = './test';
var spawn = require('child_process').spawn;
var test = spawn(TEST_EXEC);
test.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
test.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
// Simulate entering data for getchar() after 1 second
setTimeout(function() {
test.stdin.write('\n');
}, 1000);
The output appears like this:
$ nodejs test.js
stderr: string err 1
stdout: string out 1
string out 2
stderr: string err 2
Very different from the output as seen when running ./test in the terminal. This is because the ./test program isn't running in an interactive shell when spawned by nodejs. The test.c stdout stream is buffered and when run in a terminal as soon as a \n is reached the buffer is flushed but when spawned in this way with node the buffer isn't flushed. This could be resolved by either flushing stdout after every print, or changing the stdout stream to be unbuffered so it flushes everything immediately.
Assuming that test.c source isn't available or modifiable, neither of the two flushing options mentioned can be implemented.
I then started looking at emulating an interactive shell, there's pty.js (pseudo terminal) which does a good job, for example:
var spawn = require('pty.js').spawn;
var test = spawn(TEST_EXEC);
test.on('data', function (data) {
console.log('data: ' + data);
});
// Simulate entering data for getchar() after 1 second
setTimeout(function() {
test.write('\n');
}, 1000);
Which outputs:
$ nodejs test.js
data: string out 1
string err 1
data:
data: string out 2
string err 2
However both stdout and stderr are merged together (as you would see when running the program in a terminal) and I can't think of a way to separate the data from the streams.
So the question..
Is there any way using nodejs to achieve the output as seen when running ./test without modifying the test.c code? Either by terminal emulation or process spawning or any other method?
Cheers!
I tried the answer by user568109 but this does not work, which makes sense since the pipe only copies the data between streams. Hence, it only gets to process.stdout when the buffer is flushed...
The following appears to work:
var TEST_EXEC = './test';
var spawn = require('child_process').spawn;
var test = spawn(TEST_EXEC, [], { stdio: 'inherit' });
//the following is unfortunately not working
//test.stdout.on('data', function (data) {
// console.log('stdout: ' + data);
//});
Note that this effectively shares stdio's with the node process. Not sure if you can live with that.
I was just revisiting this since there is now a 'shell' option available for the spawn command in node since version 5.7.0. Unfortunately there doesn't seem to be an option to spawn an interactive shell (I also tried with shell: '/bin/sh -i' but no joy).
However I just found this which suggests using 'stdbuf' allowing you to change the buffering options of the program that you want to run. Setting them to 0 on everything produces unbuffered output for all streams and they're still kept separate.
Here's the updated javascript:
var TEST_EXEC = './test';
var spawn = require('child_process').spawn;
var test = spawn('stdbuf', ['-i0', '-o0', '-e0', TEST_EXEC]);
test.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
test.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
// Simulate entering data for getchar() after 1 second
setTimeout(function() {
test.stdin.write('\n');
}, 1000);
Looks like this isn't pre-installed on OSX and of course not available for Windows, may be similar alternatives though.
You can do this :
var TEST_EXEC = 'test';
var spawn = require('child_process').spawn;
var test = spawn(TEST_EXEC);
test.stdin.pipe(process.stdin);
test.stdout.pipe(process.stdout);
test.stderr.pipe(process.stderr);
When you use events on stdout and stderr to print the output on console.log, you will get jumbled output because of asynchronous execution of the functions. The output will be ordered for a stream independently, but output can still get interleaved among stdin,stdout and stderr.