Send the stdout of spawnSync to another spawnSync stdin - node.js

How do I emulate linux's | (pipe) in a node.js app to pipe the stdout of a command to the stdin of the next command. Both commands are being spawned with spawnSync.
This (pseudo code) works as expected in the commandline:
$ command1 -arg1 file | command2 arg2
> someoutput
But this does not:
const spawnSync = require('child_process').spawnSync;
const c1Spawn = spawnSync('command1', ['arg1', 'file']);
const c2Spawn = spawnSync('command2', ['arg2'], { input: c1Spawn.output });
const someoutput = c2Spawn.output;

I believe I found the answer by using input: c1Spawn.stdout instead of output as the in for the second command.
const spawnSync = require('child_process').spawnSync;
const c1Spawn = spawnSync('command1', ['arg1', 'file']);
const c2Spawn = spawnSync('command2', ['arg2'], { input: c1Spawn.stdout });
const someoutput = c2Spawn.output;

Related

Spawn command with redirection

Let say I have this command
somecli -s < "/path/to/file.txt"
How can I convert the above command to NodeJS spawn command ? I did something like this, but seems like it didn't pass the input.
spawn('somecli', ['-s', '<', '"/path/to/file.txt"'], { stdio: 'inherit'}).on('error', function (error) {
// something
});
I can use the exec command below and it's working, but I prefer if we can see the live output.
exec('somecli -s < "/path/to/file.txt"', (e, stdout, stderr) => {
// something
})
something like this should help
const { spawn } = require('child_process');
const fs = require('fs');
const writeStream = fs.createWriteStream("/path/to/file.txt");
const shell = spawn('somecli', ['-s']);
shell.stdout.pipe(writeStream);
To pass file input to command ( STDIN redirection )
$ somecli -s < /path/to/file.txt
We can do it something like this
spawn('somecli', ['-s'], {stdio: [fs.openSync('/path/to/file.txt', 'r'), process.stdout, process.stderr]});
To pass command output to file ( STDOUT redirection )
$ somecli -s > /path/to/file.txt
You may follow Ashish answer
let s = spawn('somecli', ['-s])
s.stdout.pipe(fs.createWriteStream('/path/to/file.txt'))

How to pipe a stream to a file descriptor in node?

I'm writing a cli in node, I want to open the users $EDITOR to edit data that is read from a stream (an http response IncomingMessage).
How can I send the data to a file descriptor?
In bash I could write this:
$EDITOR <(curl $url)
or
$DIFF <(curl $url_1) <(curl $url_2)
<(curl $url) expands to something like /proc/self/fd/11
echo <(curl $url)
/proc/self/fd/11
But how would I write it in javascript?
import cp from 'child_process'
const fisrt = request(...);
const second = require(...);
const first_fd = ???;
const second_fd = ???;
const proc = cp.spawn(process.env.DIFF, [first_fd, second_fd] { stdio: 'inherit' });
Okay, if stream is backed by a socket or fd you can pass it to options.stdio, but what if it isn't, what if it's a transform stream?
options.stdio
object - Share a readable or writable stream that refers to a tty, file, socket, or a pipe with the child process. The stream's underlying file descriptor is duplicated in the child process to the fd that corresponds to the index in the stdio array. The stream must have an underlying descriptor (file streams do not until the 'open' event has occurred).
Yes I could create a temp file but can I do it without one?
You can stream a downloaded content into vim text editor in your terminal with the following nodejs code:
const { spawn } = require('child_process');
const request = require('request');
//
request({
url: 'https://google.com'
}, function (err, res, body) {
const vi = spawn('vi', ['-'], { stdio: ['pipe', process.stdout, process.stderr] });
vi.stdin.write(body);
vi.stdin.end();
});
Then from your terminal if you execute this code, it will download google's html and let you edit and save it in a file. You can use :w myfile.txt to save to a file in vim.
Further reading on this matter: https://2ality.com/2018/05/child-process-streams.html
const { spawnSync } = require('child_process');
const string_1 = 'foo';
const string_2 = 'foobar';
const command = 'diff';
const args = [
'--unified',
`<(echo "${string_1}")`,
`<(echo "${string_2}")`,
];
const options = {
'shell': '/bin/bash',
};
const diff = spawnSync(command, args, options);
console.log(diff.stdout.toString());

Node child_process.spawn multiple commands

I wan to automate the creation and extracting of keystore.
The problem I'm facing is how to join the commands using the ' | ' symbol or similar solution.
//Original Command
var command='keytool -exportcert -storepass mypass -keypass mypass
-alias myalias -keystore mykey.keystore | openssl sha1 -binary | openssl base64';
//Arguments for the spawn
var keyArgs = [
'-exportcert',
'-storepass','mypass',
'-keypass','mypass',
'-alias','myalias',
'-keystore',"myjey.keystore",
'openssl','sha1',
'-binary',
'openssl','base64',
];
exec('keytool',keyArgs,{cwd:appCreateFolder+"/"+opt.id+"/Certificates"},function(e){
console.log(chalk.cyan('Key created'));
})
From Node.js v6 you can specify a shell option in spawn method which will run command using shell and thus it is possible to chain commands using spawn method.
For example this:
var spawn = require('child_process').spawn;
var child = spawn('ls && ls && ls', {
shell: true
});
child.stderr.on('data', function (data) {
console.error("STDERR:", data.toString());
});
child.stdout.on('data', function (data) {
console.log("STDOUT:", data.toString());
});
child.on('exit', function (exitCode) {
console.log("Child exited with code: " + exitCode);
});
Will trigger an error on node.js version less than 6:
Error: spawn ls && ls && ls ENOENT
But on version 6 and higher it will return expected result:
node app.js
STDOUT: app.js
STDOUT: app.js
app.js
Child exited with code: 0
The | symbol on the command line is called "piping" because it's like piping streams of data together. What you want is to get ahold of the stdin (Standard In) and stdout (Standard Out) streams for the commands you're executing.
For example, this is how you would spawn the echo command and pipe it's output to grep:
var spawn = require('child_process').spawn;
var echo = spawn('echo', ['The quick brown fox\njumped over the lazy dog.']);
var grep = spawn('grep', ['brown']);
echo.stdout.pipe(grep.stdin);
grep.stdout.pipe(process.stdin);
The above example spawns both the "echo" and "grep" commands. It pipes any output from the echo process's stdout stream to the grep process's stdin stream. Finally we pipe the grep process's stdout stream to the parent process's (your node process) stdin stream so you can see the output in your terminal.
The output would be "The quick brown fox" because I put a newline character in the middle and the grep only matched the first line containing "brown".
You could use the exec function to achieve the same result. Just might be harder to maintain in the future, but if all you need is to quickly run a set of piped commands, you can enter the full command line string (including pipe symbols) and pass it to exec.
var exec = require('child_process').exec;
var cmdString = 'grep "The quick brown fox\njumped over the lazy dog." | grep "brown"';
exec(cmdString, (err, stdout, stderr) => {
console.log(stdout);
});
Or instead of passing in the callback function you could just pipe the output to process.stdin if all you care about is seeing the command output.
exec(cmdString).stdout.pipe(process.stdin);
Here's a quick example of what I believe your code should look like using spawn. May require tweaks since it seems specific to what you're doing.
var keyArgs = [
'-exportcert',
'-storepass','mypass',
'-keypass','mypass',
'-alias','myalias',
'-keystore',"myjey.keystore",
'openssl','sha1',
'-binary',
'openssl','base64',
];
var keyOpts = {
cwd: `${appCreateFolder}/${opt.id}/Certificates`
};
var spawn = require('child_process').spawn;
var keytool = spawn('keytool', keyArgs, keyOpts);
var opensslBinary = spawn('openssl', ['sha1', '-binary']);
var opensslBase64 = spawn('openssl', ['base64']);
keytool.stdout.pipe(opensslBinary.stdin);
opensslBinary.stdout.pipe(opensslBase64.stdin);
opensslBase64.stdout.pipe(process.stdin);
opensslBase64.on('close', () => {
console.log(chalk.cyan('Key created.'));
});
Or using exec:
var exec = require('child_process').exec;
var cmdString = 'keytool -exportcert -storepass mypass -keypass mypass -alias myalias -keystore mykey.keystore | openssl sha1 -binary | openssl base64';
var cmdOpts = {
cwd: `${appCreateFolder}/${opt.id}/Certificates`
};
exec(cmdString, cmdOpts, () => {
console.log(chalk.cyan('Key created.'));
});

how do I make node child_process exec continuously

How to exec continuously? e.g. ls after cd?
I tried
exec = require('child_process').exec;
exec('cd ~/',
function(){
exec('ls'),
function(err, stdout, stderr){
console.log(stdout); // this logs current dir but not ~/'s
}
}
)
exec('cd ~/').exec('ls', function(err, stdout, stderr){
console.log(stdout);
})//this also fails because first exec returns a ChildProcess Object but not itself.
It is not possible to do this because exec and spawn creates a new process. But there is a way to simulate this. You can start a process with exec and execute multiple commands in the same time:
In the command line if you want to execute 3 commands on the same line you would write:
cmd1 & cmd2 & cmd3
So, all 3 commands run in the same process and have access to the context modified by the previous executed commands.
Let's take your example, you want to execute cd ../ and after that to execute dir and to view the previous directory list.
In cmd you shoud write:
cd../ & dir
From node js you can start a process with exec and to tell it to start another node instance that will evaluate an inline script:
var exec = require('child_process').exec;
var script = "var exec = require('child_process').exec;exec('dir',function(e,d,er){console.log(d);});";
script = '"'+script+'"';//enclose the inline script with "" because it contains spaces
var cmd2 = 'node -e '+script;
var cd = exec('cd ../ &'+cmd2,function(err,stdout,strerr)
{
console.log(stdout);//this would work
})
If you just want to change the current directory you should check the documentation about it http://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback
You can use nodejs promisify and async/await:
const { promisify } = require('util');
const exec = promisify(require('child_process').exec);
export default async function () {
const cpu = await exec('top -bn1');
const disk = await exec('df -h');
const memory = await exec('free -m');
const payload = {
cpu,
disk,
memory,
};
return payload
}
If you want to use cd first, better use process.chdir('~/'). Then single exec() will do the job.
You can call exec with cwd param like so:
exec('ls -a', {
cwd: '/Users/user'
}, (err, stdout) => {
if (err) {
console.log(err);
} else {
console.log(stdout);
}
})
But beware, cwd doesn't understand '~'. You can use process.env.HOME instead.

how to tail multiple files in node.js?

when use the following code to tail a file, we can successfully output data.
var spawn = require('child_process').spawn;
var filename = '/logs/error.log';
var tail = spawn("tail", ["-f", filename]);
tail.stdout.on("data", function (data) {
console.log(data);
});
but when i change filename to "/logs/*.log", i don't find anything output. who can tell me what is the reason? Thanks!
When typing tail -f /logs/*.log on the console, the expansion of /logs/*.log is handled by the shell; by the time the tail program gets the arguments, they've already been expanded to tail -f /logs/error.log /logs/other.log. You need to do the expansion yourself for Node:
var fs = require('fs');
var spawn = require('child_process').spawn;
var filename = fs.readdirSync('/logs').map(function(file) { return '/logs/' + file });
var tail = spawn("tail", ["-f"].concat(filename));
tail.stdout.on("data", function (data) {
console.log(data);
});
Because neither tail nor spawn know how to expand file names with wild cards into the set of matching file names. That's normally performed by the shell, so in this case you'll need to do it yourself in code.

Resources