Node.js pass text as stdin of `spawnSync` - node.js

I'd think this would be simple, but the following does not work as expected.
I want to pipe data to a process, say (just an arbitrary command for illustration) wc, from Node.
The docs and other SO questions seem to indicate that passing a Stream should work:
const {spawnSync} = require('child_process')
const {Readable} = require('stream')
const textStream = new Readable()
textStream.push("one two three")
textStream.push(null)
const stdio = [textStream, process.stdout, process.stderr]
spawnSync('wc', ["-c"], { stdio })
Unfortunately this throws an error:
The value "Readable { ... } is invalid for option "stdio"
The relevant bit of code from internal/child_process.js does not immediately reveal what the anticipated valid options are.

To present particular data as stdin data for the child process, you can use the input option:
spawnSync('wc', ['-c'], { input : 'one two three' })

Related

How to capture JSON from stdin when parent doesn’t close pipe?

I developed a package called transpile-md-to-json that transpile multiple markdown files to a single JSON file.
Running transpile-md-to-json --src examples/content --watch transpiles markdown files to JSON and outputs result to stdout as markdown files are created, edited and deleted.
I tried using get-stdin to capture the JSON and process it some more using another node script.
transpile-md-to-json --src src/privacy-guides --blogify --watch | node test.js
Problem is stdin.on('end') is never fired because the pipe isn’t closed by transpile-md-to-json when watch mode is enabled (--watch).
See https://github.com/sindresorhus/get-stdin/blob/master/index.js#L23-L25
How can I work around this?
As pointed out by Mike in the comments, there appears to be no built-in way of achieving this as the pipe remains open until the parent exits, therefore stdin.on('end') is not fired.
The closest we can get is to use some kind of EOF indicator and use that to end a "cycle". An indicator we can hook to isn’t always present, but in the context of JSON, we’re good as each JSON payload ends with }.
const readline = require("readline")
const fs = require("fs")
process.stdin.setEncoding("utf-8")
const rl = readline.createInterface({
input: process.stdin,
})
var json = ""
rl.on("line", function(line) {
json += `${line}\n`
if (line === "}") {
console.log("done", json)
fs.writeFileSync("test.json", json)
json = ""
}
})

Is there a way to run a self-terminating js script that can pass variables to the next?

I'd really like to have some of my my secrets/keys be iterable, since I have a growing list of external api keys that would be easier to use if I could match them based on the route being used without having to statically map them at the start of my application.
The only way I can think to better organize them without writing massive JSON one-line strings in a batch/bash file would be to have it all defined in js object literals and have a js script stringify it and load it into ENV variables to be passed to the application that's about to start.
NPM pre-start script:
const env = {
secret: 'supersecret',
key: `key
that requires
line breaks`,
apiKeys: {
'api-1':'a;sodhgfasdgflksdaj;lg',
'api-2':'ajl;sdfj;adjsfkljasd;f'
}
}
for (let x in env) {
if (typeof env[x] == 'string') {
process.env[x] = env[x];
} else {
process.env[x] = JSON.stringify(env[x])
}
console.log(x)
}
process.exit(22);
NPM start script:
const key = process.env.key
const apiKeys = JSON.parse(process.env.apiKeys)
Unfortunately, the ENV variables don't remain between instances, so this is useless.
Would it also be secure to use STDIN and STDOUT to pass the data between the two scripts?
My solution ended up being to pipe output by converting to JSON then streaming to STDOUT and receiving on STDIN on the second script. Doing this made it platform agnostic and I can add any sort of active secret management in the source script (e.g. accepting secrets from various other secret management systems/vaults or generating new secrets at every launch)
Send to STDOUT:
const env = {
someSecret: 'supersecret',
superSecretObject: {
moreProperties: 'data'
}
};
/* If you have an array of properties or have a very large set of secrets,
you should create a readable stream from it, and send that to stdout,
but this is much simpler */
process.stdout.write(JSON.stringify(env));
Accept on STDIN:
const fs = require('fs')
const env = (function () {
/* Using fs will error out on no input, but you can use process.stdin
if you don't need to suspend the whole application waiting for the input */
let envTmp = fs.readFileSync(0).toString();
envTmp = JSON.parse(envTmp);
return envTmp;
})();

Use child_process#spawn with a generic string

I have a script in the form of a string that I would like to execute in a Node.js child process.
The data looks like this:
const script = {
str: 'cd bar && fee fi fo fum',
interpreter: 'zsh'
};
Normally, I could use
const exec = [script.str,'|',script.interpreter].join(' ');
const cp = require('child_process');
cp.exec(exec, function(err,stdout,sterr){});
however, cp.exec buffers the stdout/stderr, and I would like to be able to be able to stream stdout/stderr to wherever.
does anyone know if there is a way to use cp.spawn in some way with a generic string, in the same way you can use cp.exec? I would like to avoid writing the string to a temporary file and then executing the file with cp.spawn.
cp.spawn will work with a string but only if it has a predictable format - this is for a library so it needs to be extremely generic.
...I just thought of something, I am guessing the best way to do this is:
const n = cp.spawn(script.interpreter);
n.stdin.write(script.str); // <<< key part
n.stdout.setEncoding('utf8');
n.stdout.pipe(fs.createWriteStream('./wherever'));
I will try that out, but maybe someone has a better idea.
downvoter: you are useless
Ok figured this out.
I used the answer from this question:
Nodejs Child Process: write to stdin from an already initialised process
The following allows you to feed a generic string to a child process, with different shell interpreters, the following uses zsh, but you could use bash or sh or whatever executable really.
const cp = require('child_process');
const n = cp.spawn('zsh');
n.stdin.setEncoding('utf8');
n.stdin.write('echo "bar"\n'); // <<< key part, you must use newline char
n.stdout.setEncoding('utf8');
n.stdout.on('data', function(d){
console.log('data => ', d);
});
Using Node.js, it's about the same, but seems like I need to use one extra call, that is, n.stdin.end(), like so:
const cp = require('child_process');
const n = cp.spawn('node').on('error', function(e){
console.error(e.stack || e);
});
n.stdin.setEncoding('utf-8');
n.stdin.write("\n console.log(require('util').inspect({zim:'zam'}));\n\n"); // <<< key part
n.stdin.end(); /// seems necessary to call .end()
n.stdout.setEncoding('utf8');
n.stdout.on('data', function(d){
console.log('data => ', d);
});

How to read output in nodejs from spawned child when output is a file?

Trying to spawn a process with node and read its output.
I would like the output to be in a file and to be able to read it.
This is the code I have so far, but it throws an error -
const outFile = fs.openSync('out.log', 'a')
const errFile = fs.openSync('err.log', 'a')
const child = childProcess.spawn('node', [pathToJsFile], {
stdio: ['ignore', outFile, errFile],
detached: true
})
child.unref()
console.log(child.stdio)
console.log('waiting for output')
child.stdio[1].on('data', (data)=> { // ==> get error since stdio[1] is null
As mentioned in the comment, when I look in child.stdio I see [null, null, null]
However, when I look at the file, I can see the output is written.
I am using node 4.2.1
What am I doing wrong and how can I make this work?
You are wiring up the child process's output to a filesystem file 'out.log', so it goes there and therefore is not also directly available via stdout. You'll need to directly read the output file by filesystem path using the fs core module. var outBuffer = fs.readSync('out.log');

How can I parse a string into appropriate arguments for child_process.spawn?

I want to be able to take a command string, for example:
some/script --option="Quoted Option" -d --another-option 'Quoted Argument'
And parse it into something that I can send to child_process.spawn:
spawn("some/script", ["--option=\"Quoted Option\"", "-d", "--another-option", "Quoted Argument"])
All of the parsing libraries I've found (e.g. minimist, etc.) do too much here by parsing it into some kind of options object, etc. I basically want the equivalent of whatever Node does to create process.argv in the first place.
This seems like a frustrating hole in the native APIs since exec takes a string, but doesn't execute as safely as spawn. Right now I'm hacking around this by using:
spawn("/bin/sh", ["-c", commandString])
However, I don't want this to be tied to UNIX so strongly (ideally it'd work on Windows too). Halp?
Standard Method (no library)
You don't have to parse the command string into arguments, there's an option on child_process.spawn named shell.
options.shell
If true, runs command inside of a shell.
Uses /bin/sh on UNIX, and cmd.exe on Windows.
Example:
let command = `some_script --option="Quoted Option" -d --another-option 'Quoted Argument'`
let process = child_process.spawn(command, [], { shell: true }) // use `shell` option
process.stdout.on('data', (data) => {
console.log(data)
})
process.stderr.on('data', (data) => {
console.log(data)
})
process.on('close', (code) => {
console.log(code)
})
The minimist-string package might be just what you're looking for.
Here's some sample code that parses your sample string -
const ms = require('minimist-string')
const sampleString = 'some/script --option="Quoted Option" -d --another-option \'Quoted Argument\'';
const args = ms(sampleString);
console.dir(args)
This piece of code outputs this -
{
_: [ 'some/script' ],
option: 'Quoted Option',
d: true,
'another-option': 'Quoted Argument'
}

Resources