Use child_process#spawn with a generic string - node.js

I have a script in the form of a string that I would like to execute in a Node.js child process.
The data looks like this:
const script = {
str: 'cd bar && fee fi fo fum',
interpreter: 'zsh'
};
Normally, I could use
const exec = [script.str,'|',script.interpreter].join(' ');
const cp = require('child_process');
cp.exec(exec, function(err,stdout,sterr){});
however, cp.exec buffers the stdout/stderr, and I would like to be able to be able to stream stdout/stderr to wherever.
does anyone know if there is a way to use cp.spawn in some way with a generic string, in the same way you can use cp.exec? I would like to avoid writing the string to a temporary file and then executing the file with cp.spawn.
cp.spawn will work with a string but only if it has a predictable format - this is for a library so it needs to be extremely generic.
...I just thought of something, I am guessing the best way to do this is:
const n = cp.spawn(script.interpreter);
n.stdin.write(script.str); // <<< key part
n.stdout.setEncoding('utf8');
n.stdout.pipe(fs.createWriteStream('./wherever'));
I will try that out, but maybe someone has a better idea.
downvoter: you are useless

Ok figured this out.
I used the answer from this question:
Nodejs Child Process: write to stdin from an already initialised process
The following allows you to feed a generic string to a child process, with different shell interpreters, the following uses zsh, but you could use bash or sh or whatever executable really.
const cp = require('child_process');
const n = cp.spawn('zsh');
n.stdin.setEncoding('utf8');
n.stdin.write('echo "bar"\n'); // <<< key part, you must use newline char
n.stdout.setEncoding('utf8');
n.stdout.on('data', function(d){
console.log('data => ', d);
});
Using Node.js, it's about the same, but seems like I need to use one extra call, that is, n.stdin.end(), like so:
const cp = require('child_process');
const n = cp.spawn('node').on('error', function(e){
console.error(e.stack || e);
});
n.stdin.setEncoding('utf-8');
n.stdin.write("\n console.log(require('util').inspect({zim:'zam'}));\n\n"); // <<< key part
n.stdin.end(); /// seems necessary to call .end()
n.stdout.setEncoding('utf8');
n.stdout.on('data', function(d){
console.log('data => ', d);
});

Related

return a string from an async nodejs function called in bash

i'm calling an async nodejs function that uses prompts(https://www.npmjs.com/package/prompts)
basically, the user is presented options and after they select one, i want the selection outputted to a variable in bash. I cannot get this to work. it either hangs, or outputs everything since prompts is a user interface that uses stdout
//nodefunc.js
async run() {
await blahhhh;
return result; // text string
}
console.log(run());
// bash
x=$(node nodefunc.js)
echo $x
Unless you can ensure nothing else in the node script will print to stdout, you will need a different approach.
I'd suggest having the node script write to a temporary file, and have the bash script read the output from there.
Something like this perhaps:
const fs = require('fs');
const outputString = 'I am output';
fs.writeFileSync('/tmp/node_output.txt', outputString);
node nodefunc.js
# Assuming the node script ran succesfully, read the output file
x=$(</tmp/node_output.txt)
echo "$x"
# Optionally, cleanup the tmp file
rm /tmp/node_output.txt

Is there a way to run a self-terminating js script that can pass variables to the next?

I'd really like to have some of my my secrets/keys be iterable, since I have a growing list of external api keys that would be easier to use if I could match them based on the route being used without having to statically map them at the start of my application.
The only way I can think to better organize them without writing massive JSON one-line strings in a batch/bash file would be to have it all defined in js object literals and have a js script stringify it and load it into ENV variables to be passed to the application that's about to start.
NPM pre-start script:
const env = {
secret: 'supersecret',
key: `key
that requires
line breaks`,
apiKeys: {
'api-1':'a;sodhgfasdgflksdaj;lg',
'api-2':'ajl;sdfj;adjsfkljasd;f'
}
}
for (let x in env) {
if (typeof env[x] == 'string') {
process.env[x] = env[x];
} else {
process.env[x] = JSON.stringify(env[x])
}
console.log(x)
}
process.exit(22);
NPM start script:
const key = process.env.key
const apiKeys = JSON.parse(process.env.apiKeys)
Unfortunately, the ENV variables don't remain between instances, so this is useless.
Would it also be secure to use STDIN and STDOUT to pass the data between the two scripts?
My solution ended up being to pipe output by converting to JSON then streaming to STDOUT and receiving on STDIN on the second script. Doing this made it platform agnostic and I can add any sort of active secret management in the source script (e.g. accepting secrets from various other secret management systems/vaults or generating new secrets at every launch)
Send to STDOUT:
const env = {
someSecret: 'supersecret',
superSecretObject: {
moreProperties: 'data'
}
};
/* If you have an array of properties or have a very large set of secrets,
you should create a readable stream from it, and send that to stdout,
but this is much simpler */
process.stdout.write(JSON.stringify(env));
Accept on STDIN:
const fs = require('fs')
const env = (function () {
/* Using fs will error out on no input, but you can use process.stdin
if you don't need to suspend the whole application waiting for the input */
let envTmp = fs.readFileSync(0).toString();
envTmp = JSON.parse(envTmp);
return envTmp;
})();

Node.js pass text as stdin of `spawnSync`

I'd think this would be simple, but the following does not work as expected.
I want to pipe data to a process, say (just an arbitrary command for illustration) wc, from Node.
The docs and other SO questions seem to indicate that passing a Stream should work:
const {spawnSync} = require('child_process')
const {Readable} = require('stream')
const textStream = new Readable()
textStream.push("one two three")
textStream.push(null)
const stdio = [textStream, process.stdout, process.stderr]
spawnSync('wc', ["-c"], { stdio })
Unfortunately this throws an error:
The value "Readable { ... } is invalid for option "stdio"
The relevant bit of code from internal/child_process.js does not immediately reveal what the anticipated valid options are.
To present particular data as stdin data for the child process, you can use the input option:
spawnSync('wc', ['-c'], { input : 'one two three' })

node.js: Trouble using a systemcall to write a file to the /tmp directory

As an exercise, I'm trying to use a systemcall from node.js to write a small text file to the /tmp directory. Here is my code:
#!/bin/node
var child_process = require("child_process");
var send = "Hello, world!";
child_process.exec('cat - > /tmp/test1', { input: send });
The file actually gets created; but, no content is placed in it. Things just hang. Can someone please tell me what I'm missing?
Also, I'd really like to know how to do this synchronously.
Thanks for any input.
... doug
hm unless i forgot to rtm too, this code will just never work. There is no such input option for cp.exec.
But there is a stdio option, will let us open the expected stdio on the child.
child_process.exec('cat - > /tmp/test1', { stdio: 'pipe' });
see https://nodejs.org/api/child_process.html#child_process_options_stdio
stdios are not string, they are streams, which we can end / write / pipe / close / push etc
see https://nodejs.org/api/stream.html
Note that stdin is a writable, stdout / stderr are readable.
To write the stdin of cat you ll now consume the cp.stdin object and call for its end() method.
child_process.exec('cat - > /tmp/test1', { stdio: 'pipe' }).stdin.end('hello world');
Note that end method is a write followed by a termination of the stream, which is required to tell cat to quit.
To ensure this is working well, we should refactor it, to not send stdin to a file, instead pipe child.stdout to the process.stdout.
var child_process = require('child_process');
var cp = child_process.exec('cat -', { stdio: 'pipe' });
cp.stdin.end('hello world');
cp.stdout.pipe(process.stderr);
Note that process is a global.
I finally got my original approach to work. The big stumbling block is to know that the synchronous methods are only available in version 0.12 (and later) of node.js. Here is the code that I finally got to work:
#!/usr/local/n/versions/node/0.12.14/bin/node
var child_process = require('child_process');
var send = "Hello, world!"
child_process.execSync('cat - > /tmp/test1', { input : send }).toString();
Thanks to all for the help.
... doug

Move files with node.js

Let's say I have a file "/tmp/sample.txt" and I want to move it to "/var/www/mysite/sample.txt" which is in a different volume.
How can i move the file in node.js?
I read that fs.rename only works inside the same volume and util.pump is already deprecated.
What is the proper way to do it? I read about stream.pipe, but I couldn't get it to work. A simple sample code would be very helpful.
Use the mv module:
var mv = require('mv');
mv('source', 'dest', function(err) {
// handle the error
});
If on Windows and don't have 'mv' module, can we do like
var fs = require("fs"),
source = fs.createReadStream("c:/sample.txt"),
destination = fs.createWriteStream("d:/sample.txt");
source.pipe(destination, { end: false });
source.on("end", function(){
fs.unlinkSync("C:/move.txt");
});
The mv module, like jbowes stated, is probably the right way to go, but you can use the child process API and use the built-in OS tools as an alternative. If you're in Linux use the "mv" command. If you're in Windows, use the "move" command.
var exec = require('child_process').exec;
exec('mv /temp/sample.txt /var/www/mysite/sample.txt',
function(err, stdout, stderr) {
// stdout is a string containing the output of the command.
});
You can also use spawn if exec doesn't work properly.
var spawn = require("child_process").spawn;
var child = spawn("mv", ["data.csv","./done/"]);
child.stdout.on("end", function () {
return next(null,"finished")
});
Hope this helps you out.

Resources