I am using this pattern to execute bash scripts:
const exec = util.promisify(require('child_process').exec);
async function myBash() {
try {
const { stdout, stderr } = await exec("echo hi");
console.log(stdout);
} catch (err){
console.error(err);
};
};
How can I pass the variable greeting to the exec command - non working:
const exec = util.promisify(require('child_process').exec);
const greeting = "hello";
async function myBash(greeting) {
try {
const { stdout, stderr } = await exec("echo", greeting);
console.log(stdout);
} catch (err){
console.error(err);
};
};
const exec = require('child_process').exec;
const greeting = "hello";
async function myBash(greeting) {
try {
const {
stdout,
stderr
} = await exec(`echo ${greeting}`);
console.log(stdout);
} catch (err) {
console.error(err);
};
};
myBash(greeting);
exec will execute a command line (a string) by feeding it to a shell. You have two options:
Not great in this use-case: use exec with a string that incorporates arguments:
exec(`echo '${greeting}'`);
or equivalently
exec("echo '" + greeting + "'");
Note that this breaks down if the argument contains single quotes, so they need to be sanitised or properly escaped if you do not trust the arguments.
Much better in this case: use a function that is designed to pass arguments directly to an executable - execFile:
execFile("echo", [greeting]);
Note that this does not invoke shell; here it actually executes /bin/echo, not the bash builtin echo. It also does not parse any arguments, so wildcards, variables etc will not be substituted.
Related
I have a Node JS CLI script that automates some migrations to a third-party service. I've largely avoided asynchronous methods (e.g. prefixing with async) as 1) I don't fully understand it in this context and 2) it hasn't been necessary for the script thus far.
Where I'm having trouble, is that I'm looping through a set of files and attempting to call a method on each entry, but the method doesn't execute before the script exits.
Here's the primary method:
const migrateAll = (app, env, source) => {
const self = this;
fs.promises
.readFile(config, "utf8")
.then((contents) => {
self.config = JSON.parse(contents);
})
.then(() => {
const spaceId = self.config.applications[app].space_id;
fs.readdir(source, "utf8", (err, files) => {
if (err) throw err;
files.forEach((file) => {
console.log(chalk.yellow(`Migrating "${file}" to the "${env}" environment for "${app}"`))
migrate(file, env, app);
});
process.exit();
});
});
};
The call to migrate(file, env, app) doesn't appear to run at all. The contents of that function are:
const migrate = (space, env, migration) => {
exec(
`migrate ${migration} "${space}" ${env}`,
(error, stdout, stderr) => {
if (error) {
// A `switch` to handle errors.
}
process.exit();
}
if (stderr) {
console.log(`stderr: ${stderr}`);
process.exit();
}
success(stdout);
);
};
The rest of the script, in context, looks like this:
const parseFlags = () => {
process.argv.splice(2).forEach((arg) => {
let parts = arg.split("=");
switch (parts[0]) {
// parse flags into variables
}
});
if (all) {
migrateAll(app, env, source);
}
return { app, env, source };
};
const run = () => {
try {
intro();
checkSettings();
const { app, env, source } = parseFlags();
// continue on here if migrateAll doesn't get called
} catch (err) {
complain(err);
}
};
run();
So, with the right flags, we call migrateAll() which in turn calls the migrate function for each file we find that needs to migrate. Some things I've noticed/tried
The console.log inside of the forEach in migrateAll runs as expected
I've tried various combinations of await and async, .then(), promisify, etc, but it feels like I'm throwing things at the wall just to see what sticks to no avail.
A few things:
You're calling async functions (fs.promises.readFile, readdir and exec) from within synchronous contexts. So in your script execution you have basically this:
migrate() ---------------+
| |
execution complete |
| readFile()
exit |
parse()
|
readdir()
|
exec()
You're synchronous execution completes before you finish running the async stuff.
You seem to spawning off a bunch of child processes to run these modules, you should instead require and just run them in-process
exec is not a safe way to spawn a child process as you're passing the string directly into a shell. If I as a user can control any of those three arguments I can pop a remote shell in netcat.
Using shorthand migrate is not a safe way to call a child process as it resolves from the PATH environment variable. If I have access to the runtime environment I can make migrate point to whatever I want.
Don't call process.exit(). The exit code you pass lets the caller or operating system know whether something went wrong. On success call process.exit(0), on error use any integer that's greater than 0 and less than 256. You can assign a unique exit code to each error situation if you wish.
Try this
// migrate.js
const {spawn} = require('child_process');
module.exports = async (space,env,migration) => new Promise((resolve,reject)=>{
let stdout = '';
let stderr = '';
let args = [migration,space,env];
let cp = spawn('/absolute/path/to/migrate', args);
cp.on('error',reject);
cp.stdout.setEncoding('utf8').on('data',(d)=>stdout+=d);
cp.stderr.setEncoding('utf8').on('data',(d)=>stderr+=d);
cp.on('exit',(code,signal) => {
if(signal)
code = signal;
if(code != 0) {
console.error(stderr);
return reject(new Error(`migrate returned exit code ${code}`));
}
resolve(stdout);
});
}).then(success); // not sure what success does, but this is equivalent to what you had
// migrate-all.js
const fsp = require('fs').promises;
const migrate = require('./migrate');
module.exports = async (app,env,source) => {
let contents = await fsp.readFile(config,'utf8');
self.config = JSON.parse(contents);
const spaceId = self.config.applications[app].space_id;
let files = await fsp.readdir(source);
for(let i in files) { // avoid async calls in .forEach loops
let file = files[i];
console.log(chalk.yellow(`Migrating "${file}" to the "${env}" environment for "${app}"`))
await migrate(file, env, app);
}
}
// index.js
const migrateAll = require('./migrate-all');
const parseFlags = async () => {
process.argv.splice(2).forEach((arg) => {
let parts = arg.split("=");
switch (parts[0]) {
// parse flags into variables
}
});
if (all) {
await migrateAll(app, env, source);
}
return { app, env, source };
};
const run = async () => {
try {
intro();
checkSettings();
const { app, env, source } = await parseFlags();
// continue on here if migrateAll doesn't get called
} catch (err) {
complain(err);
}
};
run();
I've written a node script to manage deployment of a git repository to a AWS autoscaling group.
The script uses child_process.spawn() to automate git, to clone repositories, checkout tags etc.
It works fine if git can find appropriate credentials. However if credentials aren't automatically found, then the spawned process will attempt to prompt for credentials, and at that point will hang. Even Ctrl-C cannot exit. The whole shell session must be ended.
The spawn() call is wrapped in a function to return a Promise. My function looks like so...
const cp = require('child_process');
let spawn_promise = (command, args, options, stream_output) => {
return new Promise((resolve, reject) => {
console.log(chalk.cyan(`${command} [${args}]`));
let childProcess = cp.spawn(command, args, options);
let std_out = '';
let std_err = '';
childProcess.stdout.on('data', function (data) {
std_out += data.toString();
if (stream_output)
console.log(chalk.green(data.toString()));
});
childProcess.stderr.on('data', function (data) {
std_err += data.toString();
if (stream_output)
console.log(chalk.red(data.toString()));
});
childProcess.on('close', (code) => {
if (code === 0) {
console.log(chalk.blue(`exit_code = ${code}`));
return resolve(std_out);
}
else {
console.log(chalk.yellow(`exit_code = ${code}`));
return reject(std_err);
}
});
childProcess.on('error', (error) => {
std_err += error.toString();
if (stream_output)
console.log(chalk.red(error.toString()));
});
});
}
I call it like so...
return spawn_promise('git', ['fetch', '--all'], {env: process.env})
.then(() => {
...
It mostly works very well, and allows easily manipulation of output and errors etc.
I'm having trouble figuring out a nice way to to handle input though, if a spawned process needs it.
A temporary work-around for the problem is to add an environment variable to prevent git from prompting for credentials, and instead to throw an error if it can't find credentials in the users environment. However this isn't ideal. Ideally I would like to be able to gracefully handle standard input, and still be able to capture and process the output and errors as I'm currently doing.
I can fix the problem with input by doing this...
let childProcess = cp.spawn(command, args, { stdio: [process.stdin, process.stdout, process.stderr] });
This allows git to prompt for credentials correctly. However I then lose the ability to capture the command output.
What is the correct way to be able to handle this?
I should also mention, that the function also automates some relatively long running processes, to build AMI's etc. This is what the "stream_output" parameter is for. I want to be able to view the output from the command in real-time, rather than waiting until the process completes.
The child_process has stdin to handle the input and same can be used to enter the input when the child_process is running.
See below an example:
test.sh:
#!/bin/sh
echo "Please enter something:"
read ch
echo "Thanks"
When I run on this terminal:
shell-input $ ./test.sh
Please enter something:
something
Thanks
shell-input $
When I use your code to run this:
test.js:
const cp = require('child_process');
const chalk = require('chalk');
let spawn_promise = (command, args, options, stream_output) => {
return new Promise((resolve, reject) => {
console.log(chalk.cyan(`${command} [${args}]`));
let childProcess = cp.spawn(command, args, options);
let std_out = '';
let std_err = '';
childProcess.stdout.on('data', function (data) {
std_out += data.toString();
if (stream_output)
console.log(chalk.green(data.toString()));
});
childProcess.stderr.on('data', function (data) {
std_err += data.toString();
if (stream_output)
console.log(chalk.red(data.toString()));
});
childProcess.on('close', (code) => {
if (code === 0) {
console.log(chalk.blue(`exit_code = ${code}`));
return resolve(std_out);
}
else {
console.log(chalk.yellow(`exit_code = ${code}`));
return reject(std_err);
}
});
childProcess.on('error', (error) => {
std_err += error.toString();
if (stream_output)
console.log(chalk.red(error.toString()));
});
});
}
spawn_promise('./test.sh', { env: process.env})
.then(() => {
});
Output:
$ node test.js
./test.sh [[object Object]]
<stuck here>
I modify your code to include the following:
...
childProcess.stdout.on('data', function (data) {
if (data == "Please enter something:\n")
{
childProcess.stdin.write("something\n");
//childProcess.stdin.end(); // Call this to end the session
}
std_out += data.toString();
if (stream_output)
console.log(chalk.green(data.toString()));
});
...
Then I run again:
$ node test.js
./test.sh [[object Object]]
exit_code = 0
It works. Basically you need to find out when stdin is waiting for input. You can use data event on stdout for that and then write on stdin. If you don't have credentials to write, you can end the session by calling childProcess.stdin.end();
I try to execute long processes sequentially with node.js (docker exec commands).
I do:
const childProcess = require('child_process');
const execWithPromise = async command => {
return new Promise(async resolve => {
const process = childProcess.exec(command);
process.on('exit', err => resolve(err));
process.on('close', err => resolve(err));
});
};
const run = async () => {
await execWithPromise('/usr/local/bin/docker exec -i -t cucumber node long-running-script.js');
await execWithPromise('/usr/local/bin/docker exec -i -t cucumber node long-running-script.js');
};
run();
But the promise is resolved immediately with a result of 1. In both cases. The command runs on the commandline just fine.
Why is it returning immediately?
child_process.exec expects a callback as the second or third argument. It doesn't return a promise. You have a few choices depending on your use case and version of node. The following work with node 16.x
Use a callback and return the resolve.
const execWithPromise = command =>
new Promise((resolve, reject) => {
childProcess.exec(command, (err, stout, sterr) => {
if(err) {
reject(sterr)
} else {
resolve(stout)
}
})
})
Use spawn instead (keeping most of your code)
const execWithPromise = command =>
new Promise((resolve, reject) => {
const process = childProcess.spawn(command);
let data = '';
let error = '';
process.stdout.on('data', stdout => {
data += stdout.toString();
});
process.stderr.on('data', stderr => {
error += stderr.toString();
});
process.on('error', err => {
reject(err);
})
process.on('close', code => {
if (code !== 0) {
reject(error)
} else {
resolve(data)
}
process.stdin.end();
});
});
Use execSync
const execWithPromise = command => childProcess.execSync(command).toString();
I know this is an old question but here is a useful tool I discovered with node a while back...So, say you have a node file app.ts, in typescript that is...
app.ts
import utils from 'util'; // The thing that is useful, it has a bunch of useful functions
import { exec } from 'child_process'; // The exec import
export const execute = utils.promisify(exec);
const run = async () => {
await execute('/usr/local/bin/docker exec -i -t cucumber node long-running-script.js');
await execute('/usr/local/bin/docker exec -i -t cucumber node long-running-script.js');
};
run();
In js though it would probably something like this
app.js
const utils = require('util');
const exec = require('child_process').exec;
const execute = utils.promisify(exec);
const run = async () => {
await execute('/usr/local/bin/docker exec -i -t cucumber node long-running-script.js');
await execute('/usr/local/bin/docker exec -i -t cucumber node long-running-script.js');
};
run();
I have a list of shell commands I want to execute with nodejs:
// index.js
var commands = ["npm install", "echo 'hello'"];
var exec = require('child_process').exec;
for (var i = 0; i < commands.length; i++) {
exec(commands[i], function(err, stdout) {
console.log(stdout);
});
}
When I run this, the commands are executed in the reverse order. Why is this happening? How do i execute the commands in sequence?
Better yet, is there a way to execute shell commands without using nodejs? I find its async handling of the shell a little cumbersome.
NOTE:
I know that libraries like shelljs exist. I'm trying to do this with base nodejs only.
Your for loop is executing all your asynchronous operation in parallel at once because exec() is non-blocking. The order they will complete depends upon their execution time and will not be determinate. If you truly want them to be sequenced, then you have to execute one, wait for it to call it's completion callback and then execute the next one.
You can't use a traditional for loop to "wait" on an asynchronous operation to complete in Javascript in order to execute them sequentially. Instead, you have to make the iteration manually where you kick off the next iteration in the completion callback of the previous one. My usual way of doing that is with a counter and a local function called next() like this:
Manual Async Iteration
var commands = ["npm install", "echo 'hello'"];
var exec = require('child_process').exec;
function runCommands(array, callback) {
var index = 0;
var results = [];
function next() {
if (index < array.length) {
exec(array[index++], function(err, stdout) {
if (err) return callback(err);
// do the next iteration
results.push(stdout);
next();
});
} else {
// all done here
callback(null, results);
}
}
// start the first iteration
next();
}
runCommands(commands, function(err, results) {
// error or results here
});
ES6 Promises
Since promises have been standardized in ES6 and are built into node.js now, I like to use Promises for my async operations:
var exec = require('child_process').exec;
function execPromise = function(cmd) {
return new Promise(function(resolve, reject) {
exec(cmd, function(err, stdout) {
if (err) return reject(err);
resolve(stdout);
});
});
}
var commands = ["npm install", "echo 'hello'"];
commands.reduce(function(p, cmd) {
return p.then(function(results) {
return execPromise(cmd).then(function(stdout) {
results.push(stdout);
return results;
});
});
}, Promise.resolve([])).then(function(results) {
// all done here, all results in the results array
}, function(err) {
// error here
});
Bluebird Promises
Using the Bluebird promise library, this would be even simpler:
var Promise = require('bluebird');
var execP = Promise.promisify(require('child_process').exec);
var commands = ["npm install", "echo 'hello'"];
Promise.mapSeries(commands, execP).then(function(results) {
// all results here
}, function(err) {
// error here
});
Opt.1: Use the '...Sync' version of the function if it exists
In this case there is already an execSync function:
child_process.execSync(command[, options])
Opt.2: Generators magic!
For a more general purpose, nowadays you could use e.g. this 'generator' pattern to 'deasync' any async function inside them, very useful for any sequential OS script.
Here an example of how to use readline async function in a sync fashion in node.js v6+ (I think also v4+)
var main = (function* () {
var rl = require('readline')
.createInterface({input: process.stdin, output: process.stdout });
// the callback uses the iterator '.next()' to resume the 'yield'
a = yield rl.question('do you want this? ', r=>main.next(r))
b = yield rl.question('are you sure? ', r=>main.next(r))
rl.close()
console.log(a,b)
})() // <- generator executed, iterator 'main' created
main.next() // <- start iterator, run till the first 'yield'
In a node.js, I'd like to find a way to obtain the output of a Unix terminal command. Is there any way to do this?
function getCommandOutput(commandString){
// now how can I implement this function?
// getCommandOutput("ls") should print the terminal output of the shell command "ls"
}
This is the method I'm using in a project I am currently working on.
var exec = require('child_process').exec;
function execute(command, callback){
exec(command, function(error, stdout, stderr){ callback(stdout); });
};
Example of retrieving a git user:
module.exports.getGitUser = function(callback){
execute("git config --global user.name", function(name){
execute("git config --global user.email", function(email){
callback({ name: name.replace("\n", ""), email: email.replace("\n", "") });
});
});
};
If you're using node later than 7.6 and you don't like the callback style, you can also use node-util's promisify function with async / await to get shell commands which read cleanly. Here's an example of the accepted answer, using this technique:
const { promisify } = require('util');
const exec = promisify(require('child_process').exec)
module.exports.getGitUser = async function getGitUser () {
// Exec output contains both stderr and stdout outputs
const nameOutput = await exec('git config --global user.name')
const emailOutput = await exec('git config --global user.email')
return {
name: nameOutput.stdout.trim(),
email: emailOutput.stdout.trim()
}
};
This also has the added benefit of returning a rejected promise on failed commands, which can be handled with try / catch inside the async code.
You're looking for child_process
var exec = require('child_process').exec;
var child;
child = exec(command,
function (error, stdout, stderr) {
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error !== null) {
console.log('exec error: ' + error);
}
});
As pointed out by Renato, there are some synchronous exec packages out there now too, see sync-exec that might be more what yo're looking for. Keep in mind though, node.js is designed to be a single threaded high performance network server, so if that's what you're looking to use it for, stay away from sync-exec kinda stuff unless you're only using it during startup or something.
Requirements
This will require Node.js 7 or later with a support for Promises and Async/Await.
Solution
Create a wrapper function that leverage promises to control the behavior of the child_process.exec command.
Explanation
Using promises and an asynchronous function, you can mimic the behavior of a shell returning the output, without falling into a callback hell and with a pretty neat API. Using the await keyword, you can create a script that reads easily, while still be able to get the work of child_process.exec done.
Code sample
const childProcess = require("child_process");
/**
* #param {string} command A shell command to execute
* #return {Promise<string>} A promise that resolve to the output of the shell command, or an error
* #example const output = await execute("ls -alh");
*/
function execute(command) {
/**
* #param {Function} resolve A function that resolves the promise
* #param {Function} reject A function that fails the promise
* #see https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise
*/
return new Promise(function(resolve, reject) {
/**
* #param {Error} error An error triggered during the execution of the childProcess.exec command
* #param {string|Buffer} standardOutput The result of the shell command execution
* #param {string|Buffer} standardError The error resulting of the shell command execution
* #see https://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback
*/
childProcess.exec(command, function(error, standardOutput, standardError) {
if (error) {
reject();
return;
}
if (standardError) {
reject(standardError);
return;
}
resolve(standardOutput);
});
});
}
Usage
async function main() {
try {
const passwdContent = await execute("cat /etc/passwd");
console.log(passwdContent);
} catch (error) {
console.error(error.toString());
}
try {
const shadowContent = await execute("cat /etc/shadow");
console.log(shadowContent);
} catch (error) {
console.error(error.toString());
}
}
main();
Sample Output
root:x:0:0::/root:/bin/bash
[output trimmed, bottom line it succeeded]
Error: Command failed: cat /etc/shadow
cat: /etc/shadow: Permission denied
Try it online.
Repl.it.
External resources
Promises.
child_process.exec.
Node.js support table.
Thanks to Renato answer, I have created a really basic example:
const exec = require('child_process').exec
exec('git config --global user.name', (err, stdout, stderr) => console.log(stdout))
It will just print your global git username :)
You can use the util library that comes with nodejs to get a promise from the exec command and can use that output as you need. Use destructuring to store the stdout and stderr in variables.
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function lsExample() {
const {
stdout,
stderr
} = await exec('ls');
console.log('stdout:', stdout);
console.error('stderr:', stderr);
}
lsExample();
you can use ShellJS package.
ShellJS is a portable (Windows/Linux/OS X) implementation of Unix shell commands on top of the Node.js API.
see: https://www.npmjs.com/package/shelljs#execcommand--options--callback
import * as shell from "shelljs";
//usage:
//exec(command [, options] [, callback])
//example:
const version = shell.exec("node --version", {async: false}).stdout;
console.log("nodejs version", version);
Here's an async await TypeScript implementation of the accepted answer:
const execute = async (command: string): Promise<any> => {
return new Promise((resolve, reject) => {
const exec = require("child_process").exec;
exec(
command,
function (
error: Error,
stdout: string | Buffer,
stderr: string | Buffer
) {
if (error) {
reject(error);
return;
}
if (stderr) {
reject(stderr);
return;
} else {
resolve(stdout);
}
}
);
});
};