I would like to
C:\>ACommandThatGetsData > save.txt
But instead of parsing and saving the data in the console, I would like to do the above command with Node.JS
How to execute a shell command with Node.JS?
Use process.execPath():
process.execPath('/path/to/executable');
Update
I should have read the documentations better.
There is a Child Process Module which allows to execute a child process. You will need either child_process.exec, child_process.execFile or child_process.spawn. All of these are similar in use, but each has its own advantages. Which of them to use depends on your needs.
You could also try the node-cmd package:
const nodeCmd = require('node-cmd');
nodeCmd.get('dir', (err, data, stderr) => console.log(data));
On newer versions of the package, the syntax changed a little:
const nodeCmd = require('node-cmd');
nodeCmd.run('dir', (err, data, stderr) => console.log(data));
I know this question is old, but it helped me get to my solution using promises.
Also see: this question & answer
const util = require('util');
const exec = util.promisify(require('child_process').exec);
async function runCommand(command) {
const { stdout, stderr, error } = await exec(command);
if(stderr){console.error('stderr:', stderr);}
if(error){console.error('error:', error);}
return stdout;
}
async function myFunction () {
// your code here building the command you wish to execute ...
const command = 'dir';
const result = await runCommand(command);
console.log("_result", result);
// your code here processing the result ...
}
// just calling myFunction() here so it runs when the file is loaded
myFunction();
Related
I have this string I need to execute on MacOS shell: solana --help
It runs fine as expected in the macOS terminal, even if I cd into my project directory.
The said application is on my global PATH too.
I have the following code opened in my vscode project:
const util = require('util');
const exec = util.promisify(require('child_process').exec);
(async () => {
const { stdout, stderr } = await exec('solana --help');
console.log('stdout:', stdout);
console.log('stderr:', stderr);
})();
Instead of showing expected output, the program errors out and shows the following error:
https://i.stack.imgur.com/fseXV.png
You need to type whereis solana in the command line to find the full path to that executable, then replace that in your code, i.e:
const { stdout, stderr } = await exec('full_path_to_solana --help');
Then show us the result.
When I try to execute an executable that have an option to have parameters. It will freeze the nodejs output and input until the executable is closed. Executables that do not need params will just run, and the nodejs console will not freeze/lock input nor output.
Example with param: test.exe -thisisaparam. Example without Params: test.exe.
Here is my code below. (Its a cli)
const cp = require('child_process');
let start = async function (start) {
let command = `start "" ${start}`;
cp.execSync(command);
console.log("Returning to menu in 10 seconds...")
setTimeout(() => {
run()
}, 10000);
};
Here is how I call the function.
async function startTest() {
await start("C:\users\user\downloads\test_param.exe")
}
Thanks, Kiefer.
Any command you run using execSync will be synchronous meaning it will wait for the command to exit and then returns the output.
If you don't need the output of the command and want to just start and detach you should use spawn with unref().
example:
const scriptPath = "C:\users\user\downloads\test_param.exe"
cp.spawn('start', [scriptPath], {detached: true, stdio: 'ignore'}).unref()
How to set what would otherwise be command-line arguments to node for a NodeJS process run from a launcher script? (The sh/CMD scripts npm places into node_modules/.bin.)
Plenty of NodeJS libraries / frameworks come with their own runner script, e.g. zeit/micro or moleculer that's usually executed from a npm script. This presents a problem in development, since in my case I want to do the equivalent of:
node --inspect -r ts-node/register -r dotenv-safe/config src/index.ts
(Except, of course, that does nothing since index.ts just exports something for the runner to pick up.)
Is there some "clean", preferably generic (i.e. not specific to a given framework's runner exposing those command line params) way that I'm missing to do this, ideally one that works as a npm script? The only thing that seems like it would work would be for e.g. micro:
node-dev -r ts-node/register ./node_modules/micro-dev/bin/micro-dev.js ./src/index.ts
which is kind of a mouthful from the Redundant Department of Redundancy Department and seems to obviate the point of having those launcher scripts. (It also won't work if the runner spawns other Node processes, but that's not a problem I'm actually having.) I'd like to not have to duplicate what the launcher scripts are already doing. I'm also aware of npx having --node-arg but npx is a whole another can of worms. (On Windows it's five seconds of startup time and one spurious error message just to run a script I already have installed; it also won't find an already installed package if it can't find its .cmd launcher script, e.g. when using Docker to run the dev environment. In short I'd rather not use npx for this.)
To clear up the confusion that seems to crop up in the comments: I want to override the command line parameters that affect the behaviour of the NodeJS runtime itself executing the runner script, not pass parameters to the script itself or to my code. That is, the options listed here: https://nodejs.org/api/cli.html
One option is to write a little wrapper script that uses the current process execPath to run child_process.execFile.
So the sample here is to be able to do
node --expose-http2 --zero-fill-buffers -r ./some-module.js ./test.js
but not actually write that out, instead have wrap.js inject the args:
node ./wrap.js ./test.js
I tested running this via npm in a package.json, and it works fine. I tested that it was working by having some-module.js stick a value on the global object, and then logging it in test.js.
Files involved:
wrap.js
const child_process = require('child_process');
const nodeArgs = ['--expose-http2', '--zero-fill-buffers', '-r', './some-module.js'];
const runTarget = process.argv[2];
console.log('going to wrap', runTarget, 'with', nodeArgs);
const finalArgs = nodeArgs.concat(runTarget).concat(process.argv.slice(2));
const child = child_process.execFile(
process.execPath,
finalArgs,
{
env: process.env,
cwd: process.cwd(),
stdio: 'inherit'
}, (e, stdout, stderr) => {
console.log('process completed');
if (e) {
process.emit('uncaughtException', e);
}
});
child.stdout.pipe(process.stdout);
child.stderr.pipe(process.stderr);
and
some-module.js
global.testval = 2;
and
test.js
console.log('hi guys, did the wrap work?', global.testval)
EDIT: So upon further thought, this solution really only satisfies wrapping the initial runner. But most tools, such as mocha re-spawn a sub process which would then lose this effect. To really get the job done, you can proxy each of the child process calls and somewhat enforce that calls to spawn and such also include your args.
I rewrote the code to reflect this. Here's a new setup:
package.json
{
"scripts": {
"test": "node -r ./ensure-wrapped.js node_modules/mocha/$(npm view mocha bin.mocha) ./test.js"
},
"dependencies": {
"mocha": "^5.1.0"
}
}
ensure-wrapped.js
const child_process = require('child_process');
// up here we can require code or do whatever we want;
global.testvalue = 'hi there'
const customParams = ['--zero-fill-buffers'];
// the code below injects itself into any child process's spawn/fork/exec calls
// so that it propogates
const matchNodeRe = /((:?\s|^|\/)node(:?(:?\.exe)|(:?\.js)|(:?\s+)|$))/;
const ensureWrappedLocation = __filename;
const injectArgsAndAddToParamsIfPathMatchesNode = (cmd, args, params) => {
params.unshift(...customParams);
params.unshift(args);
if (!Array.isArray(args)) { // all child_proc functions do [] optionally, then other params
args = []
params.unshift(args);
}
if (!matchNodeRe.test(cmd)) {
return params;
}
args.unshift(ensureWrappedLocation);
args.unshift('-r');
return params;
}
child_process._exec = child_process.exec;
child_process.exec = (cmd, ...params) => {
// replace node.js node.exe or /path/to/node to inject -r ensure-wrapped.js ...args..
// leaves alone exec if it isn't calling node
cmd = cmd.replace(matchNodeRe, '$1 -r ' + ensureWrappedLocation + ' ');
return child_process._exec(cmd, ...params)
}
child_process._execFile = child_process.execFile;
child_process.execFile = (path, args, ...params) => {
params = injectArgsAndAddToParamsIfPathMatchesNode(path, args, params);
return child_process._execFile(path, ...params)
}
child_process._execFileSync = child_process.execFileSync;
child_process.execFileSync = (path, args, ...params) => {
params = injectArgsAndAddToParamsIfPathMatchesNode(path, args, params);
return child_process._execFileSync(path, ...params);
}
child_process._execSync = child_process.execSync;
child_process.execSync = (cmd, ...params) => {
cmd = cmd.replace(matchNodeRe, '$1 -r ' + ensureWrappedLocation + ' ');
return child_process._exec(bin, ...args)
}
child_process._fork = child_process.fork;
child_process.fork = (module, args, ...params) => {
params = injectArgsAndAddToParamsIfPathMatchesNode(process.execPath, args, params);
return child_process._fork(module, ...params);
}
child_process._spawn = child_process.spawn;
child_process.spawn = (cmd, args, ...params) => {
params = injectArgsAndAddToParamsIfPathMatchesNode(cmd, args, params);
return child_process._spawn(cmd, ...params)
}
child_process._spawnSync = child_process.spawnSync;
child_process.spawnSync = (cmd, args, ...params) => {
params = injectArgsAndAddToParamsIfPathMatchesNode(cmd, args, params);
return child_process._spawnSync(cmd, ...params);
}
test.js
describe('test', () => {
it('should have the global value pulled in by some-module.js', (done) => {
if (global.testvalue !== 'hi there') {
done(new Error('test value was not globally set'))
}
return done();
})
})
Please never put code like this into a node module that's published. modifying the global library functions is pretty bad.
Everything passed in the command line AFTER your nodejs application is parsed into an array called process.argv. So...
node myapp.js foo bar hello 5000
In your nodejs code...
const args = process.argv;
console.log(args[0]);
console.log(args[1]);
console.log(args[2]);
console.log(args[3]);
would yield...
foo
bar
hello
5000
I didnt get clear scenario of your problem,but as your question title ,we can execute the any cmd command from nodejs using npm libraries like:
import Promise from 'bluebird'
import cmd from 'node-cmd'
const getAsync = Promise.promisify(cmd.get, { multiArgs: true, context: cmd })
getAsync('node -v').then(data => {
console.log('cmd data', data)
}).catch(err => {
console.log('cmd err', err)
})
How to exec continuously? e.g. ls after cd?
I tried
exec = require('child_process').exec;
exec('cd ~/',
function(){
exec('ls'),
function(err, stdout, stderr){
console.log(stdout); // this logs current dir but not ~/'s
}
}
)
exec('cd ~/').exec('ls', function(err, stdout, stderr){
console.log(stdout);
})//this also fails because first exec returns a ChildProcess Object but not itself.
It is not possible to do this because exec and spawn creates a new process. But there is a way to simulate this. You can start a process with exec and execute multiple commands in the same time:
In the command line if you want to execute 3 commands on the same line you would write:
cmd1 & cmd2 & cmd3
So, all 3 commands run in the same process and have access to the context modified by the previous executed commands.
Let's take your example, you want to execute cd ../ and after that to execute dir and to view the previous directory list.
In cmd you shoud write:
cd../ & dir
From node js you can start a process with exec and to tell it to start another node instance that will evaluate an inline script:
var exec = require('child_process').exec;
var script = "var exec = require('child_process').exec;exec('dir',function(e,d,er){console.log(d);});";
script = '"'+script+'"';//enclose the inline script with "" because it contains spaces
var cmd2 = 'node -e '+script;
var cd = exec('cd ../ &'+cmd2,function(err,stdout,strerr)
{
console.log(stdout);//this would work
})
If you just want to change the current directory you should check the documentation about it http://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback
You can use nodejs promisify and async/await:
const { promisify } = require('util');
const exec = promisify(require('child_process').exec);
export default async function () {
const cpu = await exec('top -bn1');
const disk = await exec('df -h');
const memory = await exec('free -m');
const payload = {
cpu,
disk,
memory,
};
return payload
}
If you want to use cd first, better use process.chdir('~/'). Then single exec() will do the job.
You can call exec with cwd param like so:
exec('ls -a', {
cwd: '/Users/user'
}, (err, stdout) => {
if (err) {
console.log(err);
} else {
console.log(stdout);
}
})
But beware, cwd doesn't understand '~'. You can use process.env.HOME instead.
Let's say I have a file "/tmp/sample.txt" and I want to move it to "/var/www/mysite/sample.txt" which is in a different volume.
How can i move the file in node.js?
I read that fs.rename only works inside the same volume and util.pump is already deprecated.
What is the proper way to do it? I read about stream.pipe, but I couldn't get it to work. A simple sample code would be very helpful.
Use the mv module:
var mv = require('mv');
mv('source', 'dest', function(err) {
// handle the error
});
If on Windows and don't have 'mv' module, can we do like
var fs = require("fs"),
source = fs.createReadStream("c:/sample.txt"),
destination = fs.createWriteStream("d:/sample.txt");
source.pipe(destination, { end: false });
source.on("end", function(){
fs.unlinkSync("C:/move.txt");
});
The mv module, like jbowes stated, is probably the right way to go, but you can use the child process API and use the built-in OS tools as an alternative. If you're in Linux use the "mv" command. If you're in Windows, use the "move" command.
var exec = require('child_process').exec;
exec('mv /temp/sample.txt /var/www/mysite/sample.txt',
function(err, stdout, stderr) {
// stdout is a string containing the output of the command.
});
You can also use spawn if exec doesn't work properly.
var spawn = require("child_process").spawn;
var child = spawn("mv", ["data.csv","./done/"]);
child.stdout.on("end", function () {
return next(null,"finished")
});
Hope this helps you out.