Execute shell command in foreground from node.js - node.js

I'm working on a Node.js CLI script that, as part of its duties, will sometimes need to take a large block of input text from the user. Right now, I'm just using the very basic readline.prompt(). I'd like some better editing capabilities. Rather than reinvent the wheel, I figure I could just do as crontab -e or visudo do and have the script launch a text editor that writes data to a temporary file, and read from that file once it exits. I've tried some things from the child_process library, but they all launch applications in the background, and don't give them control of stdin or the cursor. In my case, I need an application like vim or nano to take up the entire console while running. Is there a way to do this, or am I out of luck?
Note: This is for an internal project that will run on a single machine and whose source will likely never see the light of day. Hackish workarounds are welcome, assuming there's not an existing package to do what I need.

Have you set the stdio option of child_process.spawn to inherit?
This way, the child process will use same stdin and stdout as the top node process.
This code works for me (node v4.3.2):
'use strict';
var fs = require('fs');
var child_process = require('child_process');
var file = '/tmp/node-editor-test';
fs.writeFile(file, "Node Editor", function () {
var child = child_process.spawn('vim', [file], {stdio: 'inherit'});
child.on('close', function () {
fs.readFile(file, 'utf8', function (err, text) {
console.log('File content is now', text);
});
});
});

Related

Using the exec method in Node.js

I have been following a series of tutorials on Node.js. To demonstrate child processes using exec, I have been given the code below under the file of exec.js When I go to the command line for node, I type in
node exec.js
then nothing happens. Why would this be?
var exec = require("child_process").exec;
exec("open http://www.linkedin.com");
Your code works for me.
To diagnose and fix this, I would try:
In your terminal (not using Node), verify that open http://www.linkedin.com works at all. You should see your default web browser open to the appropriate URL. If that doesn't happen, then Node can't magically fix it. Most likely, you'll need to set the default browser.
Wrap the URL in single quotes just for good measure. This protects you in some cases where the URL contains certain reserved characters.
exec("open 'http://www.linkedin.com'");
Add a callback so you can see the command's output and verify it completed successfully.
exec("open 'http://www.linkedin.com'", function (err, stdout, stderr) {
console.log('err:', err);
console.log('stdout:', stdout);
console.log('stderr:', stderr);
});
The ideal solution is to use opn, a high-quality module that already exists for this exact purpose.
For Windows, use the start command instead of open (which is for Mac) as below:
exec("start http://www.linkedin.com");

Use Node.js as Shell

How might I set up node.js as a shell replacement for bash? For example I should be able to run vi('file') to open a file and cd('location') to change between directories.
Is this even possible?
Sure you can! It will become much less straightforward to use your computer, though.
First off, you will need to know how to set this up. While you could likely set your user shell in Linux to usr/bin/node, this will leave you with only a Node.js REPL with no additional programs set up. What you're going to want to do is write a setup script that can do all of the below setup/convenience steps for you, essentially something that ends with repl.start() to produce a REPL after setting everything up. Of course, since UNIX shell settings can't specify arguments, you will need to write a small C program that executes your shell with those arguments (essentially, exec("/usr/bin/node", "path/to/setup/script.js");) and set that as your UNIX shell.
The main idea here is that any commands that you use beyond the very basics must be require()d into your shell - e.g. to do anything with your filesystem, execute
var fs = require("fs")
and do all of your filesystem calls from the fs object. This is analogous to adding things to your PATH. You can get basic shell commands by using shelljs or similar, and to get at actual executable programs, use Node's built-in child_process.spawnSync for a foreground task or child_process.spawn for a background task.
Since part of your requirement is that you want to call each of your programs like a function, you will need to produce these functions yourself, getting something like:
function ls(path) {
child_process.spawnSync('/bin/ls', [path], { stdio: 'inherit' });
}
for everything you want to run. You can probably do this programmatically by iterating through all the entries in your PATH and using something involving eval() or new Function() to generate execute functions for each, assigning them to the global object so that you don't have to enter any prefixes.
Again, it will become much less straightforward to use your computer, despite having these named functions. Lots of programs that cheat and use bash commands in the background will likely no longer work. But I can certainly see the appeal of being able to leverage JavaScript in your command-line environment.
ADDENDUM: For writing this setup script, the REPLServer object returned by repl.start() has a context object that is the same as the global object accessible to the REPL session it creates. When you write your setup script, you will need to assign everything to the context object:
const context = repl.start('> ').context;
context.ls = function ls(path) { /* . . . */ }
context.cd = function cd(path) { /* . . . */ }
I think it would be an intersting proposition. Create a test account and tell it to use node as it's shell. See 'man useradd' for all options
$ useradd -s /usr/bin/node test
$ su - test
This works on mac and linux.
require('child_process').spawnSync('vi', ['file.txt'], { stdio: 'inherit' })
You could bootstrap a repl session with your own commands, then run the script
#!/bin/bash
node --experimental-repl-await -i -e "$(< scripts/noderc.js)"`
This allows for things like:
> ls()
> run('vi','file.txt')
> await myAsyncFunc()
I think you're looking for something like this https://youtu.be/rcwcigtOwQ0 !
If so.... YES you can!
If you like I can share my code. But I need to fix some bugs first!
tell me if you like.
my .sh function:
const hey = Object.create(null),
sh = Object.create(null);;
hey.shell = Object.create(null);
hey.shell.run = require('child_process').exec;
sh.help = 'Execute an OS command';
sh.action = (...args) => {
// repl_ is the replServer
// the runningExternalProgram property is one way to know if I should
// render the prompt and is not needed. I will create a better
// way to do this (action without if/decision)!
repl_.runningExternalProgram = true;
hey.shell.run(args.join(' '),
(...args) => {
['error', 'log'].forEach((command, idx) => {
if (args[idx]) {
console[command](args[idx]);
}
});
repl_.runningExternalProgram = false;
});
};
PS: to 'cd' into some directory you just need to change the process.cwd (current working directory)
PS2: to avoid need to write .sh for every OS program/command you can use Proxy on the global object.

What is the best way to get a script to execute on the server and display the output of the script on the client using node.js?

I am developing a small web app where I want to be able to give my user the ability to press a button and be able to execute a script remotely on the server. I want the output of the script to be displayed on the client.
I am just trying to understand the best way there is to implement functionality like that. Are there any libraries/nodules that let you do something like that directly?
Possible ways
Child.process()
Take a look at node's child_process library. Link
I've used the .exec function before, as that runs a script to completion and captures the output. That sounds like what your are looking for, just replace their example command with yours.
var exec = require('child_process').exec,
child;
child = exec('cat *.js bad_file | wc -l',
function (error, stdout, stderr) {
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error !== null) {
console.log('exec error: ' + error);
}
});
Wrap this call in a callback, and you'll get a response to the callback, and then to the client, when the script finishes. Be aware of the scripts your are running via a HTTP request. Open strings like "run any command" would be obvious security problems, and long running scripts would be a problem for HTTP timeouts. You could implement a WebSockets based solution and pipe results as they come (take a look at .spawn if you want to kick off a process and keep it running, and maybe interact with it later.)

Node spawn stdout.on data delay

I am checking for USB drive removal on linux. I am simply monitoring the output of a command line process with child_process.spawn. But for some reason the child's stdout data event doesn't emit until like 20 lines have been printed, which makes it unable to detect a removed drive. After removing the drive many times, it does finally go. But obviously that won't do.
Original:
var udevmonitor = require("child_process").spawn("udevadm", ["monitor", "--udev"]);
udevmonitor.stdout.on("data", function(data) {
return console.log(data.toString());
});
Pretty simple. So I figure it's an issue with the piping node is using internally. So instead of using the pipe, I figure I'll just use a simple passthrough stream. That could solve the problem and give me real-time output. That code is:
var stdout = new require('stream').PassThrough();
require("child_process").spawn("udevadm", ["monitor", "--udev"], { stdio: ['pipe', stdout, 'pipe'] });
stdout.on("data", function(data) {
console.log(data.toString());
});
But that gives me an error:
child_process.js:922 throw new TypeError('Incorrect value for stdio stream: ' + stdio);
The documentation says you can pass a stream in. I don't see what I'm doing wrong and stepping through the child_process source didn't help.
Can someone help? You can run this yourself, provided you're on Linux. Run the code and insert a USB drive. Perhaps you can run the command 'udevadm monitor --udev' in another terminal to see what happens. Remove and reinsert a few times and eventually node will print out.
mscdex, I love you. Changing the spawn command to
spawn("stdbuf", ["-oL", "-eL", "udevadm", "monitor", "--udev"]);
Did the trick. I really appreciate your help!

Getting data into a node script - more complex than command line arguments

I split my tasks into multiple node scripts, "node programs" that will be run individually with node program1.js, node program2.js, etc. using Node's child process exec function in other Node apps.
This way things split up and I can reuse one program in multiple other apps.
It's easy to get data out of a node program like that, you just throw whatever data you want to stdout and have exec capture it on the other end.
For putting data into a node program, how? This is easy if the data is only simple command line arguments, but how would I put arbitrary data (binary, JSON, whatever) into it (at or close the point where I would call exec)? Maybe some piping? Example code I'd appreciate.
Use the env property to pass environment variables to a spawned child process. You can do this in exec(), but for spawning Node processes it's better to use fork(), since it creates a new instance of V8, which is what you're doing.
This is how you'd pass an environment variable:
var exec = require('exec');
var child = exec(command, {
env: {
buffer: new Buffer(8),
json: JSON.stringify(json),
string: 'a simple string'
}
}, function(error, stdout, stderr) {
// execution callback
});
And this is how you'd use the variables in your child process:
process.env.buffer
process.env.json
process.env.string

Resources