Using the exec method in Node.js - node.js

I have been following a series of tutorials on Node.js. To demonstrate child processes using exec, I have been given the code below under the file of exec.js When I go to the command line for node, I type in
node exec.js
then nothing happens. Why would this be?
var exec = require("child_process").exec;
exec("open http://www.linkedin.com");

Your code works for me.
To diagnose and fix this, I would try:
In your terminal (not using Node), verify that open http://www.linkedin.com works at all. You should see your default web browser open to the appropriate URL. If that doesn't happen, then Node can't magically fix it. Most likely, you'll need to set the default browser.
Wrap the URL in single quotes just for good measure. This protects you in some cases where the URL contains certain reserved characters.
exec("open 'http://www.linkedin.com'");
Add a callback so you can see the command's output and verify it completed successfully.
exec("open 'http://www.linkedin.com'", function (err, stdout, stderr) {
console.log('err:', err);
console.log('stdout:', stdout);
console.log('stderr:', stderr);
});
The ideal solution is to use opn, a high-quality module that already exists for this exact purpose.

For Windows, use the start command instead of open (which is for Mac) as below:
exec("start http://www.linkedin.com");

Related

Best way to copy a directory from an external drive to a local folder with electronjs?

Just wondering if anyone has ever attempted to copy a directory from an external drive (connected via USB) to a local folder.
I am using ElectronJS so I can use my JavaScript, HTML/CSS skills to create a desktop application without utilising a C language. (i.e. C# or C++) With ElectronJS there's a lot less to worry about.
Here is the list of things I've tried so far:
basic fs.copyFile (using copyFile intially and will then loop round the directory to copy all files)
var fs = require('fs');
window.test = () => {
fs.moveSync("targetFile","destDir", function(err) {
if(err){
console.log(err);
}else{
console.log("copy complete")
}
});
}
fs.moveSync is not a function even though Visual Studio Code brought up moveSync as a suggestion when I entered fs. (ctrl + space)
using child_process functions to copy files using the command line.
Code is:
var process = require('child_process')
window.test = function(){
process.exec('ipconfig', function(err, stdout, stderr){
if(err){
console.log(err);
}else{
console.log(stdout)
}
})
}
Then bundled with browserify. Bundle.js is then imported into the html file and the test function is called on the click of a button. I'm aware the command is ipconfig for now, this was merely used to see if a command could be executed. It appears it could because I was getting process.exec is not defined.
use the node-hid node module to read and trasfer data from the external drive.
The exposed functions within this module were also reported as being undefined. And I thought about the use case longer I thought a simple copy process would suffice because external drive can be accessed like any other folder in the file explorer.
Unfortunately, all of the above have failed and I've spent the most part of the day looking for alternative modules and/or solutions.
Thanks in advance because any help to achieve this would be much appreciated.
Thanks
Patrick
The npm package fs-extra should solve your problem.
It has the move function, which
Moves a file or directory, even across devices
Ended up adding this to my preload.js for:
window.require = require;
It will work for now but is due to be depreciated.
I'll use this for now and make other updates when I have to.

What is the best way to get a script to execute on the server and display the output of the script on the client using node.js?

I am developing a small web app where I want to be able to give my user the ability to press a button and be able to execute a script remotely on the server. I want the output of the script to be displayed on the client.
I am just trying to understand the best way there is to implement functionality like that. Are there any libraries/nodules that let you do something like that directly?
Possible ways
Child.process()
Take a look at node's child_process library. Link
I've used the .exec function before, as that runs a script to completion and captures the output. That sounds like what your are looking for, just replace their example command with yours.
var exec = require('child_process').exec,
child;
child = exec('cat *.js bad_file | wc -l',
function (error, stdout, stderr) {
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error !== null) {
console.log('exec error: ' + error);
}
});
Wrap this call in a callback, and you'll get a response to the callback, and then to the client, when the script finishes. Be aware of the scripts your are running via a HTTP request. Open strings like "run any command" would be obvious security problems, and long running scripts would be a problem for HTTP timeouts. You could implement a WebSockets based solution and pipe results as they come (take a look at .spawn if you want to kick off a process and keep it running, and maybe interact with it later.)

Node spawn stdout.on data delay

I am checking for USB drive removal on linux. I am simply monitoring the output of a command line process with child_process.spawn. But for some reason the child's stdout data event doesn't emit until like 20 lines have been printed, which makes it unable to detect a removed drive. After removing the drive many times, it does finally go. But obviously that won't do.
Original:
var udevmonitor = require("child_process").spawn("udevadm", ["monitor", "--udev"]);
udevmonitor.stdout.on("data", function(data) {
return console.log(data.toString());
});
Pretty simple. So I figure it's an issue with the piping node is using internally. So instead of using the pipe, I figure I'll just use a simple passthrough stream. That could solve the problem and give me real-time output. That code is:
var stdout = new require('stream').PassThrough();
require("child_process").spawn("udevadm", ["monitor", "--udev"], { stdio: ['pipe', stdout, 'pipe'] });
stdout.on("data", function(data) {
console.log(data.toString());
});
But that gives me an error:
child_process.js:922 throw new TypeError('Incorrect value for stdio stream: ' + stdio);
The documentation says you can pass a stream in. I don't see what I'm doing wrong and stepping through the child_process source didn't help.
Can someone help? You can run this yourself, provided you're on Linux. Run the code and insert a USB drive. Perhaps you can run the command 'udevadm monitor --udev' in another terminal to see what happens. Remove and reinsert a few times and eventually node will print out.
mscdex, I love you. Changing the spawn command to
spawn("stdbuf", ["-oL", "-eL", "udevadm", "monitor", "--udev"]);
Did the trick. I really appreciate your help!

Calling Shellscript in NodeJs

I have a Node Js program that has three parts to it.
The first part executes and returns the IP and User
The second part is calling the following shellscript
#!/bin/bash
IP=$1
User=$2
ssh -i /Users/cer.pem ubuntu#$IP "cd /home/$ && ls -lth" >> /Users/outcome.txt
Upon successful creation of the outcome.txt, I have to continue doing other stuff in the third part.
I did find two items relevant to thi Run shell script with node.js (childProcess) & Node.js Shell Script And Arguments, but these arent exactly solving my problem as it doesnt talk handling the sync nature of shellsrcipt.
More info about how to really work with child_process would really help. 1) How do I pass the IP and User to the 2nd node module embedding the shellscript? 2) How do I extract the dir listing from the outcome of shellscript?
Can anyone please help me here?
So if your goal is to list the contents of a folder on a remote system via ssh, be aware that this can be done several ways. Use child_process and a shell script is OK, but you could also use node-control or mscdex/ssh2 (and probably numerous others).
But in any case, when the remote work is being done, your node code will continue to execute asynchronously. Even if your script is synchronous, you have to write your control flow logic in your node.js code asynchronously.
Start with some basic nested callbacks.
function getIpAndUser(callback) {
//get them then do
callback(null, ip, user);
}
function listDirectory(ip, user, callback) {
//do your child_process.exec here
//eventually call
callback(null, output)
}
function thirdPart() {
}
//combine them together correctly:
getIpAndUser(function (error, ip, user) {
if (error) {
console.error(error);
return;
}
listDirectory(ip, user, function () {
if (error) {
console.error(error);
return;
}
thirdPart();
});
});
}
Once you grok that you can rewrite the control flow using something like async.js or a promises library if you so choose.
To address your further questions in the comments:
1) child_process.exec('list_dir.sh ' + ip + ' ' + user, callback)
Note you should eventually escape those arguments properly and probably switch to child_process.execFile, but start with that.
2) The way they do in the example that is right there in the documentation

Execute shell command in foreground from node.js

I'm working on a Node.js CLI script that, as part of its duties, will sometimes need to take a large block of input text from the user. Right now, I'm just using the very basic readline.prompt(). I'd like some better editing capabilities. Rather than reinvent the wheel, I figure I could just do as crontab -e or visudo do and have the script launch a text editor that writes data to a temporary file, and read from that file once it exits. I've tried some things from the child_process library, but they all launch applications in the background, and don't give them control of stdin or the cursor. In my case, I need an application like vim or nano to take up the entire console while running. Is there a way to do this, or am I out of luck?
Note: This is for an internal project that will run on a single machine and whose source will likely never see the light of day. Hackish workarounds are welcome, assuming there's not an existing package to do what I need.
Have you set the stdio option of child_process.spawn to inherit?
This way, the child process will use same stdin and stdout as the top node process.
This code works for me (node v4.3.2):
'use strict';
var fs = require('fs');
var child_process = require('child_process');
var file = '/tmp/node-editor-test';
fs.writeFile(file, "Node Editor", function () {
var child = child_process.spawn('vim', [file], {stdio: 'inherit'});
child.on('close', function () {
fs.readFile(file, 'utf8', function (err, text) {
console.log('File content is now', text);
});
});
});

Resources