Elevating NodeJS ChildProcess created with child_process.fork() - node.js

I'm developing an electron application which downloads software. For users who target "Program Files" however, the installation needs to run with administrator permissions.
I'm creating a child process in which the installer runs using child_process.fork(), and am depending on the IPC connection for the ability to send and receive messages.
Unfortunately however, I can't find any way to elevate this process. Some libraries (such as node-windows) use child_process.exec() under the hood, but this doesn't create the IPC connection.
What is the best way to go about this?

The simplest option is to run the whole app as administrator.
You can force (or, politically correct,remind) user to run as admin.
E.g. in electron-builder with "requestedExecutionLevel": "requireAdministrator"
If you want to elevate only child process, you can either make this child process smart enough to ask for elevation, or use an 'elevator', extra program which will ask for the elevation.
Node-windows does that with VBS script
electron-react-boileplate does that with pre-compiled program elevate
Also node-powershell supports executing commands, if necessary, with elevation (basic powershell).
As for IPC, what are you after? child_process.exec buffers the output, while child_proces.spawn gives it to you in a stream-like manner (see child process)
You just need to provide a callback with the correct arguments.
Example from child process:
const { exec, spawn } = require('child_process');
exec('my.bat', (err, stdout, stderr) => {
if (err) {
console.error(err);
return;
}
console.log(stdout);
});

Related

Using gcloud commands in nodejs application

Some gcloud commands don't have API or client library support (for example - this one).
In these cases, is there a simple way to run gcloud commands from a nodejs application?
The gcloud endpoints service commands for IAM policy are difficult for me to check quickly but, if IIRC (and if this is similar to gcloud projects commands for IAM policy), it's not that there's no API, but that there's no single API call.
What you can always do with gcloud is append --log-http to see what happens beneath the covers. With IAM policy mutations (off-top-of-head), you get the policy, mutate it, and then apply the changes back using the etag the GET gave you. The backend checks the policy's state (the etag is like a hash of the policy) and, if it's unchanged, you can make the change.
If this is what's happening here, you should be able to repro the functionality in NodeJS using the existing (!) APIs and, if you're using API Client Libraries (rather than Cloud Client libraries), the functionality will be available.
Apart from the complexity involved in shelling out to gcloud, you'll need to also authenticate it and then you'll need to (un)marshal data to the shell and manage errors. Ergo, it's messy and generally discouraged.
In node.js ,we have child_process module. As the name suggests the child_process provides function like spawn or exec that creates new child process that executes shell command like independent process. spawn is a function that takes the main command as
first argument and other command line options as an array values in place of second parameter.
So with respect to link that you share, you might end-up writing something like this :
const { spawn } = require("child_process");
const listening = spawn('gcloud', ['endpoints', 'services', 'blah', '--option','someValue']);
listening.stdout.on("data", data => {
console.log(`stdout: ${data}`);
});
listening.stderr.on("data", data => {
console.log(`stderr: ${data}`);
});
listening.on('error', (error) => {
console.log(`error: ${error.message}`);
});
References :
https://nodejs.org/api/child_process.html#child_process_child_process_spawn_command_args_options
I'm not sure this directly answers your question but there is an npm package that can help you run unix commands from within the app.
Check out shell.js

How to automigrate when needed in loopback 3?

I created an automigrate script under /bin in my loopback app and added its path in the package.json file so that I can run this script to automigrate whenever I want from the terminal.
I also have a boot script "createUsers.js" which creates some default users in a model. The problem is, whenever I run this script it calls the boot script and it tries to create the users while automigration is still not finished, resulting in a failed automigration. I don't understand why the boot scripts are called when I only run automigrate script specifically. I could call automigrate in the boot scripts and wrap the createUsers.js code in its callback (as shown here), but that would automigrate every time the app is started which is undesirable since the data is lost on automigration. Where should I call automigrate() so that it can be called whenever required? Any help is greatly appreciated.
What I normally do, is create a script called util.js inside boot
util.js
class util{
static _automigrate(){
//Whatever you need to do
}
}
module.exports = function (server) {
global.util = util;
};
This way your script is available across the entire application. And you can call it whenever you need to.
You could call it with
util._automigrate();
I normally use this pattern to store all my input validations etc since I might need those across different models.

Initialize Git Repository From Web Interface

I'm in the process of building a web application that (without getting into the exact application of these technologies) will allow users to create repositories and share them with one another.
I am in the initial design phase, and wanted to know what the best way to go about executing terminal commands from the interface would be. Ideally, users would be able to click a button and I would initialize a new git repository for them.
Note: During the design process, I will be hosting the site on my Amazon EC2 Instance that has git installed.
In summary, you need to run git from a Node.js application. "Running" git is actually spawning the git process, and this is something you can do natively.
// Spawn a git process.
const spawn = require('child_process').spawn;
const git = spawn('git', ['init']);
// Hook into the close event. See the manual for other events.
git.on('close', (code) => {
// You can check the return code here to see if an error occured.
console.log('git init finished with return code ' + code);
});

Electron Node.js node localstorage osx mkdir permission denied

I am working with Electron and Node.js. We have developed an application that works fine on windows and as a requirement had to package it for mac os. I packaged the application using electron-packager, the packaging process completes and package is generated. Double clicking it throws an error that permission denied for mkdir, as i am using node localstorage to maintain some settings on the user's local machine. somehow mac doesn't local storage to create folder in the root of the application. Any help in this matter will be great. Thanks
First off, is the code in question in the main process or in a renderer process? If it is the latter, you don't need to use 'node-localstorage', because you can use the renderer's native LocalStorage. If you are in the main process, then you need to provide your own storage strategy so using 'node-localstorage' is a viable option.
In any case, you need to carefully consider where to store the data; for starters, let's look at where Electron's renderer processes would store its LocalStorage data: this differs based on the OS, but you can get and set the paths using the app module -- the path in question is userData, which on OS X would default to ~/Library/Application Support/<App Name>. Electron uses that folder to persist cookies, caches, LocalStorage etc. so I would suggest using that folder as well. (Otherwise, refer to XDG defaults for good defaults)
What your example above was trying to do is store your 'errorLogDb' in the current working directory, which might depend on your OS, where your App is installed, how you executed it, etc.
Finally, it's a good idea to differentiate between your 'production' app and your app during development and testing, because you might not want to use the same storage folders for every environment. In any case, just writing to './errorLogDb' is likely to cause lots of headaches so I'd be thankful for the permission denied error.
this strategy worked for me:
const { LocalStorage } = require('node-localstorage');
let ls;
mb.on('ready', () => {
let prefsPath = mb.app.getPath('userData') + '/prefs';
ls = new LocalStorage(prefsPath);
loadPrefs();
});
mb.on('after-create-window', () => { /* ls... */ }
exports.togglePref = () => { /* ls... */ }

nodejs ssh2 handling data responses in persistent shell

I want to open a persistent connection with the ssh, type commands and handle their responses. Commands will likely hook on to each other, such as changing directories then running another command, so exec does not seem to be an option from what I understand. With php and phpseclib it was simple, I could simply do:
$ssh->sftp('cd /some/dir');
$response = $ssh->sftp('ls');
However with ssh2 and nodejs there appears to be only one handler for all incoming data, so no matter what I write, it will all come back to the same function, which makes it hard to determine what is what. Especially since I can not control what comes back. If I did an 'ls' I would get a list of files and folders, but if I did a grep or tail I would get a different type of list, but my handler would not know which is which to handle/parse them properly.
How can I solve this issue?
Perhaps I am looking at this the wrong way and just need someone to take the PHP glasses off. My goal is to build a small local app that will connect to my servers through ssh and do complex tasks like grabbing my access logs and parsing all the data into a more readable format for me, or maybe creating a new sites-available config file and then a2ensite'ing it, or vardumping my databases and downloading the files to back them up locally, etc.
ssh.connection.shell is used to get access to remote shell in interactive manner. For connections to do work & come back, shell is not the right option.
Read this,
https://github.com/mscdex/ssh2/issues/210
You can do the same with npm#ssh2,
conn.sftp(function(err, sftp) {
if (err) throw err;
sftp.readdir('foo', function(err, list) {
if (err) throw err;
console.dir(list);
conn.end();
});
});

Resources