Forked electron process requires "empty" electron - node.js

I'm building a command line interface in node which connects to a background command line daemon. If no daemon is running, the first time the cli is called it will fork off the daemon using child_process.fork
The daemon needs to launch instances of electron BrowserWindow, but requiring electron is showing unusual behavior.
If running the daemon on it's own in the foreground, everything works smoothly; however in the background I get an empty module when requiring electron.
Printing Object.keys(require('electron')) to console shows the number sequence 0..84, and printing the results of require('electron') shows the string /path/to/electron/dist/electron
Printing out process.argv shows that the forked script is definitely being executed with electron.
I'm stumped. Any direction would be greatly appreciated.
Example:
launcher
#!/usr/local/bin/electron
const cp = require('child_process');
console.log();
const cld = cp.fork(__dirname+'/daemon',{
stdio:['inherit','inherit','inherit','ipc']
});
cld.on('message', (code) => {
code = parseInt(code);
cld.disconnect();
process.exit(code);
});
daemon
#!/usr/local/bin/electron
const fs=require('fs');
const log = (x)=>fs.appendFileSync('log',x+'\n\n');
log('');
if(!process.send) process.send = console.log;
log(process.argv);
const e = require('electron');
log(e);
log(Object.keys(e));
log(e.app);
process.send(0);
Resulting log file
*removed*/lib/thirdparty/node_modules/electron/dist/electron,*removed*/tmp/daemon
*removed*/lib/thirdparty/node_modules/electron/dist/electron
0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84
undefined
Log file from running just daemon
*removed*/lib/thirdparty/node_modules/electron/dist/electron,./daemon
[object Object]
clipboard,nativeImage,shell,app,autoUpdater,BrowserView,BrowserWindow,contentTracing,crashReporter,dialog,globalShortcut,ipcMain,inAppPurchase,Menu,MenuItem,net,netLog,Notification,powerMonitor,powerSaveBlocker,protocol,screen,session,systemPreferences,TopLevelWindow,TouchBar,Tray,View,webContents,WebContentsView
[object App]

forked process by default set ELECTRON_RUN_AS_NODE=1 and will not expose any electron specific modules:
as https://github.com/electron/electron/issues/6656 says, you may need workaround by explicitly invoke process separately vice versa.

Related

Unable to read a continuous data stream from a node child process in Node.js

First things first. The goal I want to achieve:
I have two processes:
the first one is webpack that just watches for file changes and pushes the bundled files into the dist/ directory
the second process (Shopify CLI) watches for any file changes in the dist/ directory and pushes them to a remote destination
My goal is to have only one command (like npm run start) which simultaneously runs both processes without printing anything to the terminal so I can print custom messages. And that's where the problem starts:
How can I continuously read child process terminal output?
Printing custom messages for webpack events at the right time is pretty easy, since webpack has a Node API for that. But the Shopify CLI only gives me the ability to capture their output and process it.
Normally, the Shopify CLI prints something like "Finished Upload" as soon as the changed file has been pushed. It works perfectly fine for the first time but after that, nothing is printed to the terminal anymore.
Here is a minimal representation of what my current setup looks like:
const spawn = require('spawn');
const childProcess = spawn('shopify', ['theme', 'serve'], {
stdio: 'pipe',
});
childProcess.stdout.on('data', (data) => {
console.log(data);
});
childProcess.stderr.on('data', (data) => {
// Just to make sure there are no errors
console.log(data);
});

Executing Command in Nodejs and Attaching the output to the console

I'm working on a CLI tool to add an extra layer of automation and utilities to our workflow and I'm wrapping the webpack development command with an alternative in my CLI (here's a demonstration):-
function runDev(){
this.doSomePreAutomationAndPreperations();
this.runWebpackDevCommand();
}
I'm using NodeJs child_proecess.exec and I'm trying to figure out a way to execute the webpack dev command and attach it to the terminal (like -it in docker if you're familiar with it) or transferring the control to the child process(so output will be directly emitted to the console).
Is there away to do that?
It turns out that I can achieve this but just making the child process inherit the stdio. ex:-
const { spawn } = require('child_process')
const shell = spawn('sh',[], { stdio: 'inherit' })
shell.on('close',(code)=>{console.log('[shell] terminated :',code)})

Retain Node.js debug mode when starting another process

To debug a Node.js process, I use:
node --inspect-brk foo_bar.js
But what if that instance would start another, separate instance, and this one would be - how to run that instance in debug mode as well?
The problem is, I am using the commander.js library for Node.js, like so:
var program = require('commander')
...
program.parse(process.argv)
This creates another instance of Node.js process, and hence I lose the debug functionality (I am debugging via Chromium browser). How can I overcome this?
Note: Commander only creates another process if you implement the subcommand using a stand-alone executable.
Commander detects debugger options being passed to the node invocation, and passes them to the subprocess with the debugging port incremented by 1.
e.g.
// index.js
const { program } = require('commander');
program
.command('sub', 'stand-alone');
console.log('1');
console.log('2');
program.parse();
// index-sub.js
console.log('3');
console.log('4');
% node --inspect-brk index.js sub
Debugger listening on ws://127.0.0.1:9229/25005682-6bf4-44fb-bdb2-2cc7749bb328
For help, see: https://nodejs.org/en/docs/inspector
Debugger attached.
1
2
Debugger listening on ws://127.0.0.1:9230/00d1924a-3122-4bef-86f2-65d562fbf3ed
For help, see: https://nodejs.org/en/docs/inspector
Debugger attached.
3
4

NodeJS child process stdout not returning completly

I'm running the following code using node and I always get only about 80% of what stdout is supposed to return when the file is past 12ko or so. If the file is 240ko it will still output 80% of it. If it's under 12 it will output completely.
When I open a cmd and run the command manually I always get the full output.
I tried exec, execFile, I tried increasing the max buffer or changing the encoding and it's not the issue. I tried to add options {shell: true, detached: true} but it vain, in fact, when I run it as detached it seems to be running completely fine as it does open an actual console but I'm not able to retrieve the stdout when the process is completed.
const spawn = require('child_process').spawn;
const process = spawn(
'C:\\Users\\jeanphilipped\\Desktop\\unrtf\\test\\unrtf.exe',
['--html' ,'C:\\Users\\jeanphilipped\\Desktop\\unrtf\\test\\tempaf45.rtf'],
);
let chunks = [];
process.stdout.on('data', function (msg) {
chunks = [...chunks, ...msg];
});
process.on('exit', function (msg) {
const buffer = Buffer.from(chunks);
console.log(buffer.toString());
});
Any clues ? It seems to be Node since when I run it manually everything works fine.
according to nodejs documentation, all of the child_process commands are asynchronous . so when you try to access your chunk variable there is no guarantee that you command has been finished , maybe your command is still on process. so it is recommended you should wrap your entire child_process command in async/await function.

NodeJS: exit parent, leave child alive

i am writing an utility. One command of this utility is to run an external application.
var child_process = require('child_process');
var fs = require('fs');
var out = fs.openSync('.../../log/out.log', 'a');
var err = fs.openSync('.../../log/err.log', 'a');
exports.Unref = function(app, argv) {
var child = child_process.spawn(app, argv, {
detached: true,
stdio: [ 'ignore', out, err ]
});
child.unref();
//process.exit(0);
};
Currently:
$ utility run app --some-args // run external app
// cant enter next command while app is running
My Problem is that if i run this command, the terminal is locked while the "external" Application is running.
But the terminal window shouldn't be locked by the child_process.
i wanna run:
$ utility run app --some-args
$ next-command
$ next-command
The external Application (a desktop application) will be closed by hisself.
Like this:
$ subl server.js // this runs Sublime Text and passes a file to the editor
$ npm start // The terminal does not locked - i can execute next command while Sublime is still running
You know what i mean ^^?
Appending ['>>../../log/out.log', '2>>../../log/err.log'] to the end of argv instead of leaving two files open should work since it's the open file handles that are keeping the process alive.
Passing opened file descriptors in stdio in addition to detached: true will not work the way you expect because there is no way to unref() the file descriptors in the parent process and have it still work for the child process. Even if there was a way, I believe that when the parent process exited, the OS would clean up (close) the file descriptors it had open, which would cause problems for the detached child process.
The only possible way that this might have been able to work would have been by passing file descriptors to child processes, but that functionality was dropped several stable branches ago because the same functionality did not exist on some other platforms (read: Windows).

Resources