Electron - How to get process argv from all electron pids - node.js

I have an electron app with multiple browserWindows.
For my own help, I spawn them with additional arguments (for example: '--renderer-mode="second-window"').
Now I want to collect Metric Data of my current electron processes.
I already have a IPC interface in my main process I call from one of my renderer.
ipcMain.handle('app-metrics', (event, message) => {
return new Promise((resolve) => {
const appMetrics = app.getAppMetrics()
resolve(appMetrics)
})
})
Here I want to add all the argv from my apps processes.
I don't know how I could get the info in this function. I only know the way with process.argv, but how could I collect these info from all the sub processes and bundle it with my appMetrics array?

I solved my question in another way. My goal was to display the "process" type (not the chromium types that already exists in the metric data).
I'm collecting the PIDs I already know and hardcode them a specific type. The next thing was to add this info into the metric object. Here is my result:
ipcMain.handle('app-metrics', (event, message) => {
return new Promise((resolve) => {
const pids = [
{
name: 'main-process',
pid: process.pid
},
{
name: 'app-gui',
pid: this.win.webContents.getOSProcessId()
},
{
name: 'popup-gui',
pid: this.winPopup.webContents.getOSProcessId()
}
]
const appMetrics = app.getAppMetrics().map((metric) => {
const pidType = pids.filter((e) => e.pid === metric.pid)
if (pidType.length > 0) {
return {
...metric,
appType: pidType[0].name
}
}
return {
...metric,
appType: ''
}
})
resolve(appMetrics)
})
})
If there is a simpler and smarter way, I'm happy to hear it. :)

Related

kill a looped task nicely from a jest test

I have a worker method 'doSomeWork' that is called in a loop, based on a flag that will be changed if a signal to terminate is received.
let RUNNING = true;
let pid;
export function getPid() {
return pid;
}
export async function doSomeWork() {
console.log("Doing some work!");
}
export const run = async () => {
console.log("starting run process with PID %s", process.pid);
pid = process.pid;
while (RUNNING) {
await doSomeWork();
}
console.log("done");
};
run()
.then(() => {
console.log("finished");
})
.catch((e) => console.error("failed", e));
process.on("SIGTERM", () => {
RUNNING = false;
});
I am happy with this and now need to write a test: I want to
trigger the loop
inject a 'SIGTERM' to the src process
give the loop a chance to finish nicely
see 'finished' in the logs to know that the run method has been killed.
here is my attempt (not working) The test code all executes, but the src loop isn't killed.
import * as main from "../src/program";
describe("main", () => {
it("a test", () => {
main.run();
setTimeout(function () {
console.log("5 seconds have passed - killing now!");
const mainProcessPid = main.getPid();
process.kill(mainProcessPid, "SIGTERM");
}, 5000);
setTimeout(function () {
console.log("5 secs of tidy up time has passed");
}, 5000);
});
});
I think the setTimeout isn't blocking the test thread, but I am not sure how to achieve this in node/TS.
sandbox at https://codesandbox.io/s/exciting-voice-goncm
update sandbox with correct environment: https://codesandbox.io/s/keen-bartik-ltjtx
any help appreciated :-)
--update---
I now see that process.kill isn't doing what I thought it was - even when I pass in the PID. will try creating a process as a child of the test process, so I can send a signal to it then. https://medium.com/#NorbertdeLangen/communicating-between-nodejs-processes-4e68be42b917
You are getting this issue because the Environment in your codesandbox is create-react-app i.e. it's a client side script and not a server-side instance of node.
Recreate you project but select as your environment node HTTP server, this will give you a node environment where the node process functions will work e.g. process.kill. This is because the node environment is run in a server-side Docker container. See here for more info on Codesandbox's environments.

How can you record a `.har` of an Electron `webContents` session?

I have a Javascript application that spawns Electron and does a bunch of stuff in it.
I'm trying to debug a strange network issue I'm having, and to do this, I'd like to use a HAR file to store a log of all the HTTP requests being made by Electron.
Is this possible?
Yes, it can be done using chrome-har-capturer - you can pass a bunch of events from the webContents.debugger and then chrome-har-capturer will transform them into a HAR for you.
Example code:
const chromeHarCapturer = require('chrome-har-capturer')
let log = []
const webContents = browserWindow.webContents
webContents.debugger.on("message", function(event, method, params) {
// https://github.com/cyrus-and/chrome-har-capturer#fromlogurl-log-options
if (!["Page.domContentEventFired", "Page.loadEventFired", "Network.requestWillBeSent", "Network.dataReceived",
"Network.responseReceived", "Network.resourceChangedPriority", "Network.loadingFinished",
"Network.loadingFailed"].includes(method)) {
// not relevant to us
return
}
log.push({method, params})
if (method === 'Network.responseReceived') { // the chrome events don't include the body, attach it manually if we want it in the HAR
webContents.debugger.sendCommand('Network.getResponseBody', {
requestId: params.requestId
}, function(err, result) {
result.requestId = params.requestId
log.push({
method: 'Network.getResponseBody',
params: result
})
})
}
})
webContents.debugger.once("detach", function() {
// on detach, write out the HAR
return chromeHarCapturer.fromLog("http://dummy-url-for-whole-session", log).then(function(har) {
const path = `/tmp/${Number(new Date())}-har.json`
fs.writeJson(path, log)
log = []
})
})
// subscribe to the required events
webContents.debugger.attach()
webContents.debugger.sendCommand('Network.enable')
webContents.debugger.sendCommand('Page.enable')

child_process.fork() in Electron

Is it possible to fork a child_process from an electron render process? I found some posts across the net, but there were no hint how helps me to get my code working.
I created a module, that fork child processes. This code works, when I run this with cmd and under node. But when I try to integrate it in my electron app, I can not communicate with the child.send() method.
// create fork
const fork = require('child_process').fork;
const fs = require('fs');
const img_path = [
'path/to/an/image1.jpg',
'path/to/an/image2.jpg',
'path/to/an/image3.jpg'
];
const cp = [];
const temp_path = img_path.map((item) => item);
createAndResize(2);
function createAndResize(num) {
return childResize(createChildProcess(num));
}
function createChildProcess(num) {
if(num <= 0) {
return cp;
} else {
let cf = fork('./child.js');
cp.push(cf);
num -= 1;
return createChildProcess(num);
}
}
function childResize(list) {
if(list.length <=0) {
return true;
} else {
// child_process is created
let child = list.shift();
child.on('message', function (data) {
if (!temp_path.length) {
process.kill(data);
} else {
child.send(temp_path.shift());
}
});
child.send(temp_path.shift());
setTimeout(function() {
childResize(list);
}, 1000);
}
}
//child.js
process.on('message', function(msg) {
console.log(msg); //this is never reached
};
EDIT: based on the comment below, I fork child processes on the main process. The comunication seems to work with few exceptions. But first my new code:
// myView.js
const { remote } = require('electron');
const mainProcess = remote.require('./main.js');
const { forkChildfromMain } = mainProcess;
forkChildfromMain();
// main.js
const fork = require('child_process').fork;
let cp = [];
function forkChildfromMain() {
createAndResize(4);
}
function createAndResize(num) {
return childResize(createChildProcess(num));
}
function createChildProcess(num) {
if(num <= 0) {
return cp;
} else {
let cf = fork('./resize.js');
cp.push(cf);
num -= 1;
return createChildProcess(num);
}
}
function childResize(list) {
if(list.length <=0) {
return true;
} else {
let child = list.shift();
child.on('message', function (msg) {
// logs 'Hello World' to the cmd console
console.log(msg);
});
child.send('Hello World');
setTimeout(function() {
childResize(list);
}, 1000);
}
}
exports.forkChildfromMain = forkChildfromMain;
// child.js
process.on('message', function(msg) {
// this console statement get never loged
// I think, I must integrate an icpModule
console.log(msg);
//process send msg back to main.js
process.send(msg);
})
OUTDATED: The main problem now is, that I think electron 'spawn' new child processes and do not fork.
Because, when I look at my task manager I see only one instance from electron. When I run the code in a node env, I see there were fork multiple node instances.
The reason why I prefer to fork my child processes in multiple node instances is, that I want to make many image manipulation. So when I fork childs, then every child has it own node instance with memory and so on. I think that would be more performant then when I only have one instance who shared the memory and resources to all of the childs.
The second unexpected behavior is, that the console.log statement in the child is not printed to my cmd console. But this is the smaller ones :)
EDIT: After I analyse my task manager a little more in depth, I saw, that electron spawn multiple child processes like it should.
Electron's renderer process is not the right place for forking child processes, you should think about moving this to the main process.
Nonetheless, it should work the way you describe. If you'd make a minimal example available somewhere I could take a closer look.

What is a sensible way to structure my control flow (promises and looping)?

I'm not sure of how to adequately achieve my desired control flow using promises/bluebird.
Essentially I have a database with X 'tasks' stored and each needs to be loaded and executed sequentially. I don't want to run more than one task concurrently and the entire code must continue executing indefinitely.
I have achieved this with the following code so far:
export default function syncLoop() {
getNextTaskRunner().then((taskRunner) => {
if (taskRunner) {
taskRunner.startTask()
.then(syncLoop)
.catch((error) => {
throw new Error(error);
});
} else {
syncLoop();
}
});
}
getNextTaskRunner() simply loads and resolves with the next task from the database (calc'd based on timestamps). Or it resolves with null (no task avail).
taskRunner.startTask() resolves with null when the full task has completed.
I've been advised that the way it is structured (recursive /w promises) could lead to stack issues after it has been running for some time.
What I've thought about doing is to restructure it to something like:
let running = false;
setInterval(() => {
if (!running) {
running = true;
getNextTaskRunner().then((taskRunner) => {
if (taskRunner) {
taskRunner.startTask()
.then(() => {
running = false;
})
.catch((error) => {
log.error(error);
});
} else {
running = false;
}
});
}
}, 5000);
Or as yet another possibility, using event emitters in some form?
task.on('complete', nextTask());
Thoughts and advice will be greatly appreciated!
What stack issues? The way you've written your code is perfectly fine as long as getNextTaskRunner is truly async (i.e. it gives control back to the main loop at some point, e.g. if it does async io). There is no recursion in your code in that case. Whoever told you that is mistaken.
Though you might want to add a setTimeout somewhere so you won't flood your db with requests. Plus it will help you if getNextTaskRunner will no longer be sync (due to for example in memory caching):
export default function syncLoop() {
setTimeout(() => {
getNextTaskRunner().then((taskRunner) => {
if (taskRunner) {
taskRunner.startTask()
.then(syncLoop)
.catch((error) => {
throw new Error(error);
});
} else {
syncLoop();
}
});
}, 2000);
}

websocket interrupted while angular2 project is loading on firefox

I've just started angular 2. I've done an angular2 sample as given in the https://angular.io/guide/quickstart
when I run the project in Firefox using
npm start
command in terminal, the connection get disconnected after output showing once.Error showing like
The connection to ws://localhost:3000/browser-sync/socket.io/?EIO=3&transport=websocket&sid=6YFGHWy7oD7T7qioAAAA was interrupted while the page was loading
Any idea about how to fix this issue ?
I don't know how you manage your web socket but you could consider using the following code. This idea is to wrap the web socket into an observable.
For this you could use a service like below. The initializeWebSocket will create a shared observable (hot) to wrap a WebSocket object.
export class WebSocketService {
initializeWebSocket(url) {
this.wsObservable = Observable.create((observer) => {
this.ws = new WebSocket(url);
this.ws.onopen = (e) => {
(...)
};
this.ws.onclose = (e) => {
if (e.wasClean) {
observer.complete();
} else {
observer.error(e);
}
};
this.ws.onerror = (e) => {
observer.error(e);
}
this.ws.onmessage = (e) => {
observer.next(JSON.parse(e.data));
}
return () => {
this.ws.close();
};
}).share();
}
}
You could add a sendData to send data on the web socket:
export class WebSocketService {
(...)
sendData(message) {
this.ws.send(JSON.stringify(message));
}
}
The last point is to make things a bit robust, i.e. filter received messages based on a criteria and implement retry when there is a disconnection. For this, you need to wrap our initial websocket observable into another one. This way we can support retries when the connection is lost and integrate filtering on criteria like the client identifier (in the sample the received data is JSON and contains a sender attribute).
export class WebSocketService {
(...)
createClientObservable(clientId) {
return Observable.create((observer) => {
let subscription = this.wsObservable
.filter((data) => data.sender!==clientId)
.subscribe(observer);
return () => {
subscription.unsubscribe();
};
}).retryWhen((errors) => {
return Observable.timer(3000);
});
}
}
You can see that deconnections are handled in this code using the retryWhen operator of observable.

Resources