I have a simple node child process that invokes a script and that script takes time to output some information (kinda like how ping works).
let command = spawn(
execPath,
[...args],
{ cwd: null, detached: false }
);
Then I do a standard console.log for the stdout:
command.stdout.on("data", (stdout) => {
console.log("Realtime Output: ", stdout.toString());
});
The issue is, I want to send this realtime output back to renderer process and show it on the frontend. I tried adding an ipcRenderer.send() inside the command.stdout.on() but it doesn't work, and the frontend shows undefined in console.log.
Is there any way to achieve this?
You have to send the stdout from mainWindow through webContents
mainWindow.webContents.send('output', stdout.toString())
Related
I am executing below code in NodeJS.
As seen below parent process spawns a child process and sets env variable.
This variable is used to decide if process is parent or child when executing the file.
const {IS_CHILD} = process.env
if(IS_CHILD){
console.log('CHILD');
console.log('child pid = ',process.pid);
console.log('child env values = ',process.env);
}else{
const {parse} = require('path')
const {root} = parse(process.cwd());
console.log('PARENT');
console.log('parent pid = ',process.pid)
const {spawn} = require('child_process');
const sp = spawn(process.execPath,[__filename], {
cwd: root,
env: {IS_CHILD : true}
});
sp.stdout.pipe(process.stdout); // if this is commented
}
The issue I am facing is , if I comment out code sp.stdout.pipe(process.stdout) inside parent process , the child process output is not shown on console.
(Three lines inside IS_CHILD if block )
If sp.stdout.pipe(process.stdout) line is commented out , does that mean that process.env for child process is also not written ?
Can anybody please help here ?
Am I missing anything here ?
I was assuming that even if sp.stdout.pipe(process.stdout) line is commented out , the child process should have env variable set in it as we have executed spawn command
If sp.stdout.pipe(process.stdout) line is commented out , does that mean that process.env for child process is also not written ?
No. Environment variable is still there.
Explanation
The output you get is expected.
When you run your script, it starts parent process. So what you will see in the console is output of parent process. Then, via spawn you are spawning a child process. The child process is a completely different process. In order to see its output in the parent process, you can use piping as you do.
Another option is to use events of child process. You can attach a handler to child process' stdout data event. e.g
sp.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
For more info about child processes, see child process reference.
If you omit piping, the child process will write output to its own stdout which is different from parent process' stdout. Therefore, you don't see it.
I have a c program (I didn't code it) that prints some data in the terminal. I launch the program as a child process in node with the spawn function.
const child_process = spawn('./myProgram', ['--arg']);
After that, I code the event to get the printed data:
child_process.stdout.on('data', function(data) {
console.log(data);
});
When I run the program I can't see the output data from my c program in my nodejs terminal. If I initialize the child process with stdio as inherit it works.
const child_process = spawn('./myProgram', ['--arg'], {stdio :'inherit'});
The key point here is that I need to process that data in my nodejs app. I suppose the way the c file prints the data is not the standard one, so my nodjs program does not get it.
The file was outputting to stderr instead to stdout. It was fixed by adding the event to stderr:
child_process.stderr.on('data', function(data) {
console.log(data);
});
#tadman got the answere.
I start a spawn child process this way:
let process = spawn(apiPath, {
detached: true
})
process.unref()
process.stdout.on('data', data => { /* do something */ })
When I start the process I need to keep it attached because I want to read its output. But just before closing my Node process (the parent) I want to detach all not finished children processes to keep them running in background, but as the documentation say:
When using the detached option to start a long-running process, the process will not stay running in the background after the parent exits unless it is provided with a stdio configuration that is not connected to the parent.
But with the option stdio: 'ignore' I can't read the stdout which is a problem.
I tried to manually close the pipes before to close the parent process but it is unsuccessful:
// Trigger just before the main process end
process.stdin.end()
process.stderr.unpipe()
process.stdout.unpipe()
After many tests I found at least one way to solve this problem : destroying all pipe before to leave the main process.
One tricky point is that the child process have to handle correctly the pipes destroying, if not it could got an error and close anyway. In this example the node child process seems to have no problem with this but it could be different with other scenario.
main.js
const { spawn } = require('child_process')
console.log('Start Main')
let child = spawn('node', ['child.js'], { detached: true })
child.unref() // With this the main process end after fully disconnect the child
child.stdout.on('data', data => {
console.log(`Got data : ${data}`)
})
// In real case should be triggered just before the end of the main process
setTimeout(() => {
console.log('Disconnect the child')
child.stderr.unpipe()
child.stderr.destroy()
child.stdout.unpipe()
child.stdout.destroy()
child.stdin.end()
child.stdin.destroy()
}, 5000)
child.js
console.log('Start Child')
setInterval(function() {
process.stdout.write('hello from child')
}, 1000)
output
Start Main
Got data : Start Child
Got data : hello from child
Got data : hello from child
Got data : hello from child
Got data : hello from child
Disconnect the child
I'm having trouble stopping a node.js process that spawns a child process. If I run the .js process from Terminal, I can stop it using Ctrl+C. But if I spawn it from a NodeJS app, I cannot kill it using kill("SIGINT") -- it just keeps going and continues to report stdout.
Here's the setup. I have a script, lets call it docker.js and it does this:
// docker.js
child_process.spawn("docker-compose", ["up", "-d", ...args], { stdio: 'inherit' });
The docker-compose up command does a lot of things and runs for awhile, sometimes for several minutes.
If I run ./docker.js from Terminal, I can consistently break out at any point by pressing Ctrl+C.
If I spawn docker.js from inside a different NodeJS app (in my case an Electron app), using spawn() or fork():
// DockerApp.js
const dir = `path/to/dockerjs/file/`;
// Tried this with and without `detached: true`
const child = spawn("node", [`./docker.js`, ...args], { cwd: dir, env, detached: true });
// Also tried this, which uses Electron's packaged node process
const child = fork(`./docker.js`, args, { cwd: dir, env, silent: true });
And I listen for stdout and stderr and close:
child.stdout.on("data", data => {
console.log(`stdout: ${data}`);
});
child.stderr.on("data", data => {
console.log(`stderr: ${data}`);
});
child.on("close", code => {
console.log(`child process exited with code ${code}`);
});
Everything works fine (I see expected output and eventually "close" after completion), but if I try to stop the process before completion like this:
child.kill("SIGINT"); // Equivalent to Ctrl+C in terminal
The child process just keeps running, and I keep getting docker-compose output through stdout.
I've tried for awhile but I cannot figure out how to stop docker.js when spawned as a child process from NodeJS/Electron app. I thought Ctrl+C from Terminal and child.kill("SIGINT") would have the same behavior, but it doesn't.
Can anyone explain what is going on here? And how can I reliably kill this docker.js child process from my NodeJS app?
Try something like this in child process:
process.on('SIGINT', () => {
console.log('Received SIGINT');
process.exit(0);
});
My objective is to have some code execute after a detached, unreferenced, child process is spawned from a NodeJS app. Here is the code that I have:
var child_options = {
cwd : prj
, env : {
PATH: cmd_directory
}
, detatched : true
, stdio : 'ignore'
};
//Spawn a child process with myapp with the options and command line params
child = spawn('myapp', params_array, child_options, function(err, stdout, stderr){
if (err) {
console.log("\t\tProblem executing myapp =>\n\t\t" + err);
} else {
console.log("\t\tLaunched myapp successfully!")
}
});
//Handle the child processes exiting. Maybe send an email?
child.on('exit', function(data) {
fs.writeFile(path.resolve("/Users/me/Desktop/myapp-child.log"), "Finished with child process!");
});
//Let the child process run in its own session without parent
child.unref();
So the function inside the exit handler does not seem to get executed when the child process finishes. Is there any way at all to have code execute after the child process exits even when it's detached and when calling the .unref() method?
Note that if I change the 'stdio' key value in the child_options object from 'ignore' to 'inherit' then the exit handler does execute.
Any ideas?
UPDATE PART 1
So, I still can not figure this one out. I went back to the NodeJS docs on spawn, and noticed the example about spawning "long-running processes". In one example, they redirect the child process' output to files instead of just using 'ignore' for the 'stdio' option. So I changed the 'stdio' key within the child_options object as in the following, but alas I am still not able to execute the code within the 'close' or 'exit' event:
var out_log = fs.openSync(path.resolve(os.tmpdir(), "stdout.log"), 'a'),
err_log = fs.openSync(path.resolve(os.tmpdir(), "stderr.log"), 'a');
var child_options = {
cwd : prj
, env : {
PATH: cmd_directory
}
, detatched : true
, stdio : ['ignore', out_log, err_log]
};
So, the stdout.log file does get the stdout from the child process—so I know it gets redirected. However, the code in the close or exit event still does not execute. Then I thought I would be able to detect when the writing to the out_log file was finished, in which case I would be able to execute code at that point. However, I cannot figure out how to do that. Any suggestions?
You can add listener to 'close' event, e.g. replace 'exit' with 'close'. It worked on my side even with 'ignore' stdio. Also, input parameter in callback is exit code number or null.
According to nodejs documentation difference between exit and close events:
The 'close' event is emitted when the stdio streams of a child process
have been closed. This is distinct from the 'exit' event, since
multiple processes might share the same stdio streams.
Hope it helps.