node.js python child-process won't print in realtime - node.js

I'm trying to make a python program's stdout trigger an event for my node.js app to detect it and send that stdout's data to a client's browser.
The problem is the program won't deliver it's stdout until it terminates... and closes the pipe i guess.
I need it to be so as a line prints node gets the stdout after each print statement.
here's an example python program run with python program.py
from __future__ import print_function
import time
import os
import sys
for num in range(0,8):
print('my []-- ', num)
time.sleep(0.2)
sys.stdout.flush()
Node which spawns a child process with
proc = spawn('python3 program.py -u')
var pid = proc.pid
echoIO( pid )
toggleState()
processDriver(proc)
below listens for the stdout and sends it to a browser with socket io
function processDriver(proc) {
proc.stdout.setEncoding('utf-8');
proc.stdout.on('data', function (data) {
echoIO(data)
})
proc.stderr.on('data', function (data) {
console.log(data)
var res = data.toString()
echoIO(res)
})
proc.on('exit', function (code) {
console.log(code)
stateOff()
echoIO("exit " + code)
sendProcState(JSON.parse('{"status":"not_running"}'))
})
}
Also i've tested this problem in raspbian, ubuntu, and crunchbang, and the issue persists, no other solutions i've found on stackoverflow have worked yet. Sorry if this is about the 20th post.
EDIT:
I've also run into this now
stderr - execvp(): No such file or directory
events.js:72
throw er; // Unhandled 'error' event
^
Error: spawn ENOENT
...

proc = spawn('python', ['program.py'], '-u')
it turns out my method of spawning the child was the error. It's a miracle that it ever spawned a child at all, playing with the code generated the error above, that led me to the post below.
But once I spawned it with this line, instead of a string with 'python program.py -u' it worked.
Node.js - spawned process is generating error "execvp(): No such file or directory"

Related

Forked child process keeps terminated with code 1

I wrapped a module using Electron Packager. Because it has heavy computation, i put it in a sub process that would be forked from renderer.js when user clicks a button on index.html.
Pseudo-code renderer.js from :
let cp = require('child_process');
let subprocess;
function log(msg) {
// A function to log messages sent from subprocess
}
document.querySelector('#create').addEventListener('click', ev => {
subprocess = cp.fork('./subprocess.js');
log('A subprocess has been created with pid: ' + subprocess.pid + ' with exexPath = ' + process.execPath);
subprocess.on('exit', (code, signal) => log(`child process terminated: signal = ${signal} ; code = ${code}`));
subprocess.on('error', log);
subprocess.on('message', log);
});
The real problem is: this subprocess runs smoothly when i call electron ./ from console in working directory, but the build generated by Electron Packager wouldn't.
The subprocess does not show up in Task Manager, or rather, it is terminated as soon as it appears. The log says child process terminated: signal = null ; code = 1.
Although i guarded at the beginning of subprocess.js with this to catch uncaughtException
process.on('uncaughtException', (err) => {
process.send(`Caught exception: ${err}`);
});
Nothing is recorded in log. What should i do to overcome this situation?
System specs:
Window 10
Node 8.6
Electron 1.7.12
Electron Packager 10.1.2
I have experienced this too. One reason i came up with was because the child process will be the child process of electron itself.
In my case, it will not recognize the node modules i defined.
I suggest using spawn with the spawn process being node.exe. But that will not be practical once you build your app.

Command not called, anything wrong with this spawn syntax?

When i run this pidof command by hand, it works. Then put into my server.js.
// send signal to start the install script
var spw = cp.spawn('/sbin/pidof', ['-x', 'wait4signal.py', '|', 'xargs', 'kill', '-USR1']);
spw.stderr.on('data', function(data) {
res.write('----- Install Error !!! -----\n');
res.write(data.toString());
console.log(data.toString());
});
spw.stdout.on('data', function(data) {
res.write('----- Install Data -----\n');
res.write(data.toString());
console.log(data.toString());
});
spw.on('close', function(data) {
res.end('----- Install Finished, please to to status page !!! -----\n');
console.log('88');
});
In the web i only see "----- Install Finished, please to to status page !!!". My install script seems never get this USR1 signal. Anything wrong please ?
The problem is that you have two separate commands. You are piping the output of your /sbin/pidof command to the input stream of your xargs command. If you are using spawn (rather than exec, which a string exactly as you would write on the command line), you need to spawn one process per command.
Spawn your processes like this:
const pidof = spawn('/sbin/pidof', ['-x', 'wait4signal.py']);
const xargs = spawn('xargs', ['kill', '-USR1']);
Now pipe the output of the first process to the input of the second, like so:
pidof.stdout.pipe(xargs.stdin);
Now you can listen to events on your xargs process, like so:
xargs.stdout.on('data', data => {
console.log(data.toString());
});

Use child_process.execSync but keep output in console

I'd like to use the execSync method which was added in NodeJS 0.12 but still have the output in the console window from which i ran the Node script.
E.g. if I run a NodeJS script which has the following line I'd like to see the full output of the rsync command "live" inside the console:
require('child_process').execSync('rsync -avAXz --info=progress2 "/src" "/dest"');
I understand that execSync returns the ouput of the command and that I could print that to the console after execution but this way I don't have "live" output...
You can pass the parent´s stdio to the child process if that´s what you want:
require('child_process').execSync(
'rsync -avAXz --info=progress2 "/src" "/dest"',
{stdio: 'inherit'}
);
You can simply use .toString().
var result = require('child_process').execSync('rsync -avAXz --info=progress2 "/src" "/dest"').toString();
console.log(result);
Edit: Looking back on this, I've realised that it doesn't actually answer the specific question because it doesn't show the output to you 'live' — only once the command has finished running.
However, I'm leaving this answer here because I know quite a few people come across this question just looking for how to print the result of the command after execution.
Unless you redirect stdout and stderr as the accepted answer suggests, this is not possible with execSync or spawnSync. Without redirecting stdout and stderr those commands only return stdout and stderr when the command is completed.
To do this without redirecting stdout and stderr, you are going to need to use spawn to do this but it's pretty straight forward:
var spawn = require('child_process').spawn;
//kick off process of listing files
var child = spawn('ls', ['-l', '/']);
//spit stdout to screen
child.stdout.on('data', function (data) { process.stdout.write(data.toString()); });
//spit stderr to screen
child.stderr.on('data', function (data) { process.stdout.write(data.toString()); });
child.on('close', function (code) {
console.log("Finished with code " + code);
});
I used an ls command that recursively lists files so that you can test it quickly. Spawn takes as first argument the executable name you are trying to run and as it's second argument it takes an array of strings representing each parameter you want to pass to that executable.
However, if you are set on using execSync and can't redirect stdout or stderr for some reason, you can open up another terminal like xterm and pass it a command like so:
var execSync = require('child_process').execSync;
execSync("xterm -title RecursiveFileListing -e ls -latkR /");
This will allow you to see what your command is doing in the new terminal but still have the synchronous call.
Simply:
try {
const cmd = 'git rev-parse --is-inside-work-tree';
execSync(cmd).toString();
} catch (error) {
console.log(`Status Code: ${error.status} with '${error.message}'`;
}
Ref: https://stackoverflow.com/a/43077917/104085
// nodejs
var execSync = require('child_process').execSync;
// typescript
const { execSync } = require("child_process");
try {
const cmd = 'git rev-parse --is-inside-work-tree';
execSync(cmd).toString();
} catch (error) {
error.status; // 0 : successful exit, but here in exception it has to be greater than 0
error.message; // Holds the message you typically want.
error.stderr; // Holds the stderr output. Use `.toString()`.
error.stdout; // Holds the stdout output. Use `.toString()`.
}
When command runs successful:
Add {"encoding": "utf8"} in options.
execSync(`pwd`, {
encoding: "utf8"
})

Node.js child processes and pipes - OSX vs Ubuntu

I am trying to get two long running node.js processes to communicate - a parent and a child - using pipes and Node's child-process module. I want the child to be able to send data back to the parent asynchronously, and I was hoping to use a pipe to do so.
Here's a simplified version of my code:
Parent:
cp = require('child_process')
es = require('event-stream')
child = cp.spawn('coffee', ['child.coffee'], {stdio: [null, null, null, 'pipe']})
so = child.stdout.pipe(es.split())
p3 = child.stdio[3].pipe(es.split())
so.on 'data', (data) ->
console.log('stdout: ' + data)
child.stderr.on 'data', (data) ->
console.log('stderr: ' + data);
p3.on 'data', (data) ->
console.log('stdio3: ' + data);
child.on 'close', (code) ->
console.log('child process exited with code ' + code)
child.stdin.write "a message from your parent", "utf8"
Child:
fs = require('fs')
p3 = fs.createWriteStream('/dev/fd/3', {encoding: 'utf8'})
process.stdin.on 'data', (data) ->
p3.write "hello #{process.pid} - #{data}\n", 'utf8'
process.stdout.write "world #{process.pid} - #{data}\n", 'utf8'
p3.end()
process.exit(0)
process.stdin.on 'end', (data) ->
console.log "end of stdin"
p3.end()
process.exit(0)
process.stdin.setEncoding('utf8')
process.stdin.resume()
The code works on OSX 10.9, but fails to run on a Ubuntu box. I have tried running it under both Ubuntu 12.04 and 14.04. I am running Node 10.2x.
/dev/fd/ under Ubuntu is symbolically linked to /proc/self/fd/ so I believe my child process is opening the right file.
The output from running the parent on Ubuntu is as follows:
$ coffee parent.coffee
stderr:
stderr: events.js:72
stderr: throw er; // Unhandled 'error' event
stderr:
stderr:
stderr:
stderr:
stderr: ^
stderr: Error: UNKNOWN, open '/dev/fd/3'
events.js:72
throw er; // Unhandled 'error' event
^
Error: read ECONNRESET
at errnoException (net.js:901:11)
at Pipe.onread (net.js:556:19)
I would expect to see (and do on a OSX box):
$ coffee parent.coffee
stdio3: hello 21101 - a message from your parent
stdout: world 21101 - a message from your parent
stdio3:
stdout:
child process exited with code 0
It is possible to communicate with the child using the command line also on Ubuntu, so the problem is likely in the parent when spawning the child process:
$ echo foo | coffee child.coffee 3>&1
hello 3077 - foo
world 3077 - foo
I have tried to investigate the kernel calls that node makes using strace, but couldn't make much sense of the output.
I figured it out myself. The error was in the child. Ubuntu linux is more strict when it comes to opening files that are already open, the line:
p3 = fs.createWriteStream('/dev/fd/3', {encoding: 'utf8'})
was throwing an error. The file descriptor 3 is already open when the child runs, so the code should look as follows:
Child:
fs = require('fs')
# parent opens the file descriptor 3 when spawning the child (and closes it when the child returns)
fd3write = (s) ->
b = new Buffer(s)
fs.writeSync(3,b,0,b.length)
process.stdin.on 'data', (data) ->
fd3write "p3 #{process.pid} - #{data}\n"
process.stdout.write "so #{process.pid} - #{data}\n", 'utf8'
process.exit(0)
process.stdin.on 'end', (data) ->
console.log "end of stdin"
process.exit(0)
process.stdin.setEncoding('utf8')
process.stdin.resume()
I hope this will be of help to someone else.
To use a pipe instead of stdin to send messages from the parent to the child this link might be of use: child-process-multiple-file-descriptors.

Logging script data with Node child_process

I am working with a python script that runs on a loop and outputs a new value every second, and can only be stopped by pressing enter on the keyboard. For various reasons, the python code should not be altered.
Ask: How do I capture the first ten values of the looping script then kill the script from Node?
I wrote the below Node script that will kick off an external program and log the output; however, this only works for scripts that aren't running in loops.
var exec = require('child_process').exec;
var scriptCommand = "sudo python script.py"
exec(scriptCommand, function cb(error, stdout, stderr){
console.log('stdout: ' + stdout);
console.log('stderr: ' + stderr);
if (error !== null){
console.log('exec error: ' + error);
}
});
You're going to want to use spawn and capture the standard output of the python child process. Once you have reached ten values, you can kill the python process.
Unfortunately you're going to have to modify the python program to flush stout. There is no way around this. If you do not manually flush stdout, python will, but only after the internal buffer fills up (which in my sample code will take awhile).
Here is a fully working example (captures first three values and then kills the python process):
pyscript.py
#!/usr/bin/env python
# python 2.7.4
import time
import sys
i = 0
while(True):
time.sleep(1)
print("hello " + str(i))
# Important! This will flush the stdout buffer so node can use it
# immediately. If you do not use this, node will see the data only after
# python decides to flush it on it's own.
sys.stdout.flush()
i += 1
script.js
#!/usr/bin/env node
"use strict";
// node version 0.10.26
var spawn = require('child_process').spawn
, path = require('path')
, split = require('split');
// start the pyscript program
var pyscript = spawn('python', [ path.join(__dirname, 'pyscript.py') ]);
var pythonData = [];
// Will get called every time the python program outputs a new line.
// I'm using the split module (npm) to get back the results
// on a line-by-line basis
pyscript.stdout.pipe(split()).on('data', function(lineChunk) {
// Kill the python process after we have three results (in our case, lines)
if (pythonData.length >= 3) {
return pyscript.kill();
}
console.log('python data:', lineChunk.toString());
pythonData.push(lineChunk.toString());
});
// Will be called when the python process ends, or is killed
pyscript.on('close', function(code) {
console.log(pythonData);
});
Put them both in the same directory, and make sure to grab the split module for the demo to work.

Resources