I'm running the following code using node and I always get only about 80% of what stdout is supposed to return when the file is past 12ko or so. If the file is 240ko it will still output 80% of it. If it's under 12 it will output completely.
When I open a cmd and run the command manually I always get the full output.
I tried exec, execFile, I tried increasing the max buffer or changing the encoding and it's not the issue. I tried to add options {shell: true, detached: true} but it vain, in fact, when I run it as detached it seems to be running completely fine as it does open an actual console but I'm not able to retrieve the stdout when the process is completed.
const spawn = require('child_process').spawn;
const process = spawn(
'C:\\Users\\jeanphilipped\\Desktop\\unrtf\\test\\unrtf.exe',
['--html' ,'C:\\Users\\jeanphilipped\\Desktop\\unrtf\\test\\tempaf45.rtf'],
);
let chunks = [];
process.stdout.on('data', function (msg) {
chunks = [...chunks, ...msg];
});
process.on('exit', function (msg) {
const buffer = Buffer.from(chunks);
console.log(buffer.toString());
});
Any clues ? It seems to be Node since when I run it manually everything works fine.
according to nodejs documentation, all of the child_process commands are asynchronous . so when you try to access your chunk variable there is no guarantee that you command has been finished , maybe your command is still on process. so it is recommended you should wrap your entire child_process command in async/await function.
Related
First things first. The goal I want to achieve:
I have two processes:
the first one is webpack that just watches for file changes and pushes the bundled files into the dist/ directory
the second process (Shopify CLI) watches for any file changes in the dist/ directory and pushes them to a remote destination
My goal is to have only one command (like npm run start) which simultaneously runs both processes without printing anything to the terminal so I can print custom messages. And that's where the problem starts:
How can I continuously read child process terminal output?
Printing custom messages for webpack events at the right time is pretty easy, since webpack has a Node API for that. But the Shopify CLI only gives me the ability to capture their output and process it.
Normally, the Shopify CLI prints something like "Finished Upload" as soon as the changed file has been pushed. It works perfectly fine for the first time but after that, nothing is printed to the terminal anymore.
Here is a minimal representation of what my current setup looks like:
const spawn = require('spawn');
const childProcess = spawn('shopify', ['theme', 'serve'], {
stdio: 'pipe',
});
childProcess.stdout.on('data', (data) => {
console.log(data);
});
childProcess.stderr.on('data', (data) => {
// Just to make sure there are no errors
console.log(data);
});
so, what am I trying to achieve:
I am running master process, which forks 2 other node processes. I am using this debug library from npm and I need it's output (of the debug function function) of child processes to be piped to parent's outputs (stdout, stderr).
Currently what I'm getting successfully piped is console.log("..."); but not:
var log = require('debug')('service');
log.color = 3;
log("...");
I am forking child processes using this code:
var fork = require('child_process').fork;
var child_options = {
cwd: __dirname,
env: process.env,
stdio: [ 'ignore', process.stdout, process.stderr, 'ipc' ],
deatached: false,
shell: true
};
var job_node = fork('job_node', [], child_options);
Could anyone help me find out what the problem might be? Thank you :)
EDIT:
This problem is probably not caused by some kind of error in stream piping -
instead there is something with existing/non-existing console window, attached terminal.
When I tried to run the server from within PhpStorm, using built-in tools, it showed ONLY the console.log function outputs even from master process.
TheKronnY
Solved:
Just for the sake of trying, I tried to use debug.enable function in the master process and it worked.
I used debug.enable('master,job,service'); to enable used namespaces in other processes and it worked, so I do not know why, but other namespaces, created in the child processes were disabled by default.
I am writing a CLI tool for a node.js app. Some of the commands have to run npm and show the results. This is what I have so far:
import {spawn} from 'child_process';
let projectRoot = '...';
let npm = (process.platform === "win32" ? "npm.cmd" : "npm"),
childProcess = spawn(npm, ["install"], { cwd: projectRoot });
childProcess.stdout.pipe(process.stdout);
childProcess.stderr.pipe(process.stderr);
childProcess.on('close', (code) => {
// Continue the remaining operations
});
The command does run fine and outputs the results (or errors). However, it doesn't give me a live feed with the progress bar, etc. It waits until the entire operation is over and then dumps the output into the console.
I've tried different variations of the spawn configuration but I can't get it to show me the live feed.
I am on Windows 10 and use node.js 4 and npm 3.
As discussed in the comments: Run the spawn with { stdio: 'inherit' }.
However, good question is why the 'manual piping' does not do the same. I think that's because npm uses the 'fancy progress bar'. It probably uses some special way how to deal with stdout that does not play well with process.stdout. If you try some other long-running command (such as 'find ./'), your way of piping works fine.
i am writing an utility. One command of this utility is to run an external application.
var child_process = require('child_process');
var fs = require('fs');
var out = fs.openSync('.../../log/out.log', 'a');
var err = fs.openSync('.../../log/err.log', 'a');
exports.Unref = function(app, argv) {
var child = child_process.spawn(app, argv, {
detached: true,
stdio: [ 'ignore', out, err ]
});
child.unref();
//process.exit(0);
};
Currently:
$ utility run app --some-args // run external app
// cant enter next command while app is running
My Problem is that if i run this command, the terminal is locked while the "external" Application is running.
But the terminal window shouldn't be locked by the child_process.
i wanna run:
$ utility run app --some-args
$ next-command
$ next-command
The external Application (a desktop application) will be closed by hisself.
Like this:
$ subl server.js // this runs Sublime Text and passes a file to the editor
$ npm start // The terminal does not locked - i can execute next command while Sublime is still running
You know what i mean ^^?
Appending ['>>../../log/out.log', '2>>../../log/err.log'] to the end of argv instead of leaving two files open should work since it's the open file handles that are keeping the process alive.
Passing opened file descriptors in stdio in addition to detached: true will not work the way you expect because there is no way to unref() the file descriptors in the parent process and have it still work for the child process. Even if there was a way, I believe that when the parent process exited, the OS would clean up (close) the file descriptors it had open, which would cause problems for the detached child process.
The only possible way that this might have been able to work would have been by passing file descriptors to child processes, but that functionality was dropped several stable branches ago because the same functionality did not exist on some other platforms (read: Windows).
I was recently going to test out running phantomJS from python as a commandline argument, I haven't got round to it yet but have seen examples. Because PhantomJS is run from the command line this seems to be possible. The result that PhantomJS would spit out would go straight into a variable.
Before I go down that path, making this work in node.js would actually be more useful for me and it got me thinking, can i just use to node to run PhantomJS as a program gets run from the commandline and store the data result that PhantomJS would normally spit out into a variable?
I would rather not use phantomjs-node because it seems to be using too many tricks.
The reason for all of this is to be able to run PhantomJS at the same time as another action the program takes and use the resulting data its recorded for some other stuff.
Simply put, you can run system command line stuff in python, can I do the same in node.js?
Cheers :)
Edit: I understand that node and phantom use different js environments, that's cool because I just want to run phantom as its own process and catch all that output data into a node.js variable (the data will be a array of a pair, string and floating point.) I don't want to 'drive' with phantom, I will craft the loaded javascript files todo what I want. All I want is phantom output. :)
From NPM: https://npmjs.org/package/phantomjs
var path = require('path')
var childProcess = require('child_process')
var phantomjs = require('phantomjs')
var binPath = phantomjs.path
var childArgs = [
path.join(__dirname, 'phantomjs-script.js'),
'some other argument (passed to phantomjs script)'
]
childProcess.execFile(binPath, childArgs, function(err, stdout, stderr) {
// handle results
})
I suppose you can make a simple script for Node.js to run; in that script phantomjs script will be run as a child process. You can see the working example (and links for some documentation) in this answer. I suppose this discussion might be helpful for you as well.
As an alternative to Donald Derek's answer, you can use the spawn function. It will allow you to read the child process's output as soon as it's produced rather than the output being buffered and returned to you all at once.
You can read more about it here.
An example from the documentation.
var spawn = require('child_process').spawn,
ls = spawn('ls', ['-lh', '/usr']);
ls.stdout.on('data', function (data) {
console.log('stdout: ' + data);
});
ls.stderr.on('data', function (data) {
console.log('stderr: ' + data);
});
ls.on('close', function (code) {
console.log('child process exited with code ' + code);
});