Format output of spawned gulp process like the parent process - node.js

Usecase: On large projects it can be nice to separate small projects into folders of their own with their own build process.
The following setup basically works on both windows and mac - I get the output of the child gulp process logged in the console - only problem is that it's not colored like the output of the parent process.
var spawnCmd = require('spawn-cmd');
gulp.task('default', function () {
// Run all the parent projects tasks first
// ....
// ....
// ....
// When done, cd to child directory
process.chdir('./some-dir-that-has-a-gulpfile');
// Run `gulp` in the child directory
var child = spawnCmd.spawn('gulp', ['default']);
// And pipe the output to the current process
child.stdout.pipe(process.stdout);
});
My question is how to display the output of the child gulp process in exactly the same way as the normal gulp process.
Edit: Duplicate of Is it possible for child processes in Node.js to preserve colored output?

You should inherit the stdio of the parent process. This correctly pipes the output to the same output, with colors and all.
Because you are using gulp, you should also add the --color always flag, in order for gulp to properly detect that you want colors.
var spawnCmd = require('spawn-cmd');
gulp.task('default', function () {
// When done, cd to child directory
process.chdir('./some-dir-that-has-a-gulpfile');
// Run `gulp` in the child directory
var child = spawnCmd.spawn('gulp', ['default', '--color', 'always'], {stdio: 'inherit'});
});

Related

Unable to read a continuous data stream from a node child process in Node.js

First things first. The goal I want to achieve:
I have two processes:
the first one is webpack that just watches for file changes and pushes the bundled files into the dist/ directory
the second process (Shopify CLI) watches for any file changes in the dist/ directory and pushes them to a remote destination
My goal is to have only one command (like npm run start) which simultaneously runs both processes without printing anything to the terminal so I can print custom messages. And that's where the problem starts:
How can I continuously read child process terminal output?
Printing custom messages for webpack events at the right time is pretty easy, since webpack has a Node API for that. But the Shopify CLI only gives me the ability to capture their output and process it.
Normally, the Shopify CLI prints something like "Finished Upload" as soon as the changed file has been pushed. It works perfectly fine for the first time but after that, nothing is printed to the terminal anymore.
Here is a minimal representation of what my current setup looks like:
const spawn = require('spawn');
const childProcess = spawn('shopify', ['theme', 'serve'], {
stdio: 'pipe',
});
childProcess.stdout.on('data', (data) => {
console.log(data);
});
childProcess.stderr.on('data', (data) => {
// Just to make sure there are no errors
console.log(data);
});

Display gulp --tasks from within task

I want to dump out a message within the default gulp task to tell the user to select a task, but then list out the tasks below this message the same way which gulp --tasks does.
Can't seem to find anything on Google which will do this without additional plugins, which I want to avoid, if there a way?
exports.default = (cb) => {
log(chalk.bgRed('Please run a task, a list has been provided below.'));
// dump out tasks here
cb();
};
If you just want to log the list of the gulp tasks, you can use child_process to execute gulp --task
const exec = require('child_process').execSync
exec('gulp --tasks', { stdio: 'inherit' })

NodeJS: exit parent, leave child alive

i am writing an utility. One command of this utility is to run an external application.
var child_process = require('child_process');
var fs = require('fs');
var out = fs.openSync('.../../log/out.log', 'a');
var err = fs.openSync('.../../log/err.log', 'a');
exports.Unref = function(app, argv) {
var child = child_process.spawn(app, argv, {
detached: true,
stdio: [ 'ignore', out, err ]
});
child.unref();
//process.exit(0);
};
Currently:
$ utility run app --some-args // run external app
// cant enter next command while app is running
My Problem is that if i run this command, the terminal is locked while the "external" Application is running.
But the terminal window shouldn't be locked by the child_process.
i wanna run:
$ utility run app --some-args
$ next-command
$ next-command
The external Application (a desktop application) will be closed by hisself.
Like this:
$ subl server.js // this runs Sublime Text and passes a file to the editor
$ npm start // The terminal does not locked - i can execute next command while Sublime is still running
You know what i mean ^^?
Appending ['>>../../log/out.log', '2>>../../log/err.log'] to the end of argv instead of leaving two files open should work since it's the open file handles that are keeping the process alive.
Passing opened file descriptors in stdio in addition to detached: true will not work the way you expect because there is no way to unref() the file descriptors in the parent process and have it still work for the child process. Even if there was a way, I believe that when the parent process exited, the OS would clean up (close) the file descriptors it had open, which would cause problems for the detached child process.
The only possible way that this might have been able to work would have been by passing file descriptors to child processes, but that functionality was dropped several stable branches ago because the same functionality did not exist on some other platforms (read: Windows).

How to start a process in node.js that thinks its being run from the commandline

I'm running require('child_process').exec('npm install') as a child process in a node.js script, but I want it to retain console colors. I'm running in windows, but want this script to be portable (e.g. to linux). How do I start a process that think's it's being run from the console?
Note: I'd rather not have npm-specific answers, but an answer that allows me to trick any command.
You can do this by letting the child process inherit the master process' stdio streams. This means you need to user spawn rather than exec, and this what you'd do:
var spawn = require('child_process').spawn;
var child = spawn('npm', ['install'], {
stdio: 'inherit'
});

Running Node app through Grunt

I am trying to run my Node application as a Grunt task. I need to spawn this as a child process, however, to allow me to run the watch task in parallel.
This works:
grunt.registerTask('start', function () {
grunt.util.spawn(
{ cmd: 'node'
, args: ['app.js']
})
grunt.task.run('watch:app')
})
However, when changes are detected by the watch task, this will trigger the start task again. Before I spawn another child process of my Node app, I need to kill the previous one.
I can't figure out how to kill the process, however. Something like this does not work:
var child
grunt.registerTask('start', function () {
if (child) child.kill()
child = grunt.util.spawn(
{ cmd: 'node'
, args: ['app.js']
})
grunt.task.run('watch:app')
})
It appears that:
Even though I store the spawned process in a variable outside of the function context, it does not persist, so the next time the start task is run, child is undefined.
child has no kill function…
Take a look at grunt-nodemon which handles a lot of the headaches related to spawning a child process.
This is because grunt-contrib-watch currently spawns all task runs as child processes. So the variable child is not within the same process context. Fairly soon, grunt-contrib-watch#0.3.0 will be released with a nospawn option. This will let you configure the watch to spawn task runs within the same context and would make your above example work.
Take a look at this issue for a little more information:
https://github.com/gruntjs/grunt-contrib-watch/issues/45

Resources