heroku -- npm postinstall script to run grunt task depending on enviro - node.js

I've got two heroku node.js apps, one for prod and one for dev, and I also have a Gruntfile with dev- and prod-specific tasks. I know you can set up package.json to run grunt as a postinstall hook for npm, but can you specify somehow different tasks to be run depending on what enviro you're in?
Here's what the relevant section of my package.json looks like so far:
"scripts": {
"postinstall": "./node_modules/grunt/bin/grunt default"
},
Rather than run grunt default every time, I'd love to run "grunt production" if NODE_ENV is production, etc.
Is this possible?

Sadly there's no difference like postInstall and postInstallDev. You can make an intermediate script to handle the difference though. For example, if you have the following:
"scripts": { "postinstall": "node postInstall.js" },
Then in this script you could check the environment variable and execute the correct Grunt task from there:
// postInstall.js
var env = process.env.NODE_ENV;
if (env === 'development') {
// Spawn a process or require the Gruntfile directly for the default task.
return;
}
if (env === 'production') {
// Spawn a process or require the Gruntfile directly to the prod task.
return;
}
console.error('No task for environment:', env);
process.exit(1);
A couple of peripherally related points...
Try not to have Grunt and co. as dependencies. Keep them to devDependencies to avoid having to install all that stuff in production. Having an intermediary script in vanilla Node like the above will allow you to do this. I like to use a postInstall script like this to install git hook scripts too (but also only on development environments).
You don't have to use ./node_modules/grunt/bin/grunt default. If grunt-cli is a dependency or devDependency, npm knows where to look and grunt default will work fine.

For some reason, my dev environment was never running my "development" if statement. I sent a ticket to Heroku support, and this was their answer: "By default, your environment is not available during slug compilation. If you would like to make this available, you can enable an experimental feature called "user-env-compile". Please see the following article for details:
http://devcenter.heroku.com/articles/labs-user-env-compile". Good to know. So, I went another route using the heroku-buildpack-nodejs-grunt buildpack, and then creating a heroku:development grunt task.

Related

How Do I Build For A UAT Environment Using React?

According to the React docs you can have development, test and production envs.
The value of NODE_ENV is set automatically to development (when using npm start), test (when using npm test) or production (when using npm build). Thus, from the point of view of create-react-app, there are only three environments.
I need to change root rest api urls based on how I am deployed.
e.g.
development: baseURL = 'http://localhost:3004';
test: baseURL = 'http://localhost:8080';
uat: baseURL = 'http://uat.api.azure.com:8080';
production: baseURL = 'http://my.cool.api.com';
How do I configure a UAT environment for react if it only caters for dev, test and prod?
What would my javascript, package.json and build commands look like to switch these values automatically?
Like John Ruddell wrote in the comments, we should still use NODE_ENV=production in a staging environment to keep it as close as prod as possible. But that doesn't help with our problem here.
The reason why NODE_ENV can't be used reliably is that most Node modules use NODE_ENV to adjust and optimize with sane defaults, like Express, React, Next, etc. Next even completely changes its features depending on the commonly used values development, test and production.
So the solution is to create our own variable, and how to do that depends on the project we're working on.
Additional environments with Create React App (CRA)
The documentation says:
Note: You must create custom environment variables beginning with REACT_APP_. Any other variables except NODE_ENV will be ignored to avoid accidentally exposing a private key on the machine that could have the same name.
It was discussed in an issue where Ian Schmitz says:
Instead you can create your own variable like REACT_APP_SERVER_URL which can have default values in dev and prod through the .env file if you'd like, then simply set that environment variable when building your app for staging like REACT_APP_SERVER_URL=... npm run build.
A common package that I use is cross-env so that anyone can run our npm scripts on any platform.
"scripts": {
"build:uat": "cross-env REACT_APP_SERVER_URL='http://uat.api.azure.com:8080' npm run build"
Any other JS project
If we're not bound to CRA, or have ejected, we can easily configure any number of environment configurations we'd like in a similar fashion.
Personally, I like dotenv-extended which offers validation for required variables and default values.
Similarly, in the package.json file:
"scripts": {
"build:uat": "cross-env APP_ENV=UAT npm run build"
Then, in an entry point node script (one of the first script loaded, e.g. required in a babel config):
const dotEnv = require('dotenv-extended');
// Import environment values from a .env.* file
const envFile = dotEnv.load({
path: `.env.${process.env.APP_ENV || 'local'}`,
defaults: 'build/env/.env.defaults',
schema: 'build/env/.env.schema',
errorOnMissing: true,
silent: false,
});
Then, as an example, a babel configuration file could use these like this:
const env = require('./build/env');
module.exports = {
plugins: [
['transform-define', env],
],
};
Runtime configuration
John Ruddell also mentioned that one can detect at runtime the domain the app is running off of.
function getApiUrl() {
const { href } = window.location;
// UAT
if (href.indexOf('https://my-uat-env.example.com') !== -1) {
return 'http://uat.api.azure.com:8080';
}
// PROD
if (href.indexOf('https://example.com') !== -1) {
return 'http://my.cool.api.com';
}
// Defaults to local
return 'http://localhost:3004';
}
This is quick and simple, works without changing the build/CI/CD pipeline at all. Though it has some downsides:
All the configuration is "leaked" in the final build,
It won't benefit from dead-code removal at minification time when using something like babel-plugin-transform-define or Webpack's DefinePlugin resulting in a slightly bigger file size.
Won't be available at compile time.
Trickier if using Server-Side Rendering (though not impossible)
To have multiple environments in a React.js application you can use this plugin
env-cmd from NPM
And after that Create the three files as per your need.
For example if you want to setup dev, stag and prod environments you can write your commands like this.
"start:dev": "env-cmd -f dev.env npm start", // dev env
"build:beta": "env-cmd -f stag.env npm run build", // beta env
"build": "react-scripts build", // prod env using .env file

Setting up Angular Universal App for development

I have created a project with Angular-CLI. (using command: ng new my-angular-universal).
Then I carefully followed all the instructions from https://github.com/angular/angular-cli/wiki/stories-universal-rendering
It builds for --prod and works fine. But there are no instructions on how I can set up a --dev build and have it served with --watch flag.
I tried removing --prod flags from npm "scripts", and it doesn't even run in dev mode. It builds fine but when I open it in browser this is what I see (directly printed to response):
TypeError: Cannot read property 'moduleType' of undefined
at C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:7069:134
at ZoneDelegate.invoke (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:105076:26)
at Object.onInvoke (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:6328:33)
at ZoneDelegate.invoke (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:105075:32)
at Zone.run (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:104826:43)
at NgZone.run (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:6145:69)
at PlatformRef.bootstrapModuleFactory (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:7068:23)
at Object.renderModuleFactory (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:52132:39)
at View.engine (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:104656:23)
at View.render (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:130741:8)
the versions of npm packages that I use are currently the latest:
#angular/* - #5.2.*
#angular/cli #1.7.3
except for ts-loader, had to downgrade it because it wasn't working:
ts-loader #3.5.0
So if anyone has any info on how to make this work, it would be very appreciated! Or maybe you know some project templates with Angular Universal App configured for both --dev and --prod builds and ability to --watch?
For development, run npm run start which triggers ng serve. The current setup has hot module reloading so it will watch for your changes and update your dev view. I used the same instructions and got it working here https://github.com/ariellephan/angular5-universal-template
In short, for development, run npm run start and look at http://localhost:4200.
For production, run npm run build:ssr and npm run serve:ssrand look at http://localhost:4000
As contributors have pointed out, it might not be the most efficient and fastest way to develop, but nevertheless I did not want to accept workarounds. Besides, hosting front and back on separate servers brings up CORS issues, and I never planned my app to run on separate hosts, I wanted it all on the same host together with API methods.
The problem with --dev build was this:
when building with the following command:
ng build --app 1 --output-hashing=false (note that there is no --prod flag)
AppServerModuleNgFactory turned out missing in the ./dist-server/main.bundle
I imagine that this relates to the ahead of time(--aot) compilation which is the default behavior if you are building for --prod. So the instructions from https://github.com/angular/angular-cli/wiki/stories-universal-rendering included instructions to configure express server for production build only. And since there is no need for server to be able to dynamically render html templates the working --dev build command would be:
ng build --app 1 --output-hashing=false --aot
and this gets rid of the TypeError: Cannot read property 'moduleType' of undefined
Now to watch this whole mess:
run these in separate command windows:
ng build --watch
ng build --app 1 --output-hashing=false --aot --watch
webpack --config webpack.server.config.js --progress --colors --watch
And for the server to restart on change, you have to install nodemon package and run it like this:
nodemon --inspect dist/server (--inspect if you wish to debug server with chrome)
Some other important stuff:
Angular/CLI has a command to generate necessary scaffolding for a universal app:
ng generate universal
and it generates a fixed version of main.ts that avoids client angular bootstrap issue:
document.addEventListener('DOMContentLoaded', () => {
platformBrowserDynamic().bootstrapModule(AppModule)
.catch(err => console.log(err));
});
a problem that I stumbled upon once I implemented TransferState
There are basically two parts - the server and the UI. While developing the UI, I simply use ng serve. That means when I make changes in my code in the IDE, the browser refreshes automatically. And, here the server part is not used.
I do prod build and run the server only for final testing to see if everything works as expected (No error due to any 3PP library DOM manipulation or AOT related issues, etc.)
Here, I have created a skeleton structure of an Angular Universal project. As I extensively use Vagrant and Docker in my projects, I run the server in a Docker container within the Vagrant guest system. And for development of the UI, I don't run the server. Simply, the ng serve is used.
If you look into my structure in the above Github link, you'll find the details as to how to run it for development and production in the Readme file.
The web server handler server.ts uses the server bundle
const { AppServerModuleNgFactory, LAZY_MODULE_MAP } = require('./dist/server/main.bundle');
That's why the server bundle needs to be compiled before you can compile the server.ts file.
So having a watch system would mean
watching/recompiling the client bundle
watching/recompiling the server bundle
recompiling the server.ts once the server bundle is created
All of them take some time (especially if you do it with aot)
I'd recommend, like Saptarshi Basu mentionned, to develop as best as you can with ng serve and check with angular universal every so often.
Otherwise, it should be possible do achieve what you want with some kind of tasks (grunt/gulp/...) which triggers sequentially ng build ... and recompilation of server.ts file.
It is a bit messy no doubt, as we preferably wish for one command to rule them all.
I came up with a somewhat OK solution where my output will be:
dist/browser
dist/ng-server
Using the executable npm-run-all package (I find it working a lot better on windows machines than concurrently does) I run the three watch tasks: browser, ng-server and nodeJS. Watching node has a pre-task defined that simply runs a small utility/helper/file that watches for the existence of a dist/ng-server folder and terminate itself once found.
For all of this to work (based on the universal-starter repo as of november 2018) there's a couple of modifications to package.json required. Primarily, to support the --watch flag on ng run commands we need to update the compiler-cli (if memory serves), ng update --all should take care of that, giving you the latest angular/cli version in the process (assuming you have a recent cli version installed globally).
package.json
ng update --all
angular 6+
angular/cli 7+
yarn add/npm install the following
chokidar
npm-run-all
(runs our tasks in parallel with the -p flag. -p kills all processes, -l gives each running task a specific color and name in the console)
ts-node (runs nodejs in it's ts-format)
nodemon // for restarting ts-node
add something similar to my util/await-file.js (after some consideration I added my own file-watcher code below even though it wasn't exactly written with the intentions to be put up on display...)
modify your package.json scripts like below
modify your angular.json to match your folder names, following my examples, mainly the "server"'s outputPath should be changed from dist/server to dist/ng-server.
package.json scripts
"dev": "npm-run-all -p -r -l watch:ng-server watch:browser watch:node",
"watch:browser": "ng build --prod --progress --watch --delete-output-path",
"watch:ng-server": "ng run ng-universal-demo:server --watch --delete-output-path",
"watch:node": "yarn run watch:file-exist && yarn run ts-node",
"ts-node": "nodemon --exec ts-node server.ts -e ts,js",
"watch:file-exist": "node utils/await-file.js",
util/await-file.js
const chokidar = require('chokidar');
const fs = require('fs');
const path = require('path');
const DIR_NAME = 'ng-server';
const DIST_PATH = './dist';
// creates dist folder if it doesn't exist - prior to adding it to the watcher.
if (!fs.existsSync(DIST_PATH)) {
fs.mkdirSync(DIST_PATH);
}
const watcher = chokidar.watch('file, dir', {
ignored: '*.map',
persistent: true,
awaitWriteFinish: {
stabilityThreshold: 5000,
pollInterval: 100
}
});
const FOLDER_PATH = path.join(process.cwd(), 'dist');
watcher.add(FOLDER_PATH);
console.log(`file-watcher running, waiting for ${DIST_PATH}/${DIR_NAME}`);
function fileFound() {
console.log(`${DIR_NAME} folder found - closing`);
watcher.close();
process.exit();
}
watcher
.on('add', function (filePath) {
const matchWith = path.join('dist', DIR_NAME);
const paths = filePath.split(path.sep);
const fileName = paths[paths.length - 1];
if ((filePath.indexOf(matchWith) >= 0)
&& fileName.indexOf('.js') > fileName.length - 4) {
fileFound();
}
})
.on('error', error => console.log(`Watcher error: ${error}`));
"npm run start" and using "http://localhost:4200" works for me. Even with Angular 10

How can I set properties conditionally in Node.js via Atom?

I am developing a Node.js app with Electron via Atom.
I want to set some properties conditionally(or automatically), for instance, the url should be http://some.url on production level.
Currently, I use like this.
// win.loadURL('http://app.url/webchat'); //uncomment when production
win.loadURL('http://test.app.url/webchat'); // uncomment when development
This is very annoying me, and it can be a problem when I miss changing comments.
How can I change my properties conditionally with production/development level?
I have a config directory with different config files for different environments: dev, test, prod. Then in my package.json I have added environment specific build commands. e.g. For prod:
"build-prod-config": "config/buildProdConfig.sh",
"build-renderer-prod": "webpack --config=webpack.config.renderer.prod.js",
"build-main-prod": "webpack --config=webpack.config.main.prod.js",
"build-prod": "npm run build-prod-config && npm run build-main-prod & npm run build-renderer-prod",
buildProdConfig.sh
#!/usr/bin/env bash
cp config/app.config.prod.js config/app.config.js
echo "Copied ProdConfig to Config"
//This is what a config file looks like
const Config = {
suppDataDirectoryPath: '/suppData/',
builtFor: 'prod',
}
module.exports = Config;
I then require Config whereever I need in my application and use the values. This is a simple thing for now, but it works.
if (process.env.DEV === "PROD") {
win.loadURL('http://app.url/webchat');
} else {
win.loadURL('http://test.app.url/webchat');
}
then when launching your app just do
DEV="PROD" node app.js or whatever

Running Meteor Build under Node with settings argument

Typically when developing I would use meteor run --settings settings.json. This works fine and can view the settings in the browser with Meteor.settings on the console.
I am now build for production, using meteor build, I've looked at the documentation and there is nowhere to add settings during the build process.
So the build runs and I have my .tar.gz file, it's loaded to production and then I untar/compress the folder and run the start script.
It enters the program with npm start and the package.json section looks like this (ignore the stop script);
{
"name": "myapp",
"scripts": {
"start": "node main.js --settings settings.json",
"stop": "killall node"
}
}
When I look at my app it is not collecting these settings. It is as if when bundled it doesn't expect the arguements. I also tried using forever beforehand, but I had no joy with this either.
Any help would appreciated, start to wish I never bothered with Meteor :)
You can refer to Meteor Guide > Production > Deployment and Monitoring > Environment variables and Settings
Settings. These are in a JSON object set via either the --settings Meteor command-line flag or stringified into the METEOR_SETTINGS environment variable.
As for setting environment variables, if you use a 3rd party host, you may have a GUI or CLI to define them.
Otherwise, you should have plenty resources including on SO:
Node.js: Setting Environment Variables
How can I set an environmental variable in node.js?
https://themeteorchef.com/snippets/making-use-of-settings-json/
In short, it should look like:
METEOR_SETTINGS='{"key":"value"}' node main.js
You can also try the bash cat command to extract the content of a file: $(cat settings.json)

Gulp + Webpack or JUST Webpack?

I see people using gulp with webpack. But then I read webpack can replace gulp? I'm completely confused here...can someone explain?
UPDATE
in the end I started with gulp. I was new to modern front-end and just wanted to get up and running quick. Now that I've got my feet quite wet after more than a year, I'm ready to move to webpack. I suggest the same route for people who start off in the same shoes. Not saying you can't try webpack but just sayin if it seems complicated start with gulp first...nothing wrong with that.
If you don't want gulp, yes there's grunt but you could also just specify commands in your package.json and call them from the command-line without a task runner just to get up and running initially. For example:
"scripts": {
"babel": "babel src -d build",
"browserify": "browserify build/client/app.js -o dist/client/scripts/app.bundle.js",
"build": "npm run clean && npm run babel && npm run prepare && npm run browserify",
"clean": "rm -rf build && rm -rf dist",
"copy:server": "cp build/server.js dist/server.js",
"copy:index": "cp src/client/index.html dist/client/index.html",
"copy": "npm run copy:server && npm run copy:index",
"prepare": "mkdir -p dist/client/scripts/ && npm run copy",
"start": "node dist/server"
},
This answer might help. Task Runners (Gulp, Grunt, etc) and Bundlers (Webpack, Browserify). Why use together?
...and here's an example of using webpack from within a gulp task. This goes a step further and assumes that your webpack config is written in es6.
var gulp = require('gulp');
var webpack = require('webpack');
var gutil = require('gutil');
var babel = require('babel/register');
var config = require(path.join('../..', 'webpack.config.es6.js'));
gulp.task('webpack-es6-test', function(done){
webpack(config).run(onBuild(done));
});
function onBuild(done) {
return function(err, stats) {
if (err) {
gutil.log('Error', err);
if (done) {
done();
}
} else {
Object.keys(stats.compilation.assets).forEach(function(key) {
gutil.log('Webpack: output ', gutil.colors.green(key));
});
gutil.log('Webpack: ', gutil.colors.blue('finished ', stats.compilation.name));
if (done) {
done();
}
}
}
}
I think you'll find that as your app gets more complicated, you might want to use gulp with a webpack task as per example above. This allows you to do a few more interesting things in your build that webpack loaders and plugins really don't do, ie. creating output directories, starting servers, etc. Well, to be succinct, webpack actually can do those things, but you might find them limited for your long term needs. One of the biggest advantages you get from gulp -> webpack is that you can customize your webpack config for different environments and have gulp do the right task for the right time. Its really up to you, but there's nothing wrong with running webpack from gulp, in fact there's some pretty interesting examples of how to do it. The example above is basically from jlongster.
NPM scripts can do the same as gulp, but in about 50x less code. In fact, with no code at all, only command line arguments.
For example, the use case you described where you want to have different code for different environments.
With Webpack + NPM Scripts, it's this easy:
"prebuild:dev": "npm run clean:wwwroot",
"build:dev": "cross-env NODE_ENV=development webpack --config config/webpack.development.js --hot --profile --progress --colors --display-cached",
"postbuild:dev": "npm run copy:index.html && npm run rename:index.html",
"prebuild:production": "npm run clean:wwwroot",
"build:production": "cross-env NODE_ENV=production webpack --config config/webpack.production.js --profile --progress --colors --display-cached --bail",
"postbuild:production": "npm run copy:index.html && npm run rename:index.html",
"clean:wwwroot": "rimraf -- wwwroot/*",
"copy:index.html": "ncp wwwroot/index.html Views/Shared",
"rename:index.html": "cd ../PowerShell && elevate.exe -c renamer --find \"index.html\" --replace \"_Layout.cshtml\" \"../MyProject/Views/Shared/*\"",
Now you simply maintain two webpack config scripts, one for development mode, webpack.development.js, and one for production mode, webpack.production.js. I also utilize a webpack.common.js which houses webpack config shared on all environments, and use webpackMerge to merge them.
Because of the coolness of NPM scripts, it allows for easy chaining, similar to how gulp does Streams/pipes.
In the example above, to build for developement, you simply go to your command line and execute npm run build:dev.
NPM would first run prebuild:dev,
Then build:dev,
And finally postbuild:dev.
The pre and post prefixes tell NPM which order to execute in.
If you notice, with Webpack + NPM scripts, you can run a native programs, such as rimraf, instead of a gulp-wrapper for a native program such as gulp-rimraf. You can also run native Windows .exe files as I did here with elevate.exe or native *nix files on Linux or Mac.
Try doing the same thing with gulp. You'll have to wait for someone to come along and write a gulp-wrapper for the native program you want to use. In addition, you'll likely need to write convoluted code like this: (taken straight from angular2-seed repo)
Gulp Development code
import * as gulp from 'gulp';
import * as gulpLoadPlugins from 'gulp-load-plugins';
import * as merge from 'merge-stream';
import * as util from 'gulp-util';
import { join/*, sep, relative*/ } from 'path';
import { APP_DEST, APP_SRC, /*PROJECT_ROOT, */TOOLS_DIR, TYPED_COMPILE_INTERVAL } from '../../config';
import { makeTsProject, templateLocals } from '../../utils';
const plugins = <any>gulpLoadPlugins();
let typedBuildCounter = TYPED_COMPILE_INTERVAL; // Always start with the typed build.
/**
* Executes the build process, transpiling the TypeScript files (except the spec and e2e-spec files) for the development
* environment.
*/
export = () => {
let tsProject: any;
let typings = gulp.src([
'typings/index.d.ts',
TOOLS_DIR + '/manual_typings/**/*.d.ts'
]);
let src = [
join(APP_SRC, '**/*.ts'),
'!' + join(APP_SRC, '**/*.spec.ts'),
'!' + join(APP_SRC, '**/*.e2e-spec.ts')
];
let projectFiles = gulp.src(src);
let result: any;
let isFullCompile = true;
// Only do a typed build every X builds, otherwise do a typeless build to speed things up
if (typedBuildCounter < TYPED_COMPILE_INTERVAL) {
isFullCompile = false;
tsProject = makeTsProject({isolatedModules: true});
projectFiles = projectFiles.pipe(plugins.cached());
util.log('Performing typeless TypeScript compile.');
} else {
tsProject = makeTsProject();
projectFiles = merge(typings, projectFiles);
}
result = projectFiles
.pipe(plugins.plumber())
.pipe(plugins.sourcemaps.init())
.pipe(plugins.typescript(tsProject))
.on('error', () => {
typedBuildCounter = TYPED_COMPILE_INTERVAL;
});
if (isFullCompile) {
typedBuildCounter = 0;
} else {
typedBuildCounter++;
}
return result.js
.pipe(plugins.sourcemaps.write())
// Use for debugging with Webstorm/IntelliJ
// https://github.com/mgechev/angular2-seed/issues/1220
// .pipe(plugins.sourcemaps.write('.', {
// includeContent: false,
// sourceRoot: (file: any) =>
// relative(file.path, PROJECT_ROOT + '/' + APP_SRC).replace(sep, '/') + '/' + APP_SRC
// }))
.pipe(plugins.template(templateLocals()))
.pipe(gulp.dest(APP_DEST));
};
Gulp Production code
import * as gulp from 'gulp';
import * as gulpLoadPlugins from 'gulp-load-plugins';
import { join } from 'path';
import { TMP_DIR, TOOLS_DIR } from '../../config';
import { makeTsProject, templateLocals } from '../../utils';
const plugins = <any>gulpLoadPlugins();
const INLINE_OPTIONS = {
base: TMP_DIR,
useRelativePaths: true,
removeLineBreaks: true
};
/**
* Executes the build process, transpiling the TypeScript files for the production environment.
*/
export = () => {
let tsProject = makeTsProject();
let src = [
'typings/index.d.ts',
TOOLS_DIR + '/manual_typings/**/*.d.ts',
join(TMP_DIR, '**/*.ts')
];
let result = gulp.src(src)
.pipe(plugins.plumber())
.pipe(plugins.inlineNg2Template(INLINE_OPTIONS))
.pipe(plugins.typescript(tsProject))
.once('error', function () {
this.once('finish', () => process.exit(1));
});
return result.js
.pipe(plugins.template(templateLocals()))
.pipe(gulp.dest(TMP_DIR));
};
The actual gulp code is much more complicated that this, as this is only 2 of the several dozen gulp files in the repo.
So, which one is easier to you?
In my opinion, NPM scripts far surpasses gulp and grunt, in both effectiveness and ease of use, and all front-end developers should consider using it in their workflow because it is a major time saver.
UPDATE
There is one scenario I've encountered where I wanted to use Gulp in combination with NPM scripts and Webpack.
When I need to do remote debugging on an iPad or Android device for example, I need to start up extra servers. In the past I ran all the servers as separate processes, from within IntelliJ IDEA (Or Webstorm) that is easy with the "Compound" Run Configuration. But if I need to stop and restart them, it was tedious to have to close 5 different server tabs, plus the output was spread across the different windows.
One of the benefits of gulp is that is can chain all the output from separate independent processes into one console window, which becomes the parent of all the child servers.
So I created a very simple gulp task that just runs my NPM scripts or the commands directly, so all the output appears in one window, and I can easily end all 5 servers at once by closing the gulp task window.
Gulp.js
/**
* Gulp / Node utilities
*/
var gulp = require('gulp-help')(require('gulp'));
var utils = require('gulp-util');
var log = utils.log;
var con = utils.colors;
/**
* Basic workflow plugins
*/
var shell = require('gulp-shell'); // run command line from shell
var browserSync = require('browser-sync');
/**
* Performance testing plugins
*/
var ngrok = require('ngrok');
// Variables
var serverToProxy1 = "localhost:5000";
var finalPort1 = 8000;
// When the user enters "gulp" on the command line, the default task will automatically be called. This default task below, will run all other tasks automatically.
// Default task
gulp.task('default', function (cb) {
console.log('Starting dev servers!...');
gulp.start(
'devserver:jit',
'nodemon',
'browsersync',
'ios_webkit_debug_proxy'
'ngrok-url',
// 'vorlon',
// 'remotedebug_ios_webkit_adapter'
);
});
gulp.task('nodemon', shell.task('cd ../backend-nodejs && npm run nodemon'));
gulp.task('devserver:jit', shell.task('npm run devserver:jit'));
gulp.task('ios_webkit_debug_proxy', shell.task('npm run ios-webkit-debug-proxy'));
gulp.task('browsersync', shell.task(`browser-sync start --proxy ${serverToProxy1} --port ${finalPort1} --no-open`));
gulp.task('ngrok-url', function (cb) {
return ngrok.connect(finalPort1, function (err, url) {
site = url;
log(con.cyan('ngrok'), '- serving your site from', con.yellow(site));
cb();
});
});
// gulp.task('vorlon', shell.task('vorlon'));
// gulp.task('remotedebug_ios_webkit_adapter', shell.task('remotedebug_ios_webkit_adapter'));
Still quite a bit of code just to run 5 tasks, in my opinion, but it works for the purpose. One caveate is that gulp-shell doesn't seem to run some commands correctly, such as ios-webkit-debug-proxy. So I had to create an NPM Script that just executes the same command, and then it works.
So I primarily use NPM Scripts for all my tasks, but occasionally when I need to run a bunch of servers at once, I'll fire up my Gulp task to help out. Pick the right tool for the right job.
UPDATE 2
I now use a script called concurrently which does the same thing as the gulp task above. It runs multiple CLI scripts in parallel and pipes them all to the same console window, and its very simple to use. Once again, no code required (well, the code is inside the node_module for concurrently, but you don't have to concern yourself with that)
// NOTE: If you need to run a command with spaces in it, you need to use
// double quotes, and they must be escaped (at least on windows).
// It doesn't seem to work with single quotes.
"run:all": "concurrently \"npm run devserver\" nodemon browsersync ios_webkit_debug_proxy ngrok-url"
This runs all 5 scripts in parallel piped out to one terminal. Awesome! So that this point, I rarely use gulp, since there are so many cli scripts to do the same tasks with no code.
I suggest you read these articles which compare them in depth.
How to Use NPM as a Build Tool
Why we should stop using Grunt & Gulp
Why I Left Gulp and Grunt for NPM Scripts
I used both options in my different projects.
Here is one boilerplate that I put together using gulp with webpack - https://github.com/iroy2000/react-reflux-boilerplate-with-webpack.
I have some other project used only webpack with npm tasks.
And they both works totally fine. And I think it burns down to is how complicated your task is, and how much control you want to have in your configuration.
For example, if you tasks is simple, let's say dev, build, test ... etc ( which is very standard ), you are totally fine with just simple webpack with npm tasks.
But if you have very complicated workflow and you want to have more control of your configuration ( because it is coding ), you could go for gulp route.
But from my experience, webpack ecosystem provides more than enough plugins and loaders that I will need, and so I love using the bare minimum approach unless there is something you can only do in gulp. And also, it will make your configuration easier if you have one less thing in your system.
And a lot of times, nowadays, I see people actually replacing gulp and browsify all together with webpack alone.
The concepts of Gulp and Webpack are quite different. You tell Gulp how to put front-end code together step-by-step, but you tell Webpack what you want through a config file.
Here is a short article (5 min read) I wrote explaining my understanding of the differences: https://medium.com/#Maokai/compile-the-front-end-from-gulp-to-webpack-c45671ad87fe
Our company moved from Gulp to Webpack in the past year. Although it took some time, we figured out how to move all we did in Gulp to Webpack. So to us, everything we did in Gulp we can also do through Webpack, but not the other way around.
As of today, I'd suggest just use Webpack and avoid the mixture of Gulp and Webpack so you and your team do not need to learn and maintain both, especially because they are requiring very different mindsets.
Honestly I think the best is to use both.
Webpack for all javascript related.
Gulp for all css related.
I still have to find a decent solution for packaging css with webpack, and so far I am happy using gulp for css and webpack for javascript.
I also use npm scripts as #Tetradev as described. Especially since I am using Visual Studio, and while NPM Task runner is pretty reliable Webpack Task Runner is pretty buggy.

Resources