electron-windows-installer slow execution - node.js

I'm doing some deployment tests in windows and I'm using "electron-windows-installer" package to create a windows installer from my electron app.
I did it as a gulp task.
'use strict';
var gulp = require('gulp');
var winInstaller = require('electron-windows-installer');
gulp.task('create-windows-installer', function(done) {
winInstaller({
appDirectory: 'build/myApp',
outputDirectory: 'build/release',
iconUrl: 'URIToIcon',
exe: 'myApp.exe',
title: 'myApp',
setupExe: 'myApp.exe',
setpMsi: 'myApp.msi',
setupIcon: 'pathToIcon',
loadingGif: 'pathToGif',
arch: 'ia32'
}).then(done).catch(done);
});
And my package.json has the following command to run it from npm
"installer": "gulp windows-installer"
When I do npm run installer everything is working but the execution to create this installer is about 1 hour and 10 minutes. I have 52 dependencies in my project and my final executable is about 200MB. I'm wondering if it's normal that this process takes so long or something is bad in my code.
Thank you very much.

The process take too long because of the cache of folders/files to new build.
Just clean the outputDirectory (either appDirectory if necessary) and then build again, you'll good to go.

Related

Gulp Clean is triggered when commandline arguments passed into gulp task

I have the following gulp task:
build.task('upload', {
execute: (config) => {
/*
THIS WORKS, BUT ONLY if i do "gulp upload".
"gulp upload -u < commandline options >" fails.
const uname = "johndoe#asdf.com";
const pwd = "supersecret";
const siteUrl = "https://<mytenant>.sharepoint.com/sites";
const siteCatalogUrl = "https://<mytenant>.sharepoint.com/sites/CatalogSiteName";
const catalogName = "AppCatalog";
console.log(uname);
console.log(siteUrl);
console.log(siteCatalogUrl);
console.log(catalogName);
*/
const uname = config.args['u'];
const pwd = config.args['p'];
const siteUrl = config.args['sU'];
const siteCatalogUrl = config.args['cU'];
const catalogName = config.args['c'];
console.log(uname);
console.log(siteUrl);
console.log(siteCatalogUrl);
console.log(catalogName);
return new Promise((resolve, reject) => {
const pkgFile = require('./config/package-solution.json');
const folderLocation = `./sharepoint/${pkgFile.paths.zippedPackage}`;
return gulp.src(folderLocation)
.pipe(spsync({
"username": uname,
"password": pwd,
"site": siteCatalogUrl,
"libraryPath": catalog,
"publish": true
}))
.on('finish', resolve);
});
}
});
when I'm testing on the command line, i run this like this:
gulp upload
the first thing it does call gulp clean ... which blows away the sppkg. So the upload task fails.
The other artifact that I've noticed, which I can't explain is that when I run the gulp task, i see this:
Build target: SHIP
instead of the usual
Build target: DEBUG
Output
lab3:search-parts admin$ gulp upload
Build target: SHIP
[14:16:31] Using gulpfile /src/search/gulpfile.js
[14:16:31] Starting 'upload'...
[14:16:31] Starting gulp
[14:16:31] Starting subtask 'clean'...
[14:16:31] Finished subtask 'clean' after 119 ms
[14:16:31] The following tasks did not complete: upload
[14:16:31] Did you forget to signal async completion?
About to exit with code: 0
Process terminated before summary could be written, possible error in async code not continuing!
Trying to exit with exit code 1
Dunno if it's related, but sharing in case...
Thanks.
EDIT 1
I can consistently recreate the problem, and I found a fix too - albeit - not one that i can use in production. But I don't know what the root cause of the issue is yet.
To create the problem I simply need to pass in the arguments via the config object. In other words, I trigger this gulp method via command line like this:
gulp package-solution --ship
gulp upload --u "johndoe#asdf.com" --p "supersecret" --sU "https://<mytenant>.sharepoint.com/sites/CatalogSiteName" --cU "https://<mytenant>.sharepoint.com/sites/CatalogSiteName" --c "AppCatalog"
When I run the script, the first thing it does is a gulp clean.
If i comment out all the logic to grab the variables from the config.args[] and just use the hardcoded values... it works. But I have to make sure that I don't supply the arguments via the commandline. So in other words, this works:
lab3:spparts admin$ gulp upload
Build target: DEBUG
[14:27:27] Using gulpfile /src/sp/spparts/gulpfile.js
[14:27:27] Starting 'upload'...
[14:27:27] Starting gulp
johndoe#asdf.com
https://<mytenant>.sharepoint.com/sites/CatalogSiteName
https://<mytenant>.sharepoint.com/sites/CatalogSiteName
AppCatalog
[14:27:27] Uploading spparts.sppkg
[14:27:32] Upload successful 5289ms
[14:27:35] Published file 2408ms
[14:27:35] Finished 'upload' after 7.73 s
[14:27:35] ==================[ Finished ]==================
[14:27:36] Project spparts version:4.3.0
[14:27:36] Build tools version:3.17.11
[14:27:36] Node version:v10.24.1
[14:27:36] Total duration:13 s
But this doesn't: (eventhough the js code is still using hardcoded values)
lab3:spparts admin$ gulp upload --u "johndoe#asdf.com" --p "supersecret" --sU "https://<mytenant>.sharepoint.com/sites/CatalogSiteName" --cU "https://<mytenant>.sharepoint.com/sites/CatalogSiteName" --c "AppCatalog"
Build target: SHIP
[14:27:27] Using gulpfile /src/sp/spparts/gulpfile.js
[14:27:27] Starting 'upload'...
[14:27:27] Starting gulp
[14:30:36] Starting subtask 'clean'...
[14:30:36] Finished subtask 'clean' after 68 ms
johndoe#asdf.com
https://<mytenant>.sharepoint.com/sites/CatalogSiteName
https://<mytenant>.sharepoint.com/sites/CatalogSiteName
AppCatalog
[14:30:36] 'upload' errored after 85 ms
[14:30:36] Error: File not found with singular glob: /src/spparts/sharepoint/solution/spparts.sppkg (if this was purposeful, use `allowEmpty` option)
So in summary, I think I can say when I pass in command line arguments, the script calls gulp clean. But I don't know why.
In case it helps, here's my version information:
lab3:spparts admin$ gulp -v
CLI version: 2.3.0
Local version: 4.0.2
version of sp-build-web is 1.12.1
I take it the upload task uploads the package to the [tenant|site collection] app catalog? This doesn't solve your problem in the post, but have you considered using the CLI for Microsoft 365? It's got a command that does this for you.
Best of all, there's no code to add to your project's gulpfile.js or code to maintain. Works great... using three commands you (1) login, (2) upload, and (3) deploy (ie: trust) the package.
# Sign into Microsoft 365
m365 login ${{ m365_target_site_url }} --authType password --userName ${{ m365_user_login }} --password ${{ m365_user_password }}
displayName:
# Upload SharePoint package to Site Collection App Catalog
m365 spo app add --filePath ${{ sppkg_filepath }} --appCatalogUrl ${{ m365_target_site_url }}/AppCatalog --scope sitecollection --overwrite
# Deploy SharePoint package
m365 spo app deploy --name ${{ spPkgFileName}} --appCatalogUrl ${{ m365_target_site_url }} --scope sitecollection --skipFeatureDeployment

Setting up Angular Universal App for development

I have created a project with Angular-CLI. (using command: ng new my-angular-universal).
Then I carefully followed all the instructions from https://github.com/angular/angular-cli/wiki/stories-universal-rendering
It builds for --prod and works fine. But there are no instructions on how I can set up a --dev build and have it served with --watch flag.
I tried removing --prod flags from npm "scripts", and it doesn't even run in dev mode. It builds fine but when I open it in browser this is what I see (directly printed to response):
TypeError: Cannot read property 'moduleType' of undefined
at C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:7069:134
at ZoneDelegate.invoke (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:105076:26)
at Object.onInvoke (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:6328:33)
at ZoneDelegate.invoke (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:105075:32)
at Zone.run (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:104826:43)
at NgZone.run (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:6145:69)
at PlatformRef.bootstrapModuleFactory (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:7068:23)
at Object.renderModuleFactory (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:52132:39)
at View.engine (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:104656:23)
at View.render (C:\Users\Mikser\documents\git\my-angular-universal\dist\server.js:130741:8)
the versions of npm packages that I use are currently the latest:
#angular/* - #5.2.*
#angular/cli #1.7.3
except for ts-loader, had to downgrade it because it wasn't working:
ts-loader #3.5.0
So if anyone has any info on how to make this work, it would be very appreciated! Or maybe you know some project templates with Angular Universal App configured for both --dev and --prod builds and ability to --watch?
For development, run npm run start which triggers ng serve. The current setup has hot module reloading so it will watch for your changes and update your dev view. I used the same instructions and got it working here https://github.com/ariellephan/angular5-universal-template
In short, for development, run npm run start and look at http://localhost:4200.
For production, run npm run build:ssr and npm run serve:ssrand look at http://localhost:4000
As contributors have pointed out, it might not be the most efficient and fastest way to develop, but nevertheless I did not want to accept workarounds. Besides, hosting front and back on separate servers brings up CORS issues, and I never planned my app to run on separate hosts, I wanted it all on the same host together with API methods.
The problem with --dev build was this:
when building with the following command:
ng build --app 1 --output-hashing=false (note that there is no --prod flag)
AppServerModuleNgFactory turned out missing in the ./dist-server/main.bundle
I imagine that this relates to the ahead of time(--aot) compilation which is the default behavior if you are building for --prod. So the instructions from https://github.com/angular/angular-cli/wiki/stories-universal-rendering included instructions to configure express server for production build only. And since there is no need for server to be able to dynamically render html templates the working --dev build command would be:
ng build --app 1 --output-hashing=false --aot
and this gets rid of the TypeError: Cannot read property 'moduleType' of undefined
Now to watch this whole mess:
run these in separate command windows:
ng build --watch
ng build --app 1 --output-hashing=false --aot --watch
webpack --config webpack.server.config.js --progress --colors --watch
And for the server to restart on change, you have to install nodemon package and run it like this:
nodemon --inspect dist/server (--inspect if you wish to debug server with chrome)
Some other important stuff:
Angular/CLI has a command to generate necessary scaffolding for a universal app:
ng generate universal
and it generates a fixed version of main.ts that avoids client angular bootstrap issue:
document.addEventListener('DOMContentLoaded', () => {
platformBrowserDynamic().bootstrapModule(AppModule)
.catch(err => console.log(err));
});
a problem that I stumbled upon once I implemented TransferState
There are basically two parts - the server and the UI. While developing the UI, I simply use ng serve. That means when I make changes in my code in the IDE, the browser refreshes automatically. And, here the server part is not used.
I do prod build and run the server only for final testing to see if everything works as expected (No error due to any 3PP library DOM manipulation or AOT related issues, etc.)
Here, I have created a skeleton structure of an Angular Universal project. As I extensively use Vagrant and Docker in my projects, I run the server in a Docker container within the Vagrant guest system. And for development of the UI, I don't run the server. Simply, the ng serve is used.
If you look into my structure in the above Github link, you'll find the details as to how to run it for development and production in the Readme file.
The web server handler server.ts uses the server bundle
const { AppServerModuleNgFactory, LAZY_MODULE_MAP } = require('./dist/server/main.bundle');
That's why the server bundle needs to be compiled before you can compile the server.ts file.
So having a watch system would mean
watching/recompiling the client bundle
watching/recompiling the server bundle
recompiling the server.ts once the server bundle is created
All of them take some time (especially if you do it with aot)
I'd recommend, like Saptarshi Basu mentionned, to develop as best as you can with ng serve and check with angular universal every so often.
Otherwise, it should be possible do achieve what you want with some kind of tasks (grunt/gulp/...) which triggers sequentially ng build ... and recompilation of server.ts file.
It is a bit messy no doubt, as we preferably wish for one command to rule them all.
I came up with a somewhat OK solution where my output will be:
dist/browser
dist/ng-server
Using the executable npm-run-all package (I find it working a lot better on windows machines than concurrently does) I run the three watch tasks: browser, ng-server and nodeJS. Watching node has a pre-task defined that simply runs a small utility/helper/file that watches for the existence of a dist/ng-server folder and terminate itself once found.
For all of this to work (based on the universal-starter repo as of november 2018) there's a couple of modifications to package.json required. Primarily, to support the --watch flag on ng run commands we need to update the compiler-cli (if memory serves), ng update --all should take care of that, giving you the latest angular/cli version in the process (assuming you have a recent cli version installed globally).
package.json
ng update --all
angular 6+
angular/cli 7+
yarn add/npm install the following
chokidar
npm-run-all
(runs our tasks in parallel with the -p flag. -p kills all processes, -l gives each running task a specific color and name in the console)
ts-node (runs nodejs in it's ts-format)
nodemon // for restarting ts-node
add something similar to my util/await-file.js (after some consideration I added my own file-watcher code below even though it wasn't exactly written with the intentions to be put up on display...)
modify your package.json scripts like below
modify your angular.json to match your folder names, following my examples, mainly the "server"'s outputPath should be changed from dist/server to dist/ng-server.
package.json scripts
"dev": "npm-run-all -p -r -l watch:ng-server watch:browser watch:node",
"watch:browser": "ng build --prod --progress --watch --delete-output-path",
"watch:ng-server": "ng run ng-universal-demo:server --watch --delete-output-path",
"watch:node": "yarn run watch:file-exist && yarn run ts-node",
"ts-node": "nodemon --exec ts-node server.ts -e ts,js",
"watch:file-exist": "node utils/await-file.js",
util/await-file.js
const chokidar = require('chokidar');
const fs = require('fs');
const path = require('path');
const DIR_NAME = 'ng-server';
const DIST_PATH = './dist';
// creates dist folder if it doesn't exist - prior to adding it to the watcher.
if (!fs.existsSync(DIST_PATH)) {
fs.mkdirSync(DIST_PATH);
}
const watcher = chokidar.watch('file, dir', {
ignored: '*.map',
persistent: true,
awaitWriteFinish: {
stabilityThreshold: 5000,
pollInterval: 100
}
});
const FOLDER_PATH = path.join(process.cwd(), 'dist');
watcher.add(FOLDER_PATH);
console.log(`file-watcher running, waiting for ${DIST_PATH}/${DIR_NAME}`);
function fileFound() {
console.log(`${DIR_NAME} folder found - closing`);
watcher.close();
process.exit();
}
watcher
.on('add', function (filePath) {
const matchWith = path.join('dist', DIR_NAME);
const paths = filePath.split(path.sep);
const fileName = paths[paths.length - 1];
if ((filePath.indexOf(matchWith) >= 0)
&& fileName.indexOf('.js') > fileName.length - 4) {
fileFound();
}
})
.on('error', error => console.log(`Watcher error: ${error}`));
"npm run start" and using "http://localhost:4200" works for me. Even with Angular 10

How to debug app startup with Gulp

I have run into a road block with a new team I am working with that supports a node app. The app is launched via Gulp, and the setup is such that there is a "core" NPM module that defines a bunch of gulp tasks and a "server", and our app simply installs this package and our code is copied in as a "plugin" to the server.
In our gulpfile.js, we have something like:
var gulp = require('gulp');
var workflow = require('base-workflow');
workflow.use({ gulp: gulp });
gulp.task('default'), ['base:default']);
...more stuff
Where base:default is pulled in and a couple of Hapi servers are ultimately started (one as a "web" app, one as the "rest" proxy app to a real Java-based REST services). What I would like to do is setup node-inpector so that I can troubleshoot the startup of the app because I have found that the latest versions of their base packages are not Mac-compatible.
What I have tried is to install gulp-node-inspector with the following changes:
var gulp = require('gulp');
var nodeInspector = require('gulp-node-inspector');
var workflow = require('base-workflow');
workflow.use({ gulp: gulp });
gulp.task('default'), ['base:default']);
gulp.task('debug', ['default'], function() { gulp.src([]).pipe(nodeInspector({debugBrk: true})); });
...more stuff
and also:
var gulp = require('gulp');
var nodeInspector = require('gulp-node-inspector');
var workflow = require('base-workflow');
workflow.use({ gulp: gulp });
gulp.task('default'), ['base:default']);
gulp.task('debug', function() { gulp.src(['default']).pipe(nodeInspector({debugBrk: true})); });
...more stuff
but neither of those works. Part of this is most likely my lack of understanding of Gulp. Does anyone know how I can debug this app?
I spent a fair bit of time googling and trying the various solutions out there; in the end the one that worked for me was the accepted answer found on this page:
How to debug gulpfile.js
This was the only one that allowed me to actually hit my "debugger" command in my gulp task.
I should also note that I had to completely uninstall and reinstall "node-inspector"; there was a version problem and when I was on the verge of solving it I was getting some "cannot find module" error because the version of node-inspector was causing it to point to the wrong folder. Once I uninstalled and reinstalled (via npm) then it worked. In my case I'm on a Windows machine and the command that worked looked like the following:
node-debug C:\myPathWhereGulpfileDotJsExists\node_modules\gulp\bin\gulp.js --gulpfile C:\myPathWhereGulpfileDotJsExists\gulpfile.js myTestTaskContainingDebuggerCommand
Maby this solution help you
node --inspect --debug-brk ./node_modules/gulp fonts
The best way to do this now is to add a debugger; to the place in the file you would like to add a breakpoint to, or set it manually once the debugger has started with setBreakpoint('gulpFile.js', 1)
Then simply
node inspect --inspect-brk $(which gulp) taskName
c
More information about debugging with node here

Gulp Watching Creates Infinite Loop Without Changing Files

Similar to other questions, in this very watered-down snippet, running the default gulp task (via npm start which runs gulp); this snippet creates an infinite loop running the scripts task over and over. Here is the gulpfile.js (literally the whole thing at the moment):
'use strict';
const gulp = require('gulp');
// COPY SCRIPTS TO BUILD FOLDER
gulp.task('scripts', function() {
return gulp.src('./scripts/**/*.js')
.pipe(gulp.dest('build/scripts'))
});
// WATCH FILES
gulp.task('watch', function() {
gulp.watch('./scripts/**/*.js', ['scripts']);
});
// DEFAULT
gulp.task('default', ['watch']);
The extra odd thing is that whether the build folder is built anew or not, the scripts task will be executed immediately after calling npm start! And the loop begins.
In case you're curious, here is the pasted (and only) scripts object in my package.json:
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1",
"start": "gulp"
},
The only other thing in my directory is a scripts folder with an app.js and an home.js file in it. Obviously once this task is run, the build folder is created (if it wasn't already there yet) and the two aforementioned files are copied into it.
You can see I'm only looking for scripts in the root directory's first level folder called scripts, so I shouldn't have an infinite loop by referencing changes on the same set of scripts. Also, even if I'm explicit, and point to exactly one particular file with a relative path such as ./scripts/home.js this still happens.
I'm anticipating being embarrassed, but I'm utterly confused.
A few things I've picked up on which could be causing some errors.
EDIT -
Try the watch plugin npm install --save-dev gulp-watch
// Try and declare your plugins like this for now.
var gulp = require('gulp'),
watch = require('gulp-watch');
// Provide a callback, cb
gulp.task('scripts', function(cb) {
// Dont use ./ on your src as watch can have a problem with this
return gulp.src('scripts/**/*.js')
.pipe(gulp.dest('./build/scripts'), cb); // call cb and dont forget ;
});
// remove ./ on watch
gulp.task('watch', function() {
gulp.watch('scripts/**/*.js', ['scripts']);
});
gulp.task('default', ['watch']);
So that is pretty weird behaviour but this should do the trick.
The only time I use ./ within gulp is on my dest.
Also just remember that gulpfile is just a JS so remember your semicolon, etc.
I cannot guarantee the resolution here, but I had a two variables that changed when this was resolved:
I upgraded my Parallels VM application (on an Apple PowerBook) from
version 10 -> 11.
I reinstalled Windows 10 using another license for a current version
(the previous one was a licensed dev or early release version).
My code, Node version, devDependencies and versions were identical.

Gulp + Webpack or JUST Webpack?

I see people using gulp with webpack. But then I read webpack can replace gulp? I'm completely confused here...can someone explain?
UPDATE
in the end I started with gulp. I was new to modern front-end and just wanted to get up and running quick. Now that I've got my feet quite wet after more than a year, I'm ready to move to webpack. I suggest the same route for people who start off in the same shoes. Not saying you can't try webpack but just sayin if it seems complicated start with gulp first...nothing wrong with that.
If you don't want gulp, yes there's grunt but you could also just specify commands in your package.json and call them from the command-line without a task runner just to get up and running initially. For example:
"scripts": {
"babel": "babel src -d build",
"browserify": "browserify build/client/app.js -o dist/client/scripts/app.bundle.js",
"build": "npm run clean && npm run babel && npm run prepare && npm run browserify",
"clean": "rm -rf build && rm -rf dist",
"copy:server": "cp build/server.js dist/server.js",
"copy:index": "cp src/client/index.html dist/client/index.html",
"copy": "npm run copy:server && npm run copy:index",
"prepare": "mkdir -p dist/client/scripts/ && npm run copy",
"start": "node dist/server"
},
This answer might help. Task Runners (Gulp, Grunt, etc) and Bundlers (Webpack, Browserify). Why use together?
...and here's an example of using webpack from within a gulp task. This goes a step further and assumes that your webpack config is written in es6.
var gulp = require('gulp');
var webpack = require('webpack');
var gutil = require('gutil');
var babel = require('babel/register');
var config = require(path.join('../..', 'webpack.config.es6.js'));
gulp.task('webpack-es6-test', function(done){
webpack(config).run(onBuild(done));
});
function onBuild(done) {
return function(err, stats) {
if (err) {
gutil.log('Error', err);
if (done) {
done();
}
} else {
Object.keys(stats.compilation.assets).forEach(function(key) {
gutil.log('Webpack: output ', gutil.colors.green(key));
});
gutil.log('Webpack: ', gutil.colors.blue('finished ', stats.compilation.name));
if (done) {
done();
}
}
}
}
I think you'll find that as your app gets more complicated, you might want to use gulp with a webpack task as per example above. This allows you to do a few more interesting things in your build that webpack loaders and plugins really don't do, ie. creating output directories, starting servers, etc. Well, to be succinct, webpack actually can do those things, but you might find them limited for your long term needs. One of the biggest advantages you get from gulp -> webpack is that you can customize your webpack config for different environments and have gulp do the right task for the right time. Its really up to you, but there's nothing wrong with running webpack from gulp, in fact there's some pretty interesting examples of how to do it. The example above is basically from jlongster.
NPM scripts can do the same as gulp, but in about 50x less code. In fact, with no code at all, only command line arguments.
For example, the use case you described where you want to have different code for different environments.
With Webpack + NPM Scripts, it's this easy:
"prebuild:dev": "npm run clean:wwwroot",
"build:dev": "cross-env NODE_ENV=development webpack --config config/webpack.development.js --hot --profile --progress --colors --display-cached",
"postbuild:dev": "npm run copy:index.html && npm run rename:index.html",
"prebuild:production": "npm run clean:wwwroot",
"build:production": "cross-env NODE_ENV=production webpack --config config/webpack.production.js --profile --progress --colors --display-cached --bail",
"postbuild:production": "npm run copy:index.html && npm run rename:index.html",
"clean:wwwroot": "rimraf -- wwwroot/*",
"copy:index.html": "ncp wwwroot/index.html Views/Shared",
"rename:index.html": "cd ../PowerShell && elevate.exe -c renamer --find \"index.html\" --replace \"_Layout.cshtml\" \"../MyProject/Views/Shared/*\"",
Now you simply maintain two webpack config scripts, one for development mode, webpack.development.js, and one for production mode, webpack.production.js. I also utilize a webpack.common.js which houses webpack config shared on all environments, and use webpackMerge to merge them.
Because of the coolness of NPM scripts, it allows for easy chaining, similar to how gulp does Streams/pipes.
In the example above, to build for developement, you simply go to your command line and execute npm run build:dev.
NPM would first run prebuild:dev,
Then build:dev,
And finally postbuild:dev.
The pre and post prefixes tell NPM which order to execute in.
If you notice, with Webpack + NPM scripts, you can run a native programs, such as rimraf, instead of a gulp-wrapper for a native program such as gulp-rimraf. You can also run native Windows .exe files as I did here with elevate.exe or native *nix files on Linux or Mac.
Try doing the same thing with gulp. You'll have to wait for someone to come along and write a gulp-wrapper for the native program you want to use. In addition, you'll likely need to write convoluted code like this: (taken straight from angular2-seed repo)
Gulp Development code
import * as gulp from 'gulp';
import * as gulpLoadPlugins from 'gulp-load-plugins';
import * as merge from 'merge-stream';
import * as util from 'gulp-util';
import { join/*, sep, relative*/ } from 'path';
import { APP_DEST, APP_SRC, /*PROJECT_ROOT, */TOOLS_DIR, TYPED_COMPILE_INTERVAL } from '../../config';
import { makeTsProject, templateLocals } from '../../utils';
const plugins = <any>gulpLoadPlugins();
let typedBuildCounter = TYPED_COMPILE_INTERVAL; // Always start with the typed build.
/**
* Executes the build process, transpiling the TypeScript files (except the spec and e2e-spec files) for the development
* environment.
*/
export = () => {
let tsProject: any;
let typings = gulp.src([
'typings/index.d.ts',
TOOLS_DIR + '/manual_typings/**/*.d.ts'
]);
let src = [
join(APP_SRC, '**/*.ts'),
'!' + join(APP_SRC, '**/*.spec.ts'),
'!' + join(APP_SRC, '**/*.e2e-spec.ts')
];
let projectFiles = gulp.src(src);
let result: any;
let isFullCompile = true;
// Only do a typed build every X builds, otherwise do a typeless build to speed things up
if (typedBuildCounter < TYPED_COMPILE_INTERVAL) {
isFullCompile = false;
tsProject = makeTsProject({isolatedModules: true});
projectFiles = projectFiles.pipe(plugins.cached());
util.log('Performing typeless TypeScript compile.');
} else {
tsProject = makeTsProject();
projectFiles = merge(typings, projectFiles);
}
result = projectFiles
.pipe(plugins.plumber())
.pipe(plugins.sourcemaps.init())
.pipe(plugins.typescript(tsProject))
.on('error', () => {
typedBuildCounter = TYPED_COMPILE_INTERVAL;
});
if (isFullCompile) {
typedBuildCounter = 0;
} else {
typedBuildCounter++;
}
return result.js
.pipe(plugins.sourcemaps.write())
// Use for debugging with Webstorm/IntelliJ
// https://github.com/mgechev/angular2-seed/issues/1220
// .pipe(plugins.sourcemaps.write('.', {
// includeContent: false,
// sourceRoot: (file: any) =>
// relative(file.path, PROJECT_ROOT + '/' + APP_SRC).replace(sep, '/') + '/' + APP_SRC
// }))
.pipe(plugins.template(templateLocals()))
.pipe(gulp.dest(APP_DEST));
};
Gulp Production code
import * as gulp from 'gulp';
import * as gulpLoadPlugins from 'gulp-load-plugins';
import { join } from 'path';
import { TMP_DIR, TOOLS_DIR } from '../../config';
import { makeTsProject, templateLocals } from '../../utils';
const plugins = <any>gulpLoadPlugins();
const INLINE_OPTIONS = {
base: TMP_DIR,
useRelativePaths: true,
removeLineBreaks: true
};
/**
* Executes the build process, transpiling the TypeScript files for the production environment.
*/
export = () => {
let tsProject = makeTsProject();
let src = [
'typings/index.d.ts',
TOOLS_DIR + '/manual_typings/**/*.d.ts',
join(TMP_DIR, '**/*.ts')
];
let result = gulp.src(src)
.pipe(plugins.plumber())
.pipe(plugins.inlineNg2Template(INLINE_OPTIONS))
.pipe(plugins.typescript(tsProject))
.once('error', function () {
this.once('finish', () => process.exit(1));
});
return result.js
.pipe(plugins.template(templateLocals()))
.pipe(gulp.dest(TMP_DIR));
};
The actual gulp code is much more complicated that this, as this is only 2 of the several dozen gulp files in the repo.
So, which one is easier to you?
In my opinion, NPM scripts far surpasses gulp and grunt, in both effectiveness and ease of use, and all front-end developers should consider using it in their workflow because it is a major time saver.
UPDATE
There is one scenario I've encountered where I wanted to use Gulp in combination with NPM scripts and Webpack.
When I need to do remote debugging on an iPad or Android device for example, I need to start up extra servers. In the past I ran all the servers as separate processes, from within IntelliJ IDEA (Or Webstorm) that is easy with the "Compound" Run Configuration. But if I need to stop and restart them, it was tedious to have to close 5 different server tabs, plus the output was spread across the different windows.
One of the benefits of gulp is that is can chain all the output from separate independent processes into one console window, which becomes the parent of all the child servers.
So I created a very simple gulp task that just runs my NPM scripts or the commands directly, so all the output appears in one window, and I can easily end all 5 servers at once by closing the gulp task window.
Gulp.js
/**
* Gulp / Node utilities
*/
var gulp = require('gulp-help')(require('gulp'));
var utils = require('gulp-util');
var log = utils.log;
var con = utils.colors;
/**
* Basic workflow plugins
*/
var shell = require('gulp-shell'); // run command line from shell
var browserSync = require('browser-sync');
/**
* Performance testing plugins
*/
var ngrok = require('ngrok');
// Variables
var serverToProxy1 = "localhost:5000";
var finalPort1 = 8000;
// When the user enters "gulp" on the command line, the default task will automatically be called. This default task below, will run all other tasks automatically.
// Default task
gulp.task('default', function (cb) {
console.log('Starting dev servers!...');
gulp.start(
'devserver:jit',
'nodemon',
'browsersync',
'ios_webkit_debug_proxy'
'ngrok-url',
// 'vorlon',
// 'remotedebug_ios_webkit_adapter'
);
});
gulp.task('nodemon', shell.task('cd ../backend-nodejs && npm run nodemon'));
gulp.task('devserver:jit', shell.task('npm run devserver:jit'));
gulp.task('ios_webkit_debug_proxy', shell.task('npm run ios-webkit-debug-proxy'));
gulp.task('browsersync', shell.task(`browser-sync start --proxy ${serverToProxy1} --port ${finalPort1} --no-open`));
gulp.task('ngrok-url', function (cb) {
return ngrok.connect(finalPort1, function (err, url) {
site = url;
log(con.cyan('ngrok'), '- serving your site from', con.yellow(site));
cb();
});
});
// gulp.task('vorlon', shell.task('vorlon'));
// gulp.task('remotedebug_ios_webkit_adapter', shell.task('remotedebug_ios_webkit_adapter'));
Still quite a bit of code just to run 5 tasks, in my opinion, but it works for the purpose. One caveate is that gulp-shell doesn't seem to run some commands correctly, such as ios-webkit-debug-proxy. So I had to create an NPM Script that just executes the same command, and then it works.
So I primarily use NPM Scripts for all my tasks, but occasionally when I need to run a bunch of servers at once, I'll fire up my Gulp task to help out. Pick the right tool for the right job.
UPDATE 2
I now use a script called concurrently which does the same thing as the gulp task above. It runs multiple CLI scripts in parallel and pipes them all to the same console window, and its very simple to use. Once again, no code required (well, the code is inside the node_module for concurrently, but you don't have to concern yourself with that)
// NOTE: If you need to run a command with spaces in it, you need to use
// double quotes, and they must be escaped (at least on windows).
// It doesn't seem to work with single quotes.
"run:all": "concurrently \"npm run devserver\" nodemon browsersync ios_webkit_debug_proxy ngrok-url"
This runs all 5 scripts in parallel piped out to one terminal. Awesome! So that this point, I rarely use gulp, since there are so many cli scripts to do the same tasks with no code.
I suggest you read these articles which compare them in depth.
How to Use NPM as a Build Tool
Why we should stop using Grunt & Gulp
Why I Left Gulp and Grunt for NPM Scripts
I used both options in my different projects.
Here is one boilerplate that I put together using gulp with webpack - https://github.com/iroy2000/react-reflux-boilerplate-with-webpack.
I have some other project used only webpack with npm tasks.
And they both works totally fine. And I think it burns down to is how complicated your task is, and how much control you want to have in your configuration.
For example, if you tasks is simple, let's say dev, build, test ... etc ( which is very standard ), you are totally fine with just simple webpack with npm tasks.
But if you have very complicated workflow and you want to have more control of your configuration ( because it is coding ), you could go for gulp route.
But from my experience, webpack ecosystem provides more than enough plugins and loaders that I will need, and so I love using the bare minimum approach unless there is something you can only do in gulp. And also, it will make your configuration easier if you have one less thing in your system.
And a lot of times, nowadays, I see people actually replacing gulp and browsify all together with webpack alone.
The concepts of Gulp and Webpack are quite different. You tell Gulp how to put front-end code together step-by-step, but you tell Webpack what you want through a config file.
Here is a short article (5 min read) I wrote explaining my understanding of the differences: https://medium.com/#Maokai/compile-the-front-end-from-gulp-to-webpack-c45671ad87fe
Our company moved from Gulp to Webpack in the past year. Although it took some time, we figured out how to move all we did in Gulp to Webpack. So to us, everything we did in Gulp we can also do through Webpack, but not the other way around.
As of today, I'd suggest just use Webpack and avoid the mixture of Gulp and Webpack so you and your team do not need to learn and maintain both, especially because they are requiring very different mindsets.
Honestly I think the best is to use both.
Webpack for all javascript related.
Gulp for all css related.
I still have to find a decent solution for packaging css with webpack, and so far I am happy using gulp for css and webpack for javascript.
I also use npm scripts as #Tetradev as described. Especially since I am using Visual Studio, and while NPM Task runner is pretty reliable Webpack Task Runner is pretty buggy.

Resources