Gulp - Improve performance of gulp.watch on bundled js files - node.js

I am having issues with the performance of browser livereloading whenever I make a change to a js file. In my gulp setup, I have the following watches:
gulp.task('watch', function() {
gulp.watch(config.paths.html, ['html']);
gulp.watch(config.paths.js, ['js', 'lint']);
gulp.watch(config.paths.css, ['css']);
});
Thus, whenever there's a change to a js file, the js and lint tasks are triggered. They are as follows:
gulp.task('js', function() {
browserify(config.paths.mainJs)
.transform(reactify)
.bundle()
.on('error', console.error.bind(console))
.pipe(source('bundle.js'))
.pipe(gulp.dest(config.paths.dest + '/scripts'))
.pipe(connect.reload());
});
gulp.task('lint', function() {
return gulp.src(config.paths.js)
.pipe(lint({config: 'eslint.config.json'}))
.pipe(lint.format());
});
With this setup, livereloading on js file changes doesn't scale. As the project gets bigger, the live reloads take longer. Even a small project with about a dozen JS files already takes 3.8 seconds per reload.
I know the problem is that on every js file change, you're reactifying and bundling every js file in the project, which is an expensive operation and completely redundant for all the js files other than the one you changed. What's a better way to handle the live reloading? I know webpack uses hot module reloading, is there a gulp equivalent for this?

Related

Is there no way to control the concat order of Gulp?

I've tried gulp-order, I've tried renaming files, and most recently I've tried just explicitly listing the scripts in the src like this:
gulp.task('scripts', function(){
return gulp.src([
'./js/bootstrap.min.js',
'./js/moment-with-locales.min.js',
'./js/jquery.fancybox.min.js'
])
.pipe(concat('bundle.js'))
.pipe(uglify())
.pipe(gulp.dest('./dist/'));
})
No matter what I do the output of bundle.js is not following the order that I need. For what it's worth the order does not change randomly when using concat it is the same order every time. But it's not alphabetical...
During my research it appears that the issue may be that my gulpfile.js changes are not being applied and I may need to "restart" gulp. That being said, I cannot find any information on how to "restart" gulp.
Turns out it wasn't gulp-concat that was changing the order. The order was being changed by gulp-uglify.
I was able to solve this problem by doing the uglify before the concat. Like so:
gulp.task('scripts', function(){
return gulp.src([
'./js/bootstrap.min.js',
'./js/moment-with-locales.min.js',
'./js/jquery.fancybox.min.js'
])
.pipe(uglify())
.pipe(concat('bundle.js'))
.pipe(gulp.dest('./dist/'));
})

How to compile ReactJS for use on server with command line arguments?

I've decided to try out ReactJS. Along with that, I've decided to use Gulp for compiling .jsx to .js, also for the first time.
I can compile it no problem for the client use with browserify. Here's my gulp task:
browserify("./scripts/main.jsx")
.transform(
babelify.configure({
presets: ["react"]
}))
.bundle()
.pipe(source('bundle.js'))
.pipe(gulp.dest('./scripts/'));
But since I use PHP to generate data, I need to get those data to node. If I use browserify, it will prevent me from using process.argv in node. I can save data to file and read that file in node, so I wouldn't need to pass the whole state to node, but I still need to pass the identifying arguments, so the node knows which file to load.
What should I use instead of browserify?
If you need to compile a React module to es5 for use on the server, use Babel itself.
A module that may help with reading and writing files is this one: https://nodejs.org/api/fs.html
Have you considered posting and getting from a database?
Here's how I solved it:
I have learnt that you can create standalone bundles with browserify, so I've compiled all the server code I need (components + rendering) as a standalone bundle. Then I have created small node script which is responsible only for reading arguments, loading data and sending it to the rendering code.
I'm not sure if this is a proper way how it should be done but it works.
Here's code for the "setup" script:
var fs = require('fs');
var Server = require('./server.js');
if (process.argv[2]) {
region = process.argv[2].toLowerCase().replace(/[^a-z0-9]/, '');
if (region != '') {
var data = JSON.parse(fs.readFileSync(__dirname + '/../tmp/' + region + '.json', 'utf8'));
console.log(Server.render(data.deal, data.region));
}
}
This way I only need to deploy two files and I still can easily compile jsx to js.

Write qUnit output to file via Grunt

I need to be able to report qUnit tests to a file so my build server can parse them.
I'm using qUnit (grunt-contrib-qunit) through Grunt along with the jUnit reporter found here.
I can get the report to write to the log just as it states but I'm having trouble getting it into a file. I've tried qunit callbacks in my gruntfile but none of them seem to get the xml info. I also tried to simply redirect stdout but it (of course) printed all of the non-xml command-line stuff along with the xml.
In short, I've got the XML echoing properly in the console.log statement. I just need to get this to a file somehow. Either through Grunt, phantomjs, or any other means.
Well, if you're running QUnit tests from Grunt, then you have the full power of Node at your disposal. I've never used that JUnit plugin, but if it just gives you callback in your QUnit HTML file, then you would need a browser solution (even if that is phantomjs).
Phantom uses QtWebKit, which has implemented the File API so you could implement a solution using that from JUnit's callback, but, of course, that would fail if you run the tests in certain other browsers (namely IE9 or under). Here's how that might look (no guarantees on this being exact, I have not run it):
QUnit.jUnitReport = function(report) {
function onInitFs(fs) {
fs.root.getFile('qunit_report.xml', {create: true}, function(fileEntry) {
fileEntry.createWriter(function(fileWriter) {
fileWriter.onwriteend = function(e) { /* if you need it */ };
fileWriter.onerror = function(e) { /* if you need it */ };
var blob = new Blob([report.xml], {type: 'application/xml'});
fileWriter.write(blob);
}, someErrorHandlerFunction);
}, someErrorHandlerFunction);
}
window.requestFileSystem(window.TEMPORARY, 1024*1024, onInitFs, someErrorHandlerFunction);
}
And again, if you need to do something to write the file in IE9 or under (or some mobile browsers) you'll need another solution, like kicking off an ajax request to upload the data to a server that stores the file. You could even run that "server" from within Grunt and have Node write the file.

Reduce HTTP requests bundling bower_components together?

I'm developing a web application and I'm using about 20 bower_components (bootstrap, jquery, some jquery plugin like cookie and serializejson, mousewheel, mustache, underscore etc).
All these plugins are quite small (except for bootstrap and jquery), but are loaded with a dedicated "" tag.
This makes my page asks for 20 micro javascript files and this makes the page load slowly.
I'd like to bundle all these scripts together and load the bundle...
I've tried with gulp-bower-src but it just help me to uglify them and move them in a "lib" folder... Can't find a way to compress them all together.
This is my gulp task atm
var filter = gulpFilter("**/*.js", "!**/*.min.js");
gulp.task("default", function () {
bowerSrc()
.pipe(filter)
.pipe(uglify())
.pipe(filter.restore())
.pipe(gulp.dest(__dirname + "/public/lib"));
});
How can I do?
You can use gulp-concat to bundle all these micro files into one asset.
var concat = require('gulp-concat');
var filter = gulpFilter("**/*.js", "!**/*.min.js");
gulp.task("default", function () {
bowerSrc()
.pipe(filter)
.pipe(uglify())
.pipe(concat('bundle.js'))
.pipe(filter.restore())
.pipe(gulp.dest(__dirname + "/public/lib"));
});
This will concat all filtered and uglified files as bundle.js into directory public/lib. Keep in mind that when concatenating these files the order of files matters. I guess that gulp-bower-src does not order scripts according to a dependency graph (I found nothing in the documentation) so it might require you to select the bower files by hand in the right order.
A manual ordering/selecting of the bower components could be done by substituting the bowerSrc() by a line somehow like this
gulp.src(['./bower_components/jquery/jquery.js', './bower_components/jquery-ui/jquery-ui.js', './bower_components/jquery-swift-color-picker/color.js']);
This may seem a little clumsy and it is but order matters.

Grunt-Karma: Use Node.js fs-framework in Jasmine Testfile

I'm writing unit-tests with the Jasmine-framework.
I use Grunt and Karma for running the Jasmine testfiles.
I simply want to load the content of a file on my local file-system (e.g. example.xml).
I thought I can do this:
var fs = require('fs');
var fileContent = fs.readFileSync("test/resources/example.xml").toString();
console.log(fileContent);
This works well in my Gruntfile.js and even in my karma.conf.js file, but not in my
Jasmine-file. My Testfile looks like this:
describe('Some tests', function() {
it('load xml file', function() {
var fs = require("fs");
fileContent = fs.readFileSync("test/resources/example.xml").toString();
console.log(fileContent);
});
});
The first error I get is:
'ReferenceError: require is not defined'.
Does not know why I cannot use RequireJS here, because I can use it
in Gruntfiel.js and even in karma.conf.js?!?!?
Okay, but when manually add require.js to the files-property in karma.conf.js-file,
then I get the following message:
Module name "fs" has not been loaded yet for context: _. Use require([])
With the array-syntax of requirejs, nothing happens.
I guess that is not possible to access Node.js functionality in Jasmine when running the
testfiles with Karma. So when Karma runs on Node.js, why is it not possible to access the 'fs'-framework of Nodejs?
Any comment/advice is welcome.
Thanks.
Your test do not work because karma - is a testrunner for client-side JavaScript (javascript who run in browser), but you want to test node.js code with it (which run on the server part). So karma just can't run server-side tests. You need different testrunner, for example take a look to jasmine-node.
Since this comes up first in the Google search, I received a similar error but wasn't using any node.js-style code in my project. Turns out the error was one of my bower components had a full copy of jasmine in it including its node.js-style code, and I had
{ pattern: 'src/**/*.js', included: false },
in my karma.conf.js.
So unfortunately Karma doesn't provide the best debugging for this sort of thing, dumping you out without telling you which file caused the issue. I had to just tear that pattern down to individual directories to find the offender.
Anyway, just be wary of bower installs, they bring a lot of code down into your project directory that you might not really care to have.
I think you're missing the point of unit testing here, because it seems to me that you're copying application logic into your test suite. This voids the point of a unit test because what it is supposed to do is run your existing functions through a test suite, not to test that fs can load an XML file. In your scenario if your XML handling code was changed (and introduced a bug) in the source file it would still pass the unit test.
Think of unit testing as a way to run your function through lots of sample data to make sure it doesn't break. Set up your file reader to accept input and then simply in the Jasmine test:
describe('My XML reader', function() {
beforeEach(function() {
this.xmlreader = new XMLReader();
});
it('can load some xml', function() {
var xmldump = this.xmlreader.loadXML('inputFile.xml');
expect(xmldump).toBeTruthy();
});
});
Test the methods that are exposed on the object you are testing. Don't make more work for yourself. :-)

Resources