change `dest` option on the fly - node.js

I have some grunt tasks to compile files and would like to "recycle" them inside different tasks.
I am trying to modify the destination directory without success... My idea is something like:
grunt.registerTask('bower', ['compile:index', 'compile:core'], function(){
this.options({dest: 'dist/*.js'});
});
The compile:index task runs good by itself (i.e. when called alone) and has dest: 'index.js, other tasks have other filenames. I would like to change these inside the bowertask, adding a new directory but keeping the filename defined in the original task.
Is this possible?

You can create a dynamic alias task that configures and then runs tasks like such:
grunt.registerTask('bower', function(target) {
target = target || 'index';
if (target === 'core') {
grunt.config('compile.core.dest', 'dist/core.js');
} else {
grunt.config('compile.index.dest', 'dist/index.js');
// Will call itself after compile:index has ran to configure for compile:core
grunt.task.run(['compile:index', 'bower:core', 'compile:core']);
}
});
Then entering grunt bower or grunt bower:index will dynamically configure/run the compile:index task, then configure/run the compile:core task.

Related

How can the base directory be replaced with gulp?

I need to alter a stream of files to contain a different base folder name. I thought the gulp-rename plugin would allow for this, but it only seems to replace the glob portion.
Example:
gulp.task("test", function() {
gulp.src("bower_components/**/*", { base: "bower_components", read:false })
.pipe($.rename(function (p) { p.dirname = "X/" + p.dirname; }))
.pipe($.print());
});
outputs:
[gulp] bower_components\X\jquery\test\data\offset\scroll.html
[gulp] bower_components\X\jquery\test\data\offset\static.html
[gulp] bower_components\X\jquery\test\data\offset\table.html
...
I want
[gulp] X\jquery\test\data\offset\scroll.html
[gulp] X\jquery\test\data\offset\static.html
[gulp] X\jquery\test\data\offset\table.html
...
Is there a way to do this with gulp-replace, or some other plugin?
I believe you could do this with gulp-tap to get a hold of the the file instances and alter properties on them before they get printed or use it to print them.
Out of curiosity what are you aiming to do?
Hope that helps!
EDIT-1::
The following is a slightly modified version of the example in the gulp-tap documentation which may work for your use case.
gulp.src("src/**/*.{coffee,js}")
.pipe(tap(function(file, t) {
file.path = 'X/' + file.path;
}))
.pipe($.print())
.pipe(gulp.dest('build'));
EDIT-2::
This is a common task I have set up in my projects for handling external scripts (note; I am using gulp-load-plugins hence invoking my plugins with plugins.<NAME>);
gulp.task('vendor:scripts:publish', function() {
return gulp.src(sources.vendor.js)
.pipe(plugins.plumber())
.pipe(plugins.concat('vendor.js'))
.pipe(gulp.dest(destinations.js))
.pipe(plugins.uglify())
.pipe(plugins.rename(pluginOpts.rename))
.pipe(gulp.dest(destinations.js));
});
destinations and sources are two variables that I have defined in a config file for my gulpfile.
But for clarity, sources.vendor.js points at an array much like the following;
js: [
'src/vendor/jquery/dist/jquery.js',
'src/vendor/lodash/lodash.js',
'src/vendor/backbone/backbone.js'
],
The reason my folder is named vendor and not bower_components is because I've made use of a .bowerrc file to point my bower installation at a different folder.
In addition if you have discrete scripts that you may not want to include all of the time you can look to make use of gulp-utils and gulp-filter to filter out certain scripts when an option is passed or not passed when gulp is invoked on the CLI.
For example; having gulp vendor:scripts:publish include all scripts but gulp vendor:scripts:publish --release omitting discrete scripts.
This then requires modifying your task to declare a filter that is piped in based on an option flag being picked up by gulp-utils.
var isRelease = (plugins.utils.env.release) ? true: false;
gulp.task('vendor:scripts:publish', function() {
var discreteFilter = plugins.filter([
'**/*.js',
'!**/discrete.min.js'
]);
return gulp.src(sources.vendor.js)
.pipe(plugins.plumber())
.pipe(isRelease ? discreteFilter: plugins.utils.noop())
.pipe(plugins.concat('vendor.js'))
.pipe(gulp.dest(destinations.js))
.pipe(plugins.uglify())
.pipe(plugins.rename(pluginOpts.rename))
.pipe(gulp.dest(destinations.js));
});
Hope that helps you out!

Using grunt to run a node server and do cleanup after

So basically this is what I want to do. Have a grunt script that compiles my coffee files to JS. Then run the node server and then, either after the server closes or while it's still running, delete the JS files that were the result of the compilation and only keep the .coffee ones.
I'm having a couple of issues getting it to work. Most importantly, the way I'm currently doing it is this:
grunt.loadNpmTasks("grunt-contrib-coffee");
grunt.registerTask("node", "Starting node server", function () {
var done = this.async();
console.log("test");
var sp = grunt.util.spawn({
cmd: "node",
args: ["index"]
}, function (err, res, code) {
console.log(err, res, code);
done();
});
});
grunt.registerTask("default", ["coffee", "node"]);
The problem here is that the node serer isn't run in the same process as grunt. This matters because I can't just CTRL-C once to terminate JUST the node server.
Ideally, I'd like to have it run in the same process and have the grunt script pause while it's waiting for me to CTRL-C the server. Then, after it's finished, I want grunt to remove the said files.
How can I achieve this?
Edit: Note that the snippet doesn't have the actual removal implemented since I can't get this to work.
If you keep the variable sp in a more global scope, you can define a task node:kill that simply checks whether sp === null (or similar), and if not, does sp.kill(). Then you can simply run the node:kill task after your testing task. You could additionally invoke a separate task that just deletes the generated JS files.
For something similar I used grunt-shell-spawn in conjunction with a shutdown listener.
In your grunt initConfig:
shell: {
runSuperCoolJavaServer:{
command:'java -jar mysupercoolserver.jar',
options: {
async:true //spawn it instead!
}
}
},
Then outside of initConfig, you can set up a listener for when the user ctrl+c's out of your grunt task:
grunt.registerTask("superCoolServerShutdownListener",function(step){
var name = this.name;
if (step === 'exit') process.exit();
else {
process.on("SIGINT",function(){
grunt.log.writeln("").writeln("Shutting down super cool server...");
grunt.task.run(["shell:runSuperCoolJavaServer:kill"]); //the key!
grunt.task.current.async()();
});
}
});
Finally, register the tasks
grunt.registerTask('serverWithKill', [
'runSuperCoolJavaServer',
'superCoolServerShutdownListener']
);

How to use jasmine with gulp.watch

I'm trying to make my tests run each time I'm saving some files. Here is the gulp watch:
gulp.task('jasmine', function() {
gulp.src('spec/nodejs/*Spec.js')
.pipe(jasmine({verbose:true, includeStackTrace: true}));
});
gulp.task('watch', function () {
gulp.watch(['app/*.js', 'app/!(embed)**/*.js','spec/nodejs/*.js'], ['jasmine']);
});
To test for example app/maps.js I'm creating a spec/nodejs/mapsSpec.js file like this:
'use strict';
var maps = require('../../app/maps');
describe('/maps related routes', function(){
it('should ...', function(){...}
...
If I change a spec file everything is working well, if I modify app/maps.js file the change trigger the test. if I modify it again tests are tiggered but the modifications do not taking effect. For example if I add a console.log('foo') in a second time, I will not see it until I relaunch gulp watch and save it again. So only one run of jasmine is ok when using it with gulp.watch.
I guess it's because require is cached by nodejs in the gulp process. So how should I do ?
I took a look at the code of gulp-jasmine. The problem is that the only file from the cache is the Specs.js file. The cache of the children(the reqquired files to test) aren't cleared.
Within the index.js of gulp-jasmine is a row which deletes the cache:
delete require.cache[require.resolve(path.resolve(file.path))];
If you put the next block of code before the delete, you will delete all the children's cache and will it run correctly after every time you save your file.
var files = require.cache[require.resolve(path.resolve(file.path))];
if( typeof files !== 'undefined' ) {
for( var i in files.children ) {
delete require.cache[ files.children[i].id ];
}
}
You can change this in the node_modules.
I will go for a pull request, so maybe in the near future this will be solved permanently.
Also wrote a post about it on: http://navelpluisje.nl/entry/fix-cache-problem-jasmine-tests-with-gulp
I haven't found a fix for this issue, but you can work around it via the gulp-shell task.
npm install gulp-shell --save-dev
then
var shell = require('gulp-shell');
...
gulp.task('jasmine', function() {
gulp.src('spec/nodejs/*Spec.js')
.pipe(shell('minijasminenode spec/*Spec.js'));
});
You'll also need jasmine installed as a direct dependency (gulp-jasmine uses minijasminenode)

How to modify grunt watch tasks based on the file changed?

I'm writing a node.js program that will watch a directory filled with a large (300-ish) amount of scss projects. Grunt-watch (run either through the node module or on its own, whatever works) will be configured so that whenever a scss file is changed, it will be compiled with compass, the output file moved to a separate directory, for example:
./1234/style.scss was changed >> grunt-watch runs grunt-compass >> /foo/bar/baz/1234/style.css updated
The project directory that the file was in is obviously very important (if grunt-compass sent all the compiled files to the same directory, they would be jumbled and unusable and the grunt automation would be purposeless). I order to make sure all files are routed to the correct place, I am dynamically changing the grunt-compass settings every time a css file is updated.
Sample gruntfile:
module.exports = function(grunt) {
grunt.initConfig({
pkg: grunt.file.readJSON('package.json'),
watch: {
files: './*/*.scss',
tasks: ['compass']
},
compass: {
origin:{
options: {
//temportary settings to be changed later
sassDir: './',
cssDir: './bar',
specify: './foo.scss'
}
}
}
});
grunt.loadNpmTasks('grunt-contrib-watch');
grunt.loadNpmTasks('grunt-contrib-compass');
grunt.event.on('watch', function(action, filepath, target) {
var path = require('path');
grunt.log.writeln(target + ': ' + filepath + ' might have ' + action);
var siteDirectory = path.dirname(filepath);
//changes sass directory to that of the changed file
var option = 'compass.origin.options.sassDir';
var result = __dirname + '/' + siteDirectory;
grunt.log.writeln(option + ' changed to ' + result);
grunt.config(option, result);
//customizes css output directory so that file goes to correct place
option = 'compass.origin.options.cssDir';
result = path.resolve(__dirname, '../', siteDirectory);
grunt.log.writeln(option + ' changed to ' + result);
grunt.config(option, result);
//grunt.task.run(['compass']);
});
};
However this doesn't work. If you run 'grunt watch' in verbose mode, you will see that grunt runs both the grunt.event.on function and the watch task in separate processes. The second parsing of the gruntfile reverts all my event.on config changes to the defaults above, and compass fails to run.
As seen in the event.on comments, I attempted to add a grunt.task.run() to make sure that compass was run in the same process as the event.on function, which would preserve my config changes. However the task refused to run, likely because I'm doing it wrong.
Unfortunately, the grunt.event.on variables are not sent to the defined grunt-watch task, otherwise I could write a custom function that would change the compass settings and then run compass in the same process.
I've tried implementing this without grunt, using the watch function build into compass, however compass can only store one static output path per project and can only watch one project at once.
I have currently gotten around this issue by adding a node program that takes the site name as a parameter, rewrites the grunfile.js by running using fs, and then running 'grunt watch' via an exec function. This however has it's own drawbacks (I can't view the grunt.log data) and is horribly convoluted, so I'd like to change it.
Thank you so much for any insight.
You need to specify
options : { nospawn : true }
in your watch task config to have the watch run in the same context:
watch: {
files: './*/*.scss',
tasks: ['compass'],
options : { nospawn : true }
}
See this section of documentation for more info on this.

How to get nodeunit to detect and run tests included in subfolders?

I have the following folder structure to my nodeunit tests for a particular project:
/tests
/tests/basic-test.js
/tests/models/
/tests/models/model1-tests.js
/tests/models/model2-tests.js
My question is - how do I get nodeunit to automatically execute ALL of the tests in the tests folder, including the sub-directories contained within?
If I execute nodeunit tests it only executes basic-test.js and skips everything in the sub-folders by default.
Use make based magic (or shell based magic).
test:
nodeunit $(shell find ./tests -name \*.js)
Here your passing the result of running find ./tests -name \*.js to nodeunit which should run all javascript tests recursively
Nodeunit allows you to pass in a list of directories from which to run tests. I used a package called diveSync which synchronously and recursively loops over files and directories. I store all the directories in an array and pass it to nodeunit:
var diveSync = require("diveSync"),
fs = require("fs"),
nodeUnit = require('nodeunit'),
directoriesToTest = ['test'];
diveSync(directoriesToTest[0], {directories:true}, function(err, file) {
if (fs.lstatSync(file).isDirectory()) {
directoriesToTest.push(file);
}
})
nodeUnit.reporters.default.run(directoriesToTest);
While this is not an automatic solution as described above, I have created a collector file like this:
allTests.js:
exports.registryTests = require("./registryTests.js");
exports.message = require("./messageTests.js")
When I run nodeunit allTests.js, it does run all the tests, and indicates the hierarchical arrangement as well:
? registryTests - [Test 1]
? registryTests - [Test 2]
? messageTests - [Test 1]
etc...
While the creation of a new unit test file will require including it in the collector, that is an easy, one-time task, and I can still run each file individually. For a very large project, this would also allow collectors that run more than one, but not all tests.
I was looking for solutions for the same question. None of the presented answers fully suited my situation where:
I didn't want to have any additional dependencies.
I already had nodeunit installed globally.
I didn't want to maintain the test file.
So the final solution for me was to combine Ian's and mbmcavoy's ideas:
// nodeunit tests.js
const path = require('path');
const fs = require('fs');
// Add folders you don't want to process here.
const ignores = [path.basename(__filename), 'node_modules', '.git'];
const testPaths = [];
// Reads a dir, finding all the tests inside it.
const readDir = (path) => {
fs.readdirSync(path).forEach((item) => {
const thisPath = `${path}/${item}`;
if (
ignores.indexOf(item) === -1 &&
fs.lstatSync(thisPath).isDirectory()
) {
if (item === 'tests') {
// Tests dir found.
fs.readdirSync(thisPath).forEach((test) => {
testPaths.push(`${thisPath}/${test}`);
});
} else {
// Sub dir found.
readDir(thisPath);
}
}
});
}
readDir('.', true);
// Feed the tests to nodeunit.
testPaths.forEach((path) => {
exports[path] = require(path);
});
Now I can run all my tests, new and old, with a mere nodeunit tests.js command.
As you can see from the code, the test files should be inside tests folders and the folders should not have any other files.

Resources