Node's del command - callback not firing - node.js

I'm working through a pluralsight course on gulp. John Papa is demonstrating how to inject a function that deletes existing css files, into the routine that compiles the new ones.
The callback on the del function is not firing. The del function is running, file are deleted, I see no error messages. If I call the callback manually it executes, so looks like the function is in tact. So I am wondering what would cause del not to want to execute the callback.
delete routine:
function clean(path, done) {
log('cleaning ' + path);
del(path, done); // problem call
}
The 'done' function is not firing, but it does if I change the code to this:
function clean(path, done) {
log('cleaning ' + path);
del(path);
done();
}
Which, of course, defeats the intended purpose of waiting until del is done before continuing on.
Any ideas at to what's going on would be appreciated.
for reference (in case relevant):
compile css function:
gulp.task('styles', ['clean-styles'], function(){
log('compiling less');
return gulp
.src(config.less)
.pipe($.less())
.pipe($.autoprefixer({browsers:['last 2 versions', '> 5%']}))
.pipe(gulp.dest(config.temp));
});
injected clean function:
gulp.task('clean-styles', function(done){
var files = config.temp + '/**/*.css';
clean(files, done);
});
UPDATE
If anyone else runs into this, re-watched the training video and it was using v1.1 of del. I checked and I was using 2.x. After installing v 1.1 all works.

del isn't a Node's command, it's probably this npm package. If that's the case it doesn't receive a callback as second parameter, instead it returns a promise and you should call .then(done) to get it called after the del finishes.
Update
A better solution is to embrace the Gulp's promise nature:
Change your clean function to:
function clean(path) {
return del(path); // returns a promise
}
And your clean-styles task to:
gulp.task('clean-styles', function(){
var files = config.temp + '/**/*.css';
return clean(files);
});

As of version 2.0, del's API changed to use promises.
Thus to specify callback you should use .then():
del('unicorn.png').then(callback);
In case you need to call it from a gulp task - just return a promise from the task:
gulp.task('clean', function () {
return del('unicorn.png');
});

Checking the docs for the del package it looks like you're getting mixed up between node's standard callback mechanism and del's, which is using a promise.
You'll want to use the promise API, with .then(done) in order to execute the callback parameter.
Node and javascript in general is currently in a bit of a state of flux for design patterns to handle async code, with most of the browser community and standards folks leaning towards promises, whereas the Node community tends towards the callback style and a library such as async.
With ES6 standardizing promises, I suspect we're going to see more of these kinds of incompatibilities in node as the folks who are passionate about that API start incorporating into node code more and more.

Related

Will a Mongoose queries `then` call occur after any passed in callback completes?

I realize that the standard practice for promises in Mongoose is to use exec(), but the following works (or at least appears to) and I want to understand the flow. I'm not against using exec, I'm just exploring this a bit to learn.
In the following Mongoose operation:
let id:string;
SomeDocument.remove({}, (err) => { //clears collection
someDoc = new SomeDocument(data); //creates a new doc for the collection. Id is created here from what I understand.
someDoc.save( (err, result) => { doSomething(); });//doSomething gets called sometime in the future.
id = someDoc._id.toString();
}).then( (result) => {doSomethingElse(id)});//This works - but will doSomethingElse always be called after the first annonymous callback completes?
I get that doSomething() will just get called at some future point - no problem. The question is, will the first callback to the remove call complete prior to doSomethingElse in the then call being called. It seems to be, in that the id is correctly populated in doSomethingElse, but I want to make sure that isn't just a fluke of timing - i.e. I want to know if I can rely on that callback completing prior to the then. I'm using standard ES6 promises (NodeJS.Global.Promise).
The alternative is that maybe then is called after the remove operation completes, but prior to the callback completing (doesn't seem to - but I want to confirm).
Set me straight if I'm explaining this incorrectly.
Yes, as #JaromandaX explained in the comments the order of callbacks is deterministic.
However, it's still bad code, and you should not rely on this behaviour. If you are using promises, don't pass a callback to remove at all; only pass callbacks to then!
SomeDocument.remove({})
.then(() => {
const someDoc = new SomeDocument(data);
someDoc.save().then(doSomething); // doSomething will get called in the future.
return someDoc._id.toString();
//^^^^^^
})
.then(doSomethingElse); // doSomethingElse will get passed the id
doSomethingElse will get called with the result of the previous callback, which is guaranteed to have been completed for that.

nodejs async.forEach callback was already called

I'm using the async library to help me with my control flow. I have a collection over which I want to iterate, for each element execute 1 asynchronous task and when all are done, call the callback.
I've decided to use an async.forEach loop, on each loop I call my asynchronous task but I get an error: callback was already called, but shouldn't the callback be called only when all callbacks are called? And I even wanted to understand properly how to handle errors, it is highly probable that some task will fail and others will succeed, I don't need to know which elements fail, but I would like, how can I do this?
This is my code:
async.forEach(fonts, function(font, callback) {
ftpm_module.installOsFont(font, callback);
}, function() {
console.log("finished");
});
EDIT: the error occurs only if I pass 2 or more fonts.

Code coverage for node.js project using Q promises

I'm currently working on a node.js project. We use Q library for promises (https://github.com/kriskowal/q).
We are using mocha for tests and code-coverage provided with grunt tasks (https://github.com/pghalliday/grunt-mocha-test) which uses blanket.js for coverage and travis-cov reporter for asserting the coverage threshold.
Unfortunately the solution does not provide branches coverage for promises.
I have tried Intern (http://theintern.io/), however basic example I wrote does not show correct branch coverage too.
Can you recommend a solution that would provide correct coverage for Q promises and works with grunt and node seamlessly?
Well, checking promises for coverage should not be too hard because JavaScript has absolutely sick aspect oriented programming abilities. While automated tools are offtopic here, let's go through what branches a promise has:
First, the resolver function for the promise (either a promise constructor or a deferred) have their own branches.
Each .then clause has 3 branches (one for success, one for failure, one for progress), if you want loop coverage too, you'd want each progress event when progress is attached to fire zero times, once and multiple times - although let's avoid progress for this question and in general since it's being deprecated and replaced in Q.
The .done, .catch etc... are all private cases of .then. Aggregation methods like .all are private cases of the promise constructor since they create a new aggregate promise and their .then needs to be checked just as well.
So, let's see how we can do this, first - note that Q uses Promise.prototype.then internally for aggregation methods, so we can override it:
Promise.prototype._then = Promise.prototype.then;
var branches = 0;
Promise.prototype.then = function(fulfilled,rejected,progressed){
branches += (fulfilled.call) +(rejected.call) + (progressed.call);
var nFulfilled = function(){ branches--;return fulfilled.apply(this,arguments); };
var nRejected = function(){ branches--; return rejected.apply(this,arguments); };
//progression may happen more than once
var nProgressed = function(){
if(!nProgress.happened) {
branches--;
nProgress.happened = true;
}
return progressed.apply(this,arguments);
};
nProgressed.happened = false;
return this._then((fulfilled.call) ? nFulfilled : undefined,
(rejected.call) ? nRejected : undefined,
(progressed.call) ? nProgressed : undefined);
};
What we're doing here, is hooking on every .then and keeping track of handlers attached and handlers being called.
The branches variable will contain the number of code paths created with promises, and not actually executed. You can use more sophisticated logic here (for example - count rather than subtract) and I'd like to see someone pick up the glove and make a git repo from this :)

NodeJS Filesytem sync and performance

I've run into an issue with NodeJS where, due to some middleware, I need to directly return a value which requires knowing the last modified time of a file. Obviously the correct way would be to do
getFilename: function(filename, next) {
fs.stat(filename, function(err, stats) {
// Do error checking, etc...
next('', filename + '?' + new Date(stats.mtime).getTime());
});
}
however, due to the middleware I am using, getFilename must return a value, so I am doing:
getFilename: function(filename) {
stats = fs.statSync(filename);
return filename + '?' + new Date(stats.mtime).getTime());
}
I don't completely understand the nature of the NodeJS event loop, so what I was wondering is if statSync had any special sauce in it that somehow pumped the event loop (or whatever it is called in node, the stack of instructions waiting to be performed) while the filenode information was loading or is it really blocking and that this code is going to cause performance nightmares down the road and I should rewrite the middleware I am using to use a callback? If it does have special sauce to allow for the event loop to continue while it is waiting on the disk, is that available anywhere else (though some promise library or something)?
Nope, there is no magic here. If you block in the middle of the function, everything is blocked.
If performance becomes an issue, I think your only option is to rewrite that part of the middleware, or get creative with how it is used.

Does node.js preserve asynchronous execution order?

I am wondering if node.js makes any guarantee on the order async calls start/complete.
I do not think it does, but I have read a number of code samples on the Internet that I thought would be buggy because the async calls may not complete in the order expected, but the examples are often stated in contexts of how great node is because of its single-threaded asynchronous model. However I cannot find an direct answer to this general question.
Is it a situation that different node modules make different guarantees? For example at https://stackoverflow.com/a/8018371/1072626 the answer clearly states the asynchronous calls involving Redis preserves order.
The crux of this problem can be boiled down to is the following execution (or similar) is strictly safe in node?
var fs = require("fs");
fs.unlink("/tmp/test.png");
fs.rename("/tmp/image1.png", "/tmp/test.png");
According to the author the call to unlink is needed because rename will fail on Windows if there is a pre-existing file. However, both calls are asynchronous, so my initial thoughts were that the call to rename should be in the callback of unlink to ensure the asynchronous I/O completes before the asynchronous rename operation starts otherwise rename may execute first, causing an error.
Async operation do not have any determined time to execute.
When you call unlink, it asks OS to remove the file, but it is not defined when OS will actually remove the file; it might be a millisecond or an year later.
The whole point of async operation is that they don't depend on each other unless explicitly stated so.
In order to rename to occur after unlink, you have to modify your code like this:
fs.unlink("/tmp/test.png", function (err) {
if (err) {
console.log("An error occured");
} else {
fs.rename("/tmp/image1.png", "/tmp/test.png", function (err) {
if (err) {
console.log("An error occured");
} else {
console.log("Done renaming");
}
});
}
});
or, alternatively, to use synchronized versions of fs functions (note that these will block the executing thread):
fs.unlinkSync("/tmp/test.png");
fs.renameSync("/tmp/image1.png", "/tmp/test.png");
There also libraries such as async that make async code to look better:
async.waterfall([
fs.unlink.bind(null, "/tmp/test.png");
fs.rename.bind(null, "/tmp/image1.png", "/tmp/test.png");
], function (err) {
if (err) {
console.log("An error occured");
} else {
console.log("done renaming");
}
});
Note that in all examples error handling is extremely simplified to represent the idea.
If you look at the documentation of Node.js you'll find that the function fs.unlink takes a callback as an argument as:
fs.unlink(path, [callback]);
An action that you intend to take when the current function returns should be passed to the function as the callback argument. So typically in your case the code will take the following form:
var fs = require("fs");
fs.unlink("/tmp/test.png", function(){
fs.rename("/tmp/image1.png", "/tmp/test.png");
});
In the specific case of unlink and rename there are also synchronous function in Node.js and can be used as fs.unlinkSync(path) and fs.renameSync(oldPath, newPath). This will ensure that the code is run synchronously.
Moreover if you wish to use asynchronous implementation but retain better readability you could consider a library like async. It also has options for different modes of implementation like parallel, series, waterfall etc.
Hope this helps.

Resources