How to add two function as callback in node? - node.js

How can I callback two function in node?
For example:
request('http://...',[func1,func2])
Is there any module for that?

Is there some reason you can't put a wrapper around the two functions, then use the wrapper as a callback? For example:
function handleCallback(data)
{
func1(data);
func2(data);
}
request('http://...',handleCallback);

As noted in other answers it's trivial to wrap multiple functions in a single callback. However, the event emitter pattern is sometimes nicer for handling such cases. A example might be:
var http = require('http');
var req = http.request({ method: 'GET', host:... });
req.on('error', function (err) {
// Make sure you handle error events of event emitters. If you don't an error
// will throw an uncaught exception!
});
req.once('response', func1);
req.once('response', func2);
You can add as many listeners as you like. Of course, this assumes that what you want your callbacks to be registered on is an event emitter. Much of the time this is true.

You can create a little module for that:
jj.js file:
module.exports.series = function(tasks) {
return function(arg) {
console.log(tasks.length, 'le')
require('async').eachSeries(tasks, function(task, next) {
task.call(arg)
next();
})
}
}
example of use:
var ff = require('./ff.js')
function t(callback) {
callback('callback')
}
function a(b) {
console.log('a', b)
}
function b(b) {
console.log('b', b)
}
t(ff.series([a, b]))

Related

Best Way to Chain Callbacks?

I have a question about the best way to chain callbacks together while passing data between them. I have an example below which works, but has the flaw that the functions have to be aware of the chain and know their position / whether to pass on a call back.
function first(data, cb1, cb2, cb3){
console.log("first()", data);
cb1(data,cb2, cb3);
}
function second(data, cb1, cb2) {
console.log("second()",data);
cb1(data, cb2);
}
function third(data, cb) {
console.log("third()",data);
cb(data);
}
function last(data) {
console.log("last() ", data);
}
first("THEDATA", second, third, last); // This work OK
first("THEDATA2", second, last); // This works OK
first("THEDATA3", second); // This doesn't work as second tries to pass on a callback that doesn't exit.
I can think of a number of ways around this, but they all involve in making the called functions aware of what is going on. But my problem is that the real functions I’m dealing with already exist, and I don’t want to modify them if I can avoid it. So I wondered if I was missing a trick in terms of how these could be called?
Thanks for the answers, and I agree that promises are probably the most appropriate solution, as they meet my requirements and provide a number of other advantages.
However I have also figured out how to do what I want without involving any extra modules.
To recap the specific ask was to:
chain together a number of functions via callbacks (this is so the first function can use a non-blocking I/O call the other functions are dependent upon),
while passing arguments (data) between them, and
without the existing functions having to be modified to be aware of their position in the callback chain.
The 'trick' I was missing was to introduce some extra anonymous callback functions to act as links between the existing functions.
// The series of functions now follow the same pattern, which is how they were before
// p1 data object and p2 is a callback, except for last().
function first(data, cb){
console.log("first()", data);
cb(data);
}
function second(data, cb) {
console.log("second()",data);
cb(data);
}
function third(data, cb) {
console.log("third()",data);
cb(data);
}
function last(data) {
console.log("last() ", data);
}
// And the named functions can be called pretty much in any order without
// the called functions knowing or caring.
// first() + last()
first("THEDATA", function (data) { // The anonymous function is called-back, receives the data,
last(data); // and calls the next function.
});
// first() + second() + last()
first("THEDATA2", function (data) {
second(data, function (data){
last(data);
}); // end second();
}); // end first();
// first() + third()! + second()! + last()
first("THEDATA3", function (data) {
third(data, function (data){
second(data, function (data){
last(data);
}); // end third();
}); // end second();
}); // end first();
Promises are the way to go as they provide following benefits :
Sequential callback chaining
Parallel callback chaining (kind of)
Exception handling
Easily passing around the objects
In general, a Promise performs some operation and then changes its own state to either rejected - that it failed, or resolved that its completed and we have the result.
Now, each promise has method then(function(result), function(error)). This then() method is executed when the Promise is either resolved or rejected. If resolved, the first argument of the then() is executed, with the result and if Promise gets rejected 2nd argument is executed.
A typical callback chaining looks like :
example 1 :
someMethodThatReturnsPromise()
.then(function (result) {
// executed after the someMethodThatReturnsPromise() resolves
return somePromise;
}).then(function (result) {
// executed after somePromise resolved
}).then(null, function (e) {
// executed if any of the promise is rejected in the above chain
});
How do you create a Promise at the first place ? You do it like this :
new Promise (function (resolve, reject) {
// do some operation and when it completes
resolve(result /*result you want to pass to "then" method*/)
// if something wrong happens call "reject(error /*error object*/)"
});
For more details : head onto here
Best practice would be to check whether the callback you are calling is provided in the args.
function first(data, cb1, cb2, cb3){
console.log("first()", data); // return something
if(cb1){
cb1(data,cb2, cb3);
}
}
function second(data, cb1, cb2) {
console.log("second()",data); // return something
if(cb1){
cb1(data, cb2);
}
}
function third(data, cb) {
console.log("third()",data); // return something
if(cb){
cb(data);
}
}
function last(data) {
console.log("last() ", data); // return something
}
first("THEDATA", second, third, last); // This work OK
first("THEDATA2", second, last); // This works OK
first("THEDATA3", second);
This will work fine.
Alternatively there are many more options like Promises and Async library e.tc
Maybe you want to try it this way, yes right using "Promise" has more features, but if you want something simpler, we can make it like this, yes this is without error checking, but of course we can use try / catch
function Bersambung(cb){
this.mainfun = cb
this.Lanjut = function(cb){
var thisss = this;
if(cb){
return new Bersambung(function(lanjut){
thisss.mainfun(function(){
cb(lanjut);
})
})
} else {
this.mainfun(function(){});
}
}
}
//You can use it like this :
var data1 = "", data2 = "" , data3 = ""
new Bersambung(function(next){
console.log("sebelum async")
setTimeout(function(){
data1 = "MENYIMPAN DATA : save data"
next();
},1000);
}).Lanjut(function(next){
if(data1 == ""){
return; // break the chain
}
console.log("sebelum async 2")
setTimeout(function(){
console.log("after async")
console.log(data1);
next();
},1000)
}).Lanjut();
Okay this might not be ideal but it works.
function foo(cb = [], args = {}) // input a chain of callback functions and argumets
{
args.sometext += "f";
console.log(args.sometext)
if (cb.length == 0)
return; //stop if no more callback functions in chain
else
cb[0](cb.slice(1), args); //run next callback and remove from chain
}
function boo(cb = [], args = {})
{
newArgs = {}; // if you want sperate arguments
newArgs.sometext = args.sometext.substring(1);
newArgs.sometext += "b";
console.log(newArgs.sometext)
if (cb.length == 0)
return;
else
cb[0](cb.slice(1), newArgs);
}
//execute a chain of callback functions
foo ([foo, foo, boo, boo], {sometext:""})

Node.js promises with mongoskin

I'm trying to avoid using callbacks when making mongodb queries. I'm using mongoskin to make calls like so:
req.db.collection('users').find().toArray(function (err, doc) {
res.json(doc);
});
In many cases I need to make multiple queries so I want to use Node.js promise library but I'm not sure how to wrap these functions as promises. Most of the examples I see are trivial for things like readFile, I'm guessing in this case I would need to wrap toArray somehow? Can this be done or would have to be something implemented by mongoskin?
An example could be any set of callbacks, find/insert, find/find/insert, find/update:
req.db.collection('users').find().toArray(function (err, doc) {
if (doc) {
req.db.collection('users').find().toArray(function (err, doc) {
// etc...
});
}
else {
// err
}
});
You can promisify the entire module like so with bluebird:
var Promise = require("bluebird");
var mongoskin = require("mongoskin");
Object.keys(mongoskin).forEach(function(key) {
var value = mongoskin[key];
if (typeof value === "function") {
Promise.promisifyAll(value);
Promise.promisifyAll(value.prototype);
}
});
Promise.promisifyAll(mongoskin);
This only needs to be done in one place for one time in your application, not anywhere in your application code.
After that you just use methods normally except with the Async suffix and don't pass callbacks:
req.db.collection('users').find().toArrayAsync()
.then(function(doc) {
if (doc) {
return req.db.collection('users').find().toArrayAsync();
}
})
.then(function(doc) {
if (doc) {
return req.db.collection('users').find().toArrayAsync();
}
})
.then(function(doc) {
if (doc) {
return req.db.collection('users').find().toArrayAsync();
}
});
So again, if you call a function like
foo(a, b, c, function(err, result) {
if (err) return console.log(err);
//Code
});
The promise-returning version is called like:
fooAsync(a, b, c).then(...)
(Uncaught errors are automatically logged so you don't need to check for them if you are only going to log it)
Just stumbled here with the same question and didn't love "promisfying" mongoskin so did a bit more digging and found monk. It's built on top of mongoskin, tidies up the API and returns
promises for all async calls. Probably worth a peek to anyone else who lands here.
Esailija's answer may work, but its not super efficient since you have to run db.collection on every single db call. I don't know exactly how expensive that is, but looking at the code in mongoskin, its non-trivial. Not only that, but it's globally modifying prototypes, which isn't very safe.
The way I do this with fibers futures is:
wrap the collection methods for each collection
on receiving the result, for methods that return a Cursor wrap the toArray method, call it and return the resulting future (for methods that don't return a cursor, you don't need to do anything else).
use the future as normal
like this:
var Future = require("fibers/future")
// note: when i originally wrote this answer fibers/futures didn't have a good/intuitive wrapping function; but as of 2014-08-18, it does have one
function futureWrap() {
// function
if(arguments.length === 1) {
var fn = arguments[0]
var object = undefined
// object, methodName
} else {
var object = arguments[0]
var fn = object[arguments[1]]
}
return function() {
var args = Array.prototype.slice.call(arguments)
var future = new Future
args.push(future.resolver())
var me = this
if(object) me = object
fn.apply(me, args)
return future
}
}
var methodsYouWantToHave = ['findOne', 'find', 'update', 'insert', 'remove', 'findAndModify']
var methods = {}
methodsYouWantToHave.forEach(function(method) {
internalMethods[method] = futureWrap(this.collection, method)
}.bind(this))
// use them
var document = methods.findOne({_id: 'a3jf938fj98j'}, {}).wait()
var documents = futureWrap(methods.find({x: 'whatever'}, {}).wait(), 'toArray')().wait()
If you don't want to use fibers, I'd recommend using the async-future module, which has a good wrap function built in too.

Manually promisifying pg.connect with Bluebird

I want to promisify node-postgres' pg.connect method along with the inner connection.query method provided in the callback.
I can .promisify the latter, but I need to implement the first one manually (if I'm missing something here, please explain).
The thing is, I'm not sure if this code is correct or should be improved? The code is working, I just want to know if I'm using Bluebird as meant.
// aliases
var asPromise = Promise.promisify;
// save reference to original method
var connect = pg.connect.bind(pg);
// promisify method
pg.connect = function (data) {
var deferred = Promise.defer();
connect(data, function promisify(err, connection, release) {
if (err) return deferred.reject(err);
// promisify query factory
connection.query = asPromise(connection.query, connection);
// resolve promised connection
deferred.resolve([connection,release]);
});
return deferred.promise;
};
Throw all that horrible callback code away, then do this somewhere in your application initialization:
var pg = require("pg");
var Promise = require("bluebird");
Object.keys(pg).forEach(function(key) {
var Class = pg[key];
if (typeof Class === "function") {
Promise.promisifyAll(Class.prototype);
Promise.promisifyAll(Class);
}
})
Promise.promisifyAll(pg);
Later in anywhere you can use the pg module as if it was designed to use promises to begin with:
// Later
// Don't even need to require bluebird here
var pg = require("pg");
// Note how it's the pg API but with *Async suffix
pg.connectAsync(...).spread(function(connection, release) {
return connection.queryAsync("...")
.then(function(result) {
console.log("rows", result.rows);
})
.finally(function() {
// Creating a superfluous anonymous function cos I am
// unsure of your JS skill level
release();
});
});
By now there are a number of libraries which do this for you:
pg-promise - generic Promises/A+ for PG
postgres-bluebird
dbh-ph
pg-bluebird
Update for bluebird 3:
The pg.connectAsync(...).spread(function(connection, release) { ... }) call will not work anymore, because the API of bluebird has changed: http://bluebirdjs.com/docs/new-in-bluebird-3.html#promisification-api-changes .
The problem is that promisifyAll in bluebird 3 does not handle multiple arguments by default. This results in the .spread() call reporting a TypeError like the following:
TypeError: expecting an array or an iterable object but got [object Null]
To solve this, you can explicitly enable multiple arguments for connect / connectAsync. Do the following after all the promisifying stuff mentioned above:
...
pg.connectAsync = Promise.promisify(pg.connect, { multiArgs: true });
I suggest to modify Petka Antonov solution a bit
var Promise = require('bluebird');
var pg = require('pg');
Object.keys(pg).forEach(function (key) {
var Cls = null;
try {
Cls = pg[key];
if (typeof Cls === 'function') {
Promise.promisifyAll(Cls.prototype);
Promise.promisifyAll(Cls);
}
} catch (e) {
console.log(e);
}
});
Promise.promisifyAll(pg);
here 'pg[key] wrapped up in try-catch block because pg[key] can retrun error when attempt to access pg['native']

Conditionally call async function in Node

I have the following example code - the first part can result in an async call or not - either way it should continue. I cannot put the rest of the code within the async callback since it needs to run when condition is false. So how to do this?
if(condition) {
someAsyncRequest(function(error, result)) {
//do something then continue
}
}
//do this next whether condition is true or not
I assume that putting the code to come after in a function might be the way to go and call that function within the async call above or on an else call if condition is false - but is there an alternative way that doesn't require me breaking it up in functions?
A library I've used in Node quite often is Async (https://github.com/caolan/async). Last I checked this also has support for the browser so you should be able to npm / concat / minify this in your distribution. If you're using this on server-side only you should consider https://github.com/continuationlabs/insync, which is a slightly improved version of Async, with some of the browser support removed.
One of the common patterns I use when using conditional async calls is populate an array with the functions I want to use in order and pass that to async.waterfall.
I've included an example below.
var tasks = [];
if (conditionOne) {
tasks.push(functionOne);
}
if (conditionTwo) {
tasks.push(functionTwo);
}
if (conditionThree) {
tasks.push(functionThree);
}
async.waterfall(tasks, function (err, result) {
// do something with the result.
// if any functions in the task throws an error, this function is
// immediately called with err == <that error>
});
var functionOne = function(callback) {
// do something
// callback(null, some_result);
};
var functionTwo = function(previousResult, callback) {
// do something with previous result if needed
// callback(null, previousResult, some_result);
};
var functionThree = function(previousResult, callback) {
// do something with previous result if needed
// callback(null, some_result);
};
Of course you could use promises instead. In either case I like to avoid nesting callbacks by using async or promises.
Some of the things you can avoid by NOT using nested callbacks are variable collision, hoisting bugs, "marching" to the right > > > >, hard to read code, etc.
Just declare some other function to be run whenever you need it :
var otherFunc = function() {
//do this next whether condition is true or not
}
if(condition) {
someAsyncRequest(function(error, result)) {
//do something then continue
otherFunc();
}
} else {
otherFunc();
}
Just an another way to do it, this is how I abstracted the pattern. There might be some libraries (promises?) that handle the same thing.
function conditional(condition, conditional_fun, callback) {
if(condition)
return conditional_fun(callback);
return callback();
}
And then at the code you can write
conditional(something === undefined,
function(callback) {
fetch_that_something_async(function() {
callback();
});
},
function() {
/// ... This is where your code would continue
});
I would recommend using clojurescript which has an awesome core-async library which makes life super easy when dealing with async calls.
In your case, you would write something like this:
(go
(when condition
(<! (someAsyncRequest)))
(otherCodeToHappenWhetherConditionIsTrueOrNot))
Note the go macro which will cause the body to run asynchronously, and the <! function which will block until the async function will return. Due to the <! function being inside the when condition, it will only block if the condition is true.

How to wait for all async calls to finish

I'm using Mongoose with Node.js and have the following code that will call the callback after all the save() calls has finished. However, I feel that this is a very dirty way of doing it and would like to see the proper way to get this done.
function setup(callback) {
// Clear the DB and load fixtures
Account.remove({}, addFixtureData);
function addFixtureData() {
// Load the fixtures
fs.readFile('./fixtures/account.json', 'utf8', function(err, data) {
if (err) { throw err; }
var jsonData = JSON.parse(data);
var count = 0;
jsonData.forEach(function(json) {
count++;
var account = new Account(json);
account.save(function(err) {
if (err) { throw err; }
if (--count == 0 && callback) callback();
});
});
});
}
}
You can clean up the code a bit by using a library like async or Step.
Also, I've written a small module that handles loading fixtures for you, so you just do:
var fixtures = require('./mongoose-fixtures');
fixtures.load('./fixtures/account.json', function(err) {
//Fixtures loaded, you're ready to go
};
Github:
https://github.com/powmedia/mongoose-fixtures
It will also load a directory of fixture files, or objects.
I did a talk about common asyncronous patterns (serial and parallel) and ways to solve them:
https://github.com/masylum/i-love-async
I hope its useful.
I've recently created simpler abstraction called wait.for to call async functions in sync mode (based on Fibers). It's at an early stage but works. It is at:
https://github.com/luciotato/waitfor
Using wait.for, you can call any standard nodejs async function, as if it were a sync function, without blocking node's event loop. You can code sequentially when you need it.
using wait.for your code will be:
//in a fiber
function setup(callback) {
// Clear the DB and load fixtures
wait.for(Account.remove,{});
// Load the fixtures
var data = wait.for(fs.readFile,'./fixtures/account.json', 'utf8');
var jsonData = JSON.parse(data);
jsonData.forEach(function(json) {
var account = new Account(json);
wait.forMethod(account,'save');
}
callback();
}
That's actually the proper way of doing it, more or less. What you're doing there is a parallel loop. You can abstract it into it's own "async parallel foreach" function if you want (and many do), but that's really the only way of doing a parallel loop.
Depending on what you intended, one thing that could be done differently is the error handling. Because you're throwing, if there's a single error, that callback will never get executed (count won't be decremented). So it might be better to do:
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
And handle the error in the callback. It's better node-convention-wise.
I would also change another thing to save you the trouble of incrementing count on every iteration:
var jsonData = JSON.parse(data)
, count = jsonData.length;
jsonData.forEach(function(json) {
var account = new Account(json);
account.save(function(err) {
if (err) return callback(err);
if (!--count) callback();
});
});
If you are already using underscore.js anywhere in your project, you can leverage the after method. You need to know how many async calls will be out there in advance, but aside from that it's a pretty elegant solution.

Resources