Hi I'm a newbie in nodejs as far as I'm concerned nodejs is event-driven which is a powerful feature in it.
I have been learning nodejs from last few days and try to build restful apis in it with mongodb, but I'm not able to use its event-driven architecture in apis below is my sudo code
//routes
app.get('/someUrl', SomeClass.executeSomeController);
//controller
class SomeClass {
async executeSomeController(req, res){
let response = awaitSomeHelper.executeQueryAndBusinessLogic(req.body);
res.send(response)
}
}
As per my understanding I have written normal code as I used to write using Ror or PHP The only difference I found that the controller is running asynchronous which does not happens in Ror or Php.
How can I use event-driven architecture to build restful apis
Hope I can cover your question. Basically in some cases 'event-driven architecture' term can be explained differently. In one case it's a basic core NodeJS flow that explains all the async functions. In another case, the root of the question can be related to events, event emitter etc.
But the main idea that you have to wait for all the asynchronous actions. In order to avoid thread blocking it goes further and handles the rest of your code without waiting for heavy requests. And there we have to know how to handle this async functionality.
Basic Async Flow
As I understand, you've got questions related to async operations in NodeJS. That's a root of the technology - all the heavy operations will be handled asynchronously. It's all about V8 and Event Loop.
So in order to work with asynchronous operations, you may use callback functions, promises or async-await syntax.
Callback Functions
function asyncFunction(params, callback) {
//do async stuff
callback(err, result);
}
function callbackFunction(err, result) {
}
asyncFunction(params, callbackFunction);
Promises
promiseFunction()
.then(anotherPromiseFunction)
.then((result) => {
//handle result
})
.catch((err) => {
//handle error
});
async-await
function anotherAsyncFunction() {
//do async stuff
}
const asycnFunction = async (params) => {
const result = await anotherAsyncFunction();
return result;
};
Events/Event Emitter
const fs = require('fs');
const filePath = './path/to/your/file';
const stream = fs.createReadStream(filePath);
stream.on('data', (data) => {
//do something
});
stream.on('end', () => {
//do something;
});
stream.on('error', (err) => {
//do something;
});
You may use these methods depends on the situation and your needs. I recommend skipping callback functions as we have modern ways to work in async flow (promises and async-await). By the way, 'async-await' returns promises as well.
Here is the example of a simple Express JS Server (pretty old syntax), but still valid. Please feel free to check and write questions:
https://github.com/roman-sachenko/express-entity-based
Here is a list of articles I'd recommend you:
https://blog.risingstack.com/node-js-at-scale-understanding-node-js-event-loop/
https://blog.risingstack.com/mastering-async-await-in-nodejs/
Related
I am new to nodejs/expressjs and mongodb. I am trying to create an API that exposes data to my mobile app that I am trying to build using Ionic framework.
I have a route setup like this
router.get('/api/jobs', (req, res) => {
JobModel.getAllJobsAsync().then((jobs) => res.json(jobs)); //IS THIS THe CORRECT WAY?
});
I have a function in my model that reads data from Mongodb. I am using the Bluebird promise library to convert my model functions to return promises.
const JobModel = Promise.promisifyAll(require('../models/Job'));
My function in the model
static getAllJobs(cb) {
MongoClient.connectAsync(utils.getConnectionString()).then((db) => {
const jobs = db.collection('jobs');
jobs.find().toArray((err, jobs) => {
if(err) {
return cb(err);
}
return cb(null, jobs);
});
});
}
The promisifyAll(myModule) converts this function to return a promise.
What I am not sure is,
If this is the correct approach for returning data to the route callback function from my model?
Is this efficient?
Using promisifyAll is slow? Since it loops through all functions in the module and creates a copy of the function with Async as suffix that now returns a promise. When does it actually run? This is a more generic question related to node require statements. See next point.
When do all require statements run? When I start the nodejs server? Or when I make a call to the api?
Your basic structure is more-or-less correct, although your use of Promise.promisifyAll seems awkward to me. The basic issue for me (and it's not really a problem - your code looks like it will work) is that you're mixing and matching promise-based and callback-based asynchronous code. Which, as I said, should still work, but I would prefer to stick to one as much as possible.
If your model class is your code (and not some library written by someone else), you could easily rewrite it to use promises directly, instead of writing it for callbacks and then using Promise.promisifyAll to wrap it.
Here's how I would approach the getAllJobs method:
static getAllJobs() {
// connect to the Mongo server
return MongoClient.connectAsync(utils.getConnectionString())
// ...then do something with the collection
.then((db) => {
// get the collection of jobs
const jobs = db.collection('jobs');
// I'm not that familiar with Mongo - I'm going to assume that
// the call to `jobs.find().toArray()` is asynchronous and only
// available in the "callback flavored" form.
// returning a new Promise here (in the `then` block) allows you
// to add the results of the asynchronous call to the chain of
// `then` handlers. The promise will be resolved (or rejected)
// when the results of the `job().find().toArray()` method are
// known
return new Promise((resolve, reject) => {
jobs.find().toArray((err, jobs) => {
if(err) {
reject(err);
}
resolve(jobs);
});
});
});
}
This version of getAllJobs returns a promise which you can chain then and catch handlers to. For example:
JobModel.getAllJobs()
.then((jobs) => {
// this is the object passed into the `resolve` call in the callback
// above. Do something interesting with it, like
res.json(jobs);
})
.catch((err) => {
// this is the error passed into the call to `reject` above
});
Admittedly, this is very similar to the code you have above. The only difference is that I dispensed with the use of Promise.promisifyAll - if you're writing the code yourself & you want to use promises, then do it yourself.
One important note: it's a good idea to include a catch handler. If you don't, your error will be swallowed up and disappear, and you'll be left wondering why your code is not working. Even if you don't think you'll need it, just write a catch handler that dumps it to console.log. You'll be glad you did!
How do I make a chained function wait for the function before it, to execute properly?
I have this excerpt from my module:
var ParentFunction = function(){
this.userAgent = "SomeAgent";
return this;
}
ParentFunction.prototype.login = function(){
var _this = this;
request.post(
url, {
"headers": {
"User-Agent": _this.userAgent
}
}, function(err, response, body){
return _this;
})
}
ParentFunction.prototype.user = function(username){
this.username = username;
return this;
}
ParentFunction.prototype.exec = function(callback){
request.post(
anotherURL, {
"headers": {
"User-Agent": _this.userAgent
}
}, function(err, response, body){
callback(body);
})
}
module.exports = parentFunction;
And this is from within my server:
var pF = require("./myModule.js"),
parentFunction = new pF();
parentFunction.login().user("Mobilpadde").exec(function(data){
res.json(data);
});
The problem is, that the user-function won't wait for login to finish (Meaning, it executes before the login returns _this). So how do I make it wait?
You can't make Javascript "wait" before executing the next call in the chain. The whole chain will execute immediately in sequential order without waiting for any async operations to complete.
The only way I can think of to make this structure work is to create a queue of things waiting to execute and then somehow monitor the things in that queue that are async so you know when to execute the next thing in the queue. This requires making each method follow some sort of standard convention for knowing both whether the method is async and if it is async when the async operation is done and what to do if there's an error in the chain.
jQuery does something like this for jQuery animations (it implements a queue for all chained animations and calls each animation in turn when the previous animation is done). But, its implementation is simpler than what you have here because it works only with jQuery animations (or manually queued functions that follow the proper convention), not with other jQuery methods. You are proposing a mix of three different kinds of methods, two of which are not even async.
So, to make a long story shorter, what you are asking for could likely be done if you make all methods follow a set of conventions, but it's not easy. Since you appear to only have one async operation here, I'm wondering if you could do something like this instead. You don't show what the .exec() method does, but if all it does is call some other function at the end of the chain, then you'ld only have one async method in the chain so you could just let it take a callback and do this:
parentFunction.user("Mobilepadde").login(function(data) {
res.json(data);
});
I was working on a queued means of doing this, but it is not something I can write and test in less than an hour and you'd have to offer some ideas for what you want to do with errors that occur anywhere in the chain. Non-async errors could just throw an exception, but async errors or even non-async errors that occur after an async operation completes can't just throw because there's no good way to catch them. So, error handling becomes complex and you really shouldn't embark on a design path that doesn't anticipate appropriate error handling.
The more I think about this, the more I think you either want to get a library designed to handle the sequencing of async operations (such as the async module) and use that for your queueing of operations or you want to just give up on the chaining and use promises to support sequencing of your operations. As I was thinking about how to do error handling in the task queue with async operations, it occurred to me that all these problems have already been dealt with in promises (propagation of errors through reject and catching of exceptions in async handlers that are turned into promise rejections, etc...). Doing error handling right with async operations is difficult and is one huge reason to build off of promises for sequencing async operations.
So, now thinking about solving this using promises, what you could do is to make each async method return a promise (you can promisfy the entire request module with one call with many promise libraries such as Bluebird). Then, one .login() and .exec() return promises, you can do this:
var Promise = require('bluebird');
var request = Promise.promisifyAll(require('request'));
ParentFunction.prototype.login = function(){
return request.postAsync(
url, {
"headers": {
"User-Agent": this.userAgent
}
});
}
ParentFunction.prototype.exec = function(){
return request.postAsync(
anotherURL, {
"headers": {
"User-Agent": this.userAgent
}
}).spread(function(response, body) {
return body;
})
}
parentFunction.login().then(function() {
parentFunction.user("Mobilpadde");
return parentFunction.exec().then(function(data) {
res.json(data);
});
}).catch(function(err) {
// handle errors here
});
This isn't chaining, but it gets you going in minutes rather than something that probably takes quite awhile to write (with robust error handling).
Try this and see if it works:
ParentFunction.prototype.login = function(callback){
var _this = this;
request.post(
url, {
"headers": {
"User-Agent": _this.userAgent
}
}, function(err, response, body){
return callback(_this);
})
}
}
On the server side:
parentFunction.login(function(loggedin){
loggedin.user("Mobilpadde").exec(function(data){
res.json(data);
});
});
Folks,
I have the following function, and am wondering whats the correct way to call the callback() only when the database operation completes on all items:
function mapSomething (callback) {
_.each(someArray, function (item) {
dao.dosomething(item.foo, function (err, account) {
item.email = account.email;
});
});
callback();
},
What I need is to iterate over someArray and do a database call for each element. After replacing all items in the array, I need to only then call the callback. Ofcourse the callback is in the incorrect place right now
Thanks!
The way you currently have it, callback is executed before any of the (async) tasks finish.
The async module has an each() that allows for a final callback:
var async = require('async');
// ...
function mapSomething (callback) {
async.each(someArray, function(item, cb) {
dao.dosomething(item.foo, function(err, account) {
if (err)
return cb(err);
item.email = account.email;
cb();
});
}, callback);
}
This will not wait for all your database calls to be done before calling callback(). It will launch all the database calls at once in parallel (I'm assuming that's what dao.dosomething() is). And, then immediately call callback() before any of the database calls have finished.
You have several choices to solve the problem.
You can use promises (by promisifying the database call) and then use Promise.all() to wait for all the database calls to be done.
You can use the async library to manage the coordination for you.
You can keep track of when each one is done yourself and when the last one is done, call your callback.
I would recommend options 1. or 2. Personally, I prefer to use promises and since you're interfacing with a database, this is probably not the only place you're making database calls, so I'd promisify the interface (bluebird will do that for you in one function call) and then use promises.
Here's what a promise solution could look like:
var Promise = require('bluebird');
// make promise version of your DB function
// ideally, you'd promisify the whole DB API with .promisifyAll()
var dosomething = Promise.promisify(dao.dosomething, dao);
function mapSomething (callback, errCallback) {
Promise.all(_.map(someArray, function(item) {
return dosomething(item.foo).then(function (account) {
item.email = account.email;
});
}).then(callback, errCallback);
}
This assumes you want to run all the DB calls in parallel and then call the callback when they are all done.
FYI, here's a link to how Bluebird promisify's existing APIs. I use this mechanism for all async file I/O in node and it saves a ton of coding time and makes error handling much more sane. Async callbacks are a nightmare for proper error handling, especially if exceptions can be thrown from async callbacks.
P.S. You may actually want your mapSomething() function to just return a promise itself so the caller is then responsible for specifying their own .then() handler and it allows the caller to use the returned promise for their own synchronization with other things (e.g. it's just more flexible that way).
function mapSomething() {
return Promise.all(_.map(someArray, function(item) {
return dosomething(item.foo).then(function (account) {
item.email = account.email;
});
})
}
mapSomething.then(mapSucessHandler, mapErrorHandler);
I haven't tried Bluebird's .map() myself, but once you've promisified the database call, I think it would simplify it a bit more like this:
function mapSomething() {
return Promise.map(someArray, function(item) {
return dosomething(item.foo).then(function (account) {
item.email = account.email;
});
})
}
mapSomething.then(mapSucessHandler, mapErrorHandler);
I am pretty new to Mongoose so please bear with me.
Is there a way to perform two queries in "parallel". Or at least query two documents and return their results together? The callback notation is tripping me up a little with the sync.
In pseudo code this is what I am looking for:
function someWork(callback) {
var task1 = service.doQueryAndReturnTask();
var task2 = service.doQueryAndReturnTask();
waitAll(task1, task2);
callback(task1, task2);
}
I know this is not the solution, due to the need to have callback on the doQueryAndReturnTask, but I need a pattern that works and referrable doesnt chain the callbacks
It's not about Mongoose. Node.js is an asynchronous language, so it allows you to execute any number of async tasks (e.g. querying a database) at the same time.
What you need is some lib to handle asynchronous control flow, like async.js or when.js:
var when = require('when');
var someWork = function(callback) {
when.all([
collection1.find(query1).exec(),
collection2.find(query2).exec()
]).spread(callback)
.otherwise(function(err) {
// something went wrong
});
};
when.js is a module to handle promises. So, if you don't need promises, you may use async.js instead:
var async = require('async');
var someWork = function(callback) {
async.parallel([
function(cb) { collection1.find(query1, cb) },
function(cb) { collection2.find(query2, cb) }
], function(err, res) {
if (!err) return callback.apply(null, data);
// something went wrong
});
};
Update: Promises are the alternative way to handle asynchronous control flow by wrapping asynchronous functions with promises.
Usually, to get the results of some asynchronous function you should pass it some callback which will be executed somewhere in the future.
When you're using promises, instead of passing some callback you're immediately getting the promise of the results of the executions which will be resolved somewhere in the future.
So, promises allows you to work with asynchronous functions in a synchronous way using promises instead of the real data. Promises also allows you to wait for the results at any point of the execution.
In my example I'm executing two queries getting two promises for their results. Then I'm telling node to wait until both promises are fulfilled passing their results to the callback function afterwards.
You can read promises/A+ specification here. You may also look at when.js api docs.
Nowadays, this could be achieved using Promise.all:
Promise.all([
collection1.find({foo: 'bar'}),
collection2.find({fooey: 'bazzy'})
]).then(([fooResults, fooeyResults]) => {
console.log('results: ', fooResults, fooeyResults);
}).catch((err) => {
console.log('Error: ', err);
});
I'm designing and implementing an API for Node.js to access from Ubuntu an IBM mainframe via IBM3270 protocol using x3270 tool. So Node.js process spawn s3270 process and uses its stdin, stdout and stderr to communicate with an IBM mainframe.
I've implemented the following interface:
var hs = require('./hs');
var session = hs.createSession(opts);
session.on('error', function(err) {
console.log('ERROR: %s', err.message);
});
session.on('connect', function() {
console.log('** connected');
session.send('TRANS');
});
session.on('response', function(res) {
console.log(res);
session.disconnect();
});
session.on('disconnect', function() {
console.log('** disconnected');
session.close();
});
session.on('close', function() {
console.log('** closed');
});
session.connect();
Everything is working very well.
The problem is the following. I would like to use Q promise library to get the client code that uses my API more organized, and also have Node.js like API in form of session.send(trans, cb(err, res) {}). I don't realize how should I implement the send function in a manner that it accepts a callback.
Generalizing my question I would like to know designing Node.js like API what should I implement first:
simple send(trans) function that emits events and using this then implement send('trans', cb(err, res) {}) OR
implement send('trans', cb(err, res) {}) first (I don't know how) and then implement events OR
how is the correct way to implement Node.js like API
What I'm looking for is the general workflow and design principles when designing Node.js like API that could be also consumed by Q promise library.
As I realized there are two approaches to design async API for Node.js:
callback-based that could be implemented with EventEmitter
promise-based that could be implemented with var d = Q.defer();, return d.promise; and d.resolve(); from Q library
I implemented my API with promise-based approach using Q library only in order to get my code more organized. Furthermore Q library has functions such as Q.nfcall();, Q.nfapply(); and Q.nfbind(); to convert callback-based Node.js API to promise-based equivalent.