Node.js like API design principles and workflow - node.js

I'm designing and implementing an API for Node.js to access from Ubuntu an IBM mainframe via IBM3270 protocol using x3270 tool. So Node.js process spawn s3270 process and uses its stdin, stdout and stderr to communicate with an IBM mainframe.
I've implemented the following interface:
var hs = require('./hs');
var session = hs.createSession(opts);
session.on('error', function(err) {
console.log('ERROR: %s', err.message);
});
session.on('connect', function() {
console.log('** connected');
session.send('TRANS');
});
session.on('response', function(res) {
console.log(res);
session.disconnect();
});
session.on('disconnect', function() {
console.log('** disconnected');
session.close();
});
session.on('close', function() {
console.log('** closed');
});
session.connect();
Everything is working very well.
The problem is the following. I would like to use Q promise library to get the client code that uses my API more organized, and also have Node.js like API in form of session.send(trans, cb(err, res) {}). I don't realize how should I implement the send function in a manner that it accepts a callback.
Generalizing my question I would like to know designing Node.js like API what should I implement first:
simple send(trans) function that emits events and using this then implement send('trans', cb(err, res) {}) OR
implement send('trans', cb(err, res) {}) first (I don't know how) and then implement events OR
how is the correct way to implement Node.js like API
What I'm looking for is the general workflow and design principles when designing Node.js like API that could be also consumed by Q promise library.

As I realized there are two approaches to design async API for Node.js:
callback-based that could be implemented with EventEmitter
promise-based that could be implemented with var d = Q.defer();, return d.promise; and d.resolve(); from Q library
I implemented my API with promise-based approach using Q library only in order to get my code more organized. Furthermore Q library has functions such as Q.nfcall();, Q.nfapply(); and Q.nfbind(); to convert callback-based Node.js API to promise-based equivalent.

Related

create restful apis with Event driven architecture node js?

Hi I'm a newbie in nodejs as far as I'm concerned nodejs is event-driven which is a powerful feature in it.
I have been learning nodejs from last few days and try to build restful apis in it with mongodb, but I'm not able to use its event-driven architecture in apis below is my sudo code
//routes
app.get('/someUrl', SomeClass.executeSomeController);
//controller
class SomeClass {
async executeSomeController(req, res){
let response = awaitSomeHelper.executeQueryAndBusinessLogic(req.body);
res.send(response)
}
}
As per my understanding I have written normal code as I used to write using Ror or PHP The only difference I found that the controller is running asynchronous which does not happens in Ror or Php.
How can I use event-driven architecture to build restful apis
Hope I can cover your question. Basically in some cases 'event-driven architecture' term can be explained differently. In one case it's a basic core NodeJS flow that explains all the async functions. In another case, the root of the question can be related to events, event emitter etc.
But the main idea that you have to wait for all the asynchronous actions. In order to avoid thread blocking it goes further and handles the rest of your code without waiting for heavy requests. And there we have to know how to handle this async functionality.
Basic Async Flow
As I understand, you've got questions related to async operations in NodeJS. That's a root of the technology - all the heavy operations will be handled asynchronously. It's all about V8 and Event Loop.
So in order to work with asynchronous operations, you may use callback functions, promises or async-await syntax.
Callback Functions
function asyncFunction(params, callback) {
//do async stuff
callback(err, result);
}
function callbackFunction(err, result) {
}
asyncFunction(params, callbackFunction);
Promises
promiseFunction()
.then(anotherPromiseFunction)
.then((result) => {
//handle result
})
.catch((err) => {
//handle error
});
async-await
function anotherAsyncFunction() {
//do async stuff
}
const asycnFunction = async (params) => {
const result = await anotherAsyncFunction();
return result;
};
Events/Event Emitter
const fs = require('fs');
const filePath = './path/to/your/file';
const stream = fs.createReadStream(filePath);
stream.on('data', (data) => {
//do something
});
stream.on('end', () => {
//do something;
});
stream.on('error', (err) => {
//do something;
});
You may use these methods depends on the situation and your needs. I recommend skipping callback functions as we have modern ways to work in async flow (promises and async-await). By the way, 'async-await' returns promises as well.
Here is the example of a simple Express JS Server (pretty old syntax), but still valid. Please feel free to check and write questions:
https://github.com/roman-sachenko/express-entity-based
Here is a list of articles I'd recommend you:
https://blog.risingstack.com/node-js-at-scale-understanding-node-js-event-loop/
https://blog.risingstack.com/mastering-async-await-in-nodejs/

How to return promise to the router callback in NodeJS/ExpressJS

I am new to nodejs/expressjs and mongodb. I am trying to create an API that exposes data to my mobile app that I am trying to build using Ionic framework.
I have a route setup like this
router.get('/api/jobs', (req, res) => {
JobModel.getAllJobsAsync().then((jobs) => res.json(jobs)); //IS THIS THe CORRECT WAY?
});
I have a function in my model that reads data from Mongodb. I am using the Bluebird promise library to convert my model functions to return promises.
const JobModel = Promise.promisifyAll(require('../models/Job'));
My function in the model
static getAllJobs(cb) {
MongoClient.connectAsync(utils.getConnectionString()).then((db) => {
const jobs = db.collection('jobs');
jobs.find().toArray((err, jobs) => {
if(err) {
return cb(err);
}
return cb(null, jobs);
});
});
}
The promisifyAll(myModule) converts this function to return a promise.
What I am not sure is,
If this is the correct approach for returning data to the route callback function from my model?
Is this efficient?
Using promisifyAll is slow? Since it loops through all functions in the module and creates a copy of the function with Async as suffix that now returns a promise. When does it actually run? This is a more generic question related to node require statements. See next point.
When do all require statements run? When I start the nodejs server? Or when I make a call to the api?
Your basic structure is more-or-less correct, although your use of Promise.promisifyAll seems awkward to me. The basic issue for me (and it's not really a problem - your code looks like it will work) is that you're mixing and matching promise-based and callback-based asynchronous code. Which, as I said, should still work, but I would prefer to stick to one as much as possible.
If your model class is your code (and not some library written by someone else), you could easily rewrite it to use promises directly, instead of writing it for callbacks and then using Promise.promisifyAll to wrap it.
Here's how I would approach the getAllJobs method:
static getAllJobs() {
// connect to the Mongo server
return MongoClient.connectAsync(utils.getConnectionString())
// ...then do something with the collection
.then((db) => {
// get the collection of jobs
const jobs = db.collection('jobs');
// I'm not that familiar with Mongo - I'm going to assume that
// the call to `jobs.find().toArray()` is asynchronous and only
// available in the "callback flavored" form.
// returning a new Promise here (in the `then` block) allows you
// to add the results of the asynchronous call to the chain of
// `then` handlers. The promise will be resolved (or rejected)
// when the results of the `job().find().toArray()` method are
// known
return new Promise((resolve, reject) => {
jobs.find().toArray((err, jobs) => {
if(err) {
reject(err);
}
resolve(jobs);
});
});
});
}
This version of getAllJobs returns a promise which you can chain then and catch handlers to. For example:
JobModel.getAllJobs()
.then((jobs) => {
// this is the object passed into the `resolve` call in the callback
// above. Do something interesting with it, like
res.json(jobs);
})
.catch((err) => {
// this is the error passed into the call to `reject` above
});
Admittedly, this is very similar to the code you have above. The only difference is that I dispensed with the use of Promise.promisifyAll - if you're writing the code yourself & you want to use promises, then do it yourself.
One important note: it's a good idea to include a catch handler. If you don't, your error will be swallowed up and disappear, and you'll be left wondering why your code is not working. Even if you don't think you'll need it, just write a catch handler that dumps it to console.log. You'll be glad you did!

Best practices of db connection pool handling in a node js app?

I'm referring to node-postgres package below, but I guess this question is rather generic.
There is this trivial example where you 1) acquire (connect) a connection (client) from the pool in the top level http request handler, 2) do all business inside of that handler and 3) release it back to the pool after you're done.
I guess it works fine for that example, but as soon as your app becomes somewhat bigger this becomes painfull soon.
I'm thinking of these two options, but I'm not quite sure...
do the "get client + work + release client" approach everywhere I need to talk to db.
This seems like a good choice, but will it not lead to eating up more than one connection/client per the top http request (there are parallel async db calls in many places in my project)?
try to assign a globaly shared reference to one client/connection accessible via require()
Is this a good idea and actually reasonably doable? Is it possible to nicely handle the "back to the pool release" in all ugly cases (errors in parallel async stuff for example)?
Thank you.
Well, I lost some time trying to figure that out. At the end, after some consideration and influenced by John Papa's code I decided use a database module like this:
var Q = require('q');
var MongoClient = require('mongodb').MongoClient;
module.exports.getDb = getDb;
var db = null;
function getDb() {
return Q.promise(theDb);
function theDb(resolve, reject, notify) {
if (db) {
resolve(db);
} else {
MongoClient.connect(mongourl, mongoOptions, function(err, theDb) {
resolve(db);
}
});
}
}
}
So, when I need to perform a query:
getDb().then(function(db) {
//performe query here
});
At least for Mongodb this is good practice as seen here.
The best advise would depend on the type of database and the basic framework that represents the database.
In case of Postgres, the basic framework/driver is node-postgres, which has embedded support for connection pool. That support is however low-level.
For high-level access see pg-promise, which provides automatic connection management, support for tasks, transactions and much more.
Here is what has worked well for me.
var pg = require('pg');
var config = { pg : 'postgres://localhost/postgres' };
pg.connect(config.pg, function(err, client, done) {
client.query('SELECT version();', function (err, results) {
done();
//do something with results.rows
});
});

Abstracting the superagent

Our application consists of nodejs, express, reactjs, and newforms.
To make rest calls we are using :
var RestClient = require('superagent-ls')
And we are making rest calls like:
cleanBirthDate(callback) {
var {birthDate} = this.cleanedData
var formattedDob = moment (birthDate).format('DDMMYYYY')
RestClient.get(Global.getBirthDateServiceUrl() + '/' + formattedDob)
.end((err, res) => {
if (err) {
callback (err)
}
else if (res.clientError) {
var message = errorsMappingSwitch(res.body.error)
callback(null, forms.ValidationError(message))
}
else {
callback(null)
}
})
},
We want to move the RestClient related code to our own file say RestCleint.js and then require it and use it across the application. By doing so we can apply some generalised code(like error handling, logging, redirect to specific error pages depending on the error code) in one place.
Appreciate any help in this direction.
I did the exact same thing you require (even with using superagent). I created modules with the API code in a /utils folder and required them where applicable. For even more abstraction we're using CoffeeScript to create classes that inherit from a BaseAPIObject and invoke using something like API.Posts.getAll().end() etc.
This article was very helpful in understanding how to write your own modules: Export This: Interface Design Patterns for Node.js Modules.
you can always require it like
RestClient.js
export default function callApi(callback) {
//your rest code
// use the callback here in the callback of your call.
}
app.js
import {callApi} from './RestClient';
callApi((err, result) => {
if (err) console.log(err)
});

Node.js event driven paradigm = messy code?

I am coming from a PHP background and am now trying to get used to the event driven paradigm of Node.js. However, my code quickly get messy. Below I compare procedural code with actual Node.js Redis code. Am I doing this right?
PROCEDURAL (pseude code)
if(!client.get("user:name:koen")) {
client.set("user:name:koen", "user:id:" + client.incr("count:users"));
}
EVENT DRIVEN (actual code)
client.get("user:name:koen", function(err, res) {
if(!res){
client.incr("count:users", function(err, count){
client.set("user:name:koen", "user:id:" + count, function (err, res) {
callback(err, res);
});
});
}
});
Callback hell, mentioned in the question, is greately explained here, as well as how to write the code to avoid it:
http://callbackhell.com/

Resources