Node Custom error handler middleware optimization? - node.js

Hej,
I am working on an enabler within a nodeJS platform which expose a REST API. I need to implement some custom errors handlers whose will implement some business logic.
Basically, I have to deal with a POST request whose content is encoded as application/json. I expect to receive a list of objects representing a services within an array. Each item of the array is an object which does have the following structure
{
"code": "sampleCode",
"id": "someId",
"status: "someStatus"
}
In term of business logic, each service belongs to a family. And each family does have some configurations stored in a properties file.
Once I received the request I need to check the body content and apply some custom rules which will throw custom errors.
Rules are :
"code value can't be empty.
"code" value must be in a list defined in the properties
"code" if code value belongs to family foo or bar id can't be empty
I implemented some small unit functions using lodash to do tests in collections.
My main regards is that do I need to go async for those process ? For now I am just calling the functions as it. As i am quite new to nodeJs and especially in terms of best practice in corporate environement I would like to have opinions from more experienced users ?
Do custom errors handlers implementing business logic should be async or there's no benefits ?

Nodejs is fundamentally asynchronous, so in general, you don't do sync stuff.
If you use expressjs or similar, you should create a kind of "filter" before the request, something like :
function validator(req, res, next){
//my validation code
next();
}
app.get('/my/endpoint', validator, function (req, res) {
req.send();
});

Related

Express route with multiple middlewares and separated layers

I'm reading the GitHub https://github.com/goldbergyoni/nodebestpractices and trying to apply the tips on my project. Currently i'm working on the "1.2 Layer your components, keep Express within its boundaries" tip, but I have a question.
I'm using routes/controllers, and using this tip (1.2), a route with multiple middlewares will look like this.
router.post("/do-multiple-stuff",
(req, res, next) => {
stuffController.getStuffDone(req.body.stuff);
next();
},
(req, res, next) => {
stuffController.getOtherStuffDone(req.body.otherStuff);
return res.send("stuff done");
});
Is this correct? Or there's a better way to do this?
Thanks! <3
The point of that 1.2 section is to create your business logic as a separate, testable component that is passed data only, not passed req and res. This allows it to be independently and separately tested without the Express environment around it.
Your calls to:
stuffController.getStuffDone(req.body.stuff);
and
stuffController.getOtherStuffDone(req.body.otherStuff);
Are indeed making that proper separation between the web and the business logic because you aren't passing req or res to your controller. That looks like it meets the point of the 1.2 training step.
The one thing I see missing here is that there isn't any output from either of these function calls. They don't return anything and since you don't pass req or res to them, they can't be modifying the req object (like some middleware does) and can't be sending a response or error by themselves. So, it appears that these need a mechanism for communicating some type of result back, either a direct return value (if the functions are synchronous) or returning a promise (if the functions are asynchronous). Then, the calling code could get their result and do something with that result.

Koa-router getting parsed params before hitting route

I'm using koa2 and koa-router together with sequelize on top. I want to be able to control user access based on their roles in the database, and it's been working somewhat so far. I made my own RBAC implementation, but I'm having some trouble.
I need to quit execution BEFORE any endpoint is hit if the user doesn't have access, considering endpoints can do any action (like inserting a new item etc.). This makes perfect sense, I realize I could potentially use transactions with Sequelize, but I find that would add more overhead and deadline is closing in.
My implementation so far looks somewhat like the following:
// initialize.js
initalizeRoutes()
initializeServerMiddleware()
Server middleware is registered after routes.
// function initializeRoutes
app.router = require('koa-router')
app.router.use('*', access_control(app))
require('./routes_init')
routes_init just runs a function which recursively parses a folder and imports all middleware definitions.
// function initializeServerMiddleware
// blah blah bunch of middleware
app.server.use(app.router.routes()).use(app.router.allowedMethods())
This is just regular koa-router.
However, the issue arises in access_control.
I have one file (access_control_definitions.js) where I specify named routes, their respective sequelize model name, and what rules exists for the route. (e.g. what role, if the owner is able to access their own resource...) I calculate whether the requester owns a resource by a route param (e.g. resource ID is ctx.params.id). However, in this implementation, params don't seem to be parsed. I don't think it's right that I have to manually parse the params before koa-router does it. Is anyone able to identify a better way based on this that would solve ctx.params not being filled with the actual named parameter?
edit: I also created a GitHub issue for this, considering it seems to me like there's some funny business going on.
So if you look at router.js
layerChain = matchedLayers.reduce(function(memo, layer) {
memo.push(function(ctx, next) {
ctx.captures = layer.captures(path, ctx.captures);
ctx.params = layer.params(path, ctx.captures, ctx.params);
ctx.routerName = layer.name;
return next();
});
return memo.concat(layer.stack);
}, []);
return compose(layerChain)(ctx, next);
What it does is that for every route function that you have, it add its own capturing layer to generate the params
Now this actually does make sense because you can have two middleware for same url with different parameters
router.use('/abc/:did', (ctx, next) => {
// ctx.router available
console.log('my request came here too', ctx.params.did)
if (next)
next();
});
router.get('/abc/:id', (ctx, next) => {
console.log('my request came here', ctx.params.id)
});
Now for the first handler a parameter id makes no sense and for the second one parameter did doesn't make any sense. Which means these parameters are specific to a handler and only make sense inside the handler. That is why it makes sense to not have the params that you expect to be there. I don't think it is a bug
And since you already found the workaround
const fromRouteId = pathToRegexp(ctx._matchedRoute).exec(ctx.captures[0])
You should use the same. Or a better one might be
var lastMatch = ctx.matched[ctx.matched.length-1];
params = lastMatch.params(ctx.originalUrl, lastMatch.captures(ctx.originalUrl), {})

Would the values inside request be mixed up in callback?

I am new to Node.js, and I have been reading questions and answers related with this issue, but still not very sure if I fully understand the concept in my case.
Suggested Code
router.post('/test123', function(req, res) {
someAsyncFunction1(parameter1, function(result1) {
someAsyncFunction2(parameter2, function(result2) {
someAsyncFunction3(parameter3, function(result3) {
var theVariable1 = req.body.something1;
var theVariable2 = req.body.something2;
)}
)}
});
Question
I assume there will be multiple (can be 10+, 100+, or whatever) requests to one certain place (for example, ajax request to /test123, as shown above) at the same time with some variables (something1 and something2). According to this, it would be impossible that one user's theVariable1 and theVariable2 are mixed up with (i.e, overwritten by) the other user's req.body.something1 and req.body.something2. I am wondering if this is true when there are multiple callbacks (three like the above, or ten, just in case).
And, I also consider using res.locals to save some data from callbacks (instead of using theVariable1 and theVariable2, but is it good idea to do so given that the data will not be overwritten due to multiple simultaneous requests from clients?
Each request an Node.js/Express server gets generated a new req object.
So in the line router.post('/test123', function(req, res), the req object that's being passed in as an argument is unique to that HTTP connection.
You don't have to worry about multiple functions or callbacks. In a traditional application, if I have two objects cat and dog that I can pass to the listen function, I would get back meow and bark. Even though there's only one listen function. That's sort of how you can view an Express app. Even though you have all these get and post functions, every user's request is passed to them as a unique entity.

node.js middleware and js encapsulation

I'm new to javascript, and jumped right into node.js. I've read a lot of theory, and began well with the practical side (I'm writing an API for a mobile app), but I have one basic problem, which has lead me to middleware. I've successfully implemented a middleware function, but I would like to know if the use I'm giving the idea of middleware is OK, and also resolve the original problem which brought me to middleware. My question is two-fold, it's as follows:
1) From what I could gather, the idea of using middleware is repeating a process before actually processing the request. I've used it for token verification, as follows:
Only one of my urls doesn't receive a token parameter, so
app.js
app.get('/settings', auth.validateToken, auth.settings);
auth.js
function validateToken(req, res, next){ //code };
In validateToken, my code checks the token, then calls next() if everything is OK, or modifies res as json to return a specific error code.
My questions regarding this are: a) Is this a correct use of middleware? b) is there a [correct] way of passing a value onto the next function? Instead of calling next only if everything is OK, is there a [correct] way of calling next either way, and knowing from inside the next function (whichever it is), if the middleware was succesful or not? If there is, would this be a proper use of middleware? This precise point brings me to my original problem, and part two of this question, which is encapsulating functions:
THIS PART WAS FIXED, SEE MY SECOND COMMENT.
2) I discovered middleware trying to simply encapsulate validateToken, and be able to call it from inside the functions that the get handlers point to, for example auth.settings.
I'm used to common, sequential programming, and not in javascript, and haven't for the life of me been able to understand how to do this, taking into account the event-based nature of node.js.
What I want to do right now is write a function which simply verifies the user and password. I have it perfectly written inside a particular handler, but was about to copy-paste it to another one, so I stopped. I want to do things the right way from scratch, and understand node.js. One of the specific problems I've been having, is that the error code I have to return when user and password don't match are different depending on the parent function, so I would need this function to be able to tell the callback function "hey, the password and user don't match", so from the parent function I can respond with the correct message.
I think what I actually want is to write an asynchronous function I can call from inside another one.
I hope I've been clear, I've been trying to solve this on my own, but I can't quite finish wrapping my head around what my actual problem is, I'm guessing it's due to my recent introduction to node.js and JS.
Thanks in advance! Jennifer.
1) There is res.locals object (http://expressjs.com/api.html#res.locals) designed to store data local to the request and to pass them from one middleware to another. After request is processed this object is disposed of. If you want to store data within the session you can use req.session.
2) If I understand your question, you want a function asynchronously passing the response to the caller. You can do it in the same way most node's functions are designed.
You define a function in this way:
function doSomething(parameters, callback) {
// ... do something
// if (errorConddition()) err = errorCode();
if (callback) callback(err, result)
}
And the caller instead of using the return value of the function passes callback to this function:
function caller(req, res, next) {
//...
doSomething(params, function(err, result) {
if (! err && result) {
// do something with the result
next();
} else {
// do something else
next();
// or even res.redirect('/error');
}
});
}
If you find yourself writing similar callback functions you should define them as function and just pass the function as parameter:
//...
doSomething(param, processIt);
function processIt(err, result) {
// ...
}
What keeps you confused, probably, is that you don't treat functions as values yet, which is a very specific to JavaScript (not counting for languages that are little used).
In validateToken, my code checks the token, then calls next() if everything is OK, or modifies res as json to return a specific error code.
a) Is this a correct use of middleware?
b) is there a [correct] way of passing a value onto the next function?
Yes that is the correct way of using middleware, although depending on the response message type and specifications you could use the built in error handling of connect. That is in this example generate a 401 status code by calling next({status:401,stack:'Unauthorized'});
The middleware system is designed to handle the request by going through a series of functions until one function replies to the request. This is why the next function only takes one argument which is error
-> if an error object is passed to the next function then it will be used to create a response and no further middleware will be processed. The manner in which error response is created is as follows
// default to 500
if (res.statusCode < 400) res.statusCode = 500;
debug('default %s', res.statusCode);
// respect err.status
if (err.status) res.statusCode = err.status;
// production gets a basic error message
var msg = 'production' == env
? http.STATUS_CODES[res.statusCode]
: err.stack || err.toString();
-> to pass values down the middleware stack modifying the request object is the best method. This ensures that all processing is bound to that specific request and since the request object goes through every middleware function it is a good way to pass information down the stack.

Node.js / Express - modifying response template context through request/response objects

I am using Express to serve web pages in node.js application.
Let's say I want to have a variable foo available in all views rendered by render method of response object. I know that I can define dynamic helpers for this task. However, I found them unsuitable when you need to set helper variable asynchronously like this (Mongoose example):
Thing.count(filter, function(error, thingCount) {
foo = thingCount;
}
I've tried using connect middleware approach, which suits me perfectly, however the question here is how to affect the response context. By looking into render method definition in express/lib/view.js I've found that it can be manipulated by writing into app._locals object:
function putFooIntoContext (req, res, next) {
Thing.count(filter, function(error, thingCount) {
res.app._locals.foo = thingCount;
next();
}
}
It works as intended, however, I am a bit afraid that such straightforward approach is not the best solution. Can someone give me any ideas how to affect response context by interacting only with request/response objects in proper way designed by Express developers?
Express 3.x allows for asynchronous helpers to be utilized in the form of 'app.use'. So for a simple global 'foo' variable, your code would be as follows:
app.use(req, res, next) {
Thing.count(filter, function(error, thingCount) {
res.locals.foo = thingCount;
next();
});
}
Of course the middleware option is also valid, this is just another viewpoint and saves inserting the middleware per each app.get(....)

Resources