Check resource ownership on Node.js rest API - node.js

I'm trying to develop a Node.js backend using express and mongoose.
Over the network there's plenty of examples of how to implement a proper authentication layer, but I couldn't find any example of how to correctly implement an authorization layer.
In my specific case, I'm creating the backend of a multi-user application and I want that every user can only see the data inserted by himself/herself.
I have three models:
User
Category
Document
User owns one or many Categories, Categories contain zero or more Documents.
The CRUD operations are implemented on the following endpoints:
/user/:userid
/user/:userid/category
/user/:userid/category/:categoryid
/user/:userid/category/:categoryid/document
/user/:userid/category/:categoryid/document/:documentid
In the authentication part I set to each request the current logged in user id, so I can check easily that
jsonwebtoken.userId == req.params.userid
And return a 403 error otherwise.
Checking the ownership of categories is quite easy, because each category contains a reference to the user who created them.
var CategorySchema = mongoose.Schema({
name: {
type: String,
required: true,
trim: true
},
user_id: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User',
index: true
}
});
In the Document model, however, I only have a reference to the category it belongs, but I didn't add a reference to the user.
I'm wondering therefore how to proceed with "nested" relationships. Do I need to add a user_id reference to all of them at any depth level? Is there any best practice?
Moreover, is this the right way to do what I need or is there any official/mature library that already does the same?

Well no-sql database gives you the power of embedding your sub documents(or equivalent table in relational db) in to the single document. So you may consider redesigning your schema to something like
{
userId:"",
categories": [
{
"categoryId": "",
"name": "",
"documents": [
{
"documentId": "",
},
{
"documentId": "",
},
]
},
{
"categoryId": "",
"name": "",
"documents": [
{
"documentId": "",
},
{
"documentId": "",
},
]
}
]
}
This may help you optimize the number of db query but the important thing to note here is that if the number of categories and documents per user and per category repectively could grow very large then this approach would not be good.
Always remember 6 important thumb rules for mongo db schema design
Favor embedding unless there is a compelling reason not to
Needing to access an object on its own is a compelling reason not to embed it
Arrays should not grow without bound. If there are more than a couple of hundred documents on the “many” side, don’t embed them; if there are more than a few thousand documents on the “many” side, don’t use an array of ObjectID references. High-cardinality arrays are a compelling reason not to embed.
Don’t be afraid of application-level joins
Consider the write/read ratio when denormalizing. A field that will mostly be read and only seldom updated is a good candidate for denormalization.
You want to structure your data to match the ways that your application queries and updates it.
Taken from here

After some tinkering, I ended up with the following middleware.
It basically checks for route parameters in the expected order and checks for coherent memberships.
Not sure if it's the best way of achieving this, but it works:
var Category = require('../category/Category'),
Document = require('../document/Document'),
unauthorizedMessage = 'You are not authorized to perform this operation.',
errorAuthorizationMessage = 'Something went wrong while validating authorizations.',
notFoundMessage = ' not found.';
var isValidMongoId = function (id) {
if (id.match(/^[0-9a-fA-F]{24}$/)) {
return true;
}
return false;
}
var verifyPermissions = function (req, res, next) {
if (req.userId) {
if (req.params.userid && isValidMongoId(req.params.userid)) {
if (req.userId != req.params.userid) {
return res.status(403).send({error: 403, message: unauthorizedMessage});
}
if (req.params.categoryid && isValidMongoId(req.params.userid)) {
Category.findOne({_id: req.params.categoryid, user_id: req.params.userid}, function(err, category){
if (err) {
return res.status(500).send({error: 500, message: errorAuthorizationMessage})
}
if (!category) {
return res.status(404).send({error: 404, message: 'Category' + notFoundMessage});
}
if (req.params.documentid && isValidMongoId(req.params.documentid)) {
Document.findOne({_id: req.params.documentid, category_id: req.params.categoryid}, function(err, document){
if (err) {
return res.status(500).send({error: 500, message: errorAuthorizationMessage})
}
if (!document) {
return res.status(404).send({error: 404, message: 'Document' + notFoundMessage});
}
});
}
});
}
}
next();
} else {
return res.status(403).send({error: 403, message: unauthorizedMessage});
}
};
module.exports = verifyPermissions;

Related

How to model a collection in nodejs+mongodb

Hello I am new to nodejs and mongodb.
I have 3 models:
"user" with fields "name phone"
"Shop" with fields "name, address"
"Member" with fields "shop user status". (shop and user hold the "id" of respective collections).
Now when I create "shops" api to fetch all shop, then I need to add extra field "isShopJoined" which is not part of the model. This extra field will true if user who see that shop is joined it otherwise it will be false.
The problem happens when I share my model with frontend developers like Android/iOS and others, They will not aware of that extra field until they see the API response.
So is it ok if I add extra field in shops listing which is not part of the model? Or do I need to add that extra field in model?
Important note
All the code below has NOT been tested (yet, I'll do it when I can setup a minimal environment) and should be adapted to your project. Keep in mind that I'm no expert when it comes to aggregation with MongoDB, let alone with Mongoose, the code is only here to grasp the general idea and algorithm.
If I understood correctly, you don't have to do anything since the info is stored in the Member collection. But it forces the front-end to do an extra-request (or many extra-requests) to have both the list of Shops and to check (one by one) if the current logged user is a Member of the shop.
Keep in mind that the front-end in general is driven by the data (and so, the API/back-end), not the contrary. The front-end will have to adapt to what you give it.
If you're happy with what you have, you can just keep it that way and it will work, but that might not be very effective.
Assuming this:
import mongoose from "mongoose";
const MemberSchema = new mongoose.Schema({
shopId: {
type: ObjectId,
ref: 'ShopSchema',
required: true
},
userId: {
type: ObjectId,
ref: 'UserSchema',
required: true
},
status: {
type: String,
required: true
}
});
const ShopSchema = new mongoose.Schema({
name: {
type: String,
required: true
},
address: {
//your address model
}
});
const UserSchema = new mongoose.Schema({
name: {
type: String,
required: true
},
phone: {
type: String,
required: true,
},
// Add something like this
shopsJoined: {
type: Array,
default: [],
required: true
}
});
You could tackle this problem via 2 ways:
MongoDB Aggregates
When retrieving (back-end side) the list of shops, if you know the user that made the request, instead of simply returning the list of Shops, you could return an aggregate of Shops and Members resulting in an hybrid document containing both the info of Shops and Models. That way, the front-end have all the info it needs with one back-end request.
Important note
The following code might not work as-is and you'll have to adapt it, I currently have nothing to test it against. Keep in mind I'm not very familiar with aggregates, let alone with Mongoose, but you'll get the general idea by looking the code and comments.
const aggregateShops = async (req, res, next) => {
try {
// $lookup will merge the "Model" and "Shop" documents into one
// $match will return only the results matching the condition
const aggreg = await Model.aggregate({$lookup: {
from: 'members', //the name of the mongodb collection
localField: '_id', //the "Shop" field to match with foreign collection
foreignField: 'shopId', //the "Member" field to match with local collection
as: 'memberInfo' //the field name in which to store the "Member" fields;
}, {
$match: {memberInfo: {userId: myUserId}}
}});
// the result should be an array of object looking like this:
/*{
_id: SHOP_OBJECT_ID,
name: SHOP_NAME,
address: SHOP_ADDRESS,
memberInfo: {
shopId: SHOP_OBJECT_ID,
userId: USER_OBJECT_ID,
status: STATUS_JOINED_OR_NOT
}
}*/
// send back the aggregated result to front-end
} catch (e) {
return next(e);
}
}
Drop the Members collection and store the info elsewhere
Instinctively, I would've gone this way. The idea is to either store an array field shopsJoined in the User model, or a membersJoined array field in the Shops model. That way, the info is retrieved no matter what, since you still have to retrieve the Shops and you already have your User.
// Your PATCH route should look like this
const patchUser = async (req, res, next) => {
try {
// How you chose to proceed here is up to you
// I tend to facilitate front-end work, so get them to send you (via req.body) the shopId to join OR "un-join"
// They should already know what shops are joined or not as they have the User
// For example, req.body.shopId = "+ID" if it's a join, or req.body.shopId = "-ID" if it's an un-join
if (req.body.shopId.startsWith("+")) {
await User.findOneAndUpdate(
{ _id: my_user_id },
{ $push: { shopsJoined: req.body.shopId } }
);
} else if (req.body.shopId.startsWith("-")) {
await User.findOneAndUpdate(
{ _id: my_user_id },
{ $pull: { shopsJoined: req.body.shopId } }
);
} else {
// not formatted correctly, return error
}
// return OK here depending on the framework you use
} catch (e) {
return next(e);
}
};
Of course, the above code is for the User model, but you can do the same thing for the Shop model.
Useful links:
MongoDB aggregation pipelines
Mongoose aggregates
MongoDB $push operator
MongoDB $pull operator
Yes you have to add the field to the model because adding it to the response will be only be a temporary display of the key but what if you need that in the future or in some list filters, so its good to add it to the model.
If you are thinking that front-end will have to be informed so just go it, and also you can set some default values to the "isShopJoined" key let it be flase for the time.

add item to array if it doesn't exist mongodb

I'm running a simple application with mongoDB + nodejs, I'm trying to achieve the following:
The unit belongs to a company, the classroom belongs to a unit and the user belongs to a classroom.
In certain moment, I want to add the user to another unit or/and classroom, then he'll belong to 2 or more units/classrooms.
My form will sent only one unit/classroom per time, in this case, I want to add it to the user model unit:[string] and classroom:[string] only if he doesn't previously belong to it. So I need to check if the arrays already have the sent data, if don't, add to it.
Mongo has the $addToSet property, and the $ne to do it, but I can't seem to make it work.
Here's my code:
User.findById(req.body._id)
.select("-__v")
.exec((err: Error, user: any) => {
if (err) {
// display error
}
if (!user) {
// display error
}
user.update({
unit: {
$ne: user.unit
},
classroom: {
$ne: user.classroom
}
}, {
$addToSet: {
unit: req.body.unit,
classroom: req.body.classroom
}
}).exec((err: Error) => {
if (err) {
// Display error
}
res.status(200).json({
status: "OK",
response: response,
})
return
})
It belongs to "Academy one" and the classroom id reference, I will add him to another unit like "Academy 2" and add another classroom reference, but if I add him to another classroom of "Academy One", I don't want a duplicate item in it's unit array.
When I post the following through postman, it gives me the error:
{
"_id":"5d8ba151248ecb4df8803657", // user id
"unit":"Test", // another unit
"classroom":"5d8a709f44f55e4a785e2c50" // another classroom
}
Response:
{
"status": "NOK",
"response": "Cast to [string] failed for value \"[{\"$ne\":[\"Academy One\"]}]\" at path \"unit\"" }
What am I missing?
Actually, I didn't needed the $ne operator, I just needed to use the $addToSet directly
user.updateOne({
$addToSet: { unit: req.body.unit, classroom: req.body.classroom }
}).exec((err: Error) => {
Thanks!
You need to use $nin instead of $ne, https://docs.mongodb.com/manual/reference/operator/query/nin/
unit: {$nin: [user.unit]}

Mongoose subquery and append results to mainquery

I have been struggling with the questions for a months now still no solution.
Basically I have 2 mongodb database structures.
One is called Users and another is called Items.
One user can have multiple Items.
User structure is simple =
Users = [{
_id: 1,
name: "Sam",
email: "sam#gmail.com",
group: "Rangers"
},
{
_id: 2,
name: "Michael",
email: "michael#gmail.com"
group: "Muse"
},
{
_id: 3,
name: "John",
email: "john#gmail.com"
group: "Merchant"
},
.....
]
The Items structures are as follows and each item is assigned to a user.
Items = [
{
_id: 1,
user_id: 1,
item_name: "Flying Sword",
timestamp: ...
},
{
_id: 3,
user_id: 1,
item_name: "Invisible Cloak",
timestamp: ...
},
{
_id: 4,
user_id: 2,
item_name: "Iron Shield"
},
{
_id: 5,
user_id: 7,
item_name: "Splashing Gun",
timestamp: ...
},
...
]
I want to run a mongoose query that queries the user as primary object.
And upon the returning the results of the user object I want to query the all the Items objects with the filtered users and append them as subdocuments to each user objects previously queried.
For example I want to query
Users.find({group: "Muse"}, function(err, users){
I DON"T KNOW WHAT TO WRITE INSIDE
})
Basically the results should be:
[
{
_id: 4,
name: "Jack",
email: "jack#gmail.com",
group: "Muse",
items: [
{
_id: 8
name: "Magic Wand",
user_id: 4,
timestamp: ...
}
{
_id: 12
name: "Blue Potion",
user_id: 4,
timestamp: ...
},
{
_id: 18
name: "Teleportation Scroll",
user_id: 4,
timestamp: ...
}
]
}
.....
More USERS of similar structure
]
Each user will return a maximum of three items which are sorted by timestamp.
Thanks in advance, I tried so many times and failed.
This is a multiple step question. So lets list out the steps:
Get a list of user documents that match a particular group.
Get a list of item documents that are assigned to each matched user from step 1.
Assign the appropriate item documents to a new property on the corresponding user document.
This can be tackled a few ways. A first pass might be to retrieve all the user documents and then iterating over them in memory retrieving the list of item documents for each user and appending that list to the user document. If your lists are smallish this shouldn't be too much of an issue but as scale comes into play and this becomes a larger list it could become a memory hog.
NOTE: all of the following code is untested so it might have typos or the like.
Users.find({group: "Muse"}, function(err, users){
var userIDs;
if (err) {
// do error handling
return;
}
userIDs = users.map(function (user) { return user._id; });
Items.find({user_id: {$in: userIDs}}, function (err, items) {
if (err) {
// do error handling
return;
}
users.forEach(function (user) {
user.items = items.filter(function (item) {
return item.user_id === user._id;
});
});
// do something with modified users object
});
});
While this will solve the problem there are plenty of improvements that can be made to make it a bit more performant as well as "clean".
For instance, lets use promises since this involves async operations anyway. Assuming Mongoose is configured to use the native Promise object or a then/catch compliant library
Users.find({group: "Muse"}).exec().then(function(users) {
var userIDs = users.map(function(user) {
return user._id;
});
// returns a promise
return Promise.all([
// include users for the next `then`
// avoids having to store it outside the scope of the handlers
users,
Items.find({
user_id: {
$in: userIDs
}
}).exec()
]);
}).then(function(results) {
var users = results[0];
var items = results[1];
users.forEach(function(user) {
user.items = items.filter(function(item) {
return item.user_id === user._id;
});
});
return users;
}).catch(function (err) {
// do something with errors from either find
});
This makes it subjectively a bit more readable but doesn't really help since we are doing a lot of manipulation in memory. Again, this might not be a concern if the document collections are smallish. However if is, there is a tradeoff that can be made with breaking up the request for items into one-per-user. Thus only working on chunks of the item list at a time.
We will also use Bluebird's map to limit the number of concurrent requests for items.
Users.find({group: "Muse"}).exec().then(function(users) {
return bluebird.map(users, function(user) {
return Items.find({user_id: user._id}).exec().then(function (items) {
user.items = items;
return user;
});
}, {concurrency: 5});
}).then(function(users) {
// do something with users
}).catch(function(err) {
// do something with errors from either find
});
This limits the amount of in memory manipulation for items but still leaves us iterating over users in memory. That can be tackled as well by using mongoose streams but I will leave that up to you to explore on your own (there are also other questions already on SO on how to use streams).
This makes it subjectively a bit more readable but doesn't really help since we are doing a lot of manipulation in memory. Again, this might not be a concern if the document collections are smallish. However if is, there is a tradeoff that can be made with breaking up the request for items into one-per-user. Thus only working on chunks of the item list at a time.

Node.js, Express, Mongoose - input validation - within route or model?

I have a rest api resource that accepts a JSON post. Example:
{
"location": {
"coordinates": [
-122.41941550000001,
37.7749295
]
}
The coordinates are then collected from the request by Express:
module.exports.create = function(req, res, next) {
var coordinates = req.body.location.coordinates;
....
These are then submitted to a Mongoose model. I am writing tests against this where the location.coordinates is missing e.g.
{
"foo": {
"bar": [
-122.41941550000001,
37.7749295
]
}
This then fails within the validation section of the Model with :
locationSchema.path('location.coordinates').validate(function(coordinates){
^
TypeError: Cannot call method 'validate' of undefined
So my question is how would I validate that the input is correct? Should this be done in the route before getting to the model, or should it be done in the model? Any examples of how would also be appreciated.
For reference the Mongoose model looks something like:
var locationSchema = new Schema({
userid: { type: Number, required: true },
location: {
type: [{
type: "String",
required: true,
enum: ['Point', 'LineString', 'Polygon'],
default: 'Point'
}], required: true,
coordinates: { type: [Number], required:true }
},
create_date: { type: Date, default: Date.now }
});
locationSchema.path('location.coordinates').validate(function(coordinates){
...
}, 'Invalid latitude or longitude.');
My typical approach is to introduce a service layer in between the routes and the model, and that's where the validation happens. Don't think "service" in the "web service" sense; it simply provides an abstraction level around a given domain. This has the following benefits:
It gives you a common abstraction for dealing with persisted and/or external data. That is, whether you're interacting with data from Mongoose or an external web service, all of your route logic can simply interact with a consistent interface.
It provides sound encapsulation around persistence details, allowing you to swap out the implementation without effecting all of your routes.
It allows you to re-use code with non-route consumers (such as an integration test suite).
It provides a good layer for mocking (for use with unit tests, for example).
It provides a very clear "validation and business logic happens here" layer, even when your data is spread across several different databases and/or backend systems.
Here's a simplified example of what that might look like:
location-service.js
var locationService = module.exports = {};
locationService.saveCoordinates = function saveCoordinates(coords, cb) {
if (!isValidCoordinates(coords)) {
// your failed validation response can be whatever you want, but I
// like to reserve actual `Error` responses for true runtime errors.
// the result here should be something your client-side logic can
// easily consume and display to the user.
return cb(null, {
success: false,
reason: 'validation',
validationError: { /* something useful to the end user here */ }
});
}
yourLocationModel.save(coords, function(err) {
if (err) return cb(err);
cb(null, { success: true });
});
};
some-route-file.js
app.post('/coordinates', function(req, res, next) {
var coordinates = req.body.location.coordinates;
locationService.saveCoordinates(coordinates, function(err, result) {
if (err) return next(err);
if (!result.success) {
// check result.reason, handle validation logic, etc.
} else {
// woohoo, send a 201 or whatever you need to do
}
});
});
I've applied this structure to 3 or 4 different web apps and APIs at this point, and have grown quite fond of it.
In my opinion the validation should occur at the very beginning, on the client at first, then in the route.
There's not much interest in passing around invalid data, using resources for nothing, so the sooner you flag it as invalid, the sooner you free the resources.
to check existence of your coordinates, you can use :
if(req.body.location.coordinates){
//do your thing
}

How to prevent pushing in the document with same attribute in Mongodb

I have the following structure. I would like to prevent pushing in the document with the same attribute.
E.g. Basically, i find the user object first. If i have another vid (with is already inside), it will not get pushed in. Try using $addToSet, but failed.
I am using Mongoose.
This is my Model Structure:
var User = mongoose.model('User', {
oauthID: Number,
name: String,
username: String,
email: String,
location: String,
birthday: String,
joindate: Date,
pvideos: Array
});
This is my code for pushing into Mongo
exports.pinkvideo = function(req, res) {
var vid = req.body.vid;
var oauthid = req.body.oauthid;
var User = require('../models/user.js');
var user = User.findOne({
oauthID: oauthid
}, function(err, obj) {
if (!err && obj != null) {
obj.pvideos.push({
vid: vid
});
obj.save(function(err) {
res.json({
status: 'success'
});
});
}
});
};
You want the .update() method rather than retrieving the document and using .save() after making your changes.
This not only gives you access to the $addToSet operator that was mentioned, and it's intent is to avoid duplicates in arrays it is a lot more efficient as you are only sending your changes to the database rather than the whole document back and forth:
User.update(
{ oauthID: oauthid },
{ "$addToSet": { "pVideos": vid } },
function( err, numAffected ) {
// check error
res.json({ status: "success" })
}
)
The only possible problem there is it does depend on what you are actually pushing onto the array and expecting it to be unique. So if your array already looked like this:
[ { "name": "A", "value": 1 } ]
And you sent and update with an array element like this:
{ "name": "A", "value": 2 }
Then that document would not be considered to exist purely on the value of "A" in "name" and would add an additional document rather than just replace the existing document.
So you need to be careful about what your intent is, and if this is the sort of logic you are looking for then you would need to find the document and test the existing array entries for the conditions that you want.
But for basic scenarios where you simply don't want to add a clear duplicate then $addToSet as shown is what you want.

Resources