I have a rest api resource that accepts a JSON post. Example:
{
"location": {
"coordinates": [
-122.41941550000001,
37.7749295
]
}
The coordinates are then collected from the request by Express:
module.exports.create = function(req, res, next) {
var coordinates = req.body.location.coordinates;
....
These are then submitted to a Mongoose model. I am writing tests against this where the location.coordinates is missing e.g.
{
"foo": {
"bar": [
-122.41941550000001,
37.7749295
]
}
This then fails within the validation section of the Model with :
locationSchema.path('location.coordinates').validate(function(coordinates){
^
TypeError: Cannot call method 'validate' of undefined
So my question is how would I validate that the input is correct? Should this be done in the route before getting to the model, or should it be done in the model? Any examples of how would also be appreciated.
For reference the Mongoose model looks something like:
var locationSchema = new Schema({
userid: { type: Number, required: true },
location: {
type: [{
type: "String",
required: true,
enum: ['Point', 'LineString', 'Polygon'],
default: 'Point'
}], required: true,
coordinates: { type: [Number], required:true }
},
create_date: { type: Date, default: Date.now }
});
locationSchema.path('location.coordinates').validate(function(coordinates){
...
}, 'Invalid latitude or longitude.');
My typical approach is to introduce a service layer in between the routes and the model, and that's where the validation happens. Don't think "service" in the "web service" sense; it simply provides an abstraction level around a given domain. This has the following benefits:
It gives you a common abstraction for dealing with persisted and/or external data. That is, whether you're interacting with data from Mongoose or an external web service, all of your route logic can simply interact with a consistent interface.
It provides sound encapsulation around persistence details, allowing you to swap out the implementation without effecting all of your routes.
It allows you to re-use code with non-route consumers (such as an integration test suite).
It provides a good layer for mocking (for use with unit tests, for example).
It provides a very clear "validation and business logic happens here" layer, even when your data is spread across several different databases and/or backend systems.
Here's a simplified example of what that might look like:
location-service.js
var locationService = module.exports = {};
locationService.saveCoordinates = function saveCoordinates(coords, cb) {
if (!isValidCoordinates(coords)) {
// your failed validation response can be whatever you want, but I
// like to reserve actual `Error` responses for true runtime errors.
// the result here should be something your client-side logic can
// easily consume and display to the user.
return cb(null, {
success: false,
reason: 'validation',
validationError: { /* something useful to the end user here */ }
});
}
yourLocationModel.save(coords, function(err) {
if (err) return cb(err);
cb(null, { success: true });
});
};
some-route-file.js
app.post('/coordinates', function(req, res, next) {
var coordinates = req.body.location.coordinates;
locationService.saveCoordinates(coordinates, function(err, result) {
if (err) return next(err);
if (!result.success) {
// check result.reason, handle validation logic, etc.
} else {
// woohoo, send a 201 or whatever you need to do
}
});
});
I've applied this structure to 3 or 4 different web apps and APIs at this point, and have grown quite fond of it.
In my opinion the validation should occur at the very beginning, on the client at first, then in the route.
There's not much interest in passing around invalid data, using resources for nothing, so the sooner you flag it as invalid, the sooner you free the resources.
to check existence of your coordinates, you can use :
if(req.body.location.coordinates){
//do your thing
}
Related
Hello I am new to nodejs and mongodb.
I have 3 models:
"user" with fields "name phone"
"Shop" with fields "name, address"
"Member" with fields "shop user status". (shop and user hold the "id" of respective collections).
Now when I create "shops" api to fetch all shop, then I need to add extra field "isShopJoined" which is not part of the model. This extra field will true if user who see that shop is joined it otherwise it will be false.
The problem happens when I share my model with frontend developers like Android/iOS and others, They will not aware of that extra field until they see the API response.
So is it ok if I add extra field in shops listing which is not part of the model? Or do I need to add that extra field in model?
Important note
All the code below has NOT been tested (yet, I'll do it when I can setup a minimal environment) and should be adapted to your project. Keep in mind that I'm no expert when it comes to aggregation with MongoDB, let alone with Mongoose, the code is only here to grasp the general idea and algorithm.
If I understood correctly, you don't have to do anything since the info is stored in the Member collection. But it forces the front-end to do an extra-request (or many extra-requests) to have both the list of Shops and to check (one by one) if the current logged user is a Member of the shop.
Keep in mind that the front-end in general is driven by the data (and so, the API/back-end), not the contrary. The front-end will have to adapt to what you give it.
If you're happy with what you have, you can just keep it that way and it will work, but that might not be very effective.
Assuming this:
import mongoose from "mongoose";
const MemberSchema = new mongoose.Schema({
shopId: {
type: ObjectId,
ref: 'ShopSchema',
required: true
},
userId: {
type: ObjectId,
ref: 'UserSchema',
required: true
},
status: {
type: String,
required: true
}
});
const ShopSchema = new mongoose.Schema({
name: {
type: String,
required: true
},
address: {
//your address model
}
});
const UserSchema = new mongoose.Schema({
name: {
type: String,
required: true
},
phone: {
type: String,
required: true,
},
// Add something like this
shopsJoined: {
type: Array,
default: [],
required: true
}
});
You could tackle this problem via 2 ways:
MongoDB Aggregates
When retrieving (back-end side) the list of shops, if you know the user that made the request, instead of simply returning the list of Shops, you could return an aggregate of Shops and Members resulting in an hybrid document containing both the info of Shops and Models. That way, the front-end have all the info it needs with one back-end request.
Important note
The following code might not work as-is and you'll have to adapt it, I currently have nothing to test it against. Keep in mind I'm not very familiar with aggregates, let alone with Mongoose, but you'll get the general idea by looking the code and comments.
const aggregateShops = async (req, res, next) => {
try {
// $lookup will merge the "Model" and "Shop" documents into one
// $match will return only the results matching the condition
const aggreg = await Model.aggregate({$lookup: {
from: 'members', //the name of the mongodb collection
localField: '_id', //the "Shop" field to match with foreign collection
foreignField: 'shopId', //the "Member" field to match with local collection
as: 'memberInfo' //the field name in which to store the "Member" fields;
}, {
$match: {memberInfo: {userId: myUserId}}
}});
// the result should be an array of object looking like this:
/*{
_id: SHOP_OBJECT_ID,
name: SHOP_NAME,
address: SHOP_ADDRESS,
memberInfo: {
shopId: SHOP_OBJECT_ID,
userId: USER_OBJECT_ID,
status: STATUS_JOINED_OR_NOT
}
}*/
// send back the aggregated result to front-end
} catch (e) {
return next(e);
}
}
Drop the Members collection and store the info elsewhere
Instinctively, I would've gone this way. The idea is to either store an array field shopsJoined in the User model, or a membersJoined array field in the Shops model. That way, the info is retrieved no matter what, since you still have to retrieve the Shops and you already have your User.
// Your PATCH route should look like this
const patchUser = async (req, res, next) => {
try {
// How you chose to proceed here is up to you
// I tend to facilitate front-end work, so get them to send you (via req.body) the shopId to join OR "un-join"
// They should already know what shops are joined or not as they have the User
// For example, req.body.shopId = "+ID" if it's a join, or req.body.shopId = "-ID" if it's an un-join
if (req.body.shopId.startsWith("+")) {
await User.findOneAndUpdate(
{ _id: my_user_id },
{ $push: { shopsJoined: req.body.shopId } }
);
} else if (req.body.shopId.startsWith("-")) {
await User.findOneAndUpdate(
{ _id: my_user_id },
{ $pull: { shopsJoined: req.body.shopId } }
);
} else {
// not formatted correctly, return error
}
// return OK here depending on the framework you use
} catch (e) {
return next(e);
}
};
Of course, the above code is for the User model, but you can do the same thing for the Shop model.
Useful links:
MongoDB aggregation pipelines
Mongoose aggregates
MongoDB $push operator
MongoDB $pull operator
Yes you have to add the field to the model because adding it to the response will be only be a temporary display of the key but what if you need that in the future or in some list filters, so its good to add it to the model.
If you are thinking that front-end will have to be informed so just go it, and also you can set some default values to the "isShopJoined" key let it be flase for the time.
I have been working on a nodejs API that works with postgres and I'm having an issue testing with expect. Sequelize is returning all fields as a string value no matter what the type is.
When using .toMatchObject an error is thrown because in original object and in database id is a numeric value, but is returned as a string. Same happens with decimal.
Is there a way to get data how it's stored, not as string?
Is there different way to check if objects match without having to parse each attribute to be same type as original ?
I tried using raw:true as i read online, but it didnt work.
Node version is 9.9, expect version is 22.4.3, sequelize version is 4.37
Testing code that fails is as follows:
it('should create new zgrada', (done) => {
var newZgrada = {
sifra_zgrade: 1006,
ulica: "Derpa derpsona",
kucni_broj: "77",
dodatak_na_kucni_broj: "A"
};
request(app)
.post('/zgrade')
.send(newZgrada)
.expect(200)
.expect((res) => {
expect(res.body.zgrada).toMatchObject(newZgrada);
})
.end((err, res) => {
if (err) {
return done(err);
}
db.zgrada.findAll({
where: {
sifra_zgrade: newZgrada.sifra_zgrade
}
}).then((zgrada) => {
expect(zgrada.length).toBe(1);
expect(zgrada).toMatchObject(newZgrada);
done();
}).catch((e) => done(e));
});
});
Error received is:
1) POST /zgrada
should create new zgrada:
Error: expect(received).toMatchObject(expected)
Object {
"dodatak_na_kucni_broj": "A",
"kucni_broj": "77",
- "sifra_zgrade": 1006,
+ "sifra_zgrade": "1006",
"ulica": "Derpa derpsona",
}
When I swap toMatchObject() with .toBe() then it works but I have to go through each attribute separately and cast the ones that are numeric to string first which is not practical for a bigger project with different models and many attributes.
edit: model definition (some attributes are left out but not relevant for error):
module.exports = (sequelize, DataTypes) => {
var Zgrada = sequelize.define('zgrada', {
sifra_zgrade: {
type: DataTypes.NUMERIC(0, 4),
unique: true,
primaryKey: true,
allowNull: false
},
ulica: {
type: DataTypes.STRING,
allowNull: false
},
postanski_broj: {
type: DataTypes.STRING
},
kucni_broj: {
type: DataTypes.STRING
},
dodatak_na_kucni_broj: {
type: DataTypes.STRING
}
}
Post method:
router
.post("/", async (req, res) => {
try {
var body = req.body;
var zgrada = await db.zgrada.create({
sifra_zgrade: body.sifra_zgrade,
ulica: body.ulica,
kucni_broj: body.kucni_broj,
dodatak_na_kucni_broj: body.dodatak_na_kucni_broj,
});
res.send({
zgrada
});
} catch (e) {
res.status(400).send(e);
}
});
}
Thank you for help
Sequelize is returning all fields as a string value no matter what the type is.
Sequelize uses the pg module as a bridge for postgres. In pg, Decimal (alias: Numeric) type are returned as String values by design for reasons concerned with the safety and precision of parseInt in JS. For more information see this Issue on node-postgres Github.
Is there a way to get data how it's stored, not as string?
As of Sequelize version 4.32.5, an underlying dependency regarding type parsing was changed and there is an open issue with the same concern as this question linked here.
In short, there isn't much you can do with your current setup.
Other Solutions:
1) Use a different type such as DataTypes.UUID or DataTypes.Integer if possible. Especially given that sifra_zgrade seems to be acting as an primary key/id column, it seems likely that the type could be changed.
2) Rollback to Sequelize version 4.23.3 and manually change how pg-types parses decimals only if you're operating within a safe decimal precision range:
var pg = require('pg');
pg.types.setTypeParser(1700, 'text', parseFloat);
Is there different way to check if objects match without having to parse each attribute to be same type as original ?
As far as I understand, solutions would have to use underlying type-converting comparisons (==) instead of strict equality comparisons.
I'm quite new to Sails.js, but not new to MVC. I feel like I'm really close, but just missing something somewhere.
I generated a controller with a single action "overview", and if I keep it "hello world"- easy, it works. The below code runs just fine:
// My controller
module.exports = {
overview : function(req, res) {
return res.send("<h1>Tossel<h1>");
},
}
However, when I try to get some data from the model and pass it to the controller, I have issues. I've tried it in two ways. I first defined a custom method on the Model and called it from my controller, like below:
// My model, e.g. Orders
module.exports = {
tableName: 'Orders',
attributes: {
id : {
type: 'integer',
primaryKey: true,
unique: true
},
client_id : {
type: 'integer'
}
},
getAllOrders: function(opts, cb) {
Orders.find().exec(function(err, data) {
cb(err, data);
});
}
};
// My controller
module.exports = {
overview : function(req, res) {
Orders. getAllOrders(null, function(err, data) {
return res.send(data);
});
},
}
From what I can see in the docs, that's pretty much how I should do it. I get the below error though. I get this when executing the command in the console, and by calling it from the controller:
TypeError: query.exec is not a function
I then tried doing everything in the controller:
// My controller
module.exports = {
overview : function(req, res) {
Orders.find().exec(function(err, data) {
return res.send(data);
});
},
}
This method didn't actually throw any errors, but it returned nothing. I'm pretty sure there is data in the database though. Since I wasn't 100% sure, I created an "add" action to the controller as below, to add data to the table. This results in no errors, but the page just hangs.
module.exports = {
overview : function(req, res) {
Orders.find().exec(function (err, data) {
return res.json(data);
});
},
add : function(req, res) {
var data = {
id: 1,
client_id: 1,
};
Orders.create(data).exec(function (err, data) {
return res.send(data);
});
}
};
Can someone please point me in the right direction here. I find the docs to be too simplistic. It's good at showing you the easy stuff, but that's all it shows.
Also, as a bonus, can someone please tell me what I need to do to be able to make changes and refresh the page to see them? At the moment I have to kill the server and bring it back up again to see the changes. This doesn't seem right, I'm must be missing something somewhere.
Edit
As per the below comment:
Node v8.10.0
Sails v0.12.14
I specify the table name because it is indeed different from the model name. I changed the actual name of the table and model in the code I posted.
Currently I'm working on a REST API in Node JS & Express. I have to create a link for every resource in the API like the example below.
{
"id": "20281",
"title": "test",
"body": "test",
"date": "2017-11-14 09:01:35",
"_links": {
"self": {
"href": "href"
},
"collection": {
"href": "href"
}
}
},
I know I can handle this in my controller because the req object is available. But it would be better to create a virtual field in my model to create the link on the fly and not save it in the DB.
What would be the best way to do this?
DB: MongoDB
ODM: Mongoose
There are a lot of ways to do this, and you haven't described your DB or ORM or the like, but implied that you have something along those lines with your terminology.
The simplest thing to do would be to just assume that this is not a concern of the storage, but instead it's something that you'd apply globally to the api, so that all routes have it applied. This could be in the form of middleware:
router.use((req, res, next) => {
if (res.data) { // if you do it this way, none of your routes will actually do a res.send, but instead will put their data into res.data and rely on another middleware to render or send it
res.data._links = {
self: calculateSelf(req, res.data),
collection: calculateCollection(req, res.data)
};
}
});
Given that, those two link calculators could use some standard patterns or regex to figure out what the link should look like generically.
Alternatively, if you're using Mongoose, you could override toJSON to populate those links in any models you expect to send down the wire, but that implies that your models should be aware of what the root application URL is, which I don't recommend.
Virtual field implementation:
YourSchema.virtual('links').get(() => {
return {
self: 'http://example.com/api/v1.0/schema/' + this._id,
collection: 'http://example.com/api/v1.0/schema/'
};
});
For that to work, you have to pass { virtuals: true } to toObject() or toJSON() or set it as a global mongoose configuration to always show virtuals. As I said before, though, I really wouldn't recommend that as it requires the schemas to have access to and knowledge of the base URL, which can change between environments. If (as your schema implies) the model is representing a web page, then the URL template could be something that is actually relevant to the model, but in most domains that wouldn't be the case and so I'd recommend the middleware approach.
If you are using mongoose you can use virtual properties to create a non persisting fields, Here is an example:
var MySchema = new Schema({
id: Number,
title: String,
body: String,
date: Date
});
MySchema.virtual('links').get(function () {
return {
self: {
href: 'https://docent.cmi.hro.nl/bootb/demo/notes/' + this.id
},
collection: {
href: 'https://docent.cmi.hro.nl/bootb/demo/notes'
}
}
});
var My = mongoose.model('My', MySchema);
Or you can use after find middleware (post hooks), example:
var MySchema = new Schema({
id: Number,
title: String,
body: String,
date: Date
});
MySchema.post('find', function(doc) {
doc.links = {
self: {
'href': 'https://docent.cmi.hro.nl/bootb/demo/notes/' + doc.id
},
collection: {
'href': 'https://docent.cmi.hro.nl/bootb/demo/notes/'
}
}
});
var My = mongoose.model('My', MySchema);
I'm trying to develop a Node.js backend using express and mongoose.
Over the network there's plenty of examples of how to implement a proper authentication layer, but I couldn't find any example of how to correctly implement an authorization layer.
In my specific case, I'm creating the backend of a multi-user application and I want that every user can only see the data inserted by himself/herself.
I have three models:
User
Category
Document
User owns one or many Categories, Categories contain zero or more Documents.
The CRUD operations are implemented on the following endpoints:
/user/:userid
/user/:userid/category
/user/:userid/category/:categoryid
/user/:userid/category/:categoryid/document
/user/:userid/category/:categoryid/document/:documentid
In the authentication part I set to each request the current logged in user id, so I can check easily that
jsonwebtoken.userId == req.params.userid
And return a 403 error otherwise.
Checking the ownership of categories is quite easy, because each category contains a reference to the user who created them.
var CategorySchema = mongoose.Schema({
name: {
type: String,
required: true,
trim: true
},
user_id: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User',
index: true
}
});
In the Document model, however, I only have a reference to the category it belongs, but I didn't add a reference to the user.
I'm wondering therefore how to proceed with "nested" relationships. Do I need to add a user_id reference to all of them at any depth level? Is there any best practice?
Moreover, is this the right way to do what I need or is there any official/mature library that already does the same?
Well no-sql database gives you the power of embedding your sub documents(or equivalent table in relational db) in to the single document. So you may consider redesigning your schema to something like
{
userId:"",
categories": [
{
"categoryId": "",
"name": "",
"documents": [
{
"documentId": "",
},
{
"documentId": "",
},
]
},
{
"categoryId": "",
"name": "",
"documents": [
{
"documentId": "",
},
{
"documentId": "",
},
]
}
]
}
This may help you optimize the number of db query but the important thing to note here is that if the number of categories and documents per user and per category repectively could grow very large then this approach would not be good.
Always remember 6 important thumb rules for mongo db schema design
Favor embedding unless there is a compelling reason not to
Needing to access an object on its own is a compelling reason not to embed it
Arrays should not grow without bound. If there are more than a couple of hundred documents on the “many” side, don’t embed them; if there are more than a few thousand documents on the “many” side, don’t use an array of ObjectID references. High-cardinality arrays are a compelling reason not to embed.
Don’t be afraid of application-level joins
Consider the write/read ratio when denormalizing. A field that will mostly be read and only seldom updated is a good candidate for denormalization.
You want to structure your data to match the ways that your application queries and updates it.
Taken from here
After some tinkering, I ended up with the following middleware.
It basically checks for route parameters in the expected order and checks for coherent memberships.
Not sure if it's the best way of achieving this, but it works:
var Category = require('../category/Category'),
Document = require('../document/Document'),
unauthorizedMessage = 'You are not authorized to perform this operation.',
errorAuthorizationMessage = 'Something went wrong while validating authorizations.',
notFoundMessage = ' not found.';
var isValidMongoId = function (id) {
if (id.match(/^[0-9a-fA-F]{24}$/)) {
return true;
}
return false;
}
var verifyPermissions = function (req, res, next) {
if (req.userId) {
if (req.params.userid && isValidMongoId(req.params.userid)) {
if (req.userId != req.params.userid) {
return res.status(403).send({error: 403, message: unauthorizedMessage});
}
if (req.params.categoryid && isValidMongoId(req.params.userid)) {
Category.findOne({_id: req.params.categoryid, user_id: req.params.userid}, function(err, category){
if (err) {
return res.status(500).send({error: 500, message: errorAuthorizationMessage})
}
if (!category) {
return res.status(404).send({error: 404, message: 'Category' + notFoundMessage});
}
if (req.params.documentid && isValidMongoId(req.params.documentid)) {
Document.findOne({_id: req.params.documentid, category_id: req.params.categoryid}, function(err, document){
if (err) {
return res.status(500).send({error: 500, message: errorAuthorizationMessage})
}
if (!document) {
return res.status(404).send({error: 404, message: 'Document' + notFoundMessage});
}
});
}
});
}
}
next();
} else {
return res.status(403).send({error: 403, message: unauthorizedMessage});
}
};
module.exports = verifyPermissions;