I'm using Joi for validation in my Node.js Rest API. I have a function to validate a specific document and I'd like to know if it's possible to include that function in Joi's validation.
Example:
function validateDocument (document) {
return someCalculation;
}
const schema = Joi.object({
document: Joi.string().required().valid(validateDocument) // I want something like that
});
const { error, value } = schema.validate(something);
Yes. You can use Joi extension to call external function for validation. An example format is given below:
const customSchema = Joi.extend(joi => ({
base: joi.string(),
name: 'customValidation',
rules: [
{
name: 'validate',
validate(params, value, state, options) {
// call custom function and return the result
// for instance:
return validateDocument(value);
},
},
],
}));
Related
I have a schema joi object:
const commonFields = {
id: joi.string().required(),
name: joi.string().required().min(5).max(50),
nif: joi.string().length(9).required(),
diet: joi.bool().required().strict(),
vegetarian: joi.bool().required().strict(),
};
const aluno = joi.object({
...commonFields,
num: joi
.string()
.required()
.regex(/^\d{1,4}\/\d{2}$/),
regime: joi.string().required().valid("externo", "interno"),
});
(...)
When i make a PUT request to update a user i want to ignore the field "id" but when i make a POST request i want to make it required.
I tried the following:
In schema I added alter() to the field "id":
id: joi.string().alter({
post: (schema) => schema.required(),
put: (schema) => schema.forbidden(),
}),
And in my functions i did this:
async function updateUser(req, res, type, db) {
try {
const { error } = requestValidation[type].validate(req.body, {
context: { method: req.method }
});
if (error) {
return res.status(400).send({ error: error.message });
}
const id = req.params.id;
const { name, nif, vegetarian, diet } = req.body;
(..)
But when i call this function in my PUT endpoint to update a user and add the field id into the requesition body it doesn't throw an error like it should throw. The response should be like this when i add the id to the body:
{
"error": "\"id\" is not allowed"
}
I want to ignore the id because I want to receive it by req.params.id.
I may not be doing the best way but I'm open to new suggestions!
The documentation has an example using tailor with alter
const { error } = requestValidation[type].tailor(req.method.toLowerCase()).validate(req.body, {
context: { method: req.method }
});
Fastify has some really awesome json schema support. (Link)
However, I now want to use the schemas which I added with fastify.addSchema(..) inside my business logic as well. For example (pseudo code):
schema = fastify.getSchema("schema1")
if (schema.validate(data)) {
console.log("ok");
} else {
console.log("not ok");
}
How can I achieve that?
Right now, in Fastify, one route has a set of validation functions.
These functions exist only because you set them in the { schema: {} } route
configuration.
So, in the first place, if you don't set those schemas in a route, you will be not able to access them.
The getSchema function retrieves the schema object, not che compiled function.
The relation is not 1:1 because a validation function may use more schemas via the $ref keyword.
The only way to archive what you need is to monkey patch the internal Fastify (highly discouraged)
Or open a feature request to the project.
Here an example, and as you can see, you are limited to get the route's validation functions inside the route's context.
So, it is far from being a flexible usage.
const fastify = require('fastify')({ logger: true })
const {
kSchemaBody: bodySchema
} = require('fastify/lib/symbols')
fastify.post('/', {
schema: {
body: {
$id: '#schema1',
type: 'object',
properties: {
bar: { type: 'number' }
}
}
}
}, async (request, reply) => {
const schemaValidator = request.context[bodySchema]
const result = schemaValidator({ bar: 'not a number' })
if (result) {
return true
}
return schemaValidator.errors
})
fastify.inject({
method: 'POST',
url: '/',
payload: {
bar: 33
}
}, (err, res) => {
console.log(res.json())
})
I'm having a NODE.JS project using mongoose 5.x
My model have toJSON method which removes the _id & __v fields perfectly
mySchema.method("toJSON", function toJSON() {
const {__v, _id, ...object} = this.toObject();
return {
id: _id,
...object
};
});
so when fetching data from the db:
const data = myModel.findOne({_id: id});
I get an object that when serialized to the user:
res.json(data);
It doesn't contain the _id and __v fields as required.
The problem is when I use lean():
const data = myModel.findOne({_id: id}).lean();
the data object contains those fields.
I can remove them manually when using lean
but I would prefer to find a way to sanitize the data object in both cases with the same mechanism.
any suggestions?
Thanks in advance.
Not sure if this is what you want but maybe:
const data = myModel.findOne({_id: id}).lean().then(res => {
delete res._id
return res
})
When JSON.stringify() is called on an object, a check if done if it has a property called toJSON. It's not specific to Mongoose, as it works on plain objects:
const myObj = {
_id: 123,
foo: 'bar',
// `toJSON() {...}` is short for `toJSON: function toJSON() {...}`
toJSON() {
// Because "this" is used, you can't use an arrow function.
delete this._id;
return this;
}
};
console.log(JSON.stringify(myObj));
// {"foo":"bar"}
Mongoose doesn't have an option to automatically inject a toJSON function into objects returned by lean(). But that's something you can add.
First, create a function that:
Takes an object with properties
Listens to when Mongoose runs a find query
Tells Mongoose that after the query, it should change the result
The change: merge the result with the object from step 1.
function mergeWithLeanObjects(props) {
// Return a function that takes your schema.
return function(schema) {
// Before these methods are run on the schema, execute a function.
schema.pre(['find', 'findOne'], function() {
// Check for {lean:true}
if (this._mongooseOptions.lean) {
// Changes the document(s) that will be returned by the query.
this.transform(function(res) {
// [].concat(res) makes sure its an array.
[].concat(res).forEach(obj => Object.assign(obj, props));
return res;
});
}
});
};
}
Now add it to your schema:
mySchema = new mongoose.Schema({
foo: String
});
mySchema.plugin(mergeWithLeanObjects({
toJSON() {
delete this._id;
delete this.__v;
return this;
}
}));
Then test it:
const full = await myModel.findOne();
const lean = await myModel.findOne().lean();
console.log(full);
// Logs a Mongoose document, all properties.
{ _id: new ObjectId("62a8b39466768658e7333154"), foo: 'bar', __v: 1 }
console.log(JSON.stringify(full));
// Logs a JSON string, all properties.
{"_id":"62a8b39466768658e7333154","foo":"bar","__v":1}
console.log(lean);
// Logs an Object, all properties.
{ _id: new ObjectId("62a8b39466768658e7333154"),
foo: 'bar', __v: 1, toJSON: [Function: toJSON] }
console.log(JSON.stringify(lean));
// Logs a JSON string, filtered properties.
{"foo":"bar"}
If you want to re-use the plugin with the same settings on multiple schemas, just save the function that mergeWithLeanObjects returns somewhere.
// file: clean.js
module.exports = mergeWithLeanObjects({
toJSON() {
delete this._id;
delete this.__v;
return this;
}
});
// file: some-schema.js
schema1.plugin(require('./clean.js'));
// file: some-other-schema.js
schema2.plugin(require('./clean.js'));
There's also mongoose.plugin() to add the function to all schemas.
Try this to retrieve the _id from a document
myModel.findOne({_id: id}, function(err, doc) {
if(err)
return 'do something with this err'
console.log(doc._id)
})
I'm writing a graphql project with express. I had defined User object like this:
const User = new GraphQLObjectType({
name: 'User',
fields: () => ({
name: {
type: new GraphQLNonNull(GraphQLString),
},
friends: {
type: HistoricalPerformanceEvaluation,
async resolve(user) {
return db.users.findFriendsFor(user.id);
},
},
}),
});
However, i'd like to use graphql syntax to create the schema and define it like this:
type User {
name: String
friends: [User]!
}
Where should i write the resolver now? I'm not using Apollo.
You can use addResolveFunctionsToSchema function from 'graphql-tools' to create schema with resolver functions.
import { addResolveFunctionsToSchema } from 'graphql-tools';
Using the reference graphql-js library, you can pass a root object to the GraphQL execution. Any top-level queries and mutations are looked up on this object, and further nested query fields are looked up on the objects returned from that, and so on.
The example on the graphql-js front page includes this object:
// The root provides a resolver function for each API endpoint
var root = {
hello: () => {
return 'Hello world!';
},
};
That object is then passed as a parameter to the graphql entry point.
I'm testing a mongoose model's validation, while trying to mock a validator, the model still has the reference to the original function, so the validation keeps calling the original function.
I want to test that the validator function get's called, however since the validator goes to the db I need to mock it.
This is my model:
const { hasRequiredCustoms } = require('../utils/validators')
const ProductSchema = new Schema({
customs: {
type: [String],
validate: hasRequiredCustoms // <- This is the validator
}
})
const Product = mongoose.model('Product', ProductSchema)
module.exports = Product
The original validators:
module.exports = {
hasRequiredCustoms(val) {
console.log('Original')
// validation logic
return result
},
//etc...
}
This is the mock for validators:
const validators = jest.genMockFromModule('../validators')
function hasRequiredCustoms (val) {
console.log('Mock')
return true
}
validators.hasRequiredCustoms = hasRequiredCustoms
module.exports = validators
And the test:
test('Should be invalid if required customs missing: price', done => {
jest.mock('../../utils/validators')
function callback(err) {
if (!err) done()
}
const m = new Product( validProduct )
m.validate(callback)
})
Every time I run the tests the console logs the Original. Why is the reference still going back to the original module? seems like I'm missing some super essential concept of how require works or the way mongoose stores the validators references.
Thanks for the help.