What is the right way to validate incoming data on server side?
I'm using lodash for simple validation like isObject or isArray etc, and validator for cases when i need to validate, say, if a string isEmail. But all this looks awkward and i'm not sure if this gonna hurt performance a lot or not so much.
There should be a way to validate incoming data the more elegant way.
One way to do it would be to use schema-inspector.
It's a module meant to validate json objects based on a json-schema description.
Here is an example from the github README :
var inspector = require('schema-inspector');
// Data that we want to sanitize and validate
var data = {
firstname: 'sterling ',
lastname: ' archer',
jobs: 'Special agent, cocaine Dealer',
email: 'NEVER!',
};
// Sanitization Schema
var sanitization = {
type: 'object',
properties: {
firstname: { type: 'string', rules: ['trim', 'title'] },
lastname: { type: 'string', rules: ['trim', 'title'] },
jobs: {
type: 'array',
splitWith: ',',
items: { type: 'string', rules: ['trim', 'title'] }
},
email: { type: 'string', rules: ['trim', 'lower'] }
}
};
// Let's update the data
inspector.sanitize(sanitization, data);
/*
data is now:
{
firstname: 'Sterling',
lastname: 'Archer',
jobs: ['Special Agent', 'Cocaine Dealer'],
email: 'never!'
}
*/
// Validation schema
var validation = {
type: 'object',
properties: {
firstname: { type: 'string', minLength: 1 },
lastname: { type: 'string', minLength: 1 },
jobs: {
type: 'array',
items: { type: 'string', minLength: 1 }
},
email: { type: 'string', pattern: 'email' }
}
};
var result = inspector.validate(validation, data);
if (!result.valid)
console.log(result.format());
/*
Property #.email: must match [email], but is equal to "never!"
*/
The sanitization schema is meant to "clean" your json before validating it (Setting optional values, trying to convert numbers to string, etc).
The validation schema describes the properties your json should respect.
You then call inspector.validate to check if everything is fine.
Related
I am getting an error if the allRequired keyword is set. If it's not given there is no error. The documentation says this keyword must to be given to object types. When it's not given the validation passes even for wrong input (tested via Postman)
Here is the schema, it's exported from another file:
const ticketSchema = {
type: 'object',
properties: {
firstName: {
type: 'string',
minLength: 1,
},
lastName: {
type: 'string',
minLength: 1,
},
ticketId: {
type: 'string',
minLength: 6,
maxLength: 6,
},
},
allRequired: true, // <---- error occurs here
};
export default ticketSchema;
Error message:
Error: strict mode: unknown keyword: "allRequired"
Validation
const ajv = new Ajv();
const validateTicket = ajv.compile(ticketSchema);
const ticket = {
'first-name': firstName,
'last-name': lastName,
'route-id': routeID,
};
const valid = validateTicket(ticket);
if (!valid) {
res.status(422).send(validateTicket.errors);
return;
}
There is no allRequired property in JSON schema. I looked for it up here in specification. It is not there. So, which documentation do you refer to?
There is no allRequired property as far as I know.
But, if you want to make all the properties required, you need to specify a property called required. It is an array with required field names as elements. So, in your case, it would be:
const ticketSchema = {
type: 'object',
properties: {
firstName: {
type: 'string',
minLength: 1,
},
lastName: {
type: 'string',
minLength: 1,
},
ticketId: {
type: 'string',
minLength: 6,
maxLength: 6,
},
},
required: ['firstName', 'lastName', 'ticketId']
};
Update:
allRequired is not part of JSON specification. It is part of ajv-keywords module. You need to initialize it as follows:
const Ajv = require('ajv');
const ajv = new Ajv();
require('ajv-keywords')(ajv);
const validateTicket = ajv.compile(ticketSchema);
const ticket = {
'first-name': firstName,
'last-name': lastName,
'route-id': routeID,
};
const valid = validateTicket(ticket);
if (!valid) {
res.status(422).send(validateTicket.errors);
return;
}
Update
There were other mistakes too. The key names must be the same as given in the scheme, otherwise the validation won't work.
Example:
const scheme = {
type: 'object',
properties: {
firstName: {
type: 'string',
},
lastName: {
type: 'string',
},
},
};
// correct - key names are the same
data = {
firstName: 'Alex',
lastName: 'Smith',
};
// incorrect - key names aren't the same
data = {
'first-name': 'Alex',
'last-name': 'Smith',
};
I have created one mongoose schema as below:
const quesSchema = new mongoose.Schema({
acRate: {
type: Number,
},
difficulty: {
type: String,
enum: {
values: ['EASY', 'MEDIUM', 'HARD'],
message: 'Difficulty should be either EASY, MEDIUM or HARD',
},
},
title: {
type: String,
},
titleSlug: {
type: String,
},
topicTags: [
{
name: {
type: String,
},
},
],
});
In the data, the difficulty may occur in any case. I could not figure out how to ignore the case for the enum easily. I can make a custom validator that will make the input value lowercase/uppercase. But is there any other solution to it?
Yes, you need to add lowercase: true in your schema.
Reference
So , i don't know how to handle this on a graphql service.
I am migrating a service from rest - express api to an apollo api.
my current mongo schema is like this:
const PostSchema = new mongoose.Schema({
title: { type: String, required: true },
body: { type: String },
uid: { type: mongoose.Schema.Types.ObjectId, ref: 'User', required: true },
privacy: { type: String, enum: ['everyone','followers','onlyme'], default: 'everyone' },
});
const UserSchema = new mongoose.Schema({
name: { type: String, required: true },
followers: [{ type: mongoose.Schema.Types.ObjectId, ref: 'User' }],
});
in express environment i used to get post with populated user the then apply a privacy control function based on user followers.
but in apollo i cant access post after its populated . where can i apply a directive or something after the post fields is resolved?
graphql schema:
type User {
_id: ID!
name: String
followers: [ID]
}
type Post {
_id: ID!
title: String!
body:String
uid :User #privacy
}
extend type Query{
post(_id: String):Post
posts:[Post]
}
post resolver:
{
Query: {
post: (p, a, ctx, info) => {
return ctx.modules.post.getPost(a._id)
}
},
Post: {
uid: (p, a, ctx, info) => {
return ctx.modules.user.getUser(p.uid);
}
},
}
I am new to Mongoose and would like to know if it is possible to add validators on the fly on some parameters depending on queries. I have for example a schema like below:
var user = new Schema({
name: { type: String, required: true },
email: { type: String, required: true },
password: { type: String, required: true },
city: { type: String },
country: { type: String }
});
For a simple registration i force users giving the name, the email and the password. The Schema on top is OK. Now later I would like to force users giving the city and the country. Is it possible for example to update a user's document with the parameters city and country on required? I am avoiding to duplicate user schema like below:
var userUpdate = new Schema({
name: { type: String },
email: { type: String },
password: { type: String },
city: { type: String, required: true },
country: { type: String, required: true }
});
What you would need to do in this case is have one Schema and make your required a function which allows null and String:
var user = new Schema({
name: {
type: String,
required: true
},
email: {
type: String,
required: true
},
password: {
type: String,
required: true
},
city: {
type: String,
required: function() {
return typeof this.city === 'undefined' || (this.city != null && typeof this.city != 'string')
}
}
});
You can extract this and make it an outside function which then you can use for county etc.
What this does is it makes the field required but also you can set null to it. In this way you can have it null in the beginning and then set it later on.
Here is the doc on required.
As far as I know, no, it is not possible.
Mongoose schema are set on collection, not on document.
you could have 2 mongoose model pointing to the same collection with different Schema, but it would effectively require to have duplicated Schema.
personnally, in your situation, I would create a single home-made schema like data structure and a function who, when feeded with the data structure, create the two version of the Schema.
by example :
const schemaStruct = {
base : {
name: { type: String, required: true },
email: { type: String, required: true },
password: { type: String, required: true },
city: { type: String },
country: { type: String }
}
addRequired : ["city", "country"]
}
function SchemaCreator(schemaStruct) {
const user = new Schema(schemaStruct.base)
const schemaCopy = Object.assign({}, schemaStruct.base)
schemaStruct.addRequired.forEach(key => {
schemaCopy[key].required = true;
})
const updateUser = new Schema(schemaCopy);
return [user, updateUser];
}
Mongoose schema:
const postSchema = new mongoose.Schema({
title: { type: String },
content: { type: String },
comments: {
count: { type: Number },
data: [{
author: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User'
},
content: { type: String },
created: { type: Date },
replies: [{
author: {
type: mongoose.Schema.Types.ObjectId,
ref: 'User'
},
content: { type: String },
created: { type: Date },
}]
}]
}
})
I don't want to normalize comments and replies because:
I will always need post comments together (read in a single go)
I will never use comments to query or sort anything
It's so much easier to create, update and delete post with comments (document locking)
If I normalized it, I would have to:
Get post from database
Get comments from database
Get replies for each comment
This is one of the biggest strengths of MongoDB: to group your data specifically to suit your application data needs but GraphQL and Relay.js doesn't seem to support that?
Question:
Is there a way to get comments nested inside of a single post object (or at least get replies nested inside of a single comment) in order to get the whole thing in a single read?
GraphQL schema:
const postType = new GraphQLObjectType({
name: 'Post',
fields: () => ({
id: globalIdField('Post'),
title: { type: GraphQLString },
content: { type: GraphQLString },
comments: {
// ?????
}
}),
interfaces: [nodeInterface],
})
Update:
const postType = new GraphQLObjectType({
name: 'Post',
fields: () => ({
id: globalIdField('Post'),
title: { type: GraphQLString },
content: { type: GraphQLString },
commentCount: { type: GraphQLInt },
comments: { type: new GraphQLList(commentType) }
}),
interfaces: [nodeInterface]
})
const commentType = new GraphQLObjectType({
name: 'Comment',
fields: () => ({
content: { type: GraphQLString },
created: { type: GraphQLString },
author: {
type: userType,
resolve: async (comment) => {
return await getUserById(comment.author)
}
},
replies: {
type: new GraphQLList(commentType),
resolve: (comment) => {
// Log is below
console.log(comment)
// Error only occurs if I return it!
return comment.replies
}
}
})
})
Log (comment):
{
author: 'user-1',
created: '05:35'
content: 'Wow, this is an awesome comment!',
replies:
[{
author: 'user-2',
created: '11:01',
content: 'Not really..',
},
{
author: 'user-1',
created: '11:03',
content: 'Why are you so salty?',
}]
}
This is one of the biggest strengths of MongoDB: to group your data specifically to suit your application data needs but GraphQL and Relay.js doesn't seem to support that?
GraphQL does support what you're trying to do. Before going into the details, we need to keep in mind that GraphQL is about exposing data, NOT about how we store data. We can store in whatever way we like - normalized or not. We just need to tell GraphQL how to get the data that we define in the GraphQL schema.
Is there a way to get comments nested inside of a single post object (or at least get replies nested inside of a single comment) in order to get the whole thing in a single read?
Yes, it's possible. You just define the comments as a list. However, GraphQL does not support arbitrary type of field in a GraphQL object type. Therefore, we need to define separate GraphQL types. In your mongoose schema, comments is an object with count of comments and the comments data. The data property is a list of another type of objects. So, we need to define two GraphQL objects - Comment and CommentList. The code looks like below:
const commentType = new GraphQLObjectType({
name: 'Comment',
fields: () => ({
author: { type: GraphQLString },
content: { type: GraphQLString },
created: { type: GraphQLString },
replies: { type: new GraphQLList(commentType) },
}),
});
const commentListType = new GraphQLObjectType({
name: 'CommentList',
fields: () => ({
count: {
type: GraphQLInt,
resolve: (comments) => comments.length,
},
data: {
type: new GraphQLList(commentType),
resolve: (comments) => comments,
},
}),
});
const postType = new GraphQLObjectType({
name: 'Post',
fields: () => ({
id: globalIdField('Post'),
title: { type: GraphQLString },
content: { type: GraphQLString },
comments: {
type: commentListType,
resolve: (post) => post.comments,
}
}),
interfaces: [nodeInterface],
});
I don't want to normalize comments and replies because:
I will always need post comments together (read in a single go)
I will never use comments to query or sort anything
It's so much easier to create, update and delete post with comments (document locking)
Defining post and comment GraphQL object types in the above way let you:
read all post comments together
not to use comment to query, as it does not even have id field.
implement comments manipulation in whatever way you like with your preferred database. GraphQL has nothing to do with it.