I'm running a NodeJS application that connects to a Mongo database. I've done the initial development with the MongoClient library. I'm now learning however that without using an ORM like Mongoose, it is becoming increasingly harder to ensure the shape of the data is good. I found a nice thing in the Mongo documentation that indicates I can enforce a schema; one example looking like this:
db.createCollection("students", {
validator: {
$jsonSchema: {
bsonType: "object",
required: [ "name", "year", "major", "address" ],
properties: {
name: {
bsonType: "string",
description: "must be a string and is required"
},
year: {
bsonType: "int",
minimum: 2017,
maximum: 3017,
description: "must be an integer in [ 2017, 3017 ] and is required"
},
major: {
enum: [ "Math", "English", "Computer Science", "History", null ],
description: "can only be one of the enum values and is required"
},
gpa: {
bsonType: [ "double" ],
description: "must be a double if the field exists"
},
address: {
bsonType: "object",
required: [ "city" ],
properties: {
street: {
bsonType: "string",
description: "must be a string if the field exists"
},
city: {
bsonType: "string",
description: "must be a string and is required"
}
}
}
}
}
}
})
and it solved my problems except if I were to spin up a new db instance (for instance using a Docker container). Wondering if anyone has any suggestions about how to copy validation across db instances or spin up a new one with it? I will rewrite the application at some point to use Mongoose (or other ORM) but in the mean time I'm looking for a solution to this problem using the validator on a schema.
Related
How can I apply the schema changes to sync with default value to all the old data in mongodb
import mongoose from "mongoose";
interface ITodo {
title: string;
description: string;
by: string;
}
interface todoModelInterface extends mongoose.Model<TodoDoc> {
build(attr: ITodo): TodoDoc;
}
interface TodoDoc extends mongoose.Document {
title: string;
description: string;
by: string;
}
const todoSchema = new mongoose.Schema({
title: {
type: String,
required: true,
},
description: {
type: String,
required: true,
},
by: {
type: String,
required: true,
},
});
todoSchema.statics.build = (attr: ITodo) => {
return new Todo(attr);
};
const Todo = mongoose.model<TodoDoc, todoModelInterface>("Todo", todoSchema);
Todo.build({
title: "some title",
description: "some description",
by: "special",
});
Todo.collection.dropIndexes(function () {
Todo.collection.reIndex(function (finished) {
console.log("finished re indexing");
});
});
Todo.collection
.getIndexes()
.then((indexes: any) => {
console.log("indexes:", indexes);
})
.catch(console.error);
export { Todo };
Db:
[{
"_id": {
"$oid": "62cee1eea60e181e412cb0a2"
},
"title": "one",
"description": "one desc"
},{
"_id": {
"$oid": "62cee2bd44026b1f85464d41"
},
"title": "one",
"description": "one desc",
"by": "alphs"
},{
"_id": {
"$oid": "62cee3c8cf1592205dacda3e"
},
"title": "one",
"description": "one desc",
"by": "alphs"
}]
Here the old data still missing the "by" key, similarly if there is nested schema change it may impact the old users, how can we define the default collection for old data in mongodb at runtime without using update query migration?
Have you tried setting the default value for "by". By giving a default value, if the old data is missing a value then the default will kick in and return the default value provided. Read about Mongoose Default: Here. I don't know if this is a good practice but we also use this method when there is change in schema and don't want to run update query.
Main question: How can I delete subdocuments when a document expires? (with mongoose index() method)
Details:
When a user registered the backend creates for them workspaces and projects.
{
*user fields*
"workspaces": [
{
"owner": "5f0dc0a6fefaaf1040796f21",
"projects": [
{
"owner": "5f0dc0a6fefaaf1040796f21",
"workspace": "5f0dc0a6fefaaf1040796f22",
"title": "EXAMPLE PROJECT",
"id": "5f0dc0a6fefaaf1040796f24"
}
],
"title": "Personal",
"id": "5f0dc0a6fefaaf1040796f22"
},
{
"owner": "5f0dc0a6fefaaf1040796f21",
"projects": [],
"title": "Shared with me",
"id": "5f0dc0a6fefaaf1040796f23"
}
],
"id": "5f0dc0a6fefaaf1040796f21"
}
Here is one part of the userschema:
const UserSchema = new Schema({
workspaces : [{ type: Schema.Types.ObjectId, ref: 'Workspace' }],
confirmed: {
type: Boolean,
default: false
},
confirmToken: {
type: String,
default: ''
},
confirmTokenExpires: {
type: Date,
default: () => new Date(+new Date() + 60 * 60 * 1000) //60 minutes
}, *more fields
});
The user have 1 hour to confirm their email address, after this the user should be deleted with the subdocuments. The user deleted now, but the subdocuments dont.
UserSchema.index(
{ 'confirmTokenExpires': 1 },
{
expireAfterSeconds: 0,
partialFilterExpression: { 'confirmed': false }
}
)
I tried to find a solution, but here i am, hoping :)
Thanks in advance!
MongoDB does not support foreign keys and doesn't have something similar to cascade deletes. You have at least three options:
Create workspaces only after the user is confirmed.
Embed workspaces within users directly. Hard to tell whether this is viable without more details about your project.
Delete "expired" users manually without the expiring index in repeating background job.
i use Express-js and express graphQL module to create my endpoint and web service ;
i am looking for way to create custom response in graphQL my endpoint is simple
select books from database my response is
{
"data": {
"books": [
{
"id": "5b5c02beab8dc1182b2e0a03",
"name": "dasta"
},
{
"id": "5b5c02c0ab8dc1182b2e0a04",
"name": "dasta"
}
]
}
}
but in need something like this
{
"result": "success",
"msg" : "list ...",
"data": [
{
"id": "5b5c02beab8dc1182b2e0a03",
"name": "dasta"
},
{
"id": "5b5c02c0ab8dc1182b2e0a04",
"name": "dasta"
}
]
}
here is my bookType
const BookType = new GraphQLObjectType({
name: 'Book',
fields: () => ({
id: {type: GraphQLID},
name: {type: GraphQLString},
genre: {type: GraphQLString},
author_id: {type: GraphQLString},
author: {
type: AuthorType,
resolve(parent, args) {
return Author.findById(parent.author_id);
}
}
})
});
That's not a legal GraphQL response. As per section 7.1 of the spec, after describing the data, errors, and extensions: top-level keys:
... the top level response map must not contain any entries other than the three described above.
You might put this data into extensions; or make it an explicit part of your GraphQL API; or simply let "success" be implied by the presence of a result and the lack of an error.
I might be missing something obvious here but have spent hours on this and unable to find a solution.
Suppose I have an employee model
"properties": {
"name": {
"type": "string",
"required": true
},
"age": {
"type": "number",
"required": true
}
};
and its child model
"properties": {
"Code": {
"type": "string",
"required": true
},
"Desc": {
"type": "string",
"required": true
}
}
How do I create a many to many relationship between them?
You can add accepted properties passing accepts fields to remoteMethod options.
Read this page in the documentation.
module.exports = function(Task) {
fs.readdir(PATH_PROCESS+PATH_API, (err, o) => {
for(var c in o){
var i = o[c];
Task[i] = require(PATH_PROCESS+i+".js").lpb;
Task.remoteMethod(
i, {
http: {path: ('/'+i), verb: 'post'},
returns: {arg: 'result', type: 'object'},
accepts: [
{arg: 'params', type: 'object', required: true, http: {source: 'body'},
description: (i+' params.')}
]
}
);
}
});
};
From the docs:
A hasManyThrough relation sets up a many-to-many connection with another model. This relation indicates that the declaring model can be matched with zero or more instances of another model by proceeding through a third model.
Use slc loopback:relation command to create relations between models. Don't forget to add through model when prompted (also explained in docs).
After you create relation between your models you will have to synchronize your changes with database using automigrate() or autoupdate().
Be careful when using automigrate because it will create or re-create your database which means that you can potentially loose your data.
Say I have a javascript object (the data) and I want to check to see if it conforms to a given Schema that I've defined.
Is there a way to do this without turning the schema into a model, creating an instance of that model populated with the data, and running mymodel.validate()?
I'd love to have a Schema(definition).validate(data, callback), but the validate function is defined on the Document class, from what I could tell.
2021 update
Mongoose added that functionality as Model.validate(...) back in 2019 (v5.8.0):
You can:
try {
await Model.validate({ name: 'Hafez', age: 26 });
} catch (err) {
err instanceof mongoose.Error.ValidationError; // true
}
One way is to perform that with the help of custom validators. When the validation declined, it failed to save the document into the database.
Or another way to do that through validate() function provided by MongoDB with the same schema as you defined.
You can validate your schema on the Mongo side link
For example:
db.createCollection("students", {
validator: {
$jsonSchema: {
bsonType: "object",
required: [ "name" ],
properties: {
name: {
bsonType: "string",
description: "must be a string and is required"
},
year: {
bsonType: "int",
minimum: 2017,
maximum: 3017,
description: "must be an integer in [ 2017, 3017 ] and is required"
}
}
}
}
})