I am starting to learn Sails.js and I want to know if there is a simpler way to set the default value in a model from a session variable? I am using Waterlock to do authentication and setting the user_id in a session variable like req.session.user_id. I have a message model, I want to default the 'from' field to this session variable. Is there a way to do this in Sails.js?
If you are using Sail's default ORM, Waterline, then model attributes have a defaultsTo option. The supplied value may be a function. You can look at the waterline documentation.
Sample Model
module.exports = {
attributes: {
description: {
type: 'string',
defaultsTo: 'No description.'
},
email: {
type: 'email',
required: true,
unique: true
},
alias: {
type: 'string',
defaultsTo: function(){
return this.email;
}
},
}
};
If the supplied value is a function, then the call to this function is bound to the values in the create.
For Model.create(values)..., if alias is null/undefined, then alias = defaultsTo.call(values) is used to get the value for alias.
So to use req.session.user_id you may have to include it during create.
Model.create({session: req.session,...}). I am not really sure, since I do not use Waterlock. And I'm not don't think this is the best way to go about it.
From my experience these extra values are ignored when using adapters such as sails-mysql, but if you are using sails-disk for development, then you might see that these extra values are persisted. You may want to delete them before you persist.
beforeCreate: function(values, cb){
delete values['session'];
cb();
}
Related
Suppose the following User Schema in MongoDB (using Mongoose/Nodejs):
var UserSchema = new Schema({
email: {
type: String,
unique: true,
required: 'User email is required.'
},
password: {
type: String,
required: 'User password is required.'
},
token: {
type: String,
unique: true,
default: hat
},
created_at: {
type: Date,
default: Date.now
},
});
// mongoose-encrypt package
UserSchema.plugin(encrypt, {
secret: 'my secret',
encryptedFields: ['email', 'password', 'token', 'created_at']
});
Now assume I want to return the user object from an API endpoint. In fact, suppose I want to return user objects from multiple API endpoints. Possibly as a standalone object, possibly as a related model.
Obviously, I don't want password to be present in the returned structure - and in many cases I wouldn't want token to be returned either. I could do this manually on every endpoint, but I'd prefer a no-thought solution - being able to simply retrieve the user, end of story, and not worry about unsetting certain values after the fact.
I mainly come from the world of Laravel, where things like API Resources (https://laravel.com/docs/5.6/eloquent-resources) exist. I already tried implementing the mongoose-hidden package (https://www.npmjs.com/package/mongoose-hidden) to hide the password and token, but unfortunately it seems as though that breaks the encryption package I'm using.
I'm new to Nodejs and MongoDB in general - is there a good way to implement this?
How to protect the password field in Mongoose/MongoDB so it won't return in a query when I populate collections?
You can use this: Users.find().select("-password"),
but this is done whenever you send the queried item to the user (res.json()...) so you can do your manipultions with this field included and then remove it from the user before you send it back (this is using the promise approach, the best practice).
And if you want your changes to be used as default you can add "select: false" into the schema object's password field.
Hope this helps :)
I was wondering if there is way to force a unique collection entry but only if entry is not null.
e
Sample schema:
var UsersSchema = new Schema({
name : {type: String, trim: true, index: true, required: true},
email : {type: String, trim: true, index: true, unique: true}
});
'email' in this case is not required but if 'email' is saved I want to make sure that this entry is unique (on a database level).
Empty entries seem to get the value 'null' so every entry wih no email crashes with the 'unique' option (if there is a different user with no email).
Right now I'm solving it on an application level, but would love to save that db query.
thx
As of MongoDB v1.8+ you can get the desired behavior of ensuring unique values but allowing multiple docs without the field by setting the sparse option to true when defining the index. As in:
email : {type: String, trim: true, index: true, unique: true, sparse: true}
Or in the shell:
db.users.ensureIndex({email: 1}, {unique: true, sparse: true});
Note that a unique, sparse index still does not allow multiple docs with an email field with a value of null, only multiple docs without an email field.
See http://docs.mongodb.org/manual/core/index-sparse/
tl;dr
Yes, it is possible to have multiple documents with a field set to null or not defined, while enforcing unique "actual" values.
requirements:
MongoDB v3.2+.
Knowing your concrete value type(s) in advance (e.g, always a string or object when not null).
If you're not interested in the details, feel free to skip to the implementation section.
longer version
To supplement #Nolan's answer, starting with MongoDB v3.2 you can use a partial unique index with a filter expression.
The partial filter expression has limitations. It can only include the following:
equality expressions (i.e. field: value or using the $eq operator),
$exists: true expression,
$gt, $gte, $lt, $lte expressions,
$type expressions,
$and operator at the top-level only
This means that the trivial expression {"yourField"{$ne: null}} cannot be used.
However, assuming that your field always uses the same type, you can use a $type expression.
{ field: { $type: <BSON type number> | <String alias> } }
MongoDB v3.6 added support for specifying multiple possible types, which can be passed as an array:
{ field: { $type: [ <BSON type1> , <BSON type2>, ... ] } }
which means that it allows the value to be of any of a number of multiple types when not null.
Therefore, if we want to allow the email field in the example below to accept either string or, say, binary data values, an appropriate $type expression would be:
{email: {$type: ["string", "binData"]}}
implementation
mongoose
You can specify it in a mongoose schema:
const UsersSchema = new Schema({
name: {type: String, trim: true, index: true, required: true},
email: {
type: String, trim: true, index: {
unique: true,
partialFilterExpression: {email: {$type: "string"}}
}
}
});
or directly add it to the collection (which uses the native node.js driver):
User.collection.createIndex("email", {
unique: true,
partialFilterExpression: {
"email": {
$type: "string"
}
}
});
native mongodb driver
using collection.createIndex
db.collection('users').createIndex({
"email": 1
}, {
unique: true,
partialFilterExpression: {
"email": {
$type: "string"
}
}
},
function (err, results) {
// ...
}
);
mongodb shell
using db.collection.createIndex:
db.users.createIndex({
"email": 1
}, {
unique: true,
partialFilterExpression: {
"email": {$type: "string"}
}
})
This will allow inserting multiple records with a null email, or without an email field at all, but not with the same email string.
Just a quick update to those researching this topic.
The selected answer will work, but you might want to consider using partial indexes instead.
Changed in version 3.2: Starting in MongoDB 3.2, MongoDB provides the
option to create partial indexes. Partial indexes offer a superset of
the functionality of sparse indexes. If you are using MongoDB 3.2 or
later, partial indexes should be preferred over sparse indexes.
More doco on partial indexes: https://docs.mongodb.com/manual/core/index-partial/
Actually, only first document where "email" as field does not exist will get save successfully. Subsequent saves where "email" is not present will fail while giving error ( see code snippet below). For the reason look at MongoDB official documentation with respect to Unique Indexes and Missing Keys here at http://www.mongodb.org/display/DOCS/Indexes#Indexes-UniqueIndexes.
// NOTE: Code to executed in mongo console.
db.things.ensureIndex({firstname: 1}, {unique: true});
db.things.save({lastname: "Smith"});
// Next operation will fail because of the unique index on firstname.
db.things.save({lastname: "Jones"});
By definition unique index can only allow one value to be stored only once. If you consider null as one such value it can only be inserted once! You are correct in your approach by ensuring and validating it at application level. That is how it can be done.
You may also like to read this http://www.mongodb.org/display/DOCS/Querying+and+nulls
Is it possible to limit available displayed options in a relationship type of KeystoneJS by specifying a value condition?
Basically, a model has two sets of array fields, instead of letting the admin user select any item from the field, I would like to restrict to only the items that are part of a specific collection _id.
Not sure if this is exactly the feature you're looking for, but you can specify a filter option on the Relationship field as an object and it will filter results so only those that match are displayed.
Each property in the filter object should either be a value to match in the related schema, or it can be a dynamic value matching the value of another path in the schema (you prefix the path with a :).
For example:
User Schema
User.add({
state: { type: Types.Select, options: 'enabled, disabled' }
});
Post Schema
// Only allow enabled users to be selected as the author
Post.add({
author: { type: Types.Relationship, ref: 'User', filter: { state: 'enabled' } }
});
Or for a dynamic example, imagine you have a role setting for both Posts and Users. You only want to match authors who have the same role as the post.
User Schema
User.add({
userRole: { type: Types.Select, options: 'frontEnd, backEnd' }
});
Post Schema
Post.add({
postRole: { type: Types.Select, options: 'frontEnd, backEnd' },
// only allow users with the same role value as the post to be selected
author: { type: Types.Relationship, ref: 'User', filter: { userRole: ':postRole' } }
});
Note that this isn't actually implemented as back-end validation, it is just implemented in the Admin UI. So it's more of a usability enhancement than a restriction.
To expand on Jed's answer, I think the correct property (at least in the latest version of KeystoneJS 0.2.22) is 'filters' instead of 'filter'. 'filter' doesn't work for me.
I'm using Sails.js (0.9.8) and MongoDB (via the sails-mongo adaptor) to create a collection of pages that can be positioned in a tree-view. I would like to store the path of a page in an array of UUIDs
My model:
module.exports = {
schema: true,
attributes: {
uuid: {
type: 'string',
unique: true,
required: true,
uuidv4: true
},
name: {
type: 'string',
required: true,
empty: false
},
path: {
type: 'array',
required: true,
array: true
}
}
}
It works well when I save a 'root' page (the 'path' property has just one item because it's a root page. Here is what it was saved in MongoDB:
{
_id: ObjectId("52f853e9609fb6c0341bdfcc"),
createdAt: ISODate("2014-02-10T04:22:01.828Z"),
name: "Home Page",
path: [
"a2b23e1f-954b-49a3-91f1-4d62d209a093"
],
updatedAt: ISODate("2014-02-10T04:22:01.833Z"),
uuid: "a2b23e1f-954b-49a3-91f1-4d62d209a093"
}
But when I want to create a 'subpage' below my previous created page (Home Page/Products), I get this error:
MongoError: E11000 duplicate key error index: cms-project.item.$path_1
dup key: { : "a2b23e1f-954b-49a3-91f1-4d62d209a093" }
Here is the data I sent:
{ name: 'Products',
uuid: 'a004ee54-7e42-49bf-976c-9bb93c118038',
path:
[ 'a2b23e1f-954b-49a3-91f1-4d62d209a093',
'a004ee54-7e42-49bf-976c-9bb93c118038' ] }
I probably missed something but I don't know what.
If I store the path in a string instead of an array, it work well, but I find it much less elegant and handy.
Not sure of all the Sails / Waterline parts myself as I've never played with it. But by the error the problem is there is a unique index on your array field.
When you are inserting your second document, you already have one of the values (the parent) in your path field in another document. The unique constraint is not going to allow this. Most certainly for what you are modelling, you do not want this and the index cannot be unique.
I hope that you set this up yourself under the assumption that it meant unique within the array contained in the document. If you did then you know where to look and what to change now. If this is being automatically deployed somehow, then I'm not the one to help.
Change the index to not be unique. You can confirm this through the mongo shell:
use cms-project
db.item.getIndices()
Good luck
We're rapidly developing an application that's using Mongoose, and our schema's are changing often. I can't seem to figure out the proper way to update a schema for existing documents, without blowing them away and completely re-recreating them from scratch.
I came across http://mongoosejs.com/docs/api.html#schema_Schema-add, which looks to be right. There's little to no documentation on how to actually implement this, making it very hard for someone who is new to MongoDB.
I simply want to add a new field called enabled. My schema definition is:
var sweepstakesSchema = new Schema({
client_id: {
type: Schema.Types.ObjectId,
ref: 'Client',
index: true
},
name: {
type: String,
default: 'Sweepstakes',
},
design: {
images: {
type: [],
default: []
},
elements: {
type: [],
default: []
}
},
enabled: {
type: Boolean,
default: false
},
schedule: {
start: {
type: Date,
default: Date.now
},
end: {
type: Date,
default: Date.now
}
},
submissions: {
type: Number,
default: 0
}
});
Considering your Mongoose model name as sweepstakesModel,
this code would add enabled field with boolean value false to all the pre-existing documents in your collection:
db.sweepstakesModel.find( { enabled : { $exists : false } } ).forEach(
function (doc) {
doc.enabled = false;
db.sweepstakesModel.save(doc);
}
)
There's nothing built into Mongoose regarding migrating existing documents to comply with a schema change. You need to do that in your own code, as needed. In a case like the new enabled field, it's probably cleanest to write your code so that it treats a missing enabled field as if it was set to false so you don't have to touch the existing docs.
As far as the schema change itself, you just update your Schema definition as you've shown, but changes like new fields with default values will only affect new documents going forward.
I was also searching for something like migrations, but didn't find it. As an alternative you could use defaults. If a key has a default and the key doesn't exist, it will use the default.
Mongoose Defaults
Default values are applied when the document skeleton is constructed. This means that if you create a new document (new MyModel) or if you find an existing document (MyModel.findById), both will have defaults provided that a certain key is missing.
I had the exact same issue, and found that using findOneAndUpdate() rather than calling save allowed us to update the schema file, without having to delete all the old documents first.
I can post a code snippet if requested.
You might use mongo shell to update the existing documents in a specific collection
db.SweeptakesModel.update({}, {$set: {"enabled": false}}, {upsert:false, multi:true})
I had a similar requirement of having to add to an existing schema when building an app with Node, and only found this (long ago posted) query to help.
The schema I added to by introducing the line in the original description of the schema and then running something similar to the following line, just the once, to update existing records:
myModelObject.updateMany( { enabled : { $exists : false } }, { enabled : false } )
'updateMany' being the function I wanted to mention here.
just addition to what Vickar was suggesting, here Mongoose Example written on Javascript (Nodejs):
const mongoose = require('mongoose');
const SweeptakesModel = mongoose.model(Constants.SWEEPTAKES,sweepstakesSchema);
SweeptakesModel.find( { enabled : { $exists : false } }).then(
function(doc){
doc.enabled = false;
doc.save();
}
)