What is an efficient approach to maintaining data history for models in Sails? For instance, how do we maintain the data if a user updates the database and we'd like to keep versions for reverting the data as it updates.
I've seen numerous examples of using a "changes" tag that keeps the date as a sub-tag. Is this the most efficient?
On model update we can copy previous information. Are there any better suggestions or links with examples?
{
text:"This is the latest paragraph",
changes:{
text:{
"1470685677694":"This was the paragraph",
"1470685577694":"This was the original paragraph"
}
}
}
I am hoping to find a good solution to find / search and optimize this data to allow the user to revert if necessary.
You can define your model like the following and so you can preserve the history as well which you can use to restore any previous values.
api/models/Test.js
module.exports = {
connectionName:"someMongodbServer",
tableName:"test",
autoCreatedAt:false,
autoUpdatedAt:false,
autoPk:false,
attributes: {
id:{
type:"integer",
primaryKey:true,
autoIncrement:true
},
name:"string",
history:{
type:'json',
defaultsTo:[]
}
},
beforeCreate:function(value,cb){
Test.count().exec(function (err, cnt) {
if(err)
return cb(err);
value.id = cnt + 1;
cb();
});
},
beforeUpdate:function(values,cb){
Test.native(function(err,collection){
if(err)
return cb(err);
collection.findOne({_id:values.id},{history:0,_id:0},function(err,data){
if(err)
return cb(err);
collection.updateOne({_id:values.id},{$push:{history:data}},function(err,res){
if(err)
return cb(err);
console.log(res);
cb();
});
})
});
}
};
Related
I have something weird, I try just to sort on _id and have some paging. Hereunder you will see the query I execute:
var condition = { isArchived: false };
if(lastId) {
condition["_id"] = { $lt : lastId };
};
PostModel
.find(condition)
.sort({_id:-1})
.limit(10)
.exec(function (err, posts) {
if(err)
return callback(new customError.Database(err.toString()),null);
callback(null, posts);
})
What I see is that in 80% of the time the result is consistent, but sometimes the result is not the same (it does not vary much, but some objects are in a different order).
I use this technique with success on other models, but only with this query/collection I get this problem (not sure if the problem is in the query or on the collection...)
What can be the reason?
I insert an Entity:
datastore.save({
key: datastore.key(['Users', 'bob']),
method: 'insert',
data: [
{
name: 'email',
value: 'bob#gmail.com',
excludeFromIndexes: false
}
]
}, function(err) {
if (!err) {
console.log('insert was a success');
}else{
console.log(err);
}
});
Then I want to query the user by email:
var query = datastore.createQuery('Users').filter('email', 'bob#gmail.com');
datastore.runInTransaction(function(transaction, done) {
transaction.runQuery(query, function(err, entities) {
if (!err) {
//insert another thing into the datastore here ...
}else{
console.log('err = ' + err);
transaction.rollback(done);
return;
}
});
});
But I get the error:
global queries do not support strong consistency
I saw I can't modify the consistency in node in the docs, so how do I query?
When you query against Datastore, these operations by default are eventually consistent. This means results that you have recently written may not show up in your query.
You can make sure that Datastore queries are strongly consistent by adding an you can only perform strongly consistent queries., which restricts the query to a single Entity Group (the unit of consistency in Datastore).
When you are running in a transaction, you can only perform strongly consistent queries. Since you are running in a transaction but not specifying an ancestor filter, you get this error.
How do I make a query in mongoose to find if a user has 50 documents then remove the oldest one and add in the new one, if not just add in the new one?
This is my attempt:
Notifications.findOne({_id: userId}, function(err, results) {
if(err) throw err;
if(results.length < 50) {
saveNotification();
} else {
Notifications.findByIdAndUpdate(userId, {pull: //WHAT GOES HERE),
function(err, newNotify) {
if(error) throw error;
saveNotification();
});
}
});
function saveNotification() {
var new_notification = new Notification ({
notifyToUserId: creatorId,
notifyFromUserId: null,
notifyMsg: newmsg,
dateNotified: dateLastPosted
});
new_notification.save(function(err, results){
if(!err) {
console.log('notification has been added to the db');
cb(null, resultObject);
} else {
console.log("Error creating notification " + err);
}
});
}
As #Pio mentioned I don't think you can do it in one query with your current schema. But if you have chance to change the schema, you can use fixed size array pattern that is described in the following article Limit Number of Elements in an Array after an Update
Basically you can keep the notifications of users in one document. Key of the document will be userId, and notifications will be stored in an array. Then the following query would achieve your goal.
Notifications.update(
{ _id: userId },
{
$push: {
notifications: {
$each: [ notificationObject ], // insert your new notification
$sort: { dateNotified: 1 }, // sort by insertion date
$slice: -50 // retrieve the last 50 notifications.
}
}
}
)
I am not sure you can do it in one query, but you can
.count({user: yourUser'}) then depending on the count .insert(newDocument) or update the oldest one so you won't remove + insert.
Capped collections do what you want by nature. If you define a capped collection with size 50 it will only keep 50 documents and will overwrite old data when you insert more.
check
http://mongoosejs.com/docs/guide.html#capped
http://docs.mongodb.org/manual/core/capped-collections/
new Schema({..}, { capped: { size: 50, max: 50, autoIndexId: true } });
Remember that when working with capped collection you can only make inplace updates. Updating whole document may change the size of collection that will remove other documents.
I ended up using cubbuk's answer and expanding it to add a notification if there is no array to start with along with upsert...
Notification.findOneAndUpdate({notifyToUserId: to}, {
$push: {
notifyArray: {$each: [newNotificationObject], // insert your new notification
$sort: { dateNotified: 1 }, // sort by insertion date
$slice: -50 // retrieve the last 50 notifications.
}
}
}, {upsert: true}, function(err, resultOfFound) {
if(err) throw err;
if(resultOfFound == null) {
var new_notification = new Notification ({
notifyToUserId: to,
notifyArray: [newNotificationObject]
});
new_notification.save(function(err, results){
if(!err) {
console.log('notification has been added to the db');
cb(null, resultObject);
} else {
console.log("Error creating notification " + err);
}
});
} else {
cb(null, resultObject);
}
});
I have been following the docs on how to update a trip using Node.js,mongoose,express, as well as a couple of stack overflow questions here and here.
Inside of my Controller I originally had a save, which was working, but I needed to switch it to an update. When I run the code, nothing breaks and the place is properly logged to the console as I would expect, which further indicates that the code is being run, but for some reason it isn't overwriting the item that has the same placeIdentifier.
What am I doing wrong that prevents this from updating?
My controller:
var place = {
identifier: placeIdentifier,
name: placeName,
location: Location,
travelStart: startDate,
travelEnd: endDate,
}
Place.findOneAndUpdate(
{identifier:placeIdentifier},
{ $set: place },
{upsert: true},
function() {
console.log(place)
console.log("place saved...")
}
);
Oh, I got it. Or rather, this person got it: Mongoose: Find, modify, save
User.findOne({username: oldUsername}, function (err, user) {
user.username = newUser.username;
user.password = newUser.password;
user.rights = newUser.rights;
user.save(function (err) {
if(err) {
console.error('ERROR!');
}
});
});
As a preface, I'm using Node.js with Mongo-db-native.
I'm also using GridFS to store images and each image has meta data, one of which is a Product Id.
I want to query all the the fs.files and return images that are associated to a specific product.
Here's how I am currently doing this:
this.collection.ensureIndex({
product_id: 1,
}, function (err, edIndex) {
self.collection.group( ['group'] , {
"product_id": ObjectID(product_id)
} , {
docs: []
} , function (doc, prev) {
prev.docs.push({
width: doc.width,
height: doc.height,
_id: doc._id
});
} , true , function (err, results) {
if (err) {
callback(err)
} else {
callback(null, results)
}
});
});
I'm finding that this is extremely slow. Does anyone have any suggestions as an alternative or how to increase the performance of this?
Thank you!
Here is a simple query in Mongo shell syntax which will find all GridFS files with ProductId = 42 in its metadata.
db.fs.files.find({"metadata.ProductId": 42});
The returned documents will contain a filename which can be used with mongo-native's GridStore.