MongoError: E11000 duplicate key error collection: ... during array update operation - node.js

I was trying to delete a specific value (a game) from my array in my schema, and this is the code:
User.update({ userName: user }, { $pull: { games: { _id: deleteItem } } }, function (err, val) {
console.log(err);
});
the schema:
const userSchema = new mongoose.Schema({
userName: { type: String, index: true, unique: true },
userPassword: String,
games: [gameSchema]
});
the error:
MongoError: E11000 duplicate key error collection: mountain.users index: games.password_1
errmsg: 'E11000 duplicate key error collection: mountain.users index: games.password_1 dup key: { games.password: null }',
[Symbol(mongoErrorContextSymbol)]: {}
}
why is the error apear and how can I solve it?/other way to delete a value from array inside on object
thanks for your help!

You have a unique index built on games. password array.
I assume the game you're trying to pull is the last game in the array. and that you already have a document with an empty games array.
Hence the value of both of those documents for the index is (null) the same as they both don't exist.
sparse indexes exist for this very reason, it allows you to benefit from the unique behaviour while only taking under account documents with values that exist.
So basically you have to re-built your index to be a unique + sparse one.

Related

Mongoose duplicate key error with upsert

I have problem with duplicate key.
Long time can`t find answer. Please help me solve this problem or explain why i get duplicate key error.
Trace: { [MongoError: E11000 duplicate key error collection: project.monitor index: _id_ dup key: { : 24392490 }]
name: 'MongoError',
message: 'E11000 duplicate key error collection: project.monitor index: _id_ dup key: { : 24392490 }',
driver: true,
index: 0,
code: 11000,
errmsg: 'E11000 duplicate key error collection: project.monitor index: _id_ dup key: { : 24392490 }' }
at /home/project/app/lib/monitor.js:67:12
at callback (/home/project/app/node_modules/mongoose/lib/query.js:2029:9)
at Immediate.<anonymous> (/home/project/app/node_modules/kareem/index.js:160:11)
at Immediate._onImmediate (/home/project/app/node_modules/mquery/lib/utils.js:137:16)
at processImmediate [as _immediateCallback] (timers.js:368:17)
but in monitor i use upsert, so why i get duplicate error??
monitor.js:62-70
monitor schema
var monitorSchema = db.Schema({
_id : {type: Number, default: utils.minute},
maxTicks : {type: Number, default: 0},
ticks : {type: Number, default: 0},
memory : {type: Number, default: 0},
cpu : {type: Number, default: 0},
reboot : {type: Number, default: 0},
streams : db.Schema.Types.Mixed
}, {
collection: 'monitor',
strict: false
});
index
monitorSchema.index({_id: -1});
Monitor = db.model('Monitor', monitorSchema);
and increase by property
exports.increase = function (property, incr) {
var update = {};
update[property] = utils.parseRound(incr) || 1;
Monitor.update({_id: utils.minute()}, {$inc: update}, {upsert: true}, function (err) {
if (err) {
console.trace(err);
}
});
};
utils.js
exports.minute = function () {
return Math.round(Date.now() / 60000);
};
exports.parseRound = function (num, round) {
if (isNaN(num)) return 0;
return Number(parseFloat(Number(num)).toFixed(round));
};
An upsert that results in a document insert is not a fully atomic operation. Think of the upsert as performing the following discrete steps:
Query for the identified document to upsert.
If the document exists, atomically update the existing document.
Else (the document doesn't exist), atomically insert a new document that incorporates the query fields and the update.
So steps 2 and 3 are each atomic, but another upsert could occur after step 1 so your code needs to check for the duplicate key error and then retry the upsert if that occurs. At that point you know the document with that _id exists so it will always succeed.
For example:
var minute = utils.minute();
Monitor.update({ _id: minute }, { $inc: update }, { upsert: true }, function(err) {
if (err) {
if (err.code === 11000) {
// Another upsert occurred during the upsert, try again. You could omit the
// upsert option here if you don't ever delete docs while this is running.
Monitor.update({ _id: minute }, { $inc: update }, { upsert: true },
function(err) {
if (err) {
console.trace(err);
}
});
}
else {
console.trace(err);
}
}
});
See here for the related documentation.
You may still be wondering why this can happen if the insert is atomic, but what that means is that no updates will occur on the inserted document until it is completely written, not that no other insert of a doc with the same _id can occur.
Also, you don't need to manually create an index on _id as all MongoDB collections have a unique index on _id regardless. So you can remove this line:
monitorSchema.index({_id: -1}); // Not needed

How to ignore duplicate properties of other fields in the object?

I wrote a for loop that takes a table of data and creates 3,500 objects using this schema:
var mongoose = require('mongoose');
var Schema = mongoose.Schema;
var LocationSchema = new Schema({
twp: {type: String, index: true, unique: true, dropDups: true},
rge: {type: String, unique: true},
});
module.exports = mongoose.model('Location', LocationSchema);
Let's say this is the first object created:
{
"_id" : ObjectId("56f5eb02683e79de61278449"),
"rge" : "008W",
"twp" : "004S",
"__v" : 0
}
If any other Objects are instantiated with the value twp: 004S, I would like the object to still be created, but have no value for twp. It would look like so:
{
"_id" : ObjectId("56f5eb02683e79de61274949"),
"rge" : "009E",
"__v" : 0
}
As you can see I experimented with adding unique, and dropDups, but this still created multiple objects with the same twp value. I also experimented with adding the code below (per someone's recommendation) , but still the same results.
var twpSet = new Set();
LocationSchema.methods.twp1 = function () {
var curTwp = this.twp;
if (twpSet.has(curTwp)) {
this.twp = undefined; // remove the twp field once duplicated
} else {
twpSet.add(curTwp); // save the existing twp value
}
};
LocationSchema.queue('twp1');
The problem is because if you don't include a specific field in a document but that field has a "unique" index on it then it is still considered for indexing purposes that the field has a value of null.
Citation:
If a document does not have a value for a field, the index entry for that item will be null in any index that includes it. Thus, in many situations you will want to combine the unique constraint with the sparse option. Sparse indexes skip over any document that is missing the indexed field, rather than storing null for the index entry. Since unique indexes cannot have duplicate values for a field, without the sparse option, MongoDB will reject the second document and all subsequent documents without the indexed field.
To combat this use the "sparse" property on the created index. Heres's a demonstration showing both cases:
var async = require('async'),
mongoose = require('mongoose'),
Schema = mongoose.Schema;
mongoose.set("debug",true);
mongoose.connect('mongodb://localhost/test');
var sparseSchema = new Schema({
"a": { "type": String, "unique": true, "sparse": true },
"b": { "type": String, "unique": true }
});
var Sparse = mongoose.model('Sparsetest',sparseSchema);
async.eachSeries(
[{ "a": 1, "b": 2 },{ "a": 2 },{ "b": 3 },{ "b": 4 },{ "a": 3 }],
function(item,callback) {
Sparse.create(item,function(err,doc) {
console.log(doc);
callback(err);
});
},
function(err) {
if (err) throw err;
mongoose.disconnect();
}
);
The code is destined to error, but the output will show you what happens:
Mongoose: sparsetests.ensureIndex({ a: 1 }) { unique: true, sparse: true, background: true }
Mongoose: sparsetests.insert({ a: '1', b: '2', _id: ObjectId("56f706820c2db3902e557cff"), __v: 0 })
Mongoose: sparsetests.ensureIndex({ b: 1 }) { unique: true, background: true }
{ _id: 56f706820c2db3902e557cff, b: '2', a: '1', __v: 0 }
Mongoose: sparsetests.insert({ a: '2', _id: ObjectId("56f706820c2db3902e557d00"), __v: 0 })
{ _id: 56f706820c2db3902e557d00, a: '2', __v: 0 }
Mongoose: sparsetests.insert({ b: '3', _id: ObjectId("56f706820c2db3902e557d01"), __v: 0 })
{ _id: 56f706820c2db3902e557d01, b: '3', __v: 0 }
Mongoose: sparsetests.insert({ b: '4', _id: ObjectId("56f706820c2db3902e557d02"), __v: 0 })
{ _id: 56f706820c2db3902e557d02, b: '4', __v: 0 }
Mongoose: sparsetests.insert({ a: '3', _id: ObjectId("56f706820c2db3902e557d03"), __v: 0 })
undefined
^
MongoError: E11000 duplicate key error collection: test.sparsetests index: b_1 dup key: { : null }
So everything is fine for the first four documents created. The first document has unique values for both properties, the second does not duplicate "a" and the third and forth do not duplicate "b". Notably here the third and forth documents do not contain an "a" property, but since that property is "sparse" indexed this is not an issue.
The same is not true though of the "b" property, and when we try to insert another document where that property is considered null the duplicate key error is thrown for that index. Even though the "a" property was still unique the "b" already has null so it is not.
So where you want a "unique" index but you are not going to include the property in every document, you must turn on "sparse".
N.B The "dropDups" option is deprecated and a "no-op" in any release past MongoDB 3.x. It was never a good option to remove existing duplicates from the collection anyway. If you have duplicate data existing then you need a manual process to remove it.

How to insert data into subdocument using $push in query, instead of retrieving doc and saving it back

Edit: this was actually working
As the Mongoose - Subdocs: "Adding subdocs" documentation says, we can add a subdoc using the push method (i.e. parent.children.push({ name: 'Liesl' });)
But I want to go further, and would like to use the $push operator to insert subdocuments.
I have two Schemas: the ThingSchema:
var ThingSchema = mongoose.Schema({
name: {
type: String,
required: true
},
description: {
type: String
}
});
and the BoxSchema, the main document that has an array of subdocuments (things) of ThingSchema:
var BoxSchema = new mongoose.Schema({
name: {
type: String,
required: true
},
description: {
type: String
},
things: {
type: [ThingSchema]
}
});
var BoxModel = mongoose.model('Box', BoxSchema);
I need every subdocument in things to have unique names - that is, that it would be impossible to insert a new document into this array that has a name value that already exists in the subdocs.
I'm trying to do something like:
var thingObj = ... // the 'thing' object to be inserted
BoxModel.update({
_id: some_box_id, // a valid 'box' ObjectId
"things.name": { "$ne": thingObj.name }
},
{
$push: { things: thingObj}
},
function(err) {
if (err) // handle err
...
});
but not getting any desired results.
What would be the correct way to add a ThingSchema subdocument into BoxSchema's thing array using the $push operator to do so in the query (must not add the subdoc if there's another subdoc named the same), instead of the Mongoose Docs way?
Edit: this is actually the issue
I made a mistake, the code above works as expected but now the problem I have is that when thingObj does not match the ThingSchema, an empty object is inserted into the things array:
// now thingObj is trash
var thingObj = { some: "trash", more: "trash" };
When executing the query given the above trash object, the following empty object is inserted into the subdocs array:
{ _id: ObjectId("an_obj_id") }
What I want this case, when the thingObj doesn't match the ThingSchema, is nothing to be added.
$addToSet adds something unique to the array (as in it checks for duplicates). But it only works for primitives.
What you should do is put things into their own collection and make a unique index on name. Then, make this change
things: {
type: [{type: ObjectId, ref: 'thingscollection'}]
}
this way you can do
BoxModel.update({
_id: some_box_id, // a valid 'box' ObjectId
"things": { "$ne": thingObj._id }
},
{
$addToSet: { things: thingObj._id}
},
function(err) {
if (err) // handle err
...
});
And when you fetch use .populate on things to get the full documents in there.
It's not exactly how you want it, but that's a design that might achieve what you're aiming for.

Mongoose error findByIdAndUpdate fails in cast

Trying to update a document using findByIdAndUpdate, i get an error that i don't understand.
console.log(req.body);
var data = req.body;
data._id = undefined;
Package.findByIdAndUpdate(req.params.id, data, function (err, pkg) {
if (err) {
console.log(err.stack);
return next(restify.InternalServerError(err));
}
res.json(pkg);
next();
});
I get the following error:
TypeError: Cannot read property '_id' of undefined
at ObjectId.cast (/home/ubuntu/workspace/server/node_modules/mongoose/lib/schema/objectid.js:109:12)
at ObjectId.castForQuery (/home/ubuntu/workspace/server/node_modules/mongoose/lib/schema/objectid.js:165:17)
at Query._castUpdateVal (/home/ubuntu/workspace/server/node_modules/mongoose/lib/query.js:2009:17)
at Query._walkUpdatePath (/home/ubuntu/workspace/server/node_modules/mongoose/lib/query.js:1969:25)
at Query._castUpdate (/home/ubuntu/workspace/server/node_modules/mongoose/lib/query.js:1865:23)
at castDoc (/home/ubuntu/workspace/server/node_modules/mongoose/lib/query.js:2032:18)
at Query._findAndModify (/home/ubuntu/workspace/server/node_modules/mongoose/lib/query.js:1509:17)
at Query.findOneAndUpdate (/home/ubuntu/workspace/server/node_modules/mongoose/node_modules/mquery/lib/mquery.js:2056:15)
at Function.Model.findOneAndUpdate (/home/ubuntu/workspace/server/node_modules/mongoose/lib/model.js:1250:13)
at Function.Model.findByIdAndUpdate (/home/ubuntu/workspace/server/node_modules/mongoose/lib/model.js:1344:32)
I have verified that the id is valid, data is a valid object as well.
My model:
mongoose.model('Package', {
name: {
required: true,
type: String
},
servers: [mongoose.Schema.Types.ObjectId],
packageType: {
type: String,
enum: ['package', 'subscription']
},
subscriptionPeriodInDays: Number,
pointsIncluded: Number,
price: Number,
rank: String,
data: mongoose.Schema.Types.Mixed //For custom solutions
});
The log also prints a valid data object
{
name: 'Your Package',
packageType: 'subscription',
subscriptionPeriodInDays: 30,
pointsIncluded: 10000,
price: 10,
rank: 'Donator',
_id: undefined,
__v: 0,
servers: [],
description: '<p>test</p>\n'
}
I have tried to step trough with the debugger but i couldn't find a reason for this.
As Raleigh said, you need to remove _id field. You can do it by delete data._id; instead of data._id = undefined;.
I believe Mongoose is trying to set the value of _id to undefined since the _id value is still getting passed in via the data object.
Try removing the line data._id = undefined; before you update the model or completely remove the _id field from the data object.

Using sparse: true still getting MongoError: E11000 duplicate key error

Schema (../models/add.js)
var addSchema = new Schema({
name: {type: String, unique: true, sparse: true},
phone: Number,
email: String,
country: Number
});
module.exports = mongoose.model('Contact', addSchema);
add-manager.js
var Add = require('../models/add.js');
var AM = {};
var mongoose = require('mongoose');
module.exports = AM;
AM.notOwned = function(country, callback)
{
Add.update({country: country}, {country: country}, {upsert: true}, function(err, res){
if (err) callback (err);
else callback(null, res);
})
}
news.js
// if country # is not in the database
AM.notOwned(country, function(error, resp){
if (error) console.log("error: "+error);
else
{
// do stuff
}
})
error:
MongoError: E11000 duplicate key error index: bot.contacts.$name_1 dup key: { : null }
After seeing the error message, I googled around and learned that when the document is created, since name isn't set, its treated as null. See Mongoose Google Group Thread The first time AM.notOwned is called it will work as there isn't any documents in the collection without a name key. AM.notOwned will then insert a document with an ID field, and a country field.
Subsequent AM.notOwned calls fails because there is already a document with no name field, so its treated as name: null, and the second AM.notOwned is called fails as the field "name" is not set and is treated as null as well; thus it is not unique.
So, following the advice of the Mongoose thread and reading the mongo docs I looked at using sparse: true. However, its still throwing the same error. Further looking into it, I thought it may be the same issue as: this, but setting schema to name: {type: String, index: {unique: true, sparse: true}} doesn't fix it either.
This S.O. question/answer leads me to believe it could be caused by the index not being correct, but I'm not quite sure how to read the the db.collection.getIndexes() from Mongo console.
db.contacts.getIndexes()
[
{
"v" : 1,
"key" : {
"_id" : 1
},
"ns" : "bot.contacts",
"name" : "_id_"
},
{
"v" : 1,
"key" : {
"name" : 1
},
"unique" : true,
"ns" : "bot.contacts",
"name" : "name_1",
"background" : true,
"safe" : null
}
]
What can I do to resolve this error?
You need to drop the old, non-sparse index in the shell so that Mongoose can recreate it with sparse: true the next time your app runs.
> db.contacts.dropIndex('name_1')

Resources