Mongoose unique if not null and if state - node.js

I have a unique index like this
code: {
type: String,
index: {
unique: true,
partialFilterExpression: {
code: { $type: 'string' }
}
},
default: null
},
state: { type: Number, default: 0 },
but When the state is 2 (archived) I want to keep the code, but it should be able to reuse the code, so it cannot be unique if state is 2.
Is there any away that I could accomplish this?

This is possible, though it's through a work around documented here https://jira.mongodb.org/browse/SERVER-25023.
In MongoDB 4.7 you will be able to apply different index options to the same field but for now you can add a non-existent field to separate the two indexes.
Here's an example using the work around.
(async () => {
const ItemSchema = mongoose.Schema({
code: {
type: String,
default: null
},
state: {
type: Number,
default: 0,
},
});
// Define a unique index for active items
ItemSchema.index({code: 1}, {
name: 'code_1_unique',
partialFilterExpression: {
$and: [
{code: {$type: 'string'}},
{state: {$eq: 0}}
]
},
unique: true
})
// Defined a non-unique index for non-active items
ItemSchema.index({code: 1, nonExistantField: 1}, {
name: 'code_1_nonunique',
partialFilterExpression: {
$and: [
{code: {$type: 'string'}},
{state: {$eq: 2}}
]
},
})
const Item = mongoose.model('Item', ItemSchema)
await mongoose.connect('mongodb://localhost:27017/so-unique-compound-indexes')
// Drop the collection for test to run correctly
await Item.deleteMany({})
// Successfully create an item
console.log('\nCreating a unique item')
const itemA = await Item.create({code: 'abc'});
// Throws error when trying to create with the same code
await Item.create({code: 'abc'})
.catch(err => {console.log('\nThrowing a duplicate error when creating with the same code')})
// Change the active code
console.log('\nChanging item state to 2')
itemA.state = 2;
await itemA.save();
// Successfully created a new doc with sama code
await Item.create({code: 'abc'})
.then(() => console.log('\nSuccessfully created a new doc with sama code'))
.catch(() => console.log('\nThrowing a duplicate error'));
// Throws error when trying to create with the same code
Item.create({code: 'abc'})
.catch(err => {console.log('\nThrowing a duplicate error when creating with the same code again')})
})();

This is not possible with using indexes. Even if you use a compound index for code and state there will still be a case where
new document
{
code: 'abc',
state: 0
}
archived document
{
code: 'abc',
state: 2
}
Now although you have the same code you will not be able to archive the new document or unarchive the archived document.
You can do something like this
const checkCode = await this.Model.findOne({code:'abc', active:0})
if(checkCode){
throw new Error('Code has to be unique')
}
else{
.....do something
}

Related

Update multiple objects in nested array

Question: Is it possible to update multiple objects in a nested array based on another field in the objects, using a single Mongoose method?
More specifically, I'm trying to update subscribed in each object of the Contact.groups array where the object's name value is included in groupNames. Solution 1 works, but it seems messy and inefficient to use both findOne() and save(). Solution 2 is close to working with just findOneAndUpdate(), but only the first eligible object in Contact.groups is updated. Am I able to update all the eligible objects using just findOneAndUpdate()?
Contact schema (trimmed down to relevant info):
{
phone: { type: String, unique: true },
groups: [
{
name: { type: String },
subscribed: { type: Boolean }
}
]
}
Variables I have at this point:
const phoneToUpdate = '1234567890' // Contact.phone to find
const groupNames = [ 'A', 'B', 'C' ] // Contacts.groups <obj>.name must be one of these
const subStatus = false // Contacts.groups <obj>.subscribed new value
Solution 1 (seems inefficient and messy):
Contact
.findOne({ phone: phoneToUpdate })
.then(contact => {
contact.groups
.filter(g => groupNames.includes(g.name))
.forEach(g => g.subscribed = subStatus)
contact
.save()
.then(c => console.log(c))
.catch(e => console.log(e))
})
.catch(e => console.log(e))
Solution 2 (only updates the first matching object):
Contact
.findOneAndUpdate(
{ phone: phoneToUpdate, 'groups.name': { $in: groupNames } },
{ $set: { 'groups.$.subscribed': subStatus } },
{ new: true }
)
.then(c => console.log(c))
.catch(error => console.log(error))
// Example Contact after the findOneAndUpdate
{
phone: '1234567890',
groups: [
{ name: 'A', subscribed: false },
{ name: 'B', subscribed: true } // Should also be false
]
}
You can not use $ operator since he will act as a placeholder only for the first match.
The positional $ operator acts as a placeholder for the first match of
the update query document.
What you can use is arrayFilters operator. You can modify your query like this:
Contact.findOneAndUpdate({
"phone": phoneToUpdate
},
{
"$set": {
"groups.$[elem].subscribed": subStatus
}
},
{
"arrayFilters": [
{
"elem.name": {
"$in": groupNames
}
}
]
})
Here is a working example: https://mongoplayground.net/p/sBT-aC4zW93

Concurrency problems updating another's collection stats

I'm trying to make a notation system for movies
A user can note a Movie in their List.
Whenever the user clicks on the frontend, the listId, movieId, note are sent to the server to update the note. The note can be set to null, but it does not remove the entry from the list.
But if the user clicks too much times, the movie's totalNote and nbNotes are completely broken. Feels like there is some sort of concurrency problems ?
Is this the correct approach to this problem or am I updating in a wrong way ?
The mongoose schemas related :
// Movie Schema
const movieSchema = new Schema({
// ...
note: { type: Number, default: 0 },
totalNotes: { type: Number, default: 0 },
nbNotes: { type: Number, default: 0 },
})
movieSchema.statics.updateTotalNote = function (movieId, oldNote, newNote) {
if (!oldNote && !newNote) return
const nbNotes = !newNote ? -1 : (!oldNote ? 1 : 0) // If oldNote is null we +1, if newNote is null we -1
return Movie.findOneAndUpdate({ _id: movieId }, { $inc: { nbNotes: nbNotes, totalNotes: (newNote - oldNote) } }, { new: true }).catch(err => console.error("Couldn't update note from movie", err))
}
// List Schema
const movieEntry = new Schema({
_id: false, // movie makes an already unique attribute, which is populated on GET
movie: { type: Schema.Types.ObjectId, ref: 'Movies', required: true },
note: { type: Number, default: null, max: 21 },
})
const listSchema = new Schema({
user: { type: Schema.Types.ObjectId, ref: 'Users', required: true },
movies: [movieEntry]
})
The server update API (add / Remove movieEntry are similar with $push and $pull instead of $set)
exports.updateEntry = (req, res) => {
const { listId, movieId } = req.params
const movieEntry = { movieId: movieId, note: req.body.note }
List.findOneAndUpdate({ _id: listId, 'movies.movie': movieId }, { $set: { 'movies.$[elem]': movieEntry } }, { arrayFilters: [{ 'elem.movie': movieId }] })
.exec()
.then(list => {
if (!list) return res.sendStatus(404)
const oldNote = list.getMovieEntryById(movieId).note // getMovieEntryById(movieId) = return this.movies.find(movieEntry => movieEntry.movie == movieId)
Movie.updateTotalNote(movieId, oldNote, movieEntry.note)
let newList = list.movies.find(movieEntry => movieEntry.movie == movieId) // Because I needed the oldNote and findOneAndUpdate returns the list prior to modification, I change it to return it
newList.note = movieEntry.note
newList.status = movieEntry.status
newList.completedDate = movieEntry.completedDate
return res.status(200).json(list)
})
.catch(err => {
console.error(err)
return res.sendStatus(400)
})
}
The entries I needed to update were arrays that could grow indefinitely so I had to first change my models and use virtuals and another model for the the list entries.
Doing so made the work easier and I was able to create, update and delete the entries more easily and without any concurrency problems.
This might also not have been a concurrency problem in the first place, but a transaction problem.

How to grab field value during a MongooseModel.bulkWrite operation?

Context:
I am trying to upsert in bulk an array of data, with an additional computed field: 'status'.
Status should be either :
- 'New' for newly inserted docs;
- 'Removed' for docs present in DB, but inexistent in incoming dataset;
- a percentage explaining the evolution for the field price, comparing the value in DB to the one in incoming dataset.
Implementations:
data.model.ts
import { Document, model, Model, models, Schema } from 'mongoose';
import { IPertinentData } from './site.model';
const dataSchema: Schema = new Schema({
sourceId: { type: String, required: true },
name: { type: String, required: true },
price: { type: Number, required: true },
reference: { type: String, required: true },
lastModified: { type: Date, required: true },
status: { type: Schema.Types.Mixed, required: true }
});
export interface IData extends IPertinentData, Document {}
export const Data: Model<IData> = models.Data || model<IData>('Data', dataSchema);
data.service.ts
import { Data, IPertinentData } from '../models';
export class DataService {
static async test() {
// await Data.deleteMany({});
const data = [
{
sourceId: 'Y',
reference: `y0`,
name: 'y0',
price: 30
},
{
sourceId: 'Y',
reference: 'y1',
name: 'y1',
price: 30
}
];
return Data.bulkWrite(
data.map(function(d) {
let status = '';
// #ts-ignore
console.log('price', this);
// #ts-ignore
if (!this.price) status = 'New';
// #ts-ignore
else if (this.price !== d.price) {
// #ts-ignore
status = (d.price - this.price) / this.price;
}
return {
updateOne: {
filter: { sourceId: d.sourceId, reference: d.reference },
update: {
$set: {
// Set percentage value when current price is greater/lower than new price
// Set status to nothing when new and current prices match
status,
name: d.name,
price: d.price
},
$currentDate: {
lastModified: true
}
},
upsert: true
}
};
}
)
);
}
}
... then in my backend controller, i just call it with some route :
try {
const results = await DataService.test();
return new HttpResponseOK(results);
} catch (error) {
return new HttpResponseInternalServerError(error);
}
Problem:
I've tried lot of implementation syntaxes, but all failed either because of type casting, and unsupported syntax like the $ symbol, and restrictions due to the aggregation...
I feel like the above solution might be closest to a working scenario but i'm missing a way to grab the value of the price field BEFORE the actual computation of status and the replacement with updated value.
Here the value of this is undefined while it is supposed to point to current document.
Questions:
Am i using correct Mongoose way for a bulk update ?
if yes, how to get the field value ?
Environment:
NodeJS 13.x
Mongoose 5.8.1
MongoDB 4.2.1
EUREKA !
Finally found a working syntax, pfeeeew...
...
return Data.bulkWrite(
data.map(d => ({
updateOne: {
filter: { sourceId: d.sourceId, reference: d.reference },
update: [
{
$set: {
lastModified: Date.now(),
name: d.name,
status: {
$switch: {
branches: [
// Set status to 'New' for newly inserted docs
{
case: { $eq: [{ $type: '$price' }, 'missing'] },
then: 'New'
},
// Set percentage value when current price is greater/lower than new price
{
case: { $ne: ['$price', d.price] },
then: {
$divide: [{ $subtract: [d.price, '$price'] }, '$price']
}
}
],
// Set status to nothing when new and current prices match
default: ''
}
}
}
},
{
$set: { price: d.price }
}
],
upsert: true
}
}))
);
...
Explanations:
Several problems were blocking me :
the '$field_value_to_check' instead of this.field with undefined 'this' ...
the syntax with $ symbol seems to work only within an aggregation update, using update: [] even if there is only one single $set inside ...
the first condition used for the inserted docs in the upsert process needs to check for the existence of the field price. Only the syntax with BSON $type worked...
Hope it helps other devs in same scenario.

Mongoose — findOneAndUpdate() causing document duplication

I'm using findOneAndUpdate() with upsert: true in order for a document to be updated if it exists and to be created otherwise. The tracks variable contains an array of Track instances. tracks does contain a few duplicates and that's where the problem begins. It causes the piece of code on line 7 (Observation.findOneAndUpdate(...)) to create a (low) number of duplicates, i.e. multiple documents that have the same (user, track) pair. Note that those duplicates are inserted randomly: running twice this piece of code brings different duplicated documents. My guess is that it has something to do with how the locking of data is done in MongoDB and that I'm doing too many operations at the same time. Any idea on how I could overcome this problem?
const promises = [];
​
tracks.forEach((track) => {
const query = { user, track };
const options = { new: true, upsert: true };
const newOb = { user, track, type: 'recent' };
promises.push(Observation.findOneAndUpdate(query, newOb, options));
});
​
return Promise.all(promises);
I'm using mongoose 5.5.8 and node 11.10.0.
Here's the Observation model:
const { Schema } = mongoose;
const ObservationSchema = new Schema({
track: { type: Schema.Types.ObjectId, ref: 'Track' },
user: { type: Schema.Types.ObjectId, ref: 'User' },
type: String
});
ObservationSchema.index({ track: 1, user: 1 }, { unique: true });
const Observation = mongoose.model('Observation', ObservationSchema);
And this is a sample of what the tracks array contains:
[
{ artists: [ 5da304b140185c5cb82d7eee ],
_id: 5da304b240185c5cb82d7f48,
spotifyId: '4QrEErhD78BjNFXpXDaTjH',
__v: 0,
isrc: 'DEF058230916',
name: 'Hungarian Dance No.17 In F Sharp Minor',
popularity: 25 },
{ artists: [ 5da304b140185c5cb82d7eee ],
_id: 5da304b240185c5cb82d7f5d,
spotifyId: '06dn1SnXsax9kJwMEpgBhD',
__v: 0,
isrc: 'DEF058230912',
name: 'Hungarian Dance No.13 In D',
popularity: 25 }
]
Thanks :)
I think this is due to your Promise.all method.
You should await every single query in the loop instead of awaiting everything at the same time at the end. Here an example with find:
async function retrieveApples() {
const apples = [];
arr.forEach(apple => {
const foundApple = await AppleModel.findOne({ apple });
apples.push(foundApple);
});
return apples
}

Query complains about missing 2dsphere-index, but it's there

When I execute the following code (a larger example, boiled down to the essentials)
var mongoose = require("mongoose");
var LocationSchema = new mongoose.Schema({
userName: String,
loc: {
'type': { type: String, enum: "Point", default: "Point" },
coordinates: { type: [Number] }
}
})
LocationSchema.index({ category: 1, loc: "2dsphere" });
var Location = mongoose.model("location", LocationSchema);
var mongoDB = 'mongodb://user1:test#ds042417.mlab.com:42417/locationdemo';
mongoose.Promise = global.Promise;
mongoose.connect(mongoDB, { useMongoClient: true });
var testUser = Location({
userName: "Tester",
loc: { coordinates: [12.44, 55.69] }
});
testUser.save(function (err) {
if (err) {
return console.log("UPPPPs: " + err);
}
console.log("User Saved, Try to find him:");
let query = Location.find({
loc:
{
$near:
{
$geometry:
{
type: "Point",
coordinates: [12.50, 55.71]
},
$maxDistance: 600000
}
}
})
query.exec(function (err, docs) {
if (err) {
return console.log("Err: " + err);
}
console.log("Found: " + JSON.stringify(docs))
})
});
I get this error:
Err: MongoError: error processing query: ns=locationdemo.locationsTree: GEONEAR field=loc maxdist=600000 isNearSphere=0
Sort: {}
Proj: {}
planner returned error: unable to find index for $geoNear query
But the index is there (see line 10) and the screenshot from mlab below. What am I doing wrong?:
You are breaking a rule of how you can use a an index in general. Whilst it is true that there is no restriction that a "2dsphere" index be the "first" property in a compound index, it is however very important that your "queries" actually address the first property in order for the index to be selected.
This is covered in Prefixes from the manual on compound indexes. In excerpt:
{ "item": 1, "location": 1, "stock": 1 }
The index has the following index prefixes:
{ item: 1 }
{ item: 1, location: 1 }
For a compound index, MongoDB can use the index to support queries on the index prefixes. As such, MongoDB can use the index for queries on the following fields:
the item field,
the item field and the location field,
the item field and the location field and the stock field.
However, MongoDB cannot use the index to support queries that include the following fields since without the item field, none of the listed fields correspond to a prefix index:
the location field,
the stock field, or
the location and stock fields.
Because your query references "loc" first and does not include "category", the index does not get selected and MongoDB returns the error.
So in order to use the index you have defined, you need to actually query "category" as well. Amending your listing:
var mongoose = require("mongoose");
mongoose.set('debug',true);
var LocationSchema = new mongoose.Schema({
userName: String,
category: Number,
loc: {
'type': { type: String, enum: "Point", default: "Point" },
coordinates: { type: [Number] }
}
})
//LocationSchema.index({ loc: "2dsphere", category: 1 },{ "background": false });
LocationSchema.index({ category: 1, loc: "2dsphere" });
var Location = mongoose.model("location", LocationSchema);
var mongoDB = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.connect(mongoDB, { useMongoClient: true });
var testUser = Location({
userName: "Tester",
category: 1,
loc: { coordinates: [12.44, 55.69] }
});
testUser.save(function (err) {
if (err) {
return console.log("UPPPPs: " + err);
}
console.log("User Saved, Try to find him:");
let query = Location.find({
category: 1,
loc:
{
$near:
{
$geometry:
{
type: "Point",
coordinates: [12.50, 55.71]
},
$maxDistance: 600000
}
}
})
query.exec(function (err, docs) {
if (err) {
return console.log("Err: " + err);
}
console.log("Found: " + JSON.stringify(docs))
})
});
As long as we include "category" everything is fine:
User Saved, Try to find him:
Mongoose: locations.find({ loc: { '$near': { '$geometry': { type: 'Point', coordinates: [ 12.5, 55.71 ] }, '$maxDistance': 600000 } }, category: 1 }, { fields: {} })
Found: [{"_id":"59f8f87554900a4e555d4e22","userName":"Tester","category":1,"__v":0,"loc":{"coordinates":[12.44,55.69],"type":"Point"}},{"_id":"59f8fabf50fcf54fc3dd01f6","userName":"Tester","category":1,"__v":0,"loc":{"coordinates":[12.44,55.69],"type":"Point"}}]
The alternate case is to simply "prefix" the index with the location. Making sure to drop previous indexes or the collection first:
LocationSchema.index({ loc: "2dsphere", category: 1 },{ "background": false });
As well as you probably should be in the habit of setting "background": true, else you start running into race conditions on unit tests where the index has not finished being created before unit test code attempts to use it.
My first solution to this problem was to create the index via the mLab web-interface which worked like a charm.
I have tried the solution suggested by Neil, but that still fails. The detailed instructions related to indexes, given by Neil however, did point me toward the solution to the problem.
It was a timing problem (which you not always see if you run the database locally) related to that my test code did the following:
Created the index, created a Location document (which first time will also create the collection), and then in the callback provided by save, I tried to find the user. It seems that the index was not yet created here, which is what gave the error.
If I delay the find method a second, using setTimeout it works fine.
But still, thanks to Neil for valuable information about the right way of using indexes (background) :-)

Resources