i have a document like this
{
_id: ...
deletedAt: null
history: [
{
_id: ...
name: ...
deletedAt: null
},
{
_id: ...
name: ...
deletedAt: null
},
]
}
after saving a document i want to return the saved document without properties
deletedAt and history.$.deletedAt
i have post middleware
mySchema.post('save', function () {
this.set('history.$[].deletedAt', undefined);
this.set('history.$.deletedAt', undefined);
this.set('deletedAt', undefined);
});
this middleware removes the deletedAt but it's not removing history.$.deletedAt
when i do history.0.deletedAt this works , but only for the first item of the array. how to make this work for all elements of the array?
also in my model i have specified select: false like this
...
history: {
type: [{
...
deletedAt: { type: Date, default: null, select: false }
...
}],
}
...
but any case history[n].deletedAt is being selected.
You can use Array.map()
// Remove the deletedAt key from each object in the history array
const historyWithoutDeletedAt = savedDoc.history.map(({ deletedAt, ...rest }) => rest);
// Create a new object without the deletedAt key and with the updated history array
const result = { ...savedDoc.toObject(), deletedAt: undefined, history: historyWithoutDeletedAt };
// Return the new object
return result;
solved this way
mySchema.post('save', function () {
this.history.forEach((v, i) => this.set(`history.${i}.deletedAt`, undefined));
this.set('deletedAt', undefined);
});
Related
I have a unique index like this
code: {
type: String,
index: {
unique: true,
partialFilterExpression: {
code: { $type: 'string' }
}
},
default: null
},
state: { type: Number, default: 0 },
but When the state is 2 (archived) I want to keep the code, but it should be able to reuse the code, so it cannot be unique if state is 2.
Is there any away that I could accomplish this?
This is possible, though it's through a work around documented here https://jira.mongodb.org/browse/SERVER-25023.
In MongoDB 4.7 you will be able to apply different index options to the same field but for now you can add a non-existent field to separate the two indexes.
Here's an example using the work around.
(async () => {
const ItemSchema = mongoose.Schema({
code: {
type: String,
default: null
},
state: {
type: Number,
default: 0,
},
});
// Define a unique index for active items
ItemSchema.index({code: 1}, {
name: 'code_1_unique',
partialFilterExpression: {
$and: [
{code: {$type: 'string'}},
{state: {$eq: 0}}
]
},
unique: true
})
// Defined a non-unique index for non-active items
ItemSchema.index({code: 1, nonExistantField: 1}, {
name: 'code_1_nonunique',
partialFilterExpression: {
$and: [
{code: {$type: 'string'}},
{state: {$eq: 2}}
]
},
})
const Item = mongoose.model('Item', ItemSchema)
await mongoose.connect('mongodb://localhost:27017/so-unique-compound-indexes')
// Drop the collection for test to run correctly
await Item.deleteMany({})
// Successfully create an item
console.log('\nCreating a unique item')
const itemA = await Item.create({code: 'abc'});
// Throws error when trying to create with the same code
await Item.create({code: 'abc'})
.catch(err => {console.log('\nThrowing a duplicate error when creating with the same code')})
// Change the active code
console.log('\nChanging item state to 2')
itemA.state = 2;
await itemA.save();
// Successfully created a new doc with sama code
await Item.create({code: 'abc'})
.then(() => console.log('\nSuccessfully created a new doc with sama code'))
.catch(() => console.log('\nThrowing a duplicate error'));
// Throws error when trying to create with the same code
Item.create({code: 'abc'})
.catch(err => {console.log('\nThrowing a duplicate error when creating with the same code again')})
})();
This is not possible with using indexes. Even if you use a compound index for code and state there will still be a case where
new document
{
code: 'abc',
state: 0
}
archived document
{
code: 'abc',
state: 2
}
Now although you have the same code you will not be able to archive the new document or unarchive the archived document.
You can do something like this
const checkCode = await this.Model.findOne({code:'abc', active:0})
if(checkCode){
throw new Error('Code has to be unique')
}
else{
.....do something
}
I'm trying to make a notation system for movies
A user can note a Movie in their List.
Whenever the user clicks on the frontend, the listId, movieId, note are sent to the server to update the note. The note can be set to null, but it does not remove the entry from the list.
But if the user clicks too much times, the movie's totalNote and nbNotes are completely broken. Feels like there is some sort of concurrency problems ?
Is this the correct approach to this problem or am I updating in a wrong way ?
The mongoose schemas related :
// Movie Schema
const movieSchema = new Schema({
// ...
note: { type: Number, default: 0 },
totalNotes: { type: Number, default: 0 },
nbNotes: { type: Number, default: 0 },
})
movieSchema.statics.updateTotalNote = function (movieId, oldNote, newNote) {
if (!oldNote && !newNote) return
const nbNotes = !newNote ? -1 : (!oldNote ? 1 : 0) // If oldNote is null we +1, if newNote is null we -1
return Movie.findOneAndUpdate({ _id: movieId }, { $inc: { nbNotes: nbNotes, totalNotes: (newNote - oldNote) } }, { new: true }).catch(err => console.error("Couldn't update note from movie", err))
}
// List Schema
const movieEntry = new Schema({
_id: false, // movie makes an already unique attribute, which is populated on GET
movie: { type: Schema.Types.ObjectId, ref: 'Movies', required: true },
note: { type: Number, default: null, max: 21 },
})
const listSchema = new Schema({
user: { type: Schema.Types.ObjectId, ref: 'Users', required: true },
movies: [movieEntry]
})
The server update API (add / Remove movieEntry are similar with $push and $pull instead of $set)
exports.updateEntry = (req, res) => {
const { listId, movieId } = req.params
const movieEntry = { movieId: movieId, note: req.body.note }
List.findOneAndUpdate({ _id: listId, 'movies.movie': movieId }, { $set: { 'movies.$[elem]': movieEntry } }, { arrayFilters: [{ 'elem.movie': movieId }] })
.exec()
.then(list => {
if (!list) return res.sendStatus(404)
const oldNote = list.getMovieEntryById(movieId).note // getMovieEntryById(movieId) = return this.movies.find(movieEntry => movieEntry.movie == movieId)
Movie.updateTotalNote(movieId, oldNote, movieEntry.note)
let newList = list.movies.find(movieEntry => movieEntry.movie == movieId) // Because I needed the oldNote and findOneAndUpdate returns the list prior to modification, I change it to return it
newList.note = movieEntry.note
newList.status = movieEntry.status
newList.completedDate = movieEntry.completedDate
return res.status(200).json(list)
})
.catch(err => {
console.error(err)
return res.sendStatus(400)
})
}
The entries I needed to update were arrays that could grow indefinitely so I had to first change my models and use virtuals and another model for the the list entries.
Doing so made the work easier and I was able to create, update and delete the entries more easily and without any concurrency problems.
This might also not have been a concurrency problem in the first place, but a transaction problem.
I need to remove the user's id from all objects in the collection except the one that was passed, in my example it is value: 'Тата', tell me how to make such a request?
console.log(result)
[
{
_id: 5fa702b2f18e5723b4c00d9f,
value: 'Тата',
vote: { '36e7da32-f818-4771-bb5e-1807b2954b5f': [Array] },
date: 2020-11-07T20:25:22.611Z,
__v: 0
}
]
console.log(req.body)
{ value: 'Тата', habalkaId: '36e7da32-f818-4771-bb5e-1807b2954b5f' }
console.log(req.user._id)
5f63a251f17f1f38bc92bdab
that's all I could do, just find
router.post('/', passport.authenticate('jwt', {session: false}), (req, res) => {
FirstName.find({value: req.body.value})
.then(result => {
if (result.length) {
console.log(result)
console.log(req.body)
console.log(req.user._id)
FirstName.find({value: {$ne: 'Слоник'}}, function (err, arr) {
arr.map(e => {
if (e.vote[req.body.habalkaId].length) {
if(e.vote[req.body.habalkaId].includes(String(req.user._id))){
console.log(e.vote[req.body.habalkaId])
}
}
})
})
} else {
new FirstName({
value: req.body.value,
vote: {[req.body.habalkaId]: [String(req.user._id)]}
}).save();
}
})
// res.json({res: req.body})
})
FirstName.js
const mongoose = require('mongoose');
const Schema = mongoose.Schema;
// Create Schema
const FirstNameSchema = new Schema({
value: {
type: String
},
vote: {
type: Object
},
date: {
type: Date,
default: Date.now
}
});
module.exports = FirstName = mongoose.model('firstname', FirstNameSchema);
If I've understand well, you want something like this:
db.collection.update({
"value": {
"$ne": "tata"
}
},
{
"$pull": {
"vote.array_name": "id_value"
}
},
{
multi: true
})
First of all, find all document that not match the value with the given one. Then, for each document found, delete the object from the array, using $pull where the id given matches.
Example here
Please check the payground and check if I've used the correct schema and it shows the expected output.
Context:
I am trying to upsert in bulk an array of data, with an additional computed field: 'status'.
Status should be either :
- 'New' for newly inserted docs;
- 'Removed' for docs present in DB, but inexistent in incoming dataset;
- a percentage explaining the evolution for the field price, comparing the value in DB to the one in incoming dataset.
Implementations:
data.model.ts
import { Document, model, Model, models, Schema } from 'mongoose';
import { IPertinentData } from './site.model';
const dataSchema: Schema = new Schema({
sourceId: { type: String, required: true },
name: { type: String, required: true },
price: { type: Number, required: true },
reference: { type: String, required: true },
lastModified: { type: Date, required: true },
status: { type: Schema.Types.Mixed, required: true }
});
export interface IData extends IPertinentData, Document {}
export const Data: Model<IData> = models.Data || model<IData>('Data', dataSchema);
data.service.ts
import { Data, IPertinentData } from '../models';
export class DataService {
static async test() {
// await Data.deleteMany({});
const data = [
{
sourceId: 'Y',
reference: `y0`,
name: 'y0',
price: 30
},
{
sourceId: 'Y',
reference: 'y1',
name: 'y1',
price: 30
}
];
return Data.bulkWrite(
data.map(function(d) {
let status = '';
// #ts-ignore
console.log('price', this);
// #ts-ignore
if (!this.price) status = 'New';
// #ts-ignore
else if (this.price !== d.price) {
// #ts-ignore
status = (d.price - this.price) / this.price;
}
return {
updateOne: {
filter: { sourceId: d.sourceId, reference: d.reference },
update: {
$set: {
// Set percentage value when current price is greater/lower than new price
// Set status to nothing when new and current prices match
status,
name: d.name,
price: d.price
},
$currentDate: {
lastModified: true
}
},
upsert: true
}
};
}
)
);
}
}
... then in my backend controller, i just call it with some route :
try {
const results = await DataService.test();
return new HttpResponseOK(results);
} catch (error) {
return new HttpResponseInternalServerError(error);
}
Problem:
I've tried lot of implementation syntaxes, but all failed either because of type casting, and unsupported syntax like the $ symbol, and restrictions due to the aggregation...
I feel like the above solution might be closest to a working scenario but i'm missing a way to grab the value of the price field BEFORE the actual computation of status and the replacement with updated value.
Here the value of this is undefined while it is supposed to point to current document.
Questions:
Am i using correct Mongoose way for a bulk update ?
if yes, how to get the field value ?
Environment:
NodeJS 13.x
Mongoose 5.8.1
MongoDB 4.2.1
EUREKA !
Finally found a working syntax, pfeeeew...
...
return Data.bulkWrite(
data.map(d => ({
updateOne: {
filter: { sourceId: d.sourceId, reference: d.reference },
update: [
{
$set: {
lastModified: Date.now(),
name: d.name,
status: {
$switch: {
branches: [
// Set status to 'New' for newly inserted docs
{
case: { $eq: [{ $type: '$price' }, 'missing'] },
then: 'New'
},
// Set percentage value when current price is greater/lower than new price
{
case: { $ne: ['$price', d.price] },
then: {
$divide: [{ $subtract: [d.price, '$price'] }, '$price']
}
}
],
// Set status to nothing when new and current prices match
default: ''
}
}
}
},
{
$set: { price: d.price }
}
],
upsert: true
}
}))
);
...
Explanations:
Several problems were blocking me :
the '$field_value_to_check' instead of this.field with undefined 'this' ...
the syntax with $ symbol seems to work only within an aggregation update, using update: [] even if there is only one single $set inside ...
the first condition used for the inserted docs in the upsert process needs to check for the existence of the field price. Only the syntax with BSON $type worked...
Hope it helps other devs in same scenario.
I try to save each document in an array as ObjectId, like that:
{
materials: {
active: "Steel",
description: "List of materials",
text: "Materials",
value: ["5c44ea8163bfea185e5e2dfb", "5c44ea8163bfea185e5e2dfc"]
}
}
I used an array of promises to save asynchronous each value and save the callback _id:
const reference = {
materials: {
...project.materials,
value: await Promise.all(project.materials.value.map(
async (value) => {
const { _id } = await Material.findOneAndUpdate({ name: value.name }, value, { upsert: true, new: true, setDefaultsOnInsert: true }).exec();
return mongoose.Types.ObjectId(_id);
}
))
},
...
}
There is another more simple way ?