I'm trying to keep an history of states in a subdocument array with mongoosejs 4.9.5 and mongo 3.2.7
Example of document structure:
company (Schema)
employees (Schema): [ ]
currentState: String
states (Schema): [ ]
state: String
starts: Date
ends: Date
When I change the employee state, I want to change the currentState, add the new state into the states array, and update the last state for define the 'ends' timestamp
// I get the last state position from a previous find request
var lastStateIndex = employee.stateHistory.length - 1;
var changeStateDate = new Date();
// Prepare the update
var query = { _id: companyId, "employees._id": employeeId };
var update = {
$set: {
"employees.$.state": newState,
`employees.$.stateHistory.${lastStateIndex}.ends`: changeStateDate
},
$push: {
"employees.$.stateHistory": {
state: newState,
starts: changeStateDate
}
}
}
Company.findOneAndUpdate(query, update, { multi:false, new:true}, ... )
Mongo is returning the following error
{"name":"MongoError","message":"Cannot update 'employees.0.stateHistory.0.ends' and 'employees.0.stateHistory' at the same time","ok":0,"errmsg":"Cannot update 'employees.0.stateHistory.0.ends' and 'employees.0.stateHistory' at the same time","code":16837}
Any suggestions how to avoid running two updates for that purpose?
Any work around for avoid storing the 'ends' date, but being able to calculate it after based on the 'starts' of the next item in the array?
Thank you,
I expected this to already be answered elsewhere, but no other reasonable response seems to exist. As commented, you cannot actually do this in a single update operation because the operations "conflict" on the same path. But .bulkWrite() allows "multiple updates" to be applied in a single request and response.
Company.bulkWrite([
{ "updateOne": {
"filter": { "_id": companyId, "employees._id": employeeId },
"update": {
"$set": {
"employees.$.state": newState,
[`employees.$.stateHistory.${lastStateIndex}.ends`]: changeStateDate
}
}},
{ "updateOne": {
"filter": { "_id": companyId, "employees._id": employeeId },
"update": {
"$push": {
"employees.$.stateHistory": {
"state": newState,
"starts": changeStateDate
}
}
}
}}
])
Now of course .bulkWrite() does not return the "modified document" like .findOneAndUpdate() does. So if you need to actually return the document, then you need to add to the Promise chain instead:
Company.bulkWrite([
{ "updateOne": {
"filter": { "_id": companyId, "employees._id": employeeId },
"update": {
"$set": {
"employees.$.state": newState,
[`employees.$.stateHistory.${lastStateIndex}.ends`]: changeStateDate
}
}},
{ "updateOne": {
"filter": { "_id": companyId, "employees._id": employeeId },
"update": {
"$push": {
"employees.$.stateHistory": {
"state": newState,
"starts": changeStateDate
}
}
}
}}
]).then( result => {
// maybe inspect the result
return Company.findById(companyId);
})
Of course noting that it is "possible" that another modification can be made to the document in between when the .bulkWrite() is applied and the .findById() is executed. But that is the cost of the operation you are doing.
It is generally best to consider if you actually need the returned document or not. In most instances you simply already have the information and any "updates" you should be aware of because you are "issuing them", and if you want "truly reactive" then you should be listening for other change events on the data through a socket instead.
Note you could simply "chain" the "multiple" .findOneAndUpdate() calls, but this is indeed "multiple" calls and responses from the server, as opposed to the one using .bulkWrite(). So there really isn't anything to gain by doing otherwise.
Related
I have collection MyCollection which basically consists of its _id and a string called comment.
This collection should be bulk-updatable.
That's done like this:
for (const obj of inputObjects) {
bulkObjectsToWrite.push({
updateOne: {
filter: { _id: obj._id },
update: {
$set: {
comment: obj.comment
}
}
}
})
}
await MyCollection.bulkWrite(bulkObjectsToWrite)
So far so good.
However, now the requirement is, that a commentHistory should be added which should look like [{"oldValue": "oldValueOfComment", "newValue": "newValueOfComment"}, ...]
I know I need to use $push for adding a new object to the commentHistory array. But how do I access the comment of the document updated right now, i.e. its current value?
I've tried
$push: {
commentHistory: {
newValue: obj.comment,
oldValue: '$comment',
},
},
but to no avail. The string $comment is added hard-coded, instead of the field being accessed.
(Using Mongoose 5.12.10 and Mongo 4.4.18)
You need to use update with aggregate pipeline.
db.collection.update({
"key": 1
},
[
{
$set: {
"comment": "New",
"commentHistory": {
"$concatArrays": [ //concatenate existing history array with new array entry
"$commentHistory",
[
{
"newValue": "New",
"oldValue": "$comment" //referencing the existing value
}
]
]
}
}
}
])
Demo
We have a college project in CouchDB and I'm using node, I want to create a view that returns a number of all my documents by email.
I cannot find anything that works and I'm not sure what I'm missing, I tried a lot of different reduce functions and emit methods.
Thanks for any answers.
The documents have 2 fields, name and email
Do not use the db endpoint because the response field doc_count includes design documents along with other documents that may not have an email field.
A straight forward way to do this is with a view. The code snippet demonstrates the difference between db info doc_count and a view's total_rows using PouchDB. I'd guess there's probably more interesting uses for the index.
The design doc is trivial
{
_id: '_design/my_index',
views: {
email: {
map: function(doc) {
if (doc.email) emit(doc.email);
}.toString()
}
}
}
And the view query is very efficient and simple.
db.query('my_index/email', {
include_docs: false,
limit: 0
})
const gel = id => document.getElementById(id);
let db;
function setJsonToText(elId, json) {
gel(elId).innerText = JSON.stringify(json, undefined, 3);
}
async function view() {
// display db info
setJsonToText('info', await db.info());
// display total number or rows in the email index
const result = await db.query('my_index/email', {
include_docs: false,
limit: 0
});
setJsonToText('view', result);
}
// canned test documents
function getDocsToInstall() {
return [{
email: 'jerry#garcia.com',
},
{
email: 'bob#weir.com',
},
{
email: 'phil#lesh.com'
},
{
email: 'wavy#gravy.com'
},
{
email: 'samson#delilah.com'
},
{
email: 'cosmic#charlie.com'
},
// design doc
{
_id: '_design/my_index',
views: {
email: {
map: function(doc) {
if (doc.email) emit(doc.email);
}.toString()
}
}
}
]
}
// init example db instance
async function initDb() {
db = new PouchDB('test', {
adapter: 'memory'
});
await db.bulkDocs(getDocsToInstall());
};
(async() => {
await initDb();
await view();
})();
<script src="https://github.com/pouchdb/pouchdb/releases/download/7.1.1/pouchdb-7.1.1.min.js"></script>
<script src="https://github.com/pouchdb/pouchdb/releases/download/7.1.1/pouchdb.memory.min.js"></script>
<pre>Info</pre>
<pre id='info'></pre>
<div style='margin-top:2em'></div>
<pre>email view</pre>
<pre id='view'>
</pre>
You can use GET /{db}, which returns information about the specified database. This is a JSON object that contains the property doc_count.
doc_count (number) – A count of the documents in the specified database.
With Angular for example, this could be done with the following method:
async countDocuments(database: string): Promise<number> {
return this.http.get<any>(this.url('GET', database), this.httpOptions).toPromise()
.then(info => info['doc_count']);
}
Assumption:
Assuming that following documents are present in the Customers database:
[
{
"_id": "93512c6c8585ab360dc7f535ff00bdfa",
"_rev": "1-299289ee89275a8618cd9470733035f4",
"name": "Tom",
"email": "tom#domain.com"
},
{
"_id": "93512c6c8585ab360dc7f535ff00c930",
"_rev": "1-a676883d6f1b5bce3b0a9ece92da6964",
"name": "Tom Doe",
"email": "tom#domain.com"
},
{
"_id": "93512c6c8585ab360dc7f535ff00edc0",
"_rev": "1-09b5bf64cfe66af7e1134448e1a328c3",
"name": "John",
"email": "john#domain.com"
},
{
"_id": "93512c6c8585ab360dc7f535ff010988",
"_rev": "1-88e347af11cfd1e40e63920fa5806fd2",
"name": "Alan",
"email": "alan#domain.com"
}
]
If I understand your query correctly, then based on above data, You need below given result set.
{
"tom#domain.com": 2,
"alan#domain.com": 1,
"john#domain.com": 1
}
Solution:
In order to achieve above, Consider following design document containing a View which has Map and Reduce functions.
{
"_id": "_design/Customers",
"views": {
"by-email": {
"map": "function (doc) {
if(doc.email){
emit(doc.email, doc._id);
}
}",
"reduce": "_count"
}
},
"language": "javascript"
}
The above view function emits value of the key email of the document if the key exists in the document.
The reduce function _count is a built in reducer (provided by CouchDB) that does the counting logic.
Executing View Query:
In order to query this view, you need to: select the view function, mark reduce to be executed (as it is optional to run reduce) and set 1 as group level.
Here is how you can do it through the UI:
Result:
Here is the result given by above query:
[![result of map reduce query
Hope this helped.
For more details about other reduce functions and group level, please refer CouchDB documentation.
Cheers.
I have collection with name products with almost 100k documents. I want to introduce a new key called secondaryKey with unique value uuid in all the documents.
I do this using nodejs.
Problem I am facing:-
When I try the below query,
db.collection('products').updateMany({},{"$set":{secondaryKey: uuid()}});
Here it updates all the documents with same uuid value,
I try with loop to update document one by one,but here issues is I don't have filter value in updateOne because I want to update all the documents.
Can anyone please help me here.
Thanks :)
If you are using MongoDB version >= 4.4 You can try this:
db.products.updateMany(
{},
[
{
$set: {
secondaryKey: {
$function: {
body: function() {
return UUID().toString().split('"')[1];
},
args: [],
lang: "js"
}
}
}
}
]
);
Output
[
{
"_id": ObjectId("..."),
"secondaryKey": "f41b15b7-a0c5-43ed-9d15-69dbafc0ed29"
},
{
"_id": ObjectId("..."),
"secondaryKey": "50ae7248-a92e-4b10-be7d-126b8083ff64"
},
{
"_id": ObjectId("..."),
"secondaryKey": "fa778a1a-371b-422a-b73f-8bcff865ad8e"
}
]
Since it's not the same value you want to put in each document you have to use the loop.
In your loop, you have to update the current document of the iteration. So you have to filter with the _id in the updateOne
The above reply didn't work for me. Plus, it compromises security when you enable javascript on your database (see here $function and javascript enabling on database). The best way is to not overload your server, do your work on local as below:
const { nanoid, customAlphabet } = require('nanoid')
async function asdf() {
const movies = await client.db("localhost").collection("productpost");
var result2 = []
let result = await movies.find({}).toArray()
result.forEach(element => {
const nanoid = customAlphabet('1234567890', 10)
console.log(element.price)
element.price = 4
element.id = nanoid()
result2.push(element)
});
console.log("out reult2", result2)
await movies.deleteMany({})
await movies.insertMany(result2)
})
It will delete any objects on your collections and update with the new ones. Using nanoid as uniqueids.
This is the database object array after adding unique id:
{ "_id": { "$oid": "334a98519a20b05c20574dd1" }, "attach": "[\"http://localhost:8000/be/images/2022/4/bitfinicon.png\"]", "title": "jkn jnjn", "description": "jnjn", "price": 4, "color": "After viewing I am 48.73025772956596% more satisfied with life.", "trademark": "", "category": "[]", "productstate": "Published", "createdat": { "$date": "2022-04-03T17:40:54.743Z" }, "language": "en"}
P.S: Please backup your collection before doing this or filter the array on your needs for not going through all collection.
I'm using nodeJS + Express + Mongoose + mongoDB
Here's my mongoDB User Schema:
{
friends: [ObjectId]
friends_count: Number
}
Whenever user adds a friend, a friendId will be pushed into friends array, and friends_count will be increased by 1.
Maybe there are a lot of actions will change the friends array, and maybe I will forgot to increase the friends_count. So I want to make sure that friends_count always equal to friends.length
Is there a good way or framework to make sure all of that?
P.S
I know how to update friends_count. What I mean is what if I forgot to?
Is there a way to automatically keep these two attributes sync?
Use the $ne operator as a "query" argument to .update() and the $inc operator to apply when that "friend" did not exist within the array as you $push the new member:
User.update(
{ "_id": docId, "friends": { "$ne": friendId } },
{
"$push": { "friends": friendId },
"$inc": { "friends_count": 1 }
},
function(err,numberAffected) {
}
)
Or to "remove" a friend from the list, do the reverse case with $pull:
User.update(
{ "_id": docId, "friends": friendId },
{
"$pull": { "friends": friendId },
"$inc": { "friends_count": -1 }
},
function(err,numberAffected) {
}
)
That way your friends_count stays in sync with the number of array elements present.
All you need to do is to update friends_count in both add and remove functions. For example:
User.findById(userId, function (err, user) {
if (user) {
user.friends.push(friendId);
user.friends_count++;
user.save();
}
});
FYI, I don't think it is necessary to add friends_count while you can get total numbers of friends by friends.length.
I want to make a query in which i want to know either the user like or unlike my status, now i want to make it on single query so that I will not call the DB 2 times from my NODEJS server, do any have solution of my problem.
For Add We are using
collection.update({ _id: id },
{ $pull: {
'user_id': 'xxxx-xxxx-xxxx-xxxx' }
}
);
For Remove We are using
collection.update({ _id: id },
{ $push: {
'user_id': 'xxxx-xxxx-xxxx-xxxx' }
}
);
Now I want to use both of them in one query like if apply present in fruit array remove it if not add it.
MongoDB does not allow both a $pull and $push or any other operation to update the same "path" ( therefore single array ) in a single statement. This is mainly to do with the logic handling server side where the update operations are never considered to be ordered in a statement.
Example:
{
"responses": [
{ "user": "Tom", "status": "like" },
{ "user": "Sarah", "status": "unlike" }
]
}
Not that it would make much sense, but you cannot do this:
db.collection.update(
{},
{
"$pull": { "responses": { "user": "Tom", "status": "like" },
"$push": { "responses": { "user": "Tom", "status": "unlike" }
}
)
As the single operation here contains both $push and $pull on the "same path" as "responses". Regardless of you you contruct the statement, neither is required to execute in any order at all.
While we could "match" the position for "Tom" and change his "status" to "unlike" instead, a better model is to do this:
{
"likes": ["Tom"],
"unlikes": ["Sarah"],
"likesTotal": 1,
"unlikesTotal": 1,
"totalScore": 0
}
What this means if I want to change the "vote" for "Tom" then you make a construct like this, with the help of Bulk operations to enable a single request and response:
var bulk = db.collection.initializeOrderedBulkOp();
// Cast "Tom's" unlike where they had a "like" already
bulk.find({
"likes": "Tom",
"unlikes": { "$ne": "Tom" }
}).updateOne({
"$pull": { "likes": "Tom" },
"$push": { "unlikes": "Tom" },
"$inc": {
"likesTotal": -1,
"unlikesTotal": 1
}
]);
// Cast "Tom's" new vote where nothing was there at all
bulk.find({
"unlikes": { "$ne": "Tom" },
"likes": { "$ne": "Tom" }
}).updateOne({
"$push": { "unlikes": "Tom" },
"$inc": {
"unlikesTotal": 1,
"totalScore": -1
}
});
bulk.execute();
This produces a really nice pattern. Not only is each update operation here basically "atomic" in that by acting on separate document properties each modifier is allowed to execute without conflict. But also as a "Bulk" operation, the request for "both" update operations that meet all possible conditions here are sent in a single request and received in a single response.
Of course your "client" logic should also be aware of the current status for who has "liked/disliked" on a particular item, but enforcing this in the general API is good practice.
It keeps arrays in check, and also keeps useful counters in check for general data and general querying purposes, without the need to "calculate" lengths of arrays or matching types.