How to update collection value - node.js

Assume I have db as;
{ name: "alex" , id: "1"}
I want to update the collection. I want to add "Mr." to the value in name field.
{ name: "Mr.alex" , id: "1"}
How can I do this? Should I wrote 2 query as;
db.collection("user").find({id : "1"}).toArray(function(err, result){
var name = result[0].name;
db.collection("user").updateOne({id : "1"}, {name: "Mr." + name},function(err, result){
})
})
Isn't there any better way to do this as x = x+1 in mongodb?

AFAIK there would be two queries, update operator won't take any field's value and need provided value.
However if you need to do it for all the document or large amount of document, you can write a script and use cursor forEach and for each document change the name and call db.user.save() by passing the argument user object.
Bottom line remains same.

try below:
var cursor = db.users.find({});
while (cursor.hasNext()) {
var user = cursor.next();
user.name = "Mr." + user.name;
user.save();
}
Not possible using one query. You have to iterate through the documents and save them with updated result.

Currently there is no way to reference the retrieved document in the same query, which would allow you to find and update a document within the same operation.
Thus, you will have to make multiple queries to accomplish what you are looking for:
/*
* Single Document
*/
// I'm assuming the id field is unique and can only return one document
var doc = db.collection('user').findOne({ id: '1' });
try {
db.collection('user').updateOne({ id: '1' }, { $set: { name: 'Mr. ' + doc.name }});
} catch (e) {
print(e);
}
If you want to handle multiple update operations, you can do so by using a Bulk() operations builder:
/*
* Multiple Documents
*/
var bulk = db.collection('user').initializeUnorderedBulkOp();
// .forEach is synchronous in MongoDB, as it performs I/O operations
var users = db.collection('user').find({ id: { $gt: 1 } }).forEach(function(user) {
bulk.find({ id: user.id }).update({ $set: { name: 'Mr. ' + user.name }});
});
bulk.execute();

The key to updating the collection with the existing field is to loop through the array returned from the find().toarray() cursor method and update your collection using the Bulk API which allows you to send many update operations within a single request (as a batch).
Let's see with some examples how this pens out:
a) For MongoDB server version 3.2 and above
db.collection("user").find({id : "1"}).toArray(function(err, result){
var operations = [];
result.forEach(function(doc){
operations.push({
"updateOne": {
"filter": {
"_id": doc._id,
"name": doc.name
},
"update": {
"$set": { "name": "Mr." + doc.name }
}
}
});
// Send once in 500 requests only
if (operations.length % 500 === 0 ) {
db.collection("user").bulkWrite(operations, function(err, r) {
// do something with result
}
operations = [];
}
});
// Clear remaining queue
if (operations.length > 0) {
db.collection("user").bulkWrite(operations, function(err, r) {
// do something with result
}
}
})
In the above, you initialise your operations array which would be used by the Bulk API's bulkWrite() function and holds the update operations.
The result from the find().toarray() cursor function is then iterated to create the operations array with the update objects. The operations are limited to batches of 500.
The reason of choosing a lower value than the default batch limit of 1000 is generally a controlled choice. As noted in the documentation there, MongoDB by default will send to the server in batches of 1000 operations at a time at maximum and there is no guarantee that makes sure that these default 1000 operations requests actually fit under the 16MB BSON limit.
So you would still need to be on the "safe" side and impose a lower batch size that you can only effectively manage so that it totals less than the data limit in size when sending to the server.
a) If using MongoDB v3.0 or below:
// Get the collection
var col = db.collection('user');
// Initialize the unordered Batch
var batch = col.initializeUnorderedBulkOp();
// Initialize counter
var counter = 0;
col.find({id : "1"}).toArray(function(err, result){
result.forEach(function(doc) {
batch.find({
"_id": doc._id,
"name": doc.name
}).updateOne({
"$set": { "name": "Mr. " + doc.name }
});
counter++;
if (counter % 500 === 0) {
batch.execute(function(err, r) {
// do something with result
});
// Re-initialize batch
batch = col.initializeOrderedBulkOp();
}
});
if (counter % 1000 != 0 ){
batch.execute(function(err, r) {
// do something with result
});
}
});

Related

MongoDB: Check if a field is same in all the documents in that collection

How could I check whether a specific field's value if same in all the document in that collection.
I have a collection called Game where I store the game responses of the users and in the game, the logic should check whether the responses are all equal or not, so how could I do it ?
You need to do the opposite
find({field : {$ne: expectedValue}})
If you get a response for this query, then there are documents which is not having the desired value.
You can get the count of unique values of that field. If more than 1, they are not the same.
You can also group by the field value. If more than 1 record, they are not the same.
const rows = await myCollection.aggregate([
{
$group: {
_id: '$theField'
}
}
]).toArray()
if (rows.length === 1){
// same field value
}
Or do the counting in the pipeline to save bandwidth.
const rows = await myCollection.aggregate([
{
$group: {
_id: '$theField'
}
},
{ $count: 'count' }
]).toArray()
if (rows.length && rows[0].count === 1) {
// same value
}

How to implement pagination for mongodb in node.js using official mongodb client?

I want to implement pagination for mongodb in node.js enviroment using offical mongodb package. I tried to find out on internet but all are mongoose based links. I dont want to use mongoose.
How can I implement pagination using official client api given at
http://mongodb.github.io/node-mongodb-native/3.1/api/
Offset-based approach has a big flaw: if the results list has changed between calls to the API, the indices would shift and cause an item to be either returned twice or skipped and never returned
This problem is demonstrated at
https://www.sitepoint.com/paginating-real-time-data-cursor-based-pagination/
Time-based pagination approach would be little better because results are no longer skipped. If you query the first page, and then a new item is deleted, it won’t shift the results in your second page and all is fine. However, this approach has a major flaw: what if there is more than one item that was created at the same time?
Best would be to use Cursor based pagination
Which can be implemented using any field in collection which is Unique, Orderable and Immutable.
_id satisfy all Unique, Orderable and Immutable conditions. Based on this field we can sort and return page result with _id of last document as the cusror for subsequent request.
curl https://api.mixmax.com/items?limit=2
const items = db.items.find({}).sort({
_id: -1
}).limit(2);
const next = items[items.length - 1]._id
res.json({ items, next })
when the user wants to get the second page, they pass the cursor (as next) on the URL:
curl https://api.mixmax.com/items?limit=2&next=590e9abd4abbf1165862d342
const items = db.items.find({
_id: { $lt: req.query.next }
}).sort({
_id: -1
}).limit(2);
const next = items[items.length - 1]._id
res.json({ items, next })
If we want to return results in a different order, such as the date the item then we will add sort=launchDate to the querystring.
curl https://api.mixmax.com/items?limit=2&sort=launchDate
const items = db.items.find({}).sort({
launchDate: -1
}).limit(2);
const next = items[items.length - 1].launchDate;
res.json({ items, next })
For subsequent page request
curl https://api.mixmax.com/items?limit=2&sort=launchDate&next=2017-09-11T00%3A44%3A54.036Z
const items = db.items.find({
launchDate: { $lt: req.query.next }
}).sort({
_id: -1
}).limit(2);
const next = items[items.length - 1].launchDate;
res.json({ items, next });
If we launched a bunch of items on the same day and time? Now our launchDate field is no longer unique and doesn’t satisfy Unique, Orderable and Immutable. condition. We can’t use it as a cursor field. But we could use two fields to generate the cursor.Since we know that the _id field in MongoDB always satisfies the above three condition, we know that if we use it alongside our launchDate field, the combination of the two fields would satisfy the requirements and could be together used as a cursor field.
curl https://api.mixmax.com/items?limit=2&sort=launchDate
const items = db.items.find({}).sort({
launchDate: -1,
_id: -1 // secondary sort in case there are duplicate launchDate values
}).limit(2);
const lastItem = items[items.length - 1];
// The cursor is a concatenation of the two cursor fields, since both are needed to satisfy the requirements of being a cursor field
const next = `${lastItem.launchDate}_${lastItem._id}`;
res.json({ items, next });
For subsequent page request
curl https://api.mixmax.com/items?limit=2&sort=launchDate&next=2017-09-11T00%3A44%3A54.036Z_590e9abd4abbf1165862d342
const [nextLaunchDate, nextId] = req.query.next.split(‘_’);
const items = db.items.find({
$or: [{
launchDate: { $lt: nextLaunchDate }
}, {
// If the launchDate is an exact match, we need a tiebreaker, so we use the _id field from the cursor.
launchDate: nextLaunchDate,
_id: { $lt: nextId }
}]
}).sort({
_id: -1
}).limit(2);
const lastItem = items[items.length - 1];
// The cursor is a concatenation of the two cursor fields, since both are needed to satisfy the requirements of being a cursor field
const next = `${lastItem.launchDate}_${lastItem._id}`;
res.json({ items, next });
Refefence: https://engineering.mixmax.com/blog/api-paging-built-the-right-way/
Using the recommended pagination approach with limit() and skip() (see here):
const MongoClient = require('mongodb').MongoClient;
MongoClient.connect('http:localhost:27017').then((client) => {
const db = client.db(mongo.db);
db.collection('my-collection').find({}, {limit:10, skip:0}).then((documents) => {
//First 10 documents
console.log(documents);
});
db.collection('my-collection').find({}, {limit:10, skip:10}).then((documents) => {
//Documents 11 to 20
console.log(documents);
});
});
Here's a pagination function:
function studentsPerPage (pageNumber, nPerPage) {
return db.collection('students').find({},
{
limit: nPerPage,
skip: pageNumber > 0 ? ( ( pageNumber - 1 ) * nPerPage ) : 0
});
}
I am sending an API that is on MongoDb and Nodejs.
module.exports.fetchLoans = function(req, res, next) {
var perPage = 5;
var page = req.body.page || 1;
loans
.find({ userId: req.user._id})
.select("-emi")
.skip(perPage * page - perPage)
.limit(perPage)
.sort({ timestamp: -1 })
.exec(function(err, loan) {
if (loan != null) {
loans
.find({ userId: req.user._id})
.count()
.exec(function(err, count) {
if (count != null) {
res.json({
success: true,
loans: loan,
currentpage: page,
totalpages: Math.ceil(count / perPage)
});
} else {
console.log("Milestone Error: ", err);
res.json({ success: false, error: "Internal Server Error. Please try again." });
}
});
} else {
console.log("Milestone Error: ", err);
res.json({ success: false, error: "Internal Server Error. Please try again." });
}
});
};
In this code, you will have to provide page number on every hit.
You can use skip and limit options to implement pagination
module.exports = (data)=>{
let page = parseInt(data.page);
let limit = parseInt(data.limit);
let skip = 0
if(page>1){
skip = (page * limit);
}
let mongoClient = require('mongodb').MongoClient;
mongoClient.connect('mongodb://localhost:27017').then((client) => {
let db = client.db('your-db');
db.collection('your-collection').find({}, {limit:limit, skip:skip}).then((documents) => {
console.log(documents);
});
});
};

How to get all updated documents' values after updateMany()?

Desired Behaviour
After updating a property value in multiple documents, access the value of that updated property for each document updated.
(for context, I am incrementing multiple users' notification counts, and then sending each updated value back to the respective user, if they are logged in, via socket.io)
What I've Tried
let collection = mongo_client.db("users").collection("users");
let filter = { "user_email": { $in: users_to_notify } };
let update = { $inc: { "new_notifications": 1 } };
let result = await collection.updateMany(filter, update);
// get the `new_notifications` value for all updated documents here
There doesn't seem to be a returnOriginal type option applicable to the updateMany() method.
https://docs.mongodb.com/manual/reference/method/db.collection.updateMany
http://mongodb.github.io/node-mongodb-native/3.2/api/Collection.html#updateMany
That is understandable, because returnOriginal only seems to make sense when updating one document, eg in the options for the findOneAndUpdate() method.
Question
If all of the above assumptions are true, what would be the best way to get the new_notifications value for all updated documents after the updateMany() method has finished?
Is it just a matter of making another database call to get the updated values?
for example - this works:
let collection = mongo_client.db("users").collection("users");
let filter = { "user_email": { $in: users_to_notify } };
let update = { $inc: { "new_notifications": 1 } };
await collection.updateMany(filter, update);
// make another database call to get updated values
var options = { projection: { user_email: 1, username: 1, new_notifications: 1 } };
let docs = collection.find(filter, options).toArray();
/*expected result:
[{
"user_email": "user_1#somedomain.com",
"username": "user_1",
"new_notifications": 17
},
{
"user_email": "user_2#somedomain.com",
"username": "user_2",
"new_notifications": 5
}]*/
// create an object where each doc's user_email is a key with the value of new_notifications
var new_notifications_object = {};
// iterate over docs and add each property and value
for (let obj of docs) {
new_notifications_object[obj.user_email] = obj.new_notifications;
}
/*expected result:
{
"user_1#somedomain.com": 17,
"user_2#somedomain.com": 3
}*/
// iterare over logged_in_users
for (let key of Object.keys(logged_in_users)) {
// get logged in user's email and socket id
let user_email = logged_in_users[key].user_email;
let socket_id = key;
// if the logged in user's email is in the users_to_notify array
if (users_to_notify.indexOf(user_email) !== -1) {
// emit to the socket's personal room
let notifications_count = new_notifications_object[user_email];
io.to(socket_id).emit('new_notification', { "notifications_count": notifications_count });
}
}
updateMany() does not return the updated documents but some counts. You have to just run another query for getting those updates.
Would suggest you do a find({}) first, then findByIdAndUpdate() in a loop of the found items.
let updatedRecord = [];
let result = await Model.find(query);
if (result.length > 0) {
result.some(async e => {
updatedRecord.push(await Model.findByIdAndUpdate(e._id, updateObj, { new: true }));
});
}
return updatedRecord;

Get total count along with Mongoose Query skip & limt

I have a json data which contains many objects. I want to limit the data for pagination and I need the total items count. Please help.
Model.find().skip((pageNumber-1)*limit).limit(limit).exec()
I want the count and skipped data in response.
You can use async library for running 2 queries at once. In your case you can run one query to get the number of documents and another for pagination.
Example with 'User' model:
var async = require('async');
var User = require('./models/user');
var countQuery = function(callback){
User.count({}, function(err, count){
if(err){ callback(err, null) }
else{
callback(null, count);
}
}
};
var retrieveQuery = function(callback){
User.find({}).skip((page-1)*PAGE_LIMIT)
.limit(PAGE_LIMIT)
.exec(function(err, doc){
if(err){ callback(err, null) }
else{
callback(null, doc);
}
}
};
async.parallel([countQuery, retrieveQuery], function(err, results){
//err contains the array of error of all the functions
//results contains an array of all the results
//results[0] will contain value of doc.length from countQuery function
//results[1] will contain doc of retrieveQuery function
//You can send the results as
res.json({users: results[1], pageLimit: PAGE_LIMIT, page: page, totalCount: results[0]});
});
async allows you to run a number of queries in parallel depending on the hardware you are using. This would be faster than using 2 independent queries to get count and get the required documents.
Hope this helps.
I have solved it with $facet and aggregate the following way in mongoose v3+:
const [{ paginatedResult, [{ totalCount }] }] = await Model.aggregate([{
$facet: {
paginatedResult: [
{ $match: query },
{ $skip: skip },
{ $limit: limit }
],
totalCount: [
{ $match: query },
{ $count: 'totalCount' }
]
}
}])
where the totalCount refers the total number of records matching the search query while the paginatedResult is only the paginated slice of them.
The problem with these solutions is that for every request you are doing two queries. This becomes problematic when you have a complex data structure and large data set as performance becomes an issue. Consider instead creating a special function that listens for the /resource?count=true or /resource/count GET methods and returns only the count.
You need to perform 2 queries to achieve that. One to get results and another to get total items amount with .count().
For example code you can watch at on of "paginator" for mongoose mongoose-paginate.
To performe only one query, you may use the find() method associated with promises and array slices. A small example would be:
getPaginated(query, skip, limit){
return this.model.find(query)
.lean()
.then((value)=>{
if (value.length === 0) return {userMessage: 'Document not found'};
const count = value.length;
//skip===0 must be handled
const start = parseInt(limit)*parseInt(skip - 1);
const end = start + parseInt(reqQuery.pagesize);
//slicing the array
value = value.slice(start,end);
//could return it another way...
value.push( { 'querySize': count });
return value;
})
.catch((reason)=>{
//...handling code
});
}

express/mongoose update query

I having problem wrapping my head around updating multiple values in my mongoDB using mongooseJS and ExpressJS.
Let say I submit an array of 2 or more objects from my frontend to "express routing" and there I get the req.body parameters to fetch it. My req.body looks like this:
[articles:
{ article: {
_id: '564209c66c23d5d20c37bd84',
quantity: 25,
},
{ article: {
_id: '564209c66c23d5d20c37bd83',
quantity: 51,
},
}]
I then need to loop? to find the specific article in the db to update and when that article is found I want to update the "quantity" value from the frontend to the correct article in db.
var id = [];
var body = {};
for (var i = req.body.length - 1; i >= 0; i--) {
id.push(req.body[i].article._id);
body[i] = req.body[i].article.quantity;
};
Articles.update(
{ _id: {$in: id} },
{ $set: {quantity: body[0].article.quantity} },
{multi: true},
function(err, response){
if(err)
console.log(err);
console.log(response);
});
The problem with this code is that I put in the first quantity value for all articles and I want it to be the correct one from the frontend. It feels like I'm on the right path but i pretty new to mongoDB and express so if there is a better solution or even a solution let me know.
Grahlie,
If you are having issues with queries, it's sometimes useful to test queries from the mongodb shell itself to workout the logic.
If your article documents are structured as such:
{
_id: ObjectId("564209c66c23d5d20c37bd84"),
quantity: 25
}
{
_id: ObjectId("564209c66c23d5d20c37bd83"),
quantity: 51
}
If you want to update the quantity of a unique document based on it's _id then you could so with this query.
db.articles.update(
{"_id": "564209c66c23d5d20c37bd84"},
{$set : { "quantity" : 25}}
)
If you wanted to update multiple documents with the same quantity you could use $in, but that's not what you want to do. You want to loop through your req.body array and update the quantity of each article.
So your code would be as such:
var articles = req.body;
var updateArticle = function(article) {
Articles.update(
{_id:article._id},
{$set:{ quantity: article.quantity}},
function(err, article){
...
);
}
for(var i = 0, n = articles.length; i < n; i++){
updateArticle(articles.[i].article);
}

Resources