mongodb nodejs update using previous value - node.js

I have a mongoDB database and have restructured how I store my date/times.
However I would like to update any old documents that are not stored in this format.
Using NodeJS how could I use the previous value of a field and update it using its previous value.
I'm looking for something along the lines of:
function modifyDateFormat(oldDate){...}
let filter = {}
let update = {
$set: {date_time: modifyDateFormat(previousValue)}
}
db.collection("collection").updateMany(filter, update)

First find the old document, then update using data from this document. No other way.

In this case, first of all, you need to find all of them. Then you can work on that items and whatever you want.
To dig deeper have a look.

Related

how can i find as well as update many documents?

I want to update the results of find query on certain conditions.now what i was thinking it that will mongodb will search whole collection for find and update or use pointer from the previous find query.I just wanted to optimism my queries that's why i was thinking about it.so is there anyway to achieve this?
update:I also want the documents.
ex-collection.find({conditions}).foreach({some condition based on which update will be called})
now what i want is that update query which will be called from foreach function uses pointer from previous find query rather than searching through the collection again.
my point is when we first use find query we search the collection and a cursor is returned which is a pointer to collection in memory.now that we have that pointer why can't we use that to update the document rather than again searching the collection and then updating it.
If you want to keep your code you can use:
collection.find({conditions}).foreach((doc) => {
if (some_conditions) {
return collection.findOneAndUpdate({_id: doc._id}, {$set: {updated_fields});
}
})
but as mentioned in the comments i'm not sure exactly what conditions need to be met but you probably can just use the update method to save time.

Mongoose bulk insert or update documents

I am working on a node.js app, and I've been searching for a way around using the Model.save() function because I will want to save many documents at the same time, so it would be a waste of network and processing doing it one by one.
I found a way to bulk insert. However, my model has two properties that makes them unique, an ID and a HASH (I am getting this info from an API, so I believe I need these two informations to make a document unique), so, I wanted that if I get an already existing object it would be updated instead of inserted into the schema.
Is there any way to do that? I was reading something about making concurrent calls to save the objects, using Q, however I still think this would generate an unwanted load on the Mongo server, wouldn't it? Does Mongo or Mongoose have a method to bulk insert or update like it does with insert?
Thanks in advance
I think you are looking for the Bulk.find(<query>).upsert().update(<update>) function.
You can use it this way:
bulk = db.yourCollection.initializeUnorderedBulkOp();
for (<your for statement>) {
bulk.find({ID: <your id>, HASH: <your hash>}).upsert().update({<your update fields>});
}
bulk.execute(<your callback>)
For each document, it will look for a document matching the {ID: <your id>, HASH: {your hash}} criteria. Then:
If it finds one, it will update that document using {<your update fields>}
Otherwise, it will create a new document
As you need, it will not make a connection to the mongo server on each iteration of the for loop. Instead a single call will be made on the bulk.execute() line.

Couch db bulk operations

So I've been trying to move data from one database to another. I've already move them but I need to clear the documents which I've already moved from the old database. I've been using ektorp's execute bulk to perform bulk operations. But for some reason I keep getting document update conflict when I try to delete bulk by inserting _deleted.
I might be doing it wrong, here is what I did.
Fetch by bulk with include docs. (For some reason, this doesn't work with just id and rev.)
Then include the _deleted field to each document.
Post using executebulk.
It works for some documents but keeps getting document update conflict for some documents.
Any solution/suggestions please..
This is the preferred way of deleting docs in bulk:
List<Object> bulkDocs = ...
MyClass toBeDeleted = ...
bulkDocs.add(BulkDeleteDocument.of(toBeDeleted));
db.executeBulk(bulkDocs);
If you only need a way to delete/update docs in bulk and you don't need to necessarily implement it in your own software, you can use the great couchapp at:
https://github.com/harthur/costco
You need to upload it to your own server with a couchapp deployment tool, and use a function like
function(doc) {
if(doc.istodelete) // replace this or remove to delete all docs
return null;
}
Read instructions and examples

How to get last created document in couchdb?

How can I get last created document in couchdb? Maybe some how I can use _changes feature of couchdb? But documentation says, that I only can get list of document, ordered by first created document, ant there is no way to change order.
So how can I get last created document?
You can get the changes feed in descending order as it's also a view.
GET /dbname/_changes?descending=true
You can use limit= as well, so;
GET /dbname/_changes?descending=true&limit=1
will give the latest update.
Your only surefire way to get the last created document is to include a timestamp (created_at or something) with your document. From there, you just need a simple view to output all the docs by their creation date.
I was going to suggest using the last_seq information from the database, but the sequence number changes with every single write, and replication also complicates the matter further.

How to efficiently bulk insert and update mongodb document values from an array?

I have a Tags collection which contains documents of the following structure:
{
word:"movie", //tag word
count:1 //count of times tag word has been used
}
I am given an array of new tags that need to be added/updated in the Tags collection:
["music","movie","book"]
I can update the counts all Tags currently existing in the tags collection by using the following query:
db.Tags.update({word:{$in:["music","movies","books"]}}, {$inc:{count:1}}), true, true);
While this is an effective strategy to update, I am unable to see which tag values were not found in the collection, and setting the upsert flag to true did not create new documents for the unfound tags.
This is where I am stuck, how should I handle the bulk insert of "new" values into the Tags collection?
Is there any other way I could better utilize the update so that it does upsert the new tag values?
(Note: I am using Node.js with mongoose, solutions using mongoose/node-mongo-native would be nice but not necessary)
Thanks ahead
The concept of using upsert and the $in operator simultaneously is incongruous. This simply will not work as there is no way to different between upsert if *any* in and upsert if *none* in.
In this case, MongoDB is doing the version you don't want it to do. But you can't make it change behaviour.
I would suggest simply issuing three consecutive writes by looping through the array of tags. I know that's it's annoying and it has a bad code smell, but that's just how MongoDB works.

Resources