Mongoose - Keep local database in sync with remote database - node.js

I have access to two separate databases that I'd like to keep in sync, a new one and an existing one, which will be in separate physical locations. The new one is going to be used to service an external API, so to cut down on request time, I think it makes sense to only query the local database for API requests.
My initial approach was to use mongoose.createConnection and limit the local collection to minor metadata and directly access the remote collection, but that's what I'm now looking to avoid.
Another approach might be to use mongoose.createConnection to periodically query the remote db and update the local one, but it could be costly if I want to do make frequent updates.
There are ways to cut down the cost - for example, there is a lastUpdated property in the relevant collection on the existing database, which could be used to limit the remote query to recently updated records such as:
RemoteCollection.find({
lastUpdated: {$gte: Date.now() - lookbackPeriod}
})
But I'm wondering if there's any native functionality of mongoose/mongoDB that can be used more efficiently make the updates. I also thought about mongodump and mongorestore to keep a full local copy of the records I needed, but that also seems costly.
Any help is appreciated.

After a bit of reading and thanks to Jake's comment, it looks like it's working. I need to do some more setup, but the code below should work and is based on this section from the docs:
https://mongoosejs.com/docs/models.html#change-streams
The first step would be start mongod with the --replSet flag:
mongod --replSet "rs0" --bind_ip localhost,<hostname(s)|ip address(es)>
Then close and restart mongo and run rs.initiate() on the database. You can then check the status of the replica set with rs.status(). If that command works and returns a result, the replica set functionality should be there.
Then within Node, you can do something like this:
// The docs reference creating a new model but you can just import an existing one
const RemotePerson = require('./models/RemotePerson');
const LocalPerson = require('./models/LocalPerson');
RemotePerson.watch().on('change', data => {
if (data.operationType === "insert") {
LocalPerson.create(data.fullDocument);
} else if (data.operationType === "update") {
LocalPerson.findByIdAndUpdate(data.documentKey, {
$set: data.updateDescription.updatedFields
});
}
});

Related

Moving specific collections from mongodb atlas to archive db

I did my homework before posting this question
So the case is that I want to create a utility in my nodejs application that will move specific collections from my main database to an archive database and vice versa. I am using mongo db atlas for my application. I have been doing my research and I found two possible ways one is to create a mongodump and store and other is to create a backup file myself using my node application and upload it to archive db. Using the later approach will cause to loose my collection indexes.
I am planning to use mongodump for the purpose but can't find a resource that shows how to achieve that. Any help would be appreciated. Also if any one has any experience with similar situation I am open to suggestions as well.
I recently created a mongodump & mongorestore wrapper for nodejs: node-mongotools
What does it mean?
you have to install mongo binary on your host by following official mongo documentation(example) and then, you could use node-mongotools to call them from nodeJS.
Here is an example but tool doc contains more details:
var mt = new MongoTools();
const dumpResult = await mt.mongodump({ uri, path })
.catch(console.log);

Updating pre-existing things already in a database mongodb

I'm currently using mongodb in a express application via mongoose and a random thought came upon me. What would happen if say you had a site with users and later on you made a update and needed to add a new field to all the user models. how would you go about updating all pre-existing users with the new field.
You would use updateMany in your mongo shell (or using driver - the same):
db.user.updateMany({}, { $set: { newField: value } });

MongoDB + node: not authorized to execute command (sometimes works, sometimes doesn't)

I'm facing a problem with my MongoDB environment - the setup is as follows:
My node app provides a restify API which handles user registration (look up if a user exists in a collection based on his mail, and if not, insert him (note - insert uses bcrypt to hash the passwords, so probably is a bit slower)). It uses restify and Mongoose ORM.
A second benchmark script (also written in node, running on the same machine) accesses this restify API using HTTP PUT.
I'm starting around 20-30 of these requests in the benchmark (with random data) and only some of the API requests correctly insert the new users. For the other, MongoDB produces errors similar to the following:
not authorized on ... to execute command { find: "users", filter: { mail: "rroouksl#hddngrau.de" } }
not authorized on ... to execute command { insert: "users", documents: [ { ... } ], ordered: false, writeConcern: { w: 1 } }
Some other users get inserted perfectly fine. Especially with a low number of requests at the same time (1-5) no problems occur. Shouldn't Mongo be able to handle these "low" amount of requests? Is it a problem because it's running on the same machine? Hasn't the user I created in Mongo for this project got enough txns/second allowed?
Best regards,
Zahlii
It turned out that mongo was still using the "old" storage engine and not WiredTiger. Since my queries included updating records, the old engine performed collection-based locks which means that the errors were solely based on read-write locks.
I migrated to WiredTiger which performs document-based locking and since then, the database handles many parallel requests without these errors (although sometimes under heavy load they appear again - but this is part of mongo being NoSQL I guess)
You can try:
Db.authenticate(user, password, function(err, res) {
// callback
});
Also see the source.

User specific database in MongoDB

I am currently working on an inventory management software in Node js and MongoDB. I am pretty new to MongoDB, having worked in Oracle and MySQL for most of my projects.
Is it possible to create a separate database schema for every client who uses my software, with each client having access only to his copy of the database schema and collections?
The equivalent of selecting data in Oracle database would be
Select * from User1.table,
Select * from User2.table etc
Also, if it were possible, how would it be implemented using a node js mongo db client like mongoose?
I looked at MongoDB documentation, but it talks mainly about adding users to a database for authorization.
I apologize if it seems like a silly question, but id appreciate it if someone could point me in the right direction for this.
Before starting to invest a lot of time in the development of your project, check out other possible approaches to the scenario that you are trying to build.
I did a quick search on SO and found some additional threads with similar scenarios:
MongoDB Database vs. Collection
MongoDB Web App - Database per User
Additional info about mongoose database creation
Whenever you call the connect method on the mongoose object, you are either connecting to an existing database or you are creating it in case it doesn't already exist.
You could have a function that allows you to pass in a name argument with the name and create databases programmatically:
function createDatabase(name) {
var conn_string = 'mongodb://localhost/';
if (typeof name == 'string') {
conn_string += name;
}else{
return false;
}
mongoose.connect(conn_string);
}
Also, be aware that a database will be created when you first insert a record in a collection of that particular database.
It is not sufficient to only connect to the database, you also have to insert a record.
As per my previous example, you could also pass a schema parameter to the function, tailored to each user's profile and fire an insert statement after you connect to that database.

Sail.js requires server restart after running command to refresh database

From this question, Sails js using models outside web server I learned how to run a command from the terminal to update records. However, when I do this the changes don't show up until I restart the server. I'm using the sails-disk adapter and v0.9
According to the source code, the application using sails-disk adapter loads the data from file only once, when the corresponding Waterline collection is being created. After that all the updates and destroys happen in the memory, then the data is being dumped to the file, but not being re-read.
That said, what's happening in your case is that once your server is running, it doesn't matter if you are changing the DB file (.tmp/disk.db) using your CLI instance, 'cause lifted Sails server won't know about the changes until it's restarted.
Long story short, the solution is simple: use another adapter. I would suggest you to check out sails-mongo or sails-redis (though the latter is still in development), for both Mongo and Redis have data auto expiry functionality (http://docs.mongodb.org/manual/tutorial/expire-data/, http://redis.io/commands/expire). Besides, sails-disk is not production-suitable anyways, so sooner or later you would need something else.
One way to accomplish deleting "expired records" over time is by rolling your own "cron-like job" in /config/bootstrap.js. In psuedo code it would look something like this:
module.exports.bootstrap = function (cb) {
setInterval(function() { < Insert Model Delete Code here> }, 300000);
cb();
};
The downside to this approach is if it throws it will stop the server. You might also take a look at kue.

Resources