Mongodb with node is using high cpu usage on Docker - node.js

Hi I've installed Rocket.chat on ubuntu Aws micro instance, It running with Nginx, MongoDB, and node, where MongoDB is running with docker image mongo:3.0
It was running smoothly on the day of installation but after some times It server was getting slow, I examined within the server with top command. It was MongoDB using cpu% around 70. and the next day It flickers with more than 90%.
I've reinstalled everything on the server but it is same again, no luck.
Here is the screenshot for top cmd.
Please let me know if any other stats are needed for this.
How can I examined the main problem here, how can I optimize it to make it work properly.
Thanks

I got to know why this issue arises. I started implementing my custom chat platform with Meteor.
So the cause of the problem was services.resume.loginTokens in the user object.
We were trying implementing rocket chat methods/api on the custom native android application.
Whenever application is calling the login method from the android app, It was adding a new login token without deleting the previous ones (for multi-system logins)
so if you'll delete the previous one with some date check, It won't create overheads to the user object.
Accounts.registerLoginHandler (loginRequest) ->
# ... Do whatever you need to do to authenticate the user
stampedToken = Accounts._generateStampedLoginToken();
Meteor.users.update userId,
$push: {'services.resume.loginTokens': stampedToken}
# Delete old resume tokens so they don't clog up the db
cutoff = +(new Date) - (24*60*60)*1000
Meteor.users.update userId, {
$pull:
'services.resume.loginTokens':
when: {$lt: cutoff}
},
{multi : true}
return {
id: userId,
token: stampedToken.token
}
I got this solution from this so question

Related

MongoDB custom user roles - "user is not allowed(...)"

I created a free tier cluster on MongoDB Atlas (it has 3 shards) and I want my Node.js app to connect with a database I created there, using a specific user, that will be restricted from using any other database than the one inteded for this app.
So step by step.
I create a database called, let's say, test.
I create a role here - I go to Security -> MongoDB Roles -> Add New Custom Role and I give it all Collection actions and all Database actions and roles to test
Time for a user, so again Security -> MongoDB Users -> Add New User and I assign a previously created role to it so it has access only to test database. So now I have 2 users - atlasAdmin and my created user.
That's where the problem occurs, when I use admin user in my app to connect, .find() or .create() it works fine all the time. With a user with custom role, it works for like 10mins/1 connection (until I shut down the local server I have my node app on) and the next time I get an error that "user is not allowed to perform action (...)".
I tried everything, tinkering with a string I use to connect, updating mongoose (I use it in my app), creating user and custom role using mongodb shell but nothing seems to work.
HOWEVER:
if I have this custom user, my app connects with it to the database and it works, then on the next connection it stops working AND I go here and just click UPDATE USER without changing anything there (I just click edit next to the user and then update) then wait for the cluster to make changes, it will work again for like +/- one connection.
everything works just fine if my app uses admin account
Anyone had similar problem? Screenshot of the error I was also thinking that it might be because of how many times I try to connect with mongo from the app (I use nodemon so everytime I save a file with changes, server restarts, thus connecting to database again) but I think that's not the case - if it was, why would I be able to make it work with admin user?
The string I use to connect with mongo:
// DATABASE SETUP
var dbURL = 'mongodb://[cluster0:port],[cluster1:port],[cluster2:port]/test?ssl=true&replicaSet=Cluster0-shard-0&authSource=admin&retryWrites=true';
var options = {
useNewUrlParser: true,
dbName: "test"
user: [login],
pass: [pass]
};
mongoose.connect(dbURL, options);
I have also encountered this problem on Atlas Free tier, not just on NodeJS but Java as well
For now, you can try mitigating this problem by using a default role instead of having a custom one
On the MongoDB Users tab, click "Edit" on your user => Add Default Privileges
Picture 1
Then select "readWrite" and type your database name on the first field, then save the user
Picture 2
Or, if you want database administration, add another field with "dbAdmin" role
Picture 3
At least that's how I solved it. I hope this helps.
Side note: You can also use the shorter connection string (MongoDB+SRV) and it would still work.

MongoDB is not responding

I have been searching for why this has been happening, but recently I reinstalled Ubuntu 16.04 and I copied a Node + Express project to a flash drive then pasted it to the exact same corresponding location (~/Programming/project/). Upon doing that, everything else works as I would expect but Mongo is not responding when I make requests to it through Mongoose. I do not have any reason to believe that Mongoose is the reason for the failure to respond. I have a couple of routes in which I know should work, the exact same code works on my friend's machine (same Ubuntu, version and everything). I have uninstalled and reinstalled everything (Including Ubuntu) multiple times. The only thing that works is making a call to find something with a specific ID will return if and only if the ID does not exist. Mongo won't return all the records or anything else. The website just spins endlessly (locally hosted on my machine). However, using Mongo in the terminal works fine. I can query and get results as if everything is normal. Has this happened to anyone else or is there any ideas? I can try to include some code.
This does not work
Greeting.find({}, function(err, greetings) {
res.status(200).json(greetings);
});
This does work.
Greeting.findById(req.params.id, function(err, greeting) {
if(err)
res.status(404).json({"error": "Greeting with that ID does not exist"});
res.status(200).json(greeting);
});
EDIT:
Sorry, I am new to stack overflow so I am still getting the hang of what should be added or not...
mongoose.connect(database.url);
mongoose.connection.on('error', function() {
console.info("Could not run mongodb, did you forget to run mongod?");
});
The database.url is what it needs to be, the connection is open as far as I can tell...
I should also mention that while installing Ubuntu, I wiped my previous dual boot in favor of just having Ubuntu, and so I opted in for the hard-drive encryption... Could that be preventing Mongo from working properly? If so, how would I fix that?
The issue was in fact the encrypted hard drive. I reinstalled Ubuntu and that fixed it. I'm still not sure how to make it work with an encrypted disk.

MongoDB + node: not authorized to execute command (sometimes works, sometimes doesn't)

I'm facing a problem with my MongoDB environment - the setup is as follows:
My node app provides a restify API which handles user registration (look up if a user exists in a collection based on his mail, and if not, insert him (note - insert uses bcrypt to hash the passwords, so probably is a bit slower)). It uses restify and Mongoose ORM.
A second benchmark script (also written in node, running on the same machine) accesses this restify API using HTTP PUT.
I'm starting around 20-30 of these requests in the benchmark (with random data) and only some of the API requests correctly insert the new users. For the other, MongoDB produces errors similar to the following:
not authorized on ... to execute command { find: "users", filter: { mail: "rroouksl#hddngrau.de" } }
not authorized on ... to execute command { insert: "users", documents: [ { ... } ], ordered: false, writeConcern: { w: 1 } }
Some other users get inserted perfectly fine. Especially with a low number of requests at the same time (1-5) no problems occur. Shouldn't Mongo be able to handle these "low" amount of requests? Is it a problem because it's running on the same machine? Hasn't the user I created in Mongo for this project got enough txns/second allowed?
Best regards,
Zahlii
It turned out that mongo was still using the "old" storage engine and not WiredTiger. Since my queries included updating records, the old engine performed collection-based locks which means that the errors were solely based on read-write locks.
I migrated to WiredTiger which performs document-based locking and since then, the database handles many parallel requests without these errors (although sometimes under heavy load they appear again - but this is part of mongo being NoSQL I guess)
You can try:
Db.authenticate(user, password, function(err, res) {
// callback
});
Also see the source.

Sail.js requires server restart after running command to refresh database

From this question, Sails js using models outside web server I learned how to run a command from the terminal to update records. However, when I do this the changes don't show up until I restart the server. I'm using the sails-disk adapter and v0.9
According to the source code, the application using sails-disk adapter loads the data from file only once, when the corresponding Waterline collection is being created. After that all the updates and destroys happen in the memory, then the data is being dumped to the file, but not being re-read.
That said, what's happening in your case is that once your server is running, it doesn't matter if you are changing the DB file (.tmp/disk.db) using your CLI instance, 'cause lifted Sails server won't know about the changes until it's restarted.
Long story short, the solution is simple: use another adapter. I would suggest you to check out sails-mongo or sails-redis (though the latter is still in development), for both Mongo and Redis have data auto expiry functionality (http://docs.mongodb.org/manual/tutorial/expire-data/, http://redis.io/commands/expire). Besides, sails-disk is not production-suitable anyways, so sooner or later you would need something else.
One way to accomplish deleting "expired records" over time is by rolling your own "cron-like job" in /config/bootstrap.js. In psuedo code it would look something like this:
module.exports.bootstrap = function (cb) {
setInterval(function() { < Insert Model Delete Code here> }, 300000);
cb();
};
The downside to this approach is if it throws it will stop the server. You might also take a look at kue.

Run mongodb script once to insert initial data

I have a chicken and egg problem with my node server in which you need to have a user with a certain role that has certain permissions to be able to log in and start creating more users, roles, etc.
I would like to initialize the database such that I create an initial ADMIN role and initial admin user that has that role.
I.E. started with a script and ran into problems:
use mydb
db.roles.insert({
name: "ADMIN_ROLE",
description: "Administrative role",
permissions: ['ALL']
});
db.users.insert({
username: "admin",
password: "password",
role: ??? (get ADMIN_ROLE _id from above)
});
Basically I ran into a couple of problems:
1. not really sure if I can script like this.
2. How to get ADMIN_ROLE id to store in new admin user
Another idea:
Write a quick node app that connects to mongodb and inserts the proper stuff. Anyone done this before.
And yet another:
Does anything like ruby rake exist for node/mongo. I.E. the initial seed may not be the only data I need to 'manually' mess with. I.E. I might need to patch the database at some point in time. Would be nice to create patch #1 as the initial seed, and then be able to write future patches if necessary and be able to. I.E. anything like rake migrate?
Any other ideas on how to seed a mongo database?
Shoot just found this:
https://github.com/visionmedia/node-migrate
and
https://npmjs.org/package/mongo-migrate
Exactly what I was looking for.

Resources