Node/Mongo application crashes on Amazon EC2 and cannot access the server - node.js

I have a node app with mongo server on an amazon ec2 instance. It works great, but I just added a new API call and every time I call it, the server freezes and I cannot access/ssh into it for several hours. While this is happening, my server goes down which makes the app that relies on it unusable and my users angry...
This code works perfectly on my localhost, but as soon as I run it on my server it freezes. My thoughts are that it may be crashing mongo? I have no idea why this would happen...
If anyone has any ideas what could be going wrong, please let me know.
node is using express. The send_error function will perform a res.send({some error}). db.CommentModel returns mongoose.model('comment', Comment);
in app.js
app.get('/puzzle/comment/:id', auth.restrict, puzzle.getComments);
in the file which defines getComments
exports.getComments = function(req, res)
{
var userID = _u.stripNonAlphaNum(req.params.id);
var CommentModel = db.CommentModel;
CommentModel.find({user: userID}, function(e, comments) {
if(e)
{
err.send_error(err.DB_ERROR, res);
}
else if (!comments)
{
err.send_error(err.DB_ERROR, res);
}
else if (comments.length == 0)
{
res.send([]);
}
else
{
var commentIDs = [];
for (var i = 0; i<comments.length; i++)
{
commentIDs.push({_id: comments[i].puzzle});
}
var TargetModel = pApp.findPuzzleModel(_u.stripNonAlphaNum(req.apiKey));
TargetModel.find({removed: false, $or: commentIDs}, function(e, puzzles) {
if(e)
{
err.send_error(err.DB_ERROR, res);
}
else if (!puzzles)
{
err.send_error(err.DB_ERROR, res);
}
else
{
res.send(puzzles);
}
});
}
});
}

It sounds like your query is causing something on your server (potentially mongo) to consume a very large amount of CPU - as this is commonly what causes the issue you have seen with SSH access.
You should try reading over the logs of your mongo instance and seeing if there are any long running queries.
Monngodb provides an internal profiler for examining long running commands. Try setting long running profiling level to 1, running the command and examining the logfile output.
More details on the profiler are available at http://www.mongodb.org/display/DOCS/Database+Profiler

Related

FATAL ERROR: Committing semi space failed. Allocation failed - process out of memory

I have the following snippet, in which I am trying to execute 10000 execute statement to understand that how many writes that Gremlin can withstand.
var gremlin = require('gremlin');
var async = require('async');
var client = gremlin.createClient(8182, "development.cluster-coeuolcg4r.us-east-1-beta.rds.amazonaws.com", {
accept: "application/vnd.gremlin-v2.0+json"
});
console.time('load');
async.times(10000, function (t, tCB) {
client.execute('g.addV("loadtest#1-50000#2").property("idx", '+new Date().getTime()+')', function (err) {
if(err) {
console.log(err);
}
tCB(null, 1);
});
}, function () {
console.timeEnd('load')
});
It is a simple snippet while running this I am getting an error and program execution gets stop.
It is working as expected if I ran this snippet for 5000.
Side note: --max-old-space-size didn't worked.
This does not look related to Neptune. Looks more like a client side setting that you need to tweak. I don't see any mention of your environment details (OS, or Node version), but I would definitely start with looking at your ulimits and prlimits of the system.

Seed nodejs express heroku app

mongoose.connect('mongodb://localhost:27017/smartphones');
mongoose.connection.on('error', console.error.bind(console, 'MongoDB connection error:'));
var smartphones = [
new Smartphone({
title: "V3",
usp: "20MP Softlight Camera",
image_path: "/image/phone_v3max.png",
qty: 1,
price: 200
}),
new Smartphone({
title: "V5",
usp: "Feel the Real Speed",
image_path: "/image/phone_v5.png",
qty: 1,
price: 450
})
];
var done = 0;
for(var i = 0; i < smartphones.length; i++) {
smartphones[i].save(function(err, result) {
done++;
if(done === smartphones.length) {
exit();
}
});
}
function exit() {
mongoose.disconnect();
}
When I work locally on Nodejs App I could run and seed the data file (smartphoneIndex-seeder.js) that is included the project by running the command line. But how should that be done when the app has been pushed to Heroku, how can I seed the data from there? The app is running but pages that runs by the seed data would not show up on my page unfortunately. Anyone an idea how I could connect the seed date so my entire app would be run properly.
It sounds to me like you want some sort of pre-boot process. You probably want to write some code that executes before you tell your application to start listening on a port. This code will look at the database, look to see if any data exists, and if it doesn't then it will call some code (such as your smartphoneIndex-seeder function which will push data into the application.
Once that decision is made, you can then call your normal app.listen() express code.

Opening Maxmind db in Nodejs

I am trying to open maxmind opensource database in my nodejs application. My application recieves a list of ip addressses from a java application. Application then returns the latitude and longitude corresponding to each ip. I have succesfully done this synchronously, but i want to do it asynchronously to make things a little faster. I have written a code for this, but the application gets killed everytime. I am guessing that the reason might be simultaneous opening of the same database(I might be wrong :D). I am posting the code below. Please take a look at it and make some suggestions on where I am going wrong. Thanks!!!
app.post('/endPoint', function(req, res){
var obj = req.body;
var list = [];
var ipList = obj.ipList;
for(var i = 0; i<ipList.length; i++){
var ip = ipList[i];
//console.log(i);
maxmind.open('./GeoLite2-City.mmdb', function(err, cityLookup){
if(err) throw err;
console.log("open database");
var city = cityLookup.get(ip);
if(city!=null){
var cordinates = {'latitude': city.location.latitude, 'longitude': geodata.location.longitude};
//console.log(cordinates);
list.push(cordinates);
}
if(list.length == ipList.length){
res.json({finalMap: list});
}
});
}
});
You should open the database only once, and reuse it.
The easiest solution would be to synchronously open the database at the top of your file:
const maxmind = require('maxmind');
const cityLookup = maxmind.openSync('./GeoLite2-City.mmdb');
Reading it asynchronously wouldn't speed things up a whole lot, and because loading the database is done only once (during app startup), I don't think it's a big deal that it may temporarily block the event loop for a few seconds.
And use the cityLookup function in your request handler:
app.post('/endPoint', function(req, res) {
...
let city = cityLookup.get(ip);
...
});

NodeJS CouchDB Long Polling Debacle

I have a web app that is published via ExpressJS on NodeJS, of course. It uses CouchDB as it's data source. I implemented long polling to keep the app in sync at all times between all users. To accomplish this I use the following logic:
User logs into app and an initial long poll request is made to Node via an Express route.
Node in turn makes a long poll request to CouchDB.
When Couch is updated it responds to the request from Node.
Lastly Node responds to the browser.
Simple. What is happening, though, is that when I refresh the browser it freezes up on every fifth refresh. Huh? very wierd. But I can reproduce it over and over, even in my test environment. Every fifth refresh without fail freezes up Node and causes the app to freeze. Rebooting Node fixes the issue.
After much hair pulling I THOUGHT I solved it by changing this:
app.get('/_changes/:since*', security, routes.changes);
To this:
app.get('/_changes/:since*', security, function () { routes.changes });
However, after further testing this is just failing to run routes.changes. So no actual solution. Any ideas why long polling CouchDb from Node would do such a strange thing? On the fifth refresh I can have a break point in node on the first line of my routing code and it never get's hit. However, in the browser I can break on the request to node for long polling and it seems to go out. It's like Node is not accepting the connection for some reason...
Should I be approaching long polling from Node to CouchDB in a different way? I'm using feed=longpoll, should I maybe be doing feed=continuous? If I turn down the changes_timeout in couchdb to 5 seconds it doesn't get rid of the issue, but it does make it easier to cope with since the freezes only last 5 seconds tops. So this would seem to indicate that node can't handle having several outstanding requests to couch. Maybe I will try a continuous feed and see what happens.
self.getChanges = function (since) {
Browser:
$.ajax({
url: "/_changes/" + since,
type: "GET", dataType: "json", cache: false,
success: function (data) {
try {
self.processChanges(data.results);
self.lastSeq(data.last_seq);
self.getChanges(self.lastSeq());
self.longPollErrorCount(0);
} catch (e) {
self.longPollErrorCount(self.longPollErrorCount() + 1);
if (self.longPollErrorCount() < 10) {
setTimeout(function () {
self.getChanges(self.lastSeq());
}, 3000);
} else {
alert("You have lost contact with the server. Please refresh your browser.");
}
}
},
error: function (data) {
self.longPollErrorCount(self.longPollErrorCount() + 1);
if (self.longPollErrorCount() < 10) {
setTimeout(function () {
self.getChanges(self.lastSeq());
}, 3000);
} else {
alert("You have lost contact with the server. Please refresh your browser.");
}
}
});
}
Node:
Routing:
exports.changes = function (req, res) {
var args = {};
args.since = req.params.since;
db.changes(args, function (err, body, headers) {
if (err) {
console.log("Error retrieving changes feed: "+err);
res.send(err.status_code);
} else {
//send my response... code removed here
}
})
}
Database long poll calls:
self.changes = function (args, callback) {
console.log("changes");
if (args.since == 0) {
request(self.url + '/work_orders/_changes?descending=true&limit=1', function (err, res, headers) {
var body = JSON.parse(res.body);
var since = body.last_seq;
console.log("Since change: "+since);
self.longPoll(since, callback);
});
} else {
self.longPoll(args.since, callback);
}
}
self.longPoll = function (since, callback) {
console.log("about to request with: "+since);
request(self.url + '/work_orders/_changes?feed=continuous&include_docs=true&since=' + since,
function (err, res, headers) {
console.log("finished request.")
if (err) { console.log("Error starting long poll: "+err.reason); return; } //if err send it back
callback(err, res.body);
});
}
Socket.io will automatically fall back to long polling, and doesn't have a problem like the one you are having. So just use that. Also for CouchDB changes use this https://github.com/iriscouch/follow or maybe this one https://npmjs.org/package/changes like the other guy suggested.
Its very bad practice to reinvent things when we have popular modules that already do what you need. There are currently more than 52,000 node modules on https://npmjs.org/. People make a big deal about copying and pasting code. In my mind reinventing basic stuff is even worse than that.
I know with so many modules its hard to know about all of them, so I'm not saying you can never solve the same problem as someone else. But take a look at npmjs.org first and also sites like http://node-modules.com/ which may give better search results.

how to make sure that the node.js calls to mongodb are really asynchronous

I am trying to write a node.js application, and we need to deploy it in production.
We need to make sure that node.js does not hang when there are any long running processes/operations, like querying, or the database server access.
So, i am trying to make a call to mongo or to filesystem which takes very long time to finish, so that i can verify that other node.js server is free to serve any other requests while that takes place.
Sadly, i am not able to insert a record for which mongo takes really long time to finish or to make a synch call to the file system.
Can someone tell me how to do it?
Thanks
Tuco
The trick is do a console log of the data after the block that do a call and a console.log in the callback if in the console apears first the message is actually asynchronous
Im using mongojs as driver for mongo:
collection.find({}, function(err, res) {
console.log("done")
});
console.log("sendign signal")
If its asynchronous, in the console:
sendign signal
done!
Now for the chained behavior you can make something like that
dbChain = (function() {
var chain = [], cursor = 0, busy = false;
chainin = {
push : function(aFn) {
if(!busy) {
chainin.reset();
aFn();
busy = true;
} else {
chain.push(aFn)
}
},
next : function() {
cursor++;
if(chain[cursor]) {
chain[cursor]();
} else {
chainin.reset();
}
},
reset : function() {
chain = [];
cursor = 0;
busy = false;
}
}
return chainin;
})()
and, in all the db calls you have to do:
dbChain.push(...(a function ) ...)
in all your callbacks
dbChain.next()

Resources