Opening Maxmind db in Nodejs - node.js

I am trying to open maxmind opensource database in my nodejs application. My application recieves a list of ip addressses from a java application. Application then returns the latitude and longitude corresponding to each ip. I have succesfully done this synchronously, but i want to do it asynchronously to make things a little faster. I have written a code for this, but the application gets killed everytime. I am guessing that the reason might be simultaneous opening of the same database(I might be wrong :D). I am posting the code below. Please take a look at it and make some suggestions on where I am going wrong. Thanks!!!
app.post('/endPoint', function(req, res){
var obj = req.body;
var list = [];
var ipList = obj.ipList;
for(var i = 0; i<ipList.length; i++){
var ip = ipList[i];
//console.log(i);
maxmind.open('./GeoLite2-City.mmdb', function(err, cityLookup){
if(err) throw err;
console.log("open database");
var city = cityLookup.get(ip);
if(city!=null){
var cordinates = {'latitude': city.location.latitude, 'longitude': geodata.location.longitude};
//console.log(cordinates);
list.push(cordinates);
}
if(list.length == ipList.length){
res.json({finalMap: list});
}
});
}
});

You should open the database only once, and reuse it.
The easiest solution would be to synchronously open the database at the top of your file:
const maxmind = require('maxmind');
const cityLookup = maxmind.openSync('./GeoLite2-City.mmdb');
Reading it asynchronously wouldn't speed things up a whole lot, and because loading the database is done only once (during app startup), I don't think it's a big deal that it may temporarily block the event loop for a few seconds.
And use the cityLookup function in your request handler:
app.post('/endPoint', function(req, res) {
...
let city = cityLookup.get(ip);
...
});

Related

Rethinkdb Node.JS .changes() unwanted looping

i have a a problem with Node JS, and rethinkdb module.
I'm currently developing program for my thesis
this pieces code of db.js:
var r = require("rethinkdb");
var host = "localhost";
var db = "example";
var port = 28015;
class dbs{
connectToDb(callback){
r.connect({
host:host,
port:port,
db:db
},function(err,connection){
return callback(err,connection);
});
}
streamAllData(tableName,callback){
this.connectToDb(function(err,conn){
r.table(tableName).changes().run(conn,function(err,cursor){
if(err){
return callback(true,err);
}
else{
cursor.next(function(err,rows){
return callback(null,rows);
});
}
});
});
}
}
and this pieces of code from server.js
var dbrs = require("./db");
var rdb = new dbrs();
var io = require("socket.io").listen(https.createServer(options,app).listen(port));
io.on("connection",function(socket){
rdb.streamAllData("userlocation",function(err,data){
socket.emit("broadcast:userlocation",data);
});
});
that result is always sending 7 same data . actually mobile phone sending cordinates to server is clean with configured interval.
this unwanted looping is always crashed my browser when im trying to draw driver location to maps.
that is a screenshot from chrome console
You method name streamAllData does not match your usage of cursor.next, which only fetches a single result. Perhaps you meant to use cursor.each instead?
See https://www.rethinkdb.com/api/javascript/next/

NodeJS readdir() function always being run twice

I've been trying to pick up NodeJS and learning more for backend development purposes. I can't seem to wrap my mind around Async tasks though and I have an example here that I've spent hours over trying to search for the solution.
app.get('/initialize_all_pictures', function(req, res){
var path = './images/';
fs.readdir(path, function(err, items){
if (err){
console.log("there was an error");
return;
}
console.log(items.length);
for(var i = 0; i<items.length; i++){
var photo = new Photo(path + items[i], 0, 0,Math.floor(Math.random()*1000))
photoArray.push(photo);
}
});
res.json({"Success" : "Done"});
});
Currently, I have this endpoint that is supposed to look through a directory called images and create "Photo" objects and push it into a global array called PhotoArray. It works, except the function for readdir is always being called twice.
console.log would always give output of
2
2
(I have two items in the directory).
Why is this?
Just figured out the problem.
I had a chrome extension that would help me format JSON values from HTTP requests. Unfortunately, the extension actually made an additional call to the endpoint therefore whenever I would point my browser to the endpoint, the function would end up getting called twice!

Store settimeout id from nodejs in mongodb

I am running a web application using express and nodejs. I have a request to a particular endpoint in which I use settimeout to call a particular function repeatedly after varying time intervals.
For example
router.get ("/playback", function(req, res) {
// Define callback here ...
....
var timeoutone = settimeout(callback, 1000);
var timeouttwo = settimeout(callback, 2000);
var timeoutthree = settimeout(callback, 3000);
});
The settimeout function returns an object with a circular reference. When trying to save this into mongodb i get a stack_overflow error. My aim is to be able to save these objects returned by settimeout into the database.
I have another endpoint called cancel playback which when called, will retrieve these timeout objects and call cleartimeout passing them in as an argument. How do I go about saving these timeout objects to the database ? Or is there a better way of clearing the timeouts than having to save them to the database. Thanks in advance for any help provided.
You cannot save live JavaScript objects in the database! Maybe you can store a string or JSON or similar reference to them, but not the actual object, and you cannot reload them later.
Edit: Also, I've just noticed you're using setTimeout for repeating stuff. If you need to repeat it on regular intervals, why not use setInterval instead?
Here is a simple solution, that would keep indexes in memory:
var timeouts = {};
var index = 0;
// route to set the timeout somewhere
router.get('/playback', function(req, res) {
timeouts['timeout-' + index] = setTimeout(ccb, 1000);
storeIndexValueSomewhere(index)
.then(function(){
res.json({timeoutIndex: index});
index++;
});
}
// another route that gets timeout indexes from that mongodb somehow
req.get('/playback/indexes', handler);
// finally a delete route
router.delete('/playback/:index', function(req, res) {
var index = 'timeout-' + req.params.index;
if (!timeouts[index]) {
return res.status(404).json({message: 'No job with that index'});
} else {
timeouts[index].cancelTimeout();
timeouts[index] = undefined;
return res.json({message: 'Removed job'});
}
});
But this probably would not scale to many millions of jobs.
A more complex solution, and perhaps more appropriate to your needs (depends on your playback job type) could involve job brokers or message queues, clusters and workers that subscribe to something they can listen to for their own job cancel signals etc.
I hope this helps you a little to clear up your requirements.

Mongoose Trying to open unclosed connection (callback hell)

I want to add a loop to the database records. But mongoose wrote that I did not close the open connection. Mongoose Trying to open unclosed connection. How to make the whole thing went in sync? Its callback hell in my code
app.get("/dobavit/", function(req, res) {
for(var i=50; i>0; i--)
{
InsertAXIXA("analitika",i,function(data){
});
}
res.end();
});
function InsertAXIXA(qwerty,page,callback){
mongoose.connect('mongodb://localhost/gazprom')
var parsedResults = [];
var req = request('http://xxx.ru/'+qwerty+"?page="+page, function (error, response, html) {
if (!error && response.statusCode == 200) {
// str = iconv.decode(html, 'utf-8');
var $ = cheerio.load(html);
$('.col-1 .col-first').each(function(i, element){
var buf = $(this);
var zagolovok = buf.children(0).children().children().eq(0).text();
var previewText = buf.children(2).children().eq(0).text();
var url = buf.children(0).children().children().eq(0).attr('href');
var picUrl = buf.children(1).children().eq(0).children().children().eq(0).attr('src');
var metadata = {
zagolovok:zagolovok,
previewText:previewText,
url:url,
typeOfnews:qwerty,
picUrl:picUrl,
qwerty:qwerty
};
var news =new News({
zagolovok: zagolovok,
previewText: previewText,
url:url,
picUrl:picUrl,
qwerty:qwerty
// created_at:Date.today()
});
news.save(function(err, news,affected){
});
parsedResults.push(metadata);
});
callback(parsedResults);
}
mongoose.connection.close()
});
You shouldn't actually need to open/close your connection on every request (see here for more about that).
Instead, you can just open your connection once when your app starts and then just let it close when the app closes.
If you leave the connection open, you can reuse the connections instead of wasting time/resources establishing a new one every time that function is called.
In my opinion, you are trying to create another connection without closing the current one. So, you might want to use:
createConnection() instead of connect().
In your case, it would look like this:
db = mongoose.createConnection('mongodb://localhost/mydb');
i was getting the same error today, the solution which i found is, we should not call the mongoose.connect function in loop or anywhere in the code which is being executed again and again.
so my example is, i was doing mongoose.connect on all request of app.
app.all("*",function(){
mongoose.connect();
})
Your code is somewhat similar to because in loop you are calling function and in function you are opening connection.
Thanks

Node ram usage keep increasing

I have a page build by node and receive 15 requests/second.
And my function is like this:
var somepage = function(req,res){
res.send(200);
call_mongo_to_save_some_data(req.somedata);
}
var call_mongo_to_save_some_data = function(data){
var needToSave = {}
needToSave.val1 = data.val1;
needToSave.val2 = data.val2;
needToSave.val3 = data.val3;
needToSave.val4 = data.val4;
needToSave.val5 = data.val5;
var db = mongoskin();
db.collection.insert(needToSave).success(function(){
db.close();
}).fail(function(err){ throw err; });
}
So you can see I do something after I send the response. To do this, is because I want to reduce the response time. So the client user won't waiting for I save something in mongo.
But after I launch the page, I found that the ram usage is keep increasing. I did some research, saying the res.write clear the output buffer. And compare with my code, I do something after the res.write (res.send). So not sure is that the reason. Or it's some other issue here.

Resources