Meteor client synchronous server database calls - node.js

I am building an application in Meteor that relies on real time updates from the database. The way Meteor has laid out the examples is to have the database call under the Template call. I've found that when dealing with medium sized datasets this becomes impractical. I am trying to move the request to the server, and have the results passed back to the client.
I have looked at similar questions on SA but have found no immediate answers.
Here is my server side function:
Meteor.methods({
"getTest" : function() {
var res = Data.find({}, { sort : { time : -1 }, limit : 10 });
var r = res.fetch();
return (r);
}
});
And client side:
Template.matches._matches = function() {
var res= {};
Meteor.call("getTest", function (error, result) {
res = result;
});
return res;
}
I have tried variations of the above code - returning in the callback function as one example. As far as I can tell, having a callback makes the function asynchronous, so it cannot be called onload (synchronously) and has to be invoked from the client.
I would like to pass all database queries server side to lighten the front end load. Is this possible in Meteor?
Thanks

The way to do this is to use subscriptions instead of remote method calls. See the counts-by-room example in the docs. So, for every database call you have a collection that exists client-side only. The server then decides the records in the collection using set and unset.

Related

How to deal with events in nodejs/node-red

I work with node-red and develop a custom node at the moment that uses websockets to connect to a device and request data from it.
function query(node, msg, callback) {
var uri = 'ws://' + node.config.host + ':' + node.config.port;
var protocol = 'Lux_WS';
node.ws = new WebSocket(uri, protocol);
var login = "LOGIN;" + node.config.password;
node.ws.on('open', function open() {
node.status({fill:"green",shape:"dot",text:"connected"});
node.ws.send(login);
node.ws.send("REFRESH");
});
node.ws.on('message', function (data, flags) {
processResponse(data, node);
});
node.ws.on('close', function(code, reason) {
node.status({fill:"grey",shape:"dot",text:"disconnected"});
});
node.ws.on('error', function(error) {
node.status({fill:"red",shape:"dot",text:"Error " + error});
});
}
In the processResponse function I need process the first response. It gives me an XML with several ids that I need to request further data.
I plan to set up a structure that holds all the data from the first request, and populate it further with the data that results from the id requests.
And that's where my problem starts, whenever I send a query from within the processResponse function, I trigger an event that results in the same function getting called again, but then my structure is empty.
I know that this is due to the async nature of nodejs and the event system, but I simply don't see how to circumvent this behavior or do my code in the right way.
If anybody can recommend examples on how to deal with situations like this or even better could give an example, that would be great!

how get resultat mariadb nodejs

This is my code:
var Maria = require('mariasql');
c = new Maria({
host: '127.0.0.1',
user : 'root',
password : 'maria',
db : 'gim'
});
c.query('SELECT * FROM contact WHERE id = ? AND nom = ?', [4, 'dupont'], function(err, rows) {
if (err) {
throw err;
}
else {
function getResult() {
return rows;
}
}
});
c.end();
//get overs file
console.log(getResult());
I want resultant data but getResult() is not defined.
How can I get resultant with node.js?
Well there are several errors on your code.
Most of them related to basic JS programming concepts. So I will suggest you to look for some JS courses or tutorials.
When you define getResult function, you're doing it inside of a function's clousure. Which means that, this function will be only accessible inside that function.
The function where you've declared that getResult function, is actually a callback which is a function that will get executed as soon as some operation completes, normally an I/O intensive operation. Like in your case reading records from the database.
When the last statement gets reached and executed your query did not return, as of per node.js non-blocking I/O, the control of the program executes c.query... and this is where you provide that callback that we mentioned earlier, so it can continue the execution of the rest of statements, and when the c.query... returns with the records, execute the callback with them as result. But by then console.log... would have been executed.
I'm not sure if this is an understandable answer, in part because the concepts that I've tried to explain are really basic to the language that you're using, so one more time I would suggest to start by a tutorial/course. There are some good ones on CodeSchool and so many other sites.

Node Postgres Module not responding

I have an amazon beanstalk node app that uses the postgres amazon RDS. To interface node with postgres I use node postgres. Code looks like this:
var pg = require('pg'),
done,client;
function DataObject(config,success,error) {
var PG_CONNECT = "postgres://"+config.username+":"+config.password+"#"+
config.server+":"+config.port+"/"+config.database;
self=this;
pg.connect(PG_CONNECT, function(_error, client, done) {
if(_error){ error();}
else
{
self.client = client;
self.done = done;
success();
}
});
}
DataObject.prototype.add_data = function(data,success,error) {
self=this;
this.client.query('INSERT INTO sample (data) VALUES ($1,$2)',
[data], function(_error, result) {
self.done();
success();
});
};
To use it I create my data object and then call add_data every time new data comes along. Within add_data I call 'this/self.done()' to release the connection back to the pool. Now when I repeatedly make those requests the client.query never gets back. Under what circumstance could this lead to a blocking/not responding database interface?
The way you are using pool is incorrect.
You are asking for a connection from pool in the function DataObject. This function acts as a constructor and is executed once per data object. Thus only one connection is asked for from the pool.
When we call add_data the first time, the query is executed and the connection is returned to the pool. Thus the consequent calls are not successful since the connection is already returned.
You can verify this by logging _error:
DataObject.prototype.add_data = function(data,success,error) {
self=this;
this.client.query('INSERT INTO sample (data) VALUES ($1,$2)',
[data], function(_error, result) {
if(_error) console.log(_error); //log the error to console
self.done();
success();
});
};
There are couple of ways you can do it differently:
Ask for a connection for every query made. Thus you'll need to move the code which ask for pool to function add_data.
Release client after performing all queries. This is a tricky way since calls are made asynchronously, you need to be careful that client is not shared i.e. no new request be made until client.query callback function is done.

Store settimeout id from nodejs in mongodb

I am running a web application using express and nodejs. I have a request to a particular endpoint in which I use settimeout to call a particular function repeatedly after varying time intervals.
For example
router.get ("/playback", function(req, res) {
// Define callback here ...
....
var timeoutone = settimeout(callback, 1000);
var timeouttwo = settimeout(callback, 2000);
var timeoutthree = settimeout(callback, 3000);
});
The settimeout function returns an object with a circular reference. When trying to save this into mongodb i get a stack_overflow error. My aim is to be able to save these objects returned by settimeout into the database.
I have another endpoint called cancel playback which when called, will retrieve these timeout objects and call cleartimeout passing them in as an argument. How do I go about saving these timeout objects to the database ? Or is there a better way of clearing the timeouts than having to save them to the database. Thanks in advance for any help provided.
You cannot save live JavaScript objects in the database! Maybe you can store a string or JSON or similar reference to them, but not the actual object, and you cannot reload them later.
Edit: Also, I've just noticed you're using setTimeout for repeating stuff. If you need to repeat it on regular intervals, why not use setInterval instead?
Here is a simple solution, that would keep indexes in memory:
var timeouts = {};
var index = 0;
// route to set the timeout somewhere
router.get('/playback', function(req, res) {
timeouts['timeout-' + index] = setTimeout(ccb, 1000);
storeIndexValueSomewhere(index)
.then(function(){
res.json({timeoutIndex: index});
index++;
});
}
// another route that gets timeout indexes from that mongodb somehow
req.get('/playback/indexes', handler);
// finally a delete route
router.delete('/playback/:index', function(req, res) {
var index = 'timeout-' + req.params.index;
if (!timeouts[index]) {
return res.status(404).json({message: 'No job with that index'});
} else {
timeouts[index].cancelTimeout();
timeouts[index] = undefined;
return res.json({message: 'Removed job'});
}
});
But this probably would not scale to many millions of jobs.
A more complex solution, and perhaps more appropriate to your needs (depends on your playback job type) could involve job brokers or message queues, clusters and workers that subscribe to something they can listen to for their own job cancel signals etc.
I hope this helps you a little to clear up your requirements.

Nodejs + mikeal/Request module, how to close request or increase MaxSockets

I have a Nodejs app that's designed to perform simple end-to-end testing of a large web application. This app uses the mikeal/Request and Cheerio modules to navigate, request, traverse and inspect web pages across the application.
We are refactoring some tests, and are hitting a problem when multiple request functions are called in series. I believe this may be due to the Node.js process hitting the MaxSockets limit, but am not entirely sure.
Some code...
var request = require('request');
var cheerio = require('cheerio);
var async = require('async');
var getPages_FromMenuLinks = function() {
var pageUrl = 'http://www.example.com/app';
async.waterfall([
function topPageRequest(cb1) {
var menuLinks = [];
request(pageUrl, function(err, resp, page) {
var $ = cheerio.load(page);
$('div[class*="sub-menu"]').each(function (i, elem) {
menuLinks.push($(this).find('a').attr('href');
});
cb1(null, menuLinks);
});
}, function subMenuRequests(menuLinks, cb2) {
async.eachSeries(menuLinks, functionv(link, callback) {
request(link, function(err, resp, page) {
var $ = cheerio.load(page);
// do some quick validation testing of elements on the expected page
callback();
});
}, function() { cb2(null) } );
}
], function () { });
};
module.export = getPages_FromMenuLinks;
Now, if I run this Node script, it runs through the first topPageRequest and starts the subMenuRequests, but then freezes after completing the request for the third sub-menu item.
It seems that I might be hitting a Max-Sockets limit, either on Node or my machine (?) -- I'm testing this on standard Windows 8 machine, running Node v0.10.26.
I've tried using request({pool:{maxSockets:25}, url:link}, function(err, resp..., but it does not seem to make any difference.
It also seems there's a way to abort the request object, if I first instantiate it (as found here). But I have no idea how I would "parse" the page, similar to what's happening in the above code. In other words, from the solution found in the link...
var theRequest = request({ ... });
theRequest.pipe(parser);
theRequest.abort();
..., how would I re-write my code to pipe and "parse" the request?
You can make easily thousands requests at the same time (e.g. from a single for loop) and they will be queued and terminate automatically one by one, once a particular request is served.
I think by default there are 5 sockets per domain and this limit in your case should be more than enough.
It is highly probably that your server does not handle your requests properly (e.g. on error they are not terminated and hung up indefinitely).
There are three steps you can make to find out what is going on:
check if you are sending proper request -- as #mattyice observed there are some bugs in your code.
investigate server code and the way your requests are handled there -- for me it seems that the server does not serve/terminate them in first place.
try to use setTimeout when sending the request. 5000ms should be a reasonable amount of time to wait. On timeout the request will be aborted with appropriate error code.
As an advice: I would recommend to use some more suitable, easier in use and more accurate tools to do your testing: e.g. phantomjs.

Resources