Maintain an open session using request - node.js

I'm pretty new to this...
I have the following code:
request("http://somePythonServer.com?param1=1&param2=2", function (err, response, body) {
if (err){
console.error(err);
next(err);
}
console.log("Response Ended - SUCCESS");
});
This request is made each time with different parameters and values and currently a new connection is being opened each time i run it.
My question is how to maintain an open connection?
EDIT: I have control on both sending and receiving end of the request.
MY GOAL:
I'm trying to boost speed between two servers (NodeJS is sending HTTP GET to a python simple HTTP server) I want the connection between those two to remain open at all time in order for the requests to be as fast as possible.

Related

Node.js http server: "getifaddres: Too many open files"

I'm currently running a nodejs server, and using GazeboJS to connect to the Gazebo server in order to send messages.
The problem is:
From my searches it seems like its due to the linux open file limit which is default at 1024 (Using Ubunuty 14.04). Most solutions seem to be to increase the open file limit.
However I don't know why my script is opening files and not closing them. It seems like each http request opens a connection which is not closed even though a response is sent? The http requests are coming from a Lua script using async.
The error
getifaddres: Too many open files
occurs after exactly 1024 requests.
I have no experience with webservers so I hope someone could give an explanation.
The details of the nodejs server i'm running:
The server is created using
http.createServer(function(req, res))
when a HTTP GET request is received, the response is sent as a string. Example of one response
gazebo.subscribe(obj.states[select].type, obj.states[select].topic, function(err, msg) // msg is a JSON object
{
if (err)
{
console.log('Error: ' + err);
return;
}
res.setHeader('Content-Type', 'text/html');
res.end(JSON.stringify(msg));
gazebo.unsubscribe(obj.states[select].topic);
})
The script makes use of the publish/subscribe topics in the Gazebo server to extract information or publish actions. More information about Gazebo communication is here.

Node JS & Socket.io Advice - Is it better to post information to a database through a route, or through the socket?

I am currently building a new application which, at a basic level, lets users add a task which needs to be completed, and then lets a different group of users pick up the task and complete it.
Previously, I took over building a real-time chat application written with NodeJS and Socket.io and on that, all the messages were posted to a database over the socket connection.
In the current application I am doing the same, but was thinking if it might be better off to post the information to the database via the route instead, then emitting the socket event on success to update the list of available tasks.
I was just looking for advice, how would you guys do this? Commit info to the database through a route or over the socket?
If you go the route way things are pretty much laid out for you to be sure whether or not your update worked or not. Socket doesn't guarantee neither success nor failure, by default.
But you could program to make it. For e.g.
client: send data
socket.emit('update', data); // send data
server: receive data & send back an update as to whether the operation was successful or not
socket.on('update', function(data){
findOrUpdateOrWhateverAsync(function(err){
if(!err) socket.emit('update', null); // send back a "null" on success
else socket.emit('update', err.message); // or send back error message
});
});
client: receive update on error/success
socket.on('update', function(err){
if(err) alert(err);
});

Is there any kind of limit with node for I/O?

I am writing a code that is downloading one file from some place and I am streaming to the client real time. The file is never full in my sever. Only chunks. Here is the code:
downloader.getLink(link, cookies[acc], function(err, location) {
if (!err) {
downloader.downloadLink(location, cookies[acc], function(err, response) {
if (!err) {
res.writeHead(200, response.headers);
response.pipe(res);
} else
res.end(JSON.stringfy(err));
});
} else {
res.end(JSON.stringfy(err));
}
});
As I can see there is nothing blocking this code since response is comming from a simple http.response...
The problem is, this way I can only stream 6 files at the same time. But the server is not using all the resources(cpu 10%, memory 10%) and it is a single core. After +/- 5 files I only get the loading page and the stream doesn't starts, only after some of them has completed.
This is not a limitation on the 1st server where I am downloading the files because using my browser for example I can download as many as I want. Am I doing something wrog or this is some limitation in node that I can change? Thanks
If your code is using the node.js core http module's http.Agent, it has an initial limit of 5 simultaneous outgoing connections to the same remote server. Try reading substack's rant in the hyperquest README for the details. But in short, try using a different module for your connections (I recommend superagent or hyperquest), or adjust the http Agent's maxSockets setting for the node core http module.

Node.js sync/async callback return a value from a database?

I apologise immensely for posting this question because I'm sure the answer is out there, but I'm simply unable to find the answer I'm looking for.
The question is, if you are making a HTTP request to a node.js server - how do you keep the connection open so that the database can return something?
Consider the following code;
app.get("/myRequest", function (req, response) {
database.query('SELECT * FROM table', function(err, rows) {
// Cannot return rows because the connection will have already been closed
response.JSON(rows); // Doh!
});
});
I suppose my question is, should I be tackling this by adapting the above example, or by editing the way the actual webserver is setup?
I'm using a simple httpserver, perhaps express.js would address this problem?
When you write a route handler for node servers the connection does not close until you send a response. So if you wait to respond until after your database query returns the connection to the client will still be sitting there waiting for you to respond. The whole point of node is that it can handle many simultaneous connections in parallel instead of in series.

Node.js Outgoing Http request connection limit (cannot make connections more than five)

I'm building a data transfer proxy server using node.js.
It pipes client's request to swift object storage server using http(s) REST API.
It works fine for the individual request but when the outgoing
ESTABLISHED tcp connection for the same destination and port(443)
reaches five, it cannot create any new connection.
It does not seem to be a problem of O/S, because I've tried to create more than 10 connections using java servlet and it works fine.
I've tried to set maximum sockets for globalAgent like below, but it does not change anything.
http.globalAgent.maxSockets = 500;
https.globalAgent.maxSockets = 500;
Here is a part of my source code.
app.post('/download*', function(req, res){
/***********************************************************
* Some codes here to create new request option
***********************************************************/
var client = https.request(reqOptions, function(swiftRes) {
var buffers = [];
res.header('Content-Length', swiftRes.headers['content-length']);
res.header('Content-Type', swiftRes.headers['content-type']);
res.header('range', swiftRes.headers['range']);
res.header('connection', swiftRes.headers['connection']);
swiftRes.pipe(res);
swiftRes.on('end', function(err){
res.end();
});
});
client.on('error', function(err) {
callback && callback(err);
client.end(err);
clog.error('######### Swift Client Error event occurred. Process EXIT ');
});
client.end();
});
I hope I can get the solution for this problem.
Thanks in advance.
Usually, the change of the maxSockets should solve your problem, try it with a value a little bit lower.
https.globalAgent.maxSockets=20;
If that does not solve your problem, try to turn off pooling for the connections. Add the key agent with the value false to the options to the request. Keep in mind that Node.js uses the pooling to use keep-alive connection.
//Your option code
reqOptions.agent=false;

Resources