I'm currently running a nodejs server, and using GazeboJS to connect to the Gazebo server in order to send messages.
The problem is:
From my searches it seems like its due to the linux open file limit which is default at 1024 (Using Ubunuty 14.04). Most solutions seem to be to increase the open file limit.
However I don't know why my script is opening files and not closing them. It seems like each http request opens a connection which is not closed even though a response is sent? The http requests are coming from a Lua script using async.
The error
getifaddres: Too many open files
occurs after exactly 1024 requests.
I have no experience with webservers so I hope someone could give an explanation.
The details of the nodejs server i'm running:
The server is created using
http.createServer(function(req, res))
when a HTTP GET request is received, the response is sent as a string. Example of one response
gazebo.subscribe(obj.states[select].type, obj.states[select].topic, function(err, msg) // msg is a JSON object
{
if (err)
{
console.log('Error: ' + err);
return;
}
res.setHeader('Content-Type', 'text/html');
res.end(JSON.stringify(msg));
gazebo.unsubscribe(obj.states[select].topic);
})
The script makes use of the publish/subscribe topics in the Gazebo server to extract information or publish actions. More information about Gazebo communication is here.
Related
I'm pretty new to this...
I have the following code:
request("http://somePythonServer.com?param1=1¶m2=2", function (err, response, body) {
if (err){
console.error(err);
next(err);
}
console.log("Response Ended - SUCCESS");
});
This request is made each time with different parameters and values and currently a new connection is being opened each time i run it.
My question is how to maintain an open connection?
EDIT: I have control on both sending and receiving end of the request.
MY GOAL:
I'm trying to boost speed between two servers (NodeJS is sending HTTP GET to a python simple HTTP server) I want the connection between those two to remain open at all time in order for the requests to be as fast as possible.
I am creating a very simple DICOM ECHO server with nodejs however I am facing a problem where the clients always respond as can't connect, I am unsure what I am missing, has someone here experience in writing a DICOM ECHO server?
This is the code I have
var net = require('net');
net.createServer(function(socket){
socket.on('data', function(data){
datat = String.fromCharCode.apply(null, new Uint16Array(data));
console.log(datat);
socket.write(data);
socket.end()
});
socket.on('error', function(error){
console.log("Caught server socket error: ")
console.log(error.stack)
console.log(error)
});
}).listen(8041);
console.log('Server running at 127.0.0.1 on port 8041');
I have tried responding with the binary data and also with text data but neither one seems to work.
DICOM Echo is not as simple as a ping. You must implement a subset of the full stack of the DICOM network protocol. Instead of writing your own server with node.js, I would advise you to rely on an existing DICOM server. Orthanc is an example of a free DICOM server designed to act as a back-end service to Web applications. Orthanc has built-in support of DICOM C-Echo, which can be triggered by an AJAX request to its REST API (URI /modalities/{dicom}/echo).
Disclaimer: I am the author of Orthanc.
As i asked,
I just want to know how to get invalid message from client in nodejs http server.
For example, normally client send "GET / HTTP/1.1\r\n\r\n" to server, But if i sent "VID / HTTP/1.1\r\n\r\n" like this, there is no react in server. I already checked that server computer gets "VID / HTTP/1.1\r\n\r\n" message using wire-shark.
Thank you for your help.
I looked the solution that using 404 response, But It not worked.
look at this server creation code
function server(Route,connect){
console.log("start server function");
function onRequest(req, response) { //req:clientrequest ,response : server response;
console.log("ans server");
Route(req,connect,response,hnd.Hnd);
}
http.createServer(onRequest).listen(port);
}
when I send message "GET /HTTP/1.1\r\n\r\n" from client, server console write "ans server". But not VID
VID is one kind of example that I assume protocol..
#slebetman : Thank you for your solution.
I've voted to close and link it to the appropriate question (see above). Unfortunately, you'll have to modify node in C and compile your own custom node.js. There have been requests for adding an API for this but so far it looks like it's not going anywhere: github.com/joyent/node/issues/3192 and github.com/joyent/http-parser/pull/158 – slebetman
I am writing a code that is downloading one file from some place and I am streaming to the client real time. The file is never full in my sever. Only chunks. Here is the code:
downloader.getLink(link, cookies[acc], function(err, location) {
if (!err) {
downloader.downloadLink(location, cookies[acc], function(err, response) {
if (!err) {
res.writeHead(200, response.headers);
response.pipe(res);
} else
res.end(JSON.stringfy(err));
});
} else {
res.end(JSON.stringfy(err));
}
});
As I can see there is nothing blocking this code since response is comming from a simple http.response...
The problem is, this way I can only stream 6 files at the same time. But the server is not using all the resources(cpu 10%, memory 10%) and it is a single core. After +/- 5 files I only get the loading page and the stream doesn't starts, only after some of them has completed.
This is not a limitation on the 1st server where I am downloading the files because using my browser for example I can download as many as I want. Am I doing something wrog or this is some limitation in node that I can change? Thanks
If your code is using the node.js core http module's http.Agent, it has an initial limit of 5 simultaneous outgoing connections to the same remote server. Try reading substack's rant in the hyperquest README for the details. But in short, try using a different module for your connections (I recommend superagent or hyperquest), or adjust the http Agent's maxSockets setting for the node core http module.
I'd like to add a live functionality to a PHP based forum - new posts would be automatically shown to users as soon as they are created.
What I find a bit confusing is the interaction between the PHP code and NodeJS+socket.io.
How would I go about informing the NodeJS server about new posts and have the server inform the clients that are watching the thread in which the post was posted?
Edit
Tried the following code, and it seems to work, my only question is whether this is considered a good solution, as it looks kind of messy to me.
I use socket.io to listen on port 81 to clients, and the server running om port 82 is only intended to be used by the forum - when a new post is created, a PHP script sends a POST request to localhost on port 82, along with the data.
Is this ok?
var io = require('socket.io').listen(81);
io.sockets.on('connection', function(socket) {
socket.on('init', function(threadid) {
socket.join(threadid);
});
});
var forumserver = require('http').createServer(function(req, res) {
if (res.socket.remoteAddress == '127.0.0.1' && req.method == 'POST') {
req.on('data', function(chunk) {
data = JSON.parse(chunk.toString());
io.sockets.in(data.threadid).emit('new-post', data.content);
});
}
res.end();
}).listen(82);
Your solution of a HTTP server running on a special port is exactly the solution I ended up with when faced with a similar problem. The PHP app simply uses curl to POST to the Node server, which then pushes a message out to socket.io.
However, your HTTP server implementation is broken. The data event is a Stream event; Streams do not emit messages, they emit chunks of data. In other words, the request entity data may be split up and emitted in two chunks.
If the data event emitted a partial chunk of data, JSON.parse would almost assuredly throw an exception, and your Node server would crash.
You either need to manually buffer data, or (my recommendation) use a more robust framework for your HTTP server like Express:
var express = require('express'), forumserver = express();
forumserver.use(express.bodyParser()); // handles buffering and parsing of the
// request entity for you
forumserver.post('/post/:threadid', function(req, res) {
io.sockets.in(req.params.threadid).emit('new-post', req.body.content);
res.send(204); // HTTP 204 No Content (empty response)
});
forumserver.listen(82);
PHP simply needs to post to http://localhost:82/post/1234 with an entity body containing content. (JSON, URL-encoded, or multipart-encoded entities are acceptable.) Make sure your firewall blocks port 82 on your public interface.
Regarding the PHP code / forum's interaction with Node.JS, you probably need to create an API endpoint of sorts that can listen for changes made to the forum. Depending on your forum software, you would want to hook into the process of creating a new post and perform the API callback to Node.js at this time.
Socket.io out of the box is geared towards visitors of the site being connected on the frontend via Javascript. Upon the Node server receiving notification of a new post update, it would then notify connected clients of this new post and its details, at which point it would probably add new HTML to the DOM of the page the visitor is viewing.
You may want to arrange the Socket.io part of things so that users only subscribe to specific events being emitted by them being in a specific room such as "subforum123" so that they only receive notifications of applicable posts.