Node.JS - Server to server data transfer memory leak and slow network speeds - node.js

Ok, this is not going to be easy to explain, but I'll give it my best..
We have a file sharing solution serving (small images to big 3d drawings/videos(++5-10GB) running on a strange network setup (IT Security request).
We have the internal zone (LAN) with a NODE.js server:
Backend ( Windows Server 2008 2x2.60Ghz 16GB Ram and disk transfer speed of up till 122 MB/S read). 0.8.22 (Some MSNodeSQL stuff so cant use the newer Node versions.. Converting to TDS now
We then have a external zone ( DMZ ) where the proxy and web-server lives.
Proxy ( Ubuntu 12.10 1x2.60Ghz 32GB Ram rest is same as above ) node 0.8.22. Proxy is the only server that can access the LAN, but there is only one port open and it has to be initilized by the backend..
Web ( Windows Server 2008 8x2.60Ghz 16GB ram and rest is same as above ) node 0.10.12
So, we need to transfer a file to the client connected to the web-server.
The library I've used now is BinaryJS witch works good, but we have a extreamly slow transfer speed of about 12MB/s and this drops when new clients are connecting (usuall arround 4-6MB/s).
Request flow:
- Client (browser) -> Web-Server over https
Web-Server -> Proxy over BinaryJS
Proxy -> Backend over BinaryJS CreateStream.
Backend -> Proxy with a stream object that are piped to the web-server socket.
Proxy to Web-Server via pipe ( the original request )
Web-Server -> Client (Browser) binaryStream.pipe(res) where res is the ExpressJS response.
From testing it seems like we loose 50% of the transfer speed for each server (backend, proxy, web) we go through and i cant figure out why.. We have enough bandwidth and disk speed to get full speed of the Gigabit connected clients. Also the memory and CPU are not working that hard.
Code:
Backend (BASIC) - DB Query's of filepath etc is removed :
client = BinaryClient('wss://'+config.proxy+':'+config.port+'/binary');
client.on('stream', function(msgStream, msgMeta){
console.log("Got stream with meta: ", msgMeta);
var stream = require('fs').createReadStream(msgMeta.filepath);
stream.pipe(msgStream);
msgStream.on('error', function(e){
console.log("Bin stream failed: ", e);
});
msgStream.on('close', function(){
console.log("Stream closed")
});
stream.on('end', function(){
console.log("Ending file stream of file: ", msgMeta.meta.itemID);
msgStream.end();
});
}
})
Proxy:
var binServer = BinaryServer({port:8001});
binServer.on('connection', function(client){
client.on('stream', function(stream, meta){
if(meta.register == true){
if(meta.who == "master"){
config.masterBinary = client;
}
} else {
// Its a stream request.
if(meta.what == "GET"){
config.masterBinary.createStream(meta).pipe(stream).on('close', function(){
console.log("CLOSED");
}).on('end', function(){
console.log("ENDED");
});
}
}
})
});
Web-Server:
client = binaryClient('wss://'+config.proxy+':'+config.port+'/binary');
client.on('open', function(){
ready = true;
})
client.on('close', function(){
ready = false;
})
exports.get = function(file, res, cb){
// This get called from the Express request.. Client will wait for the stream to start
console.log("Sending request to backend: ", file);
if(file.itemID || file.requestID){
client.createStream({filepath:"C:/SomePathToAFile", register:false, what:"GET"}).pipe(res);
} else {
cb("Parameter not present", false);
}
}
I've been wondering if is should try to write my own TCP-server and clients to better fit what we are trying to do.. Unfortunate I'm quite new to Node and have not figure out how to implement the EventEmitter into TCP so I can send data with meta for identification..
Is there a better way or have I completly missunderstood the BinaryJS server?
Thanks for reading!

Related

Persiste NodeJS Sockets

Hello community,
Since the morning I am faced with an idea not to say a problem I want to store clients (sockets) in order to re-invoke them later, I know that it is not too clear I will explain in detail:
First I have a server interface (ServerSocket) net.Server() which will receive clients (sockets) net.Socket() and this one will be stored in a Map() with a unique ID for each client, and each time I want to communicate with one of them I call map.get(id).write ... or other function,
Everything works fine until I close the server, automatically the sockets will be killed ... saying that I have found a solution to store the clients for my case (vuex) or localStorage to simplify what I want is when I restart the server and I invoke one of the client it will always be active.
So my main questions:
How can I keep clients still active after the server is closed?
How can I Store sockets and check if they are active after restarting server?
var net = require("net");
var server = new net.Server();
var sockets = new Map();
/**
* This events is called when server is successfully listened.
*/
server.on("listening", () => {
console.log("Server Listen in port 4444");
});
/**
* This events is called when error occur in server
*/
server.on("error", (e) => {
if (e.code === "EADDRINUSE") {
console.log("Address in use, retrying...");
}
});
/**
* This events is called when a new client is connected to server.
*/
server.on("connection", (socket) => {
var alreadyExist = false;
sockets.forEach((soc) => {
if (soc.remoteAddress === socket.remoteAddress) {
alreadyExist = true;
}
});
if (alreadyExist) {
socket.end();
} else {
socket.setKeepAlive(true, Infinity);
socket.setDefaultEncoding("utf8");
socket.id = nanoid(10);
sockets.set(socket.id, socket);
socket.on("error", (e) => {
console.log(e);
if (e.code === "ECONNRESET") {
console.log("Socket end shell with CTRL+C");
console.log("DEL[ERROR]: " + socket.id);
}
});
socket.on("close", () => {
console.log("DEL[CLOSE]: " + socket.id);
});
socket.on("end", () => {
console.log("DEL[END]: " + socket.id);
});
socket.on("timeout", () => {
console.log("timeout !");
});
var child = sockets.get(res.id);
child.write(/* HERE I SEND COMMAND NOT IMPORTANT ! */);
socket.on("data", (data) => {
console.log("Received data from socket " + data);
});
}
});
How can I keep clients still active after the server is closed?
A client can't maintain a connection to a server that is not running. The client is free to do whatever it wants on its own when the server shuts down, but it cannot maintain a connection to that server that is down. The whole definition of a "connection" is between two live endpoints.
How can I Store sockets and check if they are active after restarting server?
You can't store sockets when the server goes down. A socket is an OS representation of a live connection, TCP state, etc... When the server goes down and then restarts, that previous socket is gone. If it wasn't closed by the server before it shut-down, then it was cleaned up by the OS when the server process closed. It's gone.
I would make a suggestion that you're asking for the wrong thing here. Sockets don't outlive their process and don't stay alive when one end of the connection goes down.
Instead, the usual architecture for this is automatic reconnection. When the client gets notified that the server is no longer there, the client attempts to reconnect on some time interval. When, at some future time, the server starts up again, the client can then connect back to it and re-establish the connection.
If part of your connection initiation is an exchange of some sort of clientID, then a client can reconnect, present it's identifier and the server can know exactly which client it is and things can then continue as before with a new socket connection, but everything else proceeding as if the previous socket connection shut-down never happened. You just rebuild your Map object on the server as the clients reconnect.
For scalability reasons, your clients would generally implement some sort of back-off (often with some random jitter added) so that when your server comes back online, it doesn't get immediately hammered by hundreds of clients all trying to connect at the exact same moment.

Print-Server written with Electron/Node.js

I'm trying to create a print-server written with electron and node js.
My goal is to catch the body of a print-job from a POS to an Epson thermal printer.
As I understood correctly from the documentations of Epson, the printer communicates on tcp port 9100 and on udp 3289 by default.
So I created a websocket which is listening on the tcp port with the "Net" module.
The socket is established successfully and I also recieve some Buffer data.
My Question for now is, how can I encode this buffer, as it isn't possible to encode this via the default encoding types from Node.js.
Or would you recommend to use a virtual printer which prints a file and afterwards to try reading the data from it?
Which module or virtual printers are recommended?
I've searched already for quite a while now without finding any positive results.
Here is my current code from the net server:
var server = net.createServer(function(socket) {
socket.setEncoding('utf8')
socket.on('data', function(buffer) {
var decoded = buffer
console.log(decoded)
})
socket.on('end', socket.end)
});
server.on('connection', handleConnection);
server.listen(9100, function() {
console.log('server listening to %j', server.address());
});
function handleConnection(conn) {
var remoteAddress = conn.remoteAddress + ':' + conn.remotePort;
console.log('new client connection from %s', remoteAddress);
conn.on('data', onConnData);
conn.once('close', onConnClose);
conn.on('error', onConnError);
}
Ok I've got this running.
The Problem was, that the cashing system first made a request for the printerstatus "DLE EOT n".
So I responded to the cashing system with the according status bits / byte (0x16).
Afterwards the POS sended the printjob which I decoded from CP437 to UTF8 to capture and to be able to let my script read the incoming printrequest.
Hope this post helps anyone who is developing anything similar like kitchen monitors, printservers etc. as I found very less informations in the web about this topic.

How does one correctly set up a server based deepstream RPC provider?

I am building a SOA with deepstream and I want to use a deepstream client server to perform API-KEY based look ups that the user should not know. How do I actually set up an RPC client provider? I have looked in the deepstream docs and on google, but there is not a full code example on how to do this. I have created a file like below and run it with node. The output I get is below it:
var deepstream = require('deepstream.io-client-js')
const client = deepstream('localhost:6020').login()
console.log('Starting up')
client.on('error', (error,event,topic) => {
console.log(error, event, topic);
})
client.on('connectionStateChanged', connectionState => {
console.log(connectionState);
})
client.login({username: 'USER', password: 'PASSWORD'}, (success, data) => {
if (success) {
client.rpc.provide('the-rpc', function( data, response ){
response.send(data);
});
} else {
console.log(data);
}
})
--
Starting up
AWAITING_CONNECTION
As you can see it runs the code, but does not actually connect to the deepstream server. I already have the deepstream server running, and a browser client that connects to it, so the config is correct. Please help!
I think your issue is based on the fact your trying to connect node via the webport. Try using port 6021 instead for tcp ( used by the node client ).
const client = deepstream('localhost:6021').login()
You should also only call .login() once, so the line would be:
const client = deepstream('localhost:6021')
We are working on a 2.0 release coming out very soon which will remove tcp entirely and only require a single port to make life easier in terms of deployment and performance.

Non-http TCP connection on Cloudfoundry

I'm a nooby mobile developer trying to take advantage of cloudfoundry's service to run my server to handle some chats and character movements.
I'm using Noobhub to achieve this (TCP connection between server and client using Node.js and Corona SDK's TCP connection API)
So basically I'm trying a non-http TCP connection between Cloudfoundry(Node.js) and my machine(lua).
Link to Noobhub(There is a github repo with server AND client side implementation.
I am doing
Client
...
socket.connect("myappname.cloudfoundry.com", 45234)
...
(45234 is from server's process.env.VCAP_APP_PORT value I retrieved from console output I got through "vmc logs myappname" after running the application.)
Server
...
server.listen(process.env.VCAP_APP_PORT)
When I try to connect, it just times out.
On my local machine, doing
Client
...
socket.connect("localhost",8989)
Server
...
server.listen(8989)
works as expected. It's just on cloudfoundry that it doesn't work.
I tried a bunch of other ways of doing this such as setting the client's port connection to 80 and a bunch of others. I saw a few resources but none of them solved it.
I usually blow at asking questions so if you need more information, please ask me!
P.S.
Before you throw this link at me with an angry face D:< , here's a question that shows a similar problem that another person posted.
cannot connect to TCP server on CloudFoundry (localhost node.js works fine)
From here, I can see that this guy was trying to do a similar thing I was doing.
Does the selected answer mean that I MUST use host header (i.e. use http protocol) to connect? Does that also mean cloudfoundry will not support a "TRUE" TCP socket much like heroku or app fog?
Actually, process.env.VCAP_APP_PORT environment variable provides you the port, to which your HTTP traffic is redirected by the Cloud Foundry L7 router (nginx) based on the your apps route (e.g. nodejsapp.vcap.me:80 is redirected to the process.env.VCAP_APP_PORT port on the virtual machine), so you definitely should not use it for the TCP connection. This port should be used to listen HTTP traffic. That is why you example do work locally and do not work on Cloud Foundry.
The approach that worked for me is to listen to the port provided by CF with an HTTP server and then attach Websocket server (websocket.io in my example below) to it. I've created sample echo server that works both locally and in the CF. The content of my Node.js file named example.js is
var host = process.env.VCAP_APP_HOST || "localhost";
var port = process.env.VCAP_APP_PORT || 1245;
var webServerApp = require("http").createServer(webServerHandler);
var websocket = require("websocket.io");
var http = webServerApp.listen(port, host);
var webSocketServer = websocket.attach(http);
function webServerHandler (req, res) {
res.writeHead(200);
res.end("Node.js websockets.");
}
console.log("Web server running at " + host + ":" + port);
//Web Socket part
webSocketServer.on("connection", function (socket) {
console.log("Connection established.");
socket.send("Hi from webSocketServer on connect");
socket.on("message", function (message) {
console.log("Message to echo: " + message);
//Echo back
socket.send(message);
});
socket.on("error", function(error){
console.log("Error: " + error);
});
socket.on("close", function () { console.log("Connection closed."); });
});
The dependency lib websocket.io could be installed running npm install websocket.io command in the same directory. Also there is a manifest.yml file which describes CF deploy arguments:
---
applications:
- name: websocket
command: node example.js
memory: 128M
instances: 1
host: websocket
domain: vcap.me
path: .
So, running cf push from this directory deployed app to my local CFv2 instance (set up with the help of cf_nise_installer)
To test this echo websocket server, I used simple index.html file, which connects to server and sends messages (everything is logged into the console):
<!DOCTYPE html>
<head>
<script>
var socket = null;
var pingData = 1;
var prefix = "ws://";
function connect(){
socket = new WebSocket(prefix + document.getElementById("websocket_url").value);
socket.onopen = function() {
console.log("Connection established");
};
socket.onclose = function(event) {
if (event.wasClean) {
console.log("Connection closed clean");
} else {
console.log("Connection aborted (e.g. server process killed)");
}
console.log("Code: " + event.code + " reason: " + event.reason);
};
socket.onmessage = function(event) {
console.log("Data received: " + event.data);
};
socket.onerror = function(error) {
console.log("Error: " + error.message);
};
}
function ping(){
if( !socket || (socket.readyState != WebSocket.OPEN)){
console.log("Websocket connection not establihed");
return;
}
socket.send(pingData++);
}
</script>
</head>
<body>
ws://<input id="websocket_url">
<button onclick="connect()">connect</button>
<button onclick="ping()">ping</button>
</body>
</html>
Only thing to do left is to enter server address into the textbox of the Index page (websocket.vcap.me in my case), press Connect button and we have working Websocket connection over TCP which could be tested by sending Ping and receiving echo. That worked well in Chrome, however there were some issues with IE 10 and Firefox.
What about "TRUE" TCP socket, there is no exact info: according to the last paragraph here you cannot use any port except 80 and 443 (HTTP and HTTPS) to communicate with your app from outside of Cloud Foundry, which makes me think TCP socket cannot be implemented. However, according to this answer, you can actually use any other port... It seems that some deep investigation on this question is required...
"Cloud Foundry uses an L7 router (ngnix) between clients and apps. The router needs to parse HTTP before it can route requests to apps. This approach does not work for non-HTTP protocols like WebSockets. Folks running node.js are going to run into this issue but there are no easy fixes in the current architecture of Cloud Foundry."
- http://www.subbu.org/blog/2012/03/my-gripes-with-cloud-foundry
I decided to go with pubnub for all my messaging needs.

Change port without losing data

I'm building a settings manager for my http server. I want to be able to change settings without having to kill the whole process. One of the settings I would like to be able to change is change the port number, and I've come up with a variety of solutions:
Kill the process and restart it
Call server.close() and then do the first approach
Call server.close() and initialize a new server in the same process
The problem is, I'm not sure what the repercussions of each approach is. I know that the first will work, but I'd really like to accomplish these things:
Respond to existing requests without accepting new ones
Maintain data in memory on the new server
Lose as little uptime as possible
Is there any way to get everything I want? The API for server.close() gives me hope:
server.close(): Stops the server from accepting new connections.
My server will only be accessible by clients I create and by a very limited number of clients connecting through a browser, so I will be able to notify them of a port change. I understand that changing ports is generally a bad idea, but I want to allow for the edge-case where it is convenient or possibly necessary.
P.S. I'm using connect if that changes anything.
P.P.S. Relatively unrelated, but what would change if I were to use UNIX server sockets or change the host name? This might be a more relevant use-case.
P.P.P.S. This code illustrates the problem of using server.close(). None of the previous servers are killed, but more are created with access to the same resources...
var http = require("http");
var server = false,
curPort = 8888;
function OnRequest(req,res){
res.end("You are on port " + curPort);
CreateServer(curPort + 1);
}
function CreateServer(port){
if(server){
server.close();
server = false;
}
curPort = port;
server = http.createServer(OnRequest);
server.listen(curPort);
}
CreateServer(curPort);
Resources:
http://nodejs.org/docs/v0.4.4/api/http.html#server.close
I tested the close() function. It seems to do absolute nothing. The server still accepts connections on his port. restarting the process was the only way for me.
I used the following code:
var http = require("http");
var server = false;
function OnRequest(req,res){
res.end("server now listens on port "+8889);
CreateServer(8889);
}
function CreateServer(port){
if(server){
server.close();
server = false;
}
server = http.createServer(OnRequest);
server.listen(port);
}
CreateServer(8888);
I was about to file an issue on the node github page when I decided to test my code thoroughly to see if it really is a bug (I hate filing bug reports when it's user error). I realized that the problem only manifests itself in the browser, because apparently browsers do some weird kind of HTTP request keep alive thing where it can still access dead ports because there's still a connection with the server.
What I've learned is this:
Browser caches keep ports alive unless the process on the server is killed
Utilities that do not keep caches by default (curl, wget, etc) work as expected
HTTP requests in node also don't keep the same type of cache that browsers do
For example, I used this code to prove that node http clients don't have access to old ports:
Client-side code:
var http = require('http'),
client,
request;
function createClient (port) {
client = http.createClient(port, 'localhost');
request = client.request('GET', '/create');
request.end();
request.on('response', function (response) {
response.on('end', function () {
console.log("Request ended on port " + port);
setTimeout(function () {
createClient(port);
}, 5000);
});
});
}
createClient(8888);
And server-side code:
var http = require("http");
var server,
curPort = 8888;
function CreateServer(port){
if(server){
server.close();
server = undefined;
}
curPort = port;
server = http.createServer(function (req, res) {
res.end("You are on port " + curPort);
if (req.url === "/create") {
CreateServer(curPort);
}
});
server.listen(curPort);
console.log("Server listening on port " + curPort);
}
CreateServer(curPort);
Thanks everyone for the responses.
What about using cluster?
http://learnboost.github.com/cluster/docs/reload.html
It looks interesting!

Resources