Serve flash policy requests on port 80 beside HTTP in node.js - node.js

I want to serve socket connections from a Flash browser client, and therefore I need to add support for the policy-request-file protocol. I can't run the policy-file-request service on the default port 843 because of firewalls etc. The only option I have is to server the protocol on port 80, beside my HTTP server.
My app is written in node.js and the following code works:
var httpServer = http.createServer();
net.createServer(function(socket){
httpServer.emit('connection', socket);
}).listen(80);
I open a socket server on port 80, and for now I just emit the connection event on the httpServer, no problem so far. Now I want to check if the new socket is a policy-file-request which will just send the plain string <policy-file-request /> over a TCP connection. When I notice this string I know it isn't HTTP and I can return the crossdomain file and close the socket. So what I try now is this:
net.createServer(function(socket){
socket.once('readable', function(){
var chunk = socket.read(1);
// chunk[0] === 60 corresponds to the opening bracket '<'
if(chunk !== null && chunk[0] === 60) {
socket.end(crossdomain);
} else {
socket.unshift(chunk);
httpServer.emit('connection', socket);
}
});
}).listen(80);
Now I check if the first byte is the opening bracket '<' and then write the crossdomain file to the socket. Otherwise I unshift the chunk onto the stream and emit the socket as a connection on the HTTP-server. Now the problem is that the HTTP-server doesn't emit a request event anymore and my regular HTTP-requests aren't handled as a result.
I also tried this solution but with no success either:
httpServer.on('connection', function(socket){
socket.once('data', function(chunk){
if(chunk[0] === 60) {
socket.end(crossdomain);
}
})
});
When the socket emits the data event, the readyState of the socket is already 'closed' and a clientError event is already thrown by the httpServer. I searched everywhere and didn't found a solution. I also don't want to pipe the data through another socket to another port where my HTTP server is running locally, because that adds to many, unnecessary overhead I think. Is there a clean way to do this in node.js? I tested all this on node.js version 0.10.26.

Related

Combining Nodejs Net socket and Socket IO

I have a windows application (Built on C# as windows service) that sends data to NodeJs Net Socket, So since Socket.IO helps making a Web Application a live one , without the need of reload. How can i allow Socket.IO stream the received data from NodeJs Net Socket to the Web Application , in the exact moment the Net Socket receives data from C#?
So in the code that receives the socket data from C#:
var net = require('net');
net.createServer(function (socket) {
socket.on('data', function (data) {
broadcast(socket.name + "> \n" + data + " \n", socket);
socket.end("<EOF>");
//send data to web interface , does it work that way?
//SomeFooToSendDataToWebApp(Data)
});
});
Further more for the Socket.IO i have those lines , which i cant really figure out how to deal with them:
//Should it listen to net socket or web socket?
server.listen(8080);
// Loading socket.io
var io = require('socket.io').listen(server);
// It works but only for one request
io.sockets.on('connection', function (socket2) {
socket2.emit('message' , 'Message Text');
});
P.S: I am new to nodejs & socket.io , so if its possible as well to explain their behavior.
Edit 1 : My Front End Javascript to check it if it has any problems:
//for now it listens to http port , which Socket.IO listens to
var socket = io.connect('http://localhost:8080');
var myElement = document.getElementById("news");
socket.on('message', function(message) {
document.getElementById("news").innerHTML = message;
})
Edit 2 : Did follow jfriend00's answer as it seems my previous code tries were trying to send messages to an unknown socket, i only added this since i needed it to be sent to all the connected clients , so only one line fixed it !
socket.on('data', function (data) {
broadcast(socket.name + "> \n" + data + " \n", socket);
socket.end("<EOF>");
//send data to web interface , does it work that way?
//The Added code here:
io.emit('message',data + " more string");
});
It's a bit hard to tell exactly what you're asking.
If you have some data you want to send to all connected socket.io clients (no matter where the data came from), then you can do that with:
io.emit("someMessage", dataToSend);
If you want to send to only one specific connected client, then you have to somehow get the socket object for that specific client and then do:
socket.emit("someMessage", dataToSend);
How you get the specific socket object for the desired connected client depends entirely upon how your app works and how you know which client it is. Every socket connection on the server has a socket.id associated with it. In some cases, server code uses that id to keep track of a given client (such as putting the id in the session or saving it in some other server-side data). If you have the id for a socket, you can get to the socket with the .to() method such as:
io.to(someId).emit("someMessage", dataToSend);
Your question asked about how you send data received from some C# service over a normal TCP socket. As far as sending it to a socket client, it does not matter at all where the data came from or how you received it. Once you have the data in some Javascript variable, it's all the same from there whether it came from a file, from an http request, from an incoming TCP connection in your C# service, etc... It's just data you want to send.
You can try the following, simple server:
const io = require('socket.io')(8080);
io.on('connection', socket => {
console.log('client connected');
socket.on('data', data => {
io.emit('message', data);
});
});
console.log('server started at port 8080');
It should work if I understand the problem correctly.
And maybe document.getElementById("news").innerHTML += message; in the html client code to see what really happens there?
socket2 means your client which just connected. So you can store these connections to send data to them (helpful for broadcast).
If you get data from windows service via some polling mechanism, on this step you can send this message to your connected clients. So keep your connections in a array to send specific messages each client afterwards

Socket.IO, SSL Problems With cloudflare

I'm having a socket.io app that basically receives signals from a frontend in order to kill and start a new ffmpeg process (based on .spawn()).
Everything works like expected, but often I get a 525 error from cloudflare. The error message is: Cloudflare is unable to establish an SSL connection to the origin server.
It works like 9 out of 10 times.I noticed that more of these errors pop up whenever a kill + spawn is done. Could it be the case that something block the event loop and because of this blocks all incoming requests and cloudflare logs these as a handshake failed error?
Contacting cloudflare support gives me back this info (this is the request they do to my server):
Time id host message upstream
2017-08-16T09:14:24.000Z 38f34880faf04433 xxxxxx.com:2096 peer closed connection in SSL handshake while SSL handshaking to upstream https://xxx.xxx.xxx.xxx:2096/socket.io/?EIO=3&transport=polling&t=LtgKens
I'm debugging for some time now, but can't seem to find a solutions myself.
This is how I initialize my socketIO server.
/**
* Start the socket server
*/
var startSocketIO = function() {
var ssl_options = {
key: fs.readFileSync(sslConfig.keyFile, 'utf8'),
cert: fs.readFileSync(sslConfig.certificateFile, 'utf8')
};
self.app = require('https').createServer(ssl_options, express);
self.io = require('socket.io')(self.app);
self.io.set('transports', ['websocket', 'polling']);
self.app.listen(2096, function() {
console.log('Socket.IO Started on port 2096');
});
};
This is the listener code on the server side
this.io.on('connection', function (socket) {
console.log('new connection');
/**
* Connection to the room
*/
socket.on('changeVideo', function (data) {
//Send to start.js and start.js will kill the ffmpeg process and
start a new one
socket.emit('changeVideo');
});
});
Another thing that I observer while debugging (I only got this a few times):
The text new connection displayed on the server and the connected client emits the changevideo event but nothing happens on the server side instead the client just
keeps reconnecting.
This is a simplified version of the nodejs code. If you have more questions, just let me know.
Thanks!

Nodejs Error: This socket has been ended by the other party

I have setup a simple server and client, however whenever I close the client, it seems not possible to reconnect. Here's my client:
const net = require('net');
var client = net.connect({port: 8080, host: '127.0.0.1'});
var response = '';
// events
client.on('data', function(chunk) { response += chunk });
client.on('end', function() {
console.log(response);
client.end()
});
// main execution
client.write('test');
And here's my server:
const net = require('net');
var server = net.createServer();
server.listen(8080, '127.0.0.1');
server.on('connection', function(sock) {
sock.on('data', function(chunk) {
sock.write('test received');
sock.end();
});
});
This is just outline code representing my issue. When I execute my client the first time, everything works correctly. However, when I execute it again, the server outputs the error mentioned in the title and crashes. The same happens if I remove 'client.end()' and instead Ctrl+C out of the client program to cause it to end.
My understanding of sockets is that they represent endpoints in a stream between the client and the server. When that stream is no longer necessary (i.e, when the client does what it needs to do), I want that stream to be completely removed. I would think that calling end() on both the client and server endpoints of that single stream would achieve this, like sending two FIN messages, but as explained it does not. The reasons I want to do this are so: (a) the client file will actually finish execution and (b) the server will no longer have its socket endpoint of that stream in its system/waste resources listening to it.
Any insight into the source of my problem would be appreciated.
You should use the 'connect' event on the client-side to be sure that you perform requests only when your socket is ready. So, in the callback on the event, you can invoke write() function.
You can check if socket is not destroyed before writing.
if (!socket.destroyed) socket.write("something");
Your server only closes the socket when data is received, and your client never sends any.

Setting up a Stateless connection with NodeJS and Socket.IO

After prototyping my project using PHP and Unity3D i've decided on building the production version using Cordova and NodeJS.
I'm currently using Socket.io with NodeJS and having some confusion with connections. The way that I had expected this to work out was the following procedure:
The client would connect to the server with a request
The server would respond to the request
The connection would be closed
However, it seems that the connection likes to stay open, and if the connection is closed, it continuously attempts to reconnect which is not what I am looking for. I'm attempting to establish a single state of data transfer, similar to what happens when you make a web-request to a PHP file.
The source code of the project is pretty much boilerplate code:
var application = require('express')();
var http = require('http').Server(application);
var server = require('socket.io')(http);
http.listen(8080, function() {
console.log('Listening on *:8080');
});
server.on('connection', function(socket) {
console.log('SERVER: A new connection has been received.');
server.on('disconnect', function() {
console.log('SERVER: A connection has been closed.');
});
});
I do not need a persistent connection, nor do I want one.
Thoughts: I could send a close handshake from the client. For example:
Send some data to the server
Recieve some data from the server
Send a close request to the server / just close the socket
Continue application logic once the socket is closed
Would this be the proper way to handle this? However then the question arises, what if the data gets lost, then there's a permanently open socket. Would implementing a basic timeout be ideal in this situation? (IE: If a response isn't received within 10 seconds, there was an error or the server was not available).
Then Socket.io is the wrong tool for your scenario. socket.io needs to keep the socket open to get events from the server back to the client (and vice-versa). As a matter of fact, even of the server does not support WebSockets, socket.io will resort back to other mechanisms, such as polling.
Not sure why you're using socket.io for this. Socket IO is used for different purpose and doesn't fir your criteria here. I have seen mainly its uses in real time application and binary streaming.You can try TCP socket in node.js
var net = require('net');
var HOST = '127.0.0.1';
var PORT = 6969;
// Create a server instance, and chain the listen function to it
// The function passed to net.createServer() becomes the event handler for the 'connection' event
// The sock object the callback function receives UNIQUE for each connection
net.createServer(function(sock) {
// We have a connection - a socket object is assigned to the connection automatically
console.log('CONNECTED: ' + sock.remoteAddress +':'+ sock.remotePort);
// Add a 'data' event handler to this instance of socket
sock.on('data', function(data) {
console.log('DATA ' + sock.remoteAddress + ': ' + data);
// Write the data back to the socket, the client will receive it as data from the server
sock.write('You said "' + data + '"');
});
// Add a 'close' event handler to this instance of socket
sock.on('close', function(data) {
console.log('CLOSED: ' + sock.remoteAddress +' '+ sock.remotePort);
});
}).listen(PORT, HOST);
console.log('Server listening on ' + HOST +':'+ PORT);
Check out here

Is it possible to enable tcp, http and websocket all using the same port?

I am trying to enable tcp, http and websocket.io communication on the same port. I started out with the tcp server (part above //// line), it worked. Then I ran the echo server example found on websocket.io (part below //// line), it also worked. But when I try to merge them together, tcp doesn't work anymore.
SO, is it possible to enable tcp, http and websockets all using the same port? Or do I have to listen on another port for tcp connections?
var net = require('net');
var http = require('http');
var wsio = require('websocket.io');
var conn = [];
var server = net.createServer(function(client) {//'connection' listener
var info = {
remote : client.remoteAddress + ':' + client.remotePort
};
var i = conn.push(info) - 1;
console.log('[conn] ' + conn[i].remote);
client.on('end', function() {
console.log('[disc] ' + conn[i].remote);
});
client.on('data', function(msg) {
console.log('[data] ' + conn[i].remote + ' ' + msg.toString());
});
client.write('hello\r\n');
});
server.listen(8080);
///////////////////////////////////////////////////////////
var hs = http.createServer(function(req, res) {
res.writeHead(200, {
'Content-Type' : 'text/html'
});
res.end(['<script>', "var ws = new WebSocket('ws://127.0.0.1:8080');", 'ws.onmessage = function (data) { ws.send(data); };', '</script>'].join(''));
});
hs.listen(server);
var ws = wsio.attach(hs);
var i = 0, last;
ws.on('connection', function(client) {
var id = ++i, last
console.log('Client %d connected', id);
function ping() {
client.send('ping!');
if (last)
console.log('Latency for client %d: %d ', id, Date.now() - last);
last = Date.now();
};
ping();
client.on('message', ping);
});
You can have multiple different protocols handled by the same port but there are some caveats:
There must be some way for the server to detect (or negotiate) the protocol that the client wishes to speak. You can think of separate ports as the normal way of detecting the protocol the client wishes to speak.
Only one server process can be actually listening on the port. This server might only serve the purpose of detecting the type of protocol and then forwarding to multiple other servers, but each port is owned by a single server process.
You can't support multiple protocols where the server speaks first (because there is no way to detect the protocol of the client). You can support a single server-first protocol with multiple client-first protocols (by adding a short delay after accept to see if the client will send data), but that's a bit wonky.
An explicit design goal of the WebSocket protocol was to allow WebSocket and HTTP protocols to share the same server port. The initial WebSocket handshake is an HTTP compatible upgrade request.
The websockify server/bridge is an example of a server that can speak 5 different protocols on the same port: HTTP, HTTPS (encrypted HTTP), WS (WebSockets), WSS (encrypted WebSockets), and Flash policy response. The server peeks at the first character of the incoming request to determine if it is TLS encrypted (HTTPS, or WSS) or whether it begins with "<" (Flash policy request). If it is a Flash policy request, then it reads the request, responds and closes the connection. Otherwise, it reads the HTTP handshake (either encrypted or not) and the Connection and Upgrade headers determine whether it is a WebSocket request or a plain HTTP request.
Disclaimer: I made websockify
Short answer - NO, you can't have different TCP/HTTP/Websocket servers running on the same port.
Longish answer -
Both websockets and HTTP work on top of TCP. So you can think of a http server or websocket server as a custom TCP server (with some state mgmt and protocol specific encoding/decoding). It is not possible to have multiple sockets bind to the same port/protocol pair on a machine and so the first one will win and the following ones will get socket bind exceptions.
nginx allows you to run http and websocket on the same port, and it forwards to the correct appliaction:
https://medium.com/localhost-run/using-nginx-to-host-websockets-and-http-on-the-same-domain-port-d9beefbfa95d

Resources