Sockjs/socketio disconnect delay behind apache proxy - node.js

I need to handle users disconnecting from my sockjs application running in xhr-polling mode. When I connect to localhost, everything works as expected. When I put apache between nodejs and browser, I get ~20 sec delay between closed browser and disconnect event inside nodejs. My apache proxy config is following:
<Location />
ProxyPass http://127.0.0.1:8080/
ProxyPassReverse http://127.0.0.1:8080/
</Location>
The rest of the file is default, you can see it here. I tried playing with ttl=2 and timeout=2 options, but either nothing changes, or I get reconnected each 2 seconds without closing browser. How can I reduce additional disconnect timeout, introduced, but apache, somewhere in its defaults?

It's possible that your Apache server is configured to use HTTP Keep Alive which will keep a persistent connection open. In that case I would try disabling KeepAlive, or lowering the KeepAliveTimeout setting in your Apache configuration to see if this solves the problem.
If that doesn't work, I would also take a look at netstat and see what is the status of each socket and start a root cause analysis. This chart gives is the TCP state machine and can tell you where each connection is. Wireshark can also give you some information on what is going on.

In long polling the connection happens like below
<client> ---> apache ---> <node.js>
When client breaks the connection
<client> -X-> apache ---> <node.js>
Apache still keeps the connection open. Now there are two workaround to this
ProxyTimeout
You can add below to your apache config
ProxyTimeout 10
This will break the connection after 10 seconds, but then this break every long polling connection after 10 seconds, which you don't want
Ping
Next option is to ping the client
pingTimeout (Number): how many ms without a pong packet to consider the connection closed (60000)
pingInterval (Number): how many ms before sending a new ping packet (25000).
var io = require('socket.io')(server, { 'transports': ['polling'], pingTimeout: 5000, pingInterval: 2500});
Above will make sure the client is disconnected within 5 seconds of going off, you can lower it again further but then this may impact the usual loading scenarios
But reading through all the posts, threads and sites, I don't think you can replicate the behavior you get when connect to socket.io directly, because the connection break then can be detected easily by socket.io

The 20 sec delay not in the apache proxy. I got the same issue, delay not happening in the local URL, but delay happening in global URL.
The issue was solved in the NodeJs itself. Need to send one time data from server to client to make sure it's initialized. This problem not in the documentation and issues blog at the WebSocket plugin.
Send a dummy message after request accepted in the server, like below.
let connection = request.accept(null, request.origin);
connection.on('message', function (evt) {
console.log(evt);
});
connection.on('close', function (evt) {
console.log(evt);
});
connection.send("Success"); //dummy message to the client from server

Related

NodeJs websocket getting disconnected

My NodeJs server is using websocket and accepting 2 connections, one from a web client (Firefox v83.0) and the other from an Android application.
Everything works fine and messages flow back and forth from the web client to the mobile and the opposite.
However, after what it looks to me some time of non-activity (like 30 or 60 secs) the websocket connection gets disconnected.
I have logged the close event in the server side like this:
wsServer.on('close', (data) => {
console.log("CONNECTION CLOSED")
console.log(data.closeReasonCode)
console.log(data.closeDescription)
})
And what I get is:
1006
Peer not responding.
I googled the 1006 error and the docs only tells connection was closed abnormally...great explanation....thanks :S
I need some help figuring out what is going on here, and what is this peer not responding message that I get. Since I do not know if the issue is in the server, in any of the 2 clients, is it any timer that closes the connection that I am not aware of?
Thanks.
According to "symptoms" I would say you use nginx for reverse proxy.
Try to set
proxy_read_timeout 600s; # deafult is 60 seconds
check docs

Debugging lost HTTPS UPGRADE requests on node

I have a node HTTPS linux server which handles UPGRADE requests to allow secure WebSocket connections as well as other HTTPS requests. Works well 99% of the time.
Periodically and unpredictably, client websocket connection attempts to the server timeout (client gives up after 30 seconds).
I listen for upgrade requests on the HTTPS server as follows:
server.on('upgrade', function upgrade(req, socket, head) {
console.log('Upgrade - Beg - '+req.url);
...
In the cases where the client timeout occurs, I never get the 'upgrade' event.
How can I debug this? Is there a lower-level Node https server event that I can listen for that might indicate something (so that I know the connection is actually getting to the server, for example)?
Notes:
When I detect the timeout on the client side (actually, even before the 30 seconds), I attempt another HTTPS connection to the same server (a POST). It works! The problem only seems to happen with websocket connections.
I have code that retries the websocket connection when it experiences the timeout, but usually it takes several retries before the timeout magically disappears.
Any help in how to debug this would be greatly appreciated.

Socket.io - when I run my node app on localhost I can connect, but on the production server I can't?

I'm using passport.socketio. The debug statements for the Authorization success are firing, but not the connect event. I get these debug statements on the server:
setting poll timeout
discarding transport
cleared post timeout for client laknraalkn3but
clearing poll timeout
jsonpolling writing to io.j[o]("8::");
set post timeout for client laknraalkn3but
jsonpolling closed due to exceeded duration
setting request GET /socket.io/1/jsonp-polling/laknraalkn3but
Not sure if this is the problem, but I'm trying to connect on port 8086, which some have told me could be too high of a port for a university web server. Where can I allow the ports? In the server firewall? I'm using windows server 2008 and IIS 7, and I've set up a reverse proxy to forward all traffic to port 8086.
Thanks!
This is definitely a firewall issue, but without knowing your full setup, it's kind of impossible to diagnose. Make sure port 8806 is open on the server, then see if there is a physical hardware firewall between you and the server and port 8806 is open on that as well.

Socket.io constantly polling... should it be doing this?

I have a node server and a web page connected via socket.io. I noticed in the browser console that it is outputting
XHR finished loading: GET "http://my_url/socket.io/?EIO=3&transport=polling&t=1418944327412-412&sid=vqLTUtW3QhNLwQG8AAAA".
and
XHR finished loading: POST "http://my_url/socket.io/?EIO=3&transport=polling&t=1418944385398-415&sid=vqLTUtW3QhNLwQG8AAAA".
every few seconds. Should it be doing this or am I missing a setting. I'm really only looking to send data back and forth explicitly via the socket. Perhaps I'm missing something in the set up.
Client side is basically
var socket = io("http://my_url");
with the usual event listeners. Server side is
var io = require('socket.io')(server);
I tried placing this on the server side
io.set('transports', ['websocket']);
but that seemed to kill it.
The socket.io implementation (when using webSockets) sends regular (every few seconds) heartbeat and response packets to constantly verify that the connection is alive and well. This is normal.
These packets are not actual http requests (they are websocket data packets) so there should not be full-on http packets going on unless socket.io is not actually using the webSocket protocol, but is instead using HTTP long polling. socket.io will use the webSocket protocol as long as it is supported in the client (which it should be in all modern browsers nowadays).
You may have to be careful about how you interpret requests in a debugger. A socket.io connection starts its life as an http request with some custom headers and all debuggers will show this initial http request. If webSocket is supported at both ends, then the server will return a response which "upgrades" the connection to the webSocket protocol. That same TCP socket which started out as a TCP request, then becomes a webSocket connection. Subsequent webSockets messages sent on the webSocket then flow over that TCP socket. It is up to the debugger on how it might display that traffic. In the Chrome debugger, you have to open the original http connection and then ask to see websocket traffic and only then can you actually see webSocket packets. But, I could imagine in other debuggers that weren't as webSocket saavy, they might show subsequent packets as related to that original HTTP connection (I haven't looked at how debuggers other than Chrome show webSocket traffic).
The only other reason I can think of that a client would be repeatedly sending HTTP connection requests is if the connection keeps dropping for some reason so the client keeps reconnecting every time the connection drops. socket.io has settings that can control how often/vigorously the client tries to reconnect when the connection is lost, though if you have connection issues, then you really need to figure out why there are connection issues rather than change the reconnect settings.

Socket.io connection to server doesn't work sometimes

I have setup a Node.JS server with socket.io on a VPS and I broadcast every 10 seconds the number of connected clients to all. This usually works fine, though often times, the connection can't be established and I get this error (I changed the IP a bit):
GET http://166.0.0.55:8177/socket.io/1/?t=1385120872574
After reloading the site, usually the connection can be established, though I have no idea why the failed connection happens in the first place, also I don't know how to debug the socket.io code. Sometimes I can't connect to the server anymore and I have to restart the server.
Additional information:
My main site runs on a different server (using a LAMP environment with CakePHP) than the Node.js server.
I use forever to run the server
I have a lot of connected clients (around 1000)
My VPS has 512 MB Ram and the CPU is never higher than 25%
After top command, try:
socket.on('error', function (err) {
console.log("Socket.IO Error");
console.log(err.stack); // this is changed from your code in last comment
});
Also, you could try a the slower transport. Socket.io use Websocket by default but if your server cannot allocate enough resource, you can try another transport which is slower but use less resources
io = socketIo.listen(80);
io.set('transports', ['xhr-polling']);

Resources