client not handshaken client should reconnect, socket.io in cluster - node.js

My node.js app with express, redis, mongo and socket.io works prefectly.
Today when I introduced cluster, I found that it works, but there are lot of messages by the app,
'client not handshaken' 'client should reconnect'
Many a times the response time from socket.io is very bad, up several seconds.
Then I used http-proxy front ending the requests from the browsers. Chrome works intermittently without throwing up such messages. Sometimes if I open the same URL again, it starts throwing up these messages and response is delayed.
Firefox also behaves the same way. randomly, it starts throwing these messages continuously..
Looks like some problem with websocket in clustered environment.
My node.js version is 0.6.10, socket.io 0.9.0, express 2.5.9, http-proxy 0.8.0

This is most probably because Socket.IO keeps your connections in memory, so each server will have it's own set of clients. To share Socket.IO over multiple server instances, look into using their RedisStore. The same applies to Express sessions, where you have connect-redis as an option.

Related

Why socket.io disconnects for every 5 second and then reconnects on client side?

This might be a simple question.Any help is really appreciated.
I am using socket.io with express framework on AWS.
I am running a timer using socket.io .I need to constantly emit server time for each seconds.My sockets get disconnected for every 5 seconds and then reconnects automatically.This causes a break in my timer.I really believe that this won't be a problem with code as everything worked good in my previous server.
Is there any configuration that we need to handle in AWS to avoid such disconnects?

How to run HTTP server, UDP Server and WebSocket Server from a single NodeJS app?

As NodeJS is single threaded runtime platform, how to run the following servers in parallel from within single NodeJS app:
NodeJS's http server: to serve HTML5 app
A WebSocket server: to serve WebSocket connection to HTML5 app using same http connection opened at http server.
UDP server: to expose service discovery endpoint for other independently running NodeJS apps on same machine or on other machines/docker containers.
I was thinking about somehow achieving the above by using RxJS, but would rather want to listen to the community about their solution/experiences.
Node.js is not single threaded. The developer only has access to a thread. But under the hoods, node.js is multi-threaded.
Specifically for your question, You can start multiple servers in the same process. Socket.io getting started example shows running websockets with http server. Same thing can also be done with UDP.
Hope that helps.
First off, you can have as many listening servers as you want in your node.js process. As long as you write proper asynchronous code in your handlers and don't have any CPU-hogging algorithms to run, you should be just fine.
Second, your webSocket and http server can be the exact same server process as that's how webSocket was designed to work.
Your UDP listener then just needs to be on some different port from your web server.
The single-threaded aspect of node.js applies only to your Javascript. You can run multiple server listeners just fine. If two requests on different servers come in at the same time, the one that arrives slightly before the other will get its handler called and the one arrive just a bit later will be queued until the handler for the first is done or returns while waiting for an asynchronous operation itself. In this way, the single threaded node.js can handle many requests.

User not disconnecting in Socket.IO Node.Js app ONLY on https connection - caching issue?

I'm having this issue with my Socket.Io / Node.Js app that when I reload a page Socket.IO still thinks that the "previous" me is connected to the chat room, so it thinks there's two people in the chat room instead of one. After a while (maybe like a minute) this problem disappears.
I tried resolving it but I think there's some issue with cache - that even though I reload the browser window the previous session is still active, so it "thinks" two people are connected.
This issue doesn't occur at all times, but always on HTTPS connection, and almost always on iOS and sometimes in Safari / Chrome and other browsers (on all systems with https connection).
Do you know what the issue might really be and what would be the best way to resolve it?
I use all the standard code for setting up the Socket.IO connection and the app is running on Express / Node.Js.
I can put the code here, but it's quite a lot, the open source code is available on http://github.com/noduslabs/infranodus (the app.js file and public/entries.js is where the socket.io code is).
Thank you!

socketio security issue and differences of socketio and long polling

What are the security issues for the socket io in nodejs
Which one is better for real time updation using node js. Either socket io or long polling.
Socket.io is a websocket. If you deploy your code in a shared hosting environment or if you are going via a firewall the websocket protocol might not work.
You can configure socket.io to default to a long-polling strategy in that case (which uses XHR requests). You will send your data normally to the websocket API, and it will decide which strategy to use. Long-polling is more cpu-consuming and it uses 2 sockets as it stablishes a 2 ways communication with the server.

socket.io, node cluster, express, session store with redis

I'm not exactly sure how to describe this, but I'm running a node app with a cluster on 4 cores, port 80, using RedisStore as the socket.io store, express.js, and socket.io listens on it.
Some interesting behavior that happens is that on around 40% of clients that connect to socket.io using Chrome (and Firefox, but we stopped bothering using different browsers because it seems to be across the board), it connects and works fine for the first 25-30 seconds, then after that there is 60 seconds of dead time where requests are sent from the client but none are received or acknowledged by the server, and at 1.5 min - 1.6 min, the client starts a new websocket connection. Sometimes that new websocket connection has the same behavior, at other times the connection "catches" and is persistent for the next few hours and is absolutely fine.
What's funny is that this behavior doesn't happen on our test server that runs on a different port (which also uses a cluster); and moreover it doesn't happen on any of our local development servers (some of which implement clusters, others of which don't).
Any ideas?

Resources