Persist websocket connection object across the multiple server - node.js

I am using a websocket library on server for establishing socket connection.
https://github.com/websockets/ws
I have a more than one server in cluster, I want to know how can I use same socket connection object on another server in cluster.
And also I want to know what is the best option for webchat implementation native websocket or socket.io

You cannot use the same actual socket object across multiple servers. The socket object represents a socket connection between a client and one physical server process. It is possible to build a virtual socket object that would know what server its connection is on, send that server a message to then send out over the actual socket from that other server.
The socket.io/redis adapter is one such virtual ways of doing this. You set up a node.js cluster and you use the redis adapter with socket.io. It uses a central redis-based store to keep track of which serve process each physical connection is one. Then, when you want to send a message to a particular client from any of the server processes, you send that message through socket.io and it looks up for you in the redis database where that socket is connected, contacts that actual server and asks it to send the message to that particular client over the socket.io connection that is currently present on that other server process. Similarly, you can broadcast to groups of sockets and it will do all the work under the covers of making sure the message gets to the clients no matter which actual server they are connected to.
You could surely build something similar yourself for plain webSocket connections and others have built pieces of it. I'm not familiar enough with what exists out there in the wild to offer any recommendations for a plain webSocket. There plenty of articles on scaling webSocket servers horizontally which you can find with Google and read to get started if you want to do it with a plain webSocket.

Related

Multiple Socket.io app processes cause each client socket connects and disconnects repeatedly

I am working on a nodejs app with Socket.io and I did a test in a single process using PM 2 and it was no errors. Then I move to our production environment(We use Google Cloud Compute Instance).
I run 3 app processes and a iOS client connects to the server.
By the way the iOS client doesn't keep the socket connection. It doesn't send disconnect to the server. But it's disconnected and reconnect to the server. It happens continuously.
I am not sure why the server disconnects the client.
If you have any hint or answer for this, I would appreciate you.
That's probably because requests end up on a different machine rather than the one they originated from.
Straight from Socket.io Docs: Using Multiple Nodes:
If you plan to distribute the load of connections among different processes or machines, you have to make sure that requests associated with a particular session id connect to the process that originated them.
What you need to do:
Enable session affinity, a.k.a sticky sessions.
If you want to work with rooms/namespaces you also need to use a centralised memory store to keep track of namespace information, such as the Redis/Redis Adapter.
But I'd advise you to read the documentation piece I posted, things might have changed a bit since the last time I've implemented something like this.
By default, the socket.io client "tests" out the connection to its server with a couple http requests. If you have multiple server requests and those initial http requests don't go to the exact same server each time, then the socket.io connect will never get established properly and will not switch over to webSocket and it will keep attempting to use http polling.
There are two ways to fix this.
You can configure your clients to just assume the webSocket protocol will work. This will initiate the connection with one and only one http connection which will then be immediately upgraded to the webSocket protocol (with socket.io running on top of that). In socket.io, this is a transport option specified with the initial connection.
You can configure your server infrastructure to be sticky so that a request from a given client always goes back to the exact same server. There are lots of ways to do this depending upon your server architecture and how the load balancing is done between your servers.
If your servers are keeping any client state local to the server (and not in a shared database that all servers access), then you will need even a dropped connection and reconnect to go back to the same server and you will need sticky connections as your only solution. You can read more about sticky sessions on the socket.io website here.
Thanks for your replies.
I finally figured out the issue. The issue was caused by TTL of backend service in Google Cloud Load Balancer. The default TTL was 30 seconds and it made each socket connection tried to disconnect and reconnect.
So I updated the value to 3600s and then I could keep the connection.

node js, socket.io, redis and pm2

Our system includes a NodeJs Restful API server. This server also serves as a socket IO server. Many devices will connect to the server by socket io, the user can control the device by calling restful API, the server will transfer command to the device through the socket IO. We used pm2 to cluster the API server. Can you help how to use Redis.io server to support to send a message from a cluster to a specific socket instance?
If you already have a redis server setup, all you have to do is setup the socket.io-redis adapter: https://www.npmjs.com/package/socket.io-redis
From there, it's best to have the incoming socket.io connections join a device specific room. This is typically achieved with a socket.join() from the connection event.
From there, you can call your .to().emit() method to send the data to the device. Using rooms allows you to ignore the socket and transmit the message to the device no matter which instance they are connected to

Can a socket.io server communicate with a non-socket.io client

I am building a chat server which uses a custom protocol for communicating with clients over sockets. The client sends specific strings to the server and the server must take an appropriate action based on these non-standard messages. I can't change this protocol, nor do I have any access to the underlying client code.
My question is, can I use the node.js socket.io package to power the server socket communication if I have no idea how the client is handling it's socket activity? I'm asking because, reading through the socket.io docs, each time I push anything through a socket an 'event' is associated with each action.
Is it possible to send a very exact message to the client from the server with this 'event' bundled in? Am I better off using the websockets package?
Can a socket.io server communicate with a non-socket.io client
No. A socket.io server requires both the webSocket protocol for connection initiation and the socket.io format on top of that. So, a socket.io server can only talk to a socket.io client and vice versa.
If your chat client uses a custom protocol, then you need to implement a TCP server that also speaks that custom protocol (whatever it is).
If you can modify the client, then you can modify it to use a socket.io client and then you can send your chat messages via socket.io where your socket.io server can then receive those messages.
The client sends specific strings to the server and the server must take an appropriate action based on these non-standard messages. I can't change this protocol, nor do I have any access to the underlying client code.
Then, you have to implement a server that speaks your custom client protocol based on whatever the underlying protocol is for the client. There is no other way around it.
I'm asking because, reading through the socket.io docs, each time I push anything through a socket an 'event' is associated with each action.
This is how the socket.io layer works. It sends a message and (optional) data. This can always be used to just send data by just declaring a generic data message and then just listening for the data message on the other end. But, this assumes you can modify both client and server to work this way. If you can't modify your client to use the socket.io protocol, then you can't use socket.io.

socketio inter server communication with redis and haproxy

I'm working on a project which uses SocketIO and should be Horizontally scalable. Im using
A Load Balancer using HAProxy
Multiple Node Servers (2-4)
Database server(Redis and MongoDB)
I'm able to redirect my incoming Socket connections to Node servers using roundrobin method. Socket connection is stable and if I use socket.emit() I'm receiving the data. I'm also able to emit to the Other socket connection connected to the same Node server.
I'm facing issue in the following scenario:
User A connected to Node server 1 and User B connected to Node Server 2
My intention is to store the Socket data in redis
If User A wants to send some data to User B, how can I tell the Node server 2 to emit the data to User B from Node server 1
Please let me know how can I achieve this (with ref if possible).
Thanks in advance.
This scenario is a match for the case Pub/Sub of Redis.
If you haven't already, you should try Pub/Sub.
Have a look at socket.io Redis adapter. It should be exactly what you need.
clients() method in particular looks promising. Keep in mind, that socket.io creates a unique room for each client.

Is it possible to both listen and send messages to a remote socket using node-red's websocket node?

The Almond Plus router and home automation hub now exposes the state of its registered z-wave and zigbee sensors via a websocket.
The websocket API is here:
https://wiki.securifi.com/index.php/Websockets_Documentation
I've aimed a node-red websocket node at the router's local IP address, and have included authentication information in the URL. This seems to work for receiving status changes to devices.
But I need to also be able to send commands over the websocket to flip switches and whatnot. When I create both 'listen on' and 'connect to' websocket nodes in node-red, only the node that's listening connects. Do I need to make two nodes at all? I would have hoped there'd be a way to make one connection to the websocket and use it for two-way communication, but maybe this question just exposes my ignorance of how either websockets or node-red function.
Thanks for any help or information.
You should need both a Websocket in and a Websocket out, but both should be set to "connect to" mode. These handle the input and output side of the connection
I'd have to double check the code, but they should share the same client connection under the covers so there will only be one real Websocket connection. But even if the 2 nodes don't share the connection having 2 separate websocket connections to the same address shouldn't be a problem.

Resources