Server Sent Events NodeJS limits - node.js

Is there a limit how many clients a NodeJs server can handle with server sent event? As far as I understood, the server has to keep a connection to be able to send messages to the browser. How can I know how many connections be be held open without pen-testing, since it will be a shared hosting for now.

Related

Connection between multiple clients to a single server in Nodejs

I need to create 20 clients which make requests simultaneously to a server in nodejs using websocket. I am able to create the connection between a single server and a client using websocket. But when it comes to creating 20 clients, I am not having any idea to proceed. Please give any suggestions to proceed.
You wouldn't need to create 20 html pages. Same HTML page can be loaded by multiple clients.
On server-side, the 'request' event will fire every time a client connects to your websocket server. Your websocket server will be able to handle multiple clients out of the box. However, you will need to ascertain 'which' client this particular request originated from. That can be done by using tokens or credentials, or any other custom protocol that you want to establish between your client and server.
Check the server-side usage example for websocket module here: https://www.npmjs.com/package/websocket#server-example

Which is the better way to implement heartbeat on the client side for websockets?

On the Server side for websockets there is already an ping/pong implementation where the server sends a ping and client replies with a pong to let the server node whether a client is connected or not. But there isn't something implemented in reverse to let the client know if the server is still connected to them.
There are two ways to go about this I have read:
Every client sends a message to server every x seconds and whenever
an error is thrown when sending, that means the server is down, so
reconnect.
Server sends a message to every client every x seconds, the client receives this message and updates a variable on the client, and on the client side you have a thread that constantly checks every x seconds which checks if this variable has changed, if it hasn't in a while it means it hasn't received a message from the server and you can assume the server is down so reestablish a connection.
You can achieve trying to figure out on client side whether the server is still online using either methods. The first one you'll be sending traffic to the server whereas the second one you'll be sending traffic out of the server. Both seem easy enough to implement but I'm not so sure which is the better way in terms of being the more efficient/cost effective.
Server upload speeds are higher than client upload speeds, but server CPUs are an expensive resource while client CPUs are relatively cheap. Unloading logic onto the client is a more cost-effective approach...
Having said that, servers must implement this specific logic (actually, all ping/timeout logic), otherwise they might be left with "half-open" sockets that drain resources but aren't connected to any client.
Remember that sockets (file descriptors) are a limited resource. Not only do they use memory even when no traffic is present, but they prevent new clients from connecting when the resource is maxed out.
Hence, servers must clear out dead sockets, either using timeouts or by implementing ping.
P.S.
I'm not a node.js expert, but this type of logic should be implemented using the Websocket protocol ping rather than by your application. You should probably look into the node.js server / websocket framework and check how to enable ping-ing.
You should set pings to accommodate your specific environment. i.e., if you host on Heroku, than Heroku will implement a timeout of ~55 seconds and your pings should be sent before this timeout occurs.

Why messages from Socket.IO server are recieved from time to time?

I have socket.io server in node.js. All connections come through NGINX. The client is written in C# with Quobject/SocketIoClientDotNet library.
The problem is that client receives messages from server only from time to time.
I have logs in node.js code, so the server tries to send messages. Moreover, there are multiple processes with TIME_WAIT state in server (I gat that by netstat) and the number of that processes is equal to number of unsuccessful send attempts by socket.io server.
Otherwise, server always receive messages from clients.
I made nginx settings ("upgrade" headers, etc.) but it didn't help.
I turned off Windows Firewall but it didn't help.
So, I don't know why such situation happens, I don't know where else to look at and I will appreciate any help from community.

Sending data from RabbitMQ to Node.JS via Socket.IO

I am going to design a system where there is a two-way communication between clients and a web application. The web application can receive data from the client so it can persist it to a DB and so forth, while it can also send instructions to the client. For this reason, I am going to use Node.JS and Socket.IO.
I also need to use RabbitMQ since I want that if the web application sends an instruction to a client, and the client is down (hence the socket has dropped), I want it to be queued so it can be sent whenever the client connects again and creates a new socket.
From the client to the web application it should be pretty straightforward, since the client uses the socket to send the data to the Node.JS app, which in turn sends it to the queue so it can ultimately be forwarded to the web application. From this direction, if the socket is down, there is no internet connection, and hence the data is not sent in the first place, or is cached on the client.
My concern lies with the other direction, and I would like an answer before I design it this way and actually implement it, so I can avoid hitting any brick walls. Let's say that the web application tries to send an instruction to the client. If the socket is available, the web app forwards the instruction to the queue, which in turn forwards it to the Node.JS app, which in turn uses the socket to forward it to the client. So far so good. If on the other hand, the internet connection from the client has dropped, and hence the socket is currently down, the web app will still send the instruction to the queue. My question is, when the queue forwards the instruction to Node.JS, and Node.JS figures out that the socket does not exist, and hence cannot send the instruction, will the queue receive a reply from Node.JS that it could not forward the data, and hence that it should remain in the queue? If that is the case, it would be perfect. When the client manages to connect to the internet, it will perform a handshake once again, the queue will once again try to send to Node.JS, only this time Node.JS manages to send the instruction to the client.
Is this the correct reasoning of how those components would interact together?
this won't work the way you want it to.
when the node process receives the message from rabbitmq and sees the socket is gone, you can easily nack the message back to the queue.
however, that message will be processed again immediately. it won't sit there doing nothing. the node process will just pick it up again. you'll end up with your node / rabbitmq thrashing as it just nacks a message over and over and over and over, waiting for the socket to come back online.
if you have dozens or hundreds of messages for a client that isn't connected, you'll have dozens or hundreds of messages thrashing round in circles like this. it will destroy the performance of both your node process and rabbitmq.
my recommendation:
when the node app receives the message from rabbitmq, and the socket is not available to the client, put the message in a database table and mark it as waiting for that client.
when the client re-connects, check the database for any pending messages and forward them all at that point.

NodeJS Synchronize clients

I'm using socket.io and nodejs,
I have a server and I use it as my nodeJS server. What I'm trying to do is moving clients according to messages sent as client -> server -> clients
For example; client1 sending a message "MOVE-RIGHT" to server. Server redirecting this message to all clients LIKE "MOVE-RIGHT-CLIENT1" and according to this message, all clients starting to move client1 to the right direction.
The problem is, all clients may have different latency according to their network status. For example, if server->client1 communication happens in 50 ms, server->client2 communication may happen in 250 ms. Therefore, client1 does this job nearly 200 ms earlier. So we can say that these two movements are not synchronized because one of them happens earlier than other ones.
As you know latency between clients and server may be different for each clients, and also it can be different for each message for the same client.
My question is, Which method should I use to synchronize these clients, to do their jobs at the same time. Is there any feature of socket.io or nodejs about this? What would you recommend for me?

Resources