I am working on Socket IO, the connection between the client and the server is established successfully. I am facing two problems:
1 - When the initial connection is made between the client and the server, the socket.client.id on server and socket.id on client side, both are the same, but when I refresh the client page, the id of the client changes to other one, but on the server it is still the same. Does it makes any issue / problem while communicating with the server or even with the client using sockets, while not having the same ids ? or does the id on the server get changed when the client page is refreshed ?
2 - On the initial connection establishment the socket passes a messages, using socket.emit() from server and receives as socket.on() on client. But when I try to emit anything from client it doesn't get received on server.
Socket Connections
function Globals() {
this.socketConnection = async function() {
let p = new Promise(function(res, rej) {
io.on("connection", function(socket) {
if (socket.connected) {
res(socket);
} else {
rej("Socket Connection Error !");
}
})
})
return await p;
}
}
new Globals().socketConnection().then(function(soc) {
console.log(soc.client.id);
socket = soc;
soc.emit("Hi");
soc.on("Nady", function() {
console.log("I am called");
})
})
Client Side Connection
function Globals() {
this.socketConnection = async function() {
var socket = io('http://localhost:8080');
let p = new Promise(function(res, rej) {
socket.on('connect', function() {
if (socket.connected) {
console.log(socket.id);
res(socket);
}
})
})
return await p;
}
}
var socket;
new App().socketConnection().then(function(s) {
socket = s;
});
function ScrapJobs() {
var socket;
new App().socketConnection().then(function(s) {
socket = s;
});
var _this = this;
this.attachListeners = function() {
qs("#init-scrap").ev("click", _this.startScrapping);
}
this.startScrapping = function() {
console.log("I am cliced");
socket.on("Hi", function() {
console.log("Hi Nadeem");
})
socket.emit("Nady");
}
}
When the initial connection is made between the client and the server, the socket.client.id on server and socket.id on client side, both are the same, but when I refresh the client page, the id of the client changes to other one, but on the server it is still the same. Does it makes any issue
The client side socket.id value is set on the client socket object after the connect event is received and is updated (e.g. modified) upon a reconnect event.
It appears that the socket.io infrastructure will keep them the same on client and server. If the client disconnects and then reconnects, there will be a new connection with a new id on both client and server. It is possible you are attempting to hang onto the old socket object on the server after the client has disconnected it (we can't really see enough of your server code to evaluate that).
On the initial connection establishment the socket passes a messages, using socket.emit() from server and receives as socket.on() on client. But when I try to emit anything from client it doesn't get received on server.
You'd have to show us a reproducible case. This does not happen if you are coding things correctly. I would guess that you do not have the right listeners for messages on the right socket in order to see the messages you are sending. I promise you that sending a message from client to server works just fine when implemented properly.
A general comment about your code. Both code blocks you show appear to be stuffing a socket object into a higher scoped (or perhaps even global) variable. That is likely part the cause of your problem because that socket object can become dead if the client reconnects for any reason. Plus putting any sort of socket object into a global or module level variable makes your server only capable of serving one client - it's simply not how you design multi-client servers.
Related
Environment:
Backend
node:latest
socket.io | 4.5.2
Frontend
React Native | 0.70.4
socket.io-client | 4.6.0
both Android and iOS
Here is my NodeJs entry file:
const numCPUs = cpus().length
if (cluster.isPrimary) {
const app = express()
const httpServer = http.createServer(app)
setupMaster(httpServer, { loadBalancingMethod: 'least-connection' })
setupPrimary()
for (let i = 0; i < numCPUs; i++) {
cluster.fork()
}
cluster.on('exit', (worker) => {
cluster.fork()
})
} else {
const app = express()
const httpServer = http.createServer(app)
const io = new Server(httpServer, { maxHttpBufferSize: 1e8 })
io.adapter(createAdapter())
setupWorker(io)
API.Socket.init(io, process.pid)
middlewares.forEach((middleware: any) => app.use(middleware))
routes.forEach((route) => app.use(route.path, route.handler))
httpServer.listen(CONFIG.PORT, () => {})
}
I have a simple chat application.
When user A sends message to user B, new chat message and notification is recorded in database. Now that chat message and notification* should be sent to the B user. There are 2 socket emit-functions for that:
sendNewNotification(
notification: BE.Entities.TNotification,
toUser: string,
) {
this.io
?.to(toUser)
.volatile.emit(ECustomEvents.NewNotification, notification)
}
sendPrivateMessage(
toUser: string | Array<string>,
chatMessage: BE.Entities.TChatMessage,
sourceUser: BE.Entities.TUser,
) {
this.io
?.to(toUser)
.volatile.emit(ECustomEvents.PrivateMessage, chatMessage, sourceUser)
}
If I do it like this, the targetUser is not going to receive the event with the newChatMessage however he will receive the savedNotification
API.Socket.sendPrivateMessage(targetUserId, newChatMessage, userToPass)
API.Socket.sendNewNotification(savedNotification, targetUserId)
Now, if I switch these lines:
API.Socket.sendNewNotification(savedNotification, targetUserId)
API.Socket.sendPrivateMessage(targetUserId, newChatMessage, userToPass)
the behavior would be as expected: the target user B will receive both saved notification and new chat message
How is that possible? What could be wrong?
Thank you mates in advance!
With the current information, I'm not so sure the order matters but perhaps that it's a side-effect / coincidence. Are you checking anywhere to make sure the server-side socket is ready before the client emits?
Consider this super simple WebSocket chat sandbox:
One of the issues I noticed when writing this is when the server WebSocket is not ready, I could not emit from the client to the server. To make sure the server is ready, I sent a ping from the server to the client notifying the client that the server is ready:
wss.on("connection", async function connection(client, request) {
console.log("user connected", Date.now());
client.send(JSON.stringify({ ready: true }));
...
});
I also notice you are usingg the volatile.emit which according to the documentation:
Volatile events
Volatile events are events that will not be sent if the underlying connection is not ready (a bit like UDP, in terms of reliability).
This can be interesting for example if you need to send the position of the characters in an online game (as only the latest values are useful).
socket.volatile.emit("hello", "might or might not be received");
The Socket.IO docs have a similar listener which lets you know when the server is ready.
If you prevent the client from emitting until the server is ready, you can avoid this issue. You also should not need to use the volatile.emit for something that must be delivered.
Hello community,
Since the morning I am faced with an idea not to say a problem I want to store clients (sockets) in order to re-invoke them later, I know that it is not too clear I will explain in detail:
First I have a server interface (ServerSocket) net.Server() which will receive clients (sockets) net.Socket() and this one will be stored in a Map() with a unique ID for each client, and each time I want to communicate with one of them I call map.get(id).write ... or other function,
Everything works fine until I close the server, automatically the sockets will be killed ... saying that I have found a solution to store the clients for my case (vuex) or localStorage to simplify what I want is when I restart the server and I invoke one of the client it will always be active.
So my main questions:
How can I keep clients still active after the server is closed?
How can I Store sockets and check if they are active after restarting server?
var net = require("net");
var server = new net.Server();
var sockets = new Map();
/**
* This events is called when server is successfully listened.
*/
server.on("listening", () => {
console.log("Server Listen in port 4444");
});
/**
* This events is called when error occur in server
*/
server.on("error", (e) => {
if (e.code === "EADDRINUSE") {
console.log("Address in use, retrying...");
}
});
/**
* This events is called when a new client is connected to server.
*/
server.on("connection", (socket) => {
var alreadyExist = false;
sockets.forEach((soc) => {
if (soc.remoteAddress === socket.remoteAddress) {
alreadyExist = true;
}
});
if (alreadyExist) {
socket.end();
} else {
socket.setKeepAlive(true, Infinity);
socket.setDefaultEncoding("utf8");
socket.id = nanoid(10);
sockets.set(socket.id, socket);
socket.on("error", (e) => {
console.log(e);
if (e.code === "ECONNRESET") {
console.log("Socket end shell with CTRL+C");
console.log("DEL[ERROR]: " + socket.id);
}
});
socket.on("close", () => {
console.log("DEL[CLOSE]: " + socket.id);
});
socket.on("end", () => {
console.log("DEL[END]: " + socket.id);
});
socket.on("timeout", () => {
console.log("timeout !");
});
var child = sockets.get(res.id);
child.write(/* HERE I SEND COMMAND NOT IMPORTANT ! */);
socket.on("data", (data) => {
console.log("Received data from socket " + data);
});
}
});
How can I keep clients still active after the server is closed?
A client can't maintain a connection to a server that is not running. The client is free to do whatever it wants on its own when the server shuts down, but it cannot maintain a connection to that server that is down. The whole definition of a "connection" is between two live endpoints.
How can I Store sockets and check if they are active after restarting server?
You can't store sockets when the server goes down. A socket is an OS representation of a live connection, TCP state, etc... When the server goes down and then restarts, that previous socket is gone. If it wasn't closed by the server before it shut-down, then it was cleaned up by the OS when the server process closed. It's gone.
I would make a suggestion that you're asking for the wrong thing here. Sockets don't outlive their process and don't stay alive when one end of the connection goes down.
Instead, the usual architecture for this is automatic reconnection. When the client gets notified that the server is no longer there, the client attempts to reconnect on some time interval. When, at some future time, the server starts up again, the client can then connect back to it and re-establish the connection.
If part of your connection initiation is an exchange of some sort of clientID, then a client can reconnect, present it's identifier and the server can know exactly which client it is and things can then continue as before with a new socket connection, but everything else proceeding as if the previous socket connection shut-down never happened. You just rebuild your Map object on the server as the clients reconnect.
For scalability reasons, your clients would generally implement some sort of back-off (often with some random jitter added) so that when your server comes back online, it doesn't get immediately hammered by hundreds of clients all trying to connect at the exact same moment.
I have a socket running in nodejs and using this socket in html page this is working fine and some times I'm receiving the error on developer console as like
failed: Connection closed before receiving a handshake response. In this time my update not getting reflect on the user screen. Actually whenever the changes updated in admin screen I written the login in laravel to store this values into the redis and I have used the laravel event broadcast and in node js socket.io read the redis value change and push the values into the user screens.
I have code in laravel as like,
Laravel Controller,
public function updatecommoditygroup(Request $request)
{
$request_data = array();
parse_str($request, $request_data);
app('redis')->set("ssahaitrdcommoditydata", json_encode($request_data['commodity']));
event(new SSAHAITRDCommodityUpdates($request_data['commodity']));
}
In this above controller when the api call receives just store the values into this redis key and broadcast the event.
In my event class,
public $updatedata;
public function __construct($updatedata)
{
$this->updatedata = $updatedata;
}
public function broadcastOn()
{
return ['ssahaitrdupdatecommodity'];
}
Finally I have written my socket.io file as like below,
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
var Redis = require('ioredis');
var redis = new Redis({ port: 6379 } );
redis.subscribe('ssahaitrdupdatecommodity', function(err, count) {
});
io.on('connection', function(socket) {
console.log('A client connected');
});
redis.on('pmessage', function(subscribed, channel, data) {
data = JSON.parse(data);
io.emit(channel + ':' + data.event, data.data);
});
redis.on('message', function(channel, message) {
message = JSON.parse(message);
io.emit(channel + ':' + message.event, message.data);
});
http.listen(3001, function(){
console.log('Listening on Port 3001');
});
When I have update the data from admin I'm passing to laravel controller, and controller will store the received data into redis database and pass to event broadcast.And event broadcast pass the values to socket server and socket server push the data whenever the redis key get change to client page.
In client page I have written the code as like below,
<script src="../assets/js/socket.io.js"></script>
var socket = io('http://ip:3001/');
socket.on("novnathupdatecommodity:App\\Events\\NOVNATHCommodityUpdates", function(data){
//received data processing in client
});
Everything working fine in most of the time and some times issue facing like
**VM35846 socket.io.js:7 WebSocket connection to 'ws://host:3001/socket.io/?EIO=3&transport=websocket&sid=p8EsriJGGCemaon3ASuh' failed: Connection closed before receiving a handshake response**
By this issue user page not getting update with new data. Could you please anyone help me to solve this issue and give the best solution for this issue.
I think this is because your socket connection timeout.
new io({
path:,
serveClient:,
orgins:,
pingTimeout:,
pingInterval:
});
The above is the socket configuration. If you are not configuring socket sometime it behaves strangely. I do not know the core reason, but i too have faced similar issues that implementing the socket configuration solved it.
Socket.io Server
Similar configuration should be done on the client side. There is an option of timeout in client side
Socket.io Client
For example.
Say this is your front-end code
You connect to the socket server using the following command:
io('http://ip:3001', { path: '/demo/socket' });
In your server side when creating the connection:
const io = require("socket.io");
const socket = new io({
path: "/demo/socket",
serveClient: false /*whether to serve the client files (true/false)*/,
orgins: "*" /*Supports cross orgine i.e) it helps to work in different browser*/,
pingTimeout: 6000 /*how many ms the connection needs to be opened before we receive a ping from client i.e) If the client/ front end doesnt send a ping to the server for x amount of ms the connection will be closed in the server end for that specific client*/,
pingInterval: 6000 /* how many ms before sending a new ping packet */
});
socket.listen(http);
Note:
To avoid complication start you http server first and then start you sockets.
There are other options available, but the above are the most common ones.
I am just describing what i see in the socket.io document available in github.socket_config. Hope this helps
I'm trying to implement a TCP proxy in Node JS. I only have some experience with Javascript so I met a lot of problems along the way. I've done a lot of searching for this one but had no luck.
The problem occurs when browser sends a CONNECT request for HTTPS. My proxy will parse the host name and port, and then create a new socket that connects to the server. If all these steps went well, I will start forwarding message.
Part of my code looks like this:
var net = require('net');
var server = net.createServer(function(clientSock) {
clientSock.on('data', function(clientData) {
var host = // get from data
var port = // get from data
if (data is a CONNECT request) {
// Create a new socket to server
var serverSock = new net.Socket();
serverSock.connect(port, host, function() {
serverSock.write(clientData);
clientSock.write('HTTP/1.1 200 OK\r\n');
}
serverSock.on('data', function(serverData) {
clientSock.write(serverData);
}
}
}
}
Since the CONNECT request needs both client socket and server socket open until one side closes the connection, the code above doesn't have this behavior. Every time I receive some data from client, I will create a new socket to server and the old one is closed.
Is there a way to store the server socket as a global variable so that the data event handler can reuse it? Or is there any other way to solve this?
Thanks a lot!!!!
You can just move the variable up to a higher scope so it survives across multiple events and then you can test to see if its value is already there:
var net = require('net');
var server = net.createServer(function(clientSock) {
var serverSock;
clientSock.on('data', function(clientData) {
var host = // get from data
var port = // get from data
if (data is a CONNECT request) {
// Create a new socket to server
if (!serverSock) {
serverSock = new net.Socket();
serverSock.connect(port, host, function() {
serverSock.write(clientData);
clientSock.write('HTTP/1.1 200 OK\r\n');
}
serverSock.on('data', function(serverData) {
clientSock.write(serverData);
}
} else {
serverSock.write(clientData);
}
}
}
}
In my normal setup, the client will emit data to my server regardless of whether or not there is another client to receive it. How can I make it so that it only sends packets when the user-count is > 1? I'm using node with socket.io.
To do this you would want to listen to the connection event on your server (as well as disconnect) and maintain a list of clients which are connected in a 'global' variable. When more than 1 client is connected send out a message to all connected clients to know they can start sending messages, like so:
var app = require('express').createServer(),
io = require('socket.io').listen(app);
app.listen(80);
//setup express
var clients = [];
io.sockets.on('connection', function (socket) {
clients.push(socket);
if (clients.length > 1) {
io.socket.emit('start talking');
}
socket.on('disconnect', function () {
var index = clients.indexOf(socket);
clients = clients.slice(0, index).concat(clients.slice(index + 1));
if (clients.length <= 1) {
io.sockets.emit('quiet time');
};
});
});
Note: I'm making an assumption here that the socket is passed to the disconnect event, I'm pretty sure it is but haven't had a chance to test.
The disconnect event wont receive the socket passed into it but because the event handler is registered within the closure scope of the initial connection you will have access to it.