Hello guys i have simple example with sockets:
const app = express();
const server = require("http").createServer(app);
const io = require("socket.io")(server, {
cors: {
origin: "http://localhost:3000",
methods: ["GET", "POST"],
credentials: true
}
});
var userCount = 0;
io.on('connection', function (socket) {
userCount++;
io.emit('userCount', { userCount: userCount });
socket.on('disconnect', function() {
userCount--;
io.emit('userCount', { userCount: userCount });
});
});
and frontend:
const [userCount, setuserCount] = useState(0);
socket.on('userCount', function (data) {
setuserCount(data.userCount);
});
I dont understand, but it fire so much requests ..
And my question is this proper way to work with sockets?
The issue seems to be with your frontend. The following code runs multiple times everytime your component renders:
socket.on('userCount', function (data) {
setuserCount(data.userCount);
});
This means that you're adding multiple event listener functions for the one userCount event. To fix this, you can use React's useEffect() hook, which you can use to run code once when your component mounts:
import React, {useState, useEffect} from "react";
...
// Inside your component:
const [userCount, setuserCount] = useState(0);
useEffect(() => {
const listener = data => setuserCount(data.userCount);
socket.on('userCount', listener);
return () => socket.off('userCount', listener);
}, [setuserCount]);
This way your listener will only be added once when your component mounts, and not multiple times. The cleanup function returned from the useEffect hook will also allow you to remove the listener when the component unmounts (thanks #Codebling for this suggestion). Your socket.on callback will still execute multiple times as socket.io will call it whenever your event occurs.
I have found similar code here (http://sahatyalkabov.com/jsrecipes/#!/backend/who-is-online-with-socketio) and yes, this is the correct way to use sockets. It fires so many requests because every time a user connects and every time a user disconnects a message is fired (including when you reload. when you reload it fires twice, 1 for getting out and 1 for getting back on the site).
Related
I have a Next.js project that has the simplest Socket.IO implementation set up. Below is the code.
// pages/index.tsx
let socket: Socket;
const Home: NextPage = () => {
useEffect(() => {
async function socketInit() {
//start server
await fetch("http://localhost:3000/api/socket");
// connects to the socket
socket = io();
socket.on("connect", () => {
console.log("hello");
});
}
socketInit();
}, []);
return (
<button
onClick={() => {
socket.emit("test");
}}
>
hello
</button>
);
};
// pages/api/socket.ts
export default function handler(
req: NextApiRequest,
res: Res
) {
if (res.socket.server.io) { res.end(); return; }
const io = new IOServer(res.socket.server);
res.socket.server.io = io;
io.on('connection', socket => {
socket.on('test', () => {
console.log("1"); //Changing it to "2" doesn't do anything until dev is restarted.
});
});
res.end();
}
For some reason, the listener in the server would not update from hot reload. Restarting the dev is the only way. Why is that?
I think there are 2 issues here:
The response object that you instantiate the IOServer on is not recreated after a HMR, it is still referring to the callback function that prints 1, which is hanging around in memory somewhere.
To fix this, you need to actively call the handler method, unsubscribe the old callback function and resubscribe the new (replaced) callback function. Just interacting with through the socket is not enough. Unfortunately, all tutorials I have seen call the handler socket, which is misleading. It should be called setup-socket-handler instead. What it does is retrieve the actual server from the response object of this handler and register a IOSocket server with the underlying server, which will then register a new handler/endpoint /socket.io that will be used for the communication between client and server.
Here is what I came up with. This should not be used as is in production (make sure the replacement happens only once in production, as it did in the original):
const SocketHandler = (
req: NextApiRequest,
res: NextApiResponseWithSocket
): void => {
if (res.socket.server.io != null) {
logger.info("Socket is already running");
res.socket.server.io.removeAllListeners("connection");
res.socket.server.io.on("connection", onConnection);
} else {
logger.info("Socket is initializing");
const io = new Server<ClientToServerEvents, ServerToClientEvents>(
res.socket.server
);
io.engine.on("connection_error", (err: unknown) => {
logger.error(`Connection error: ${err}`);
});
res.socket.server.io = io;
io.on("connection", onConnection);
}
res.end();
};
After changing the callback function and nextjs doing its HMR, it is required to call the handler once as described in 2. I do this by reloading my frontend page which sends a request to the socket handler.
I'm trying to figure out how to implement a two separate websockets together. Not sure if this possible or not, but I have a websocket that works on it's own in from node.js file to angular another node.js file that uses Kraken (crypto exchange) websocket that also works in it's own file. I'm trying to consolidate them both together so that whenever a event onChange comes from Kraken websocket, I can relay that data to angular with angular socket.io websocket. Trying to do something like this
const webSocketClient = new WebSocket(connectionURL);
webSocketClient.on("open", function open() {
webSocketClient.send(webSocketSubscription);
});
webSocketClient.on("message", function incoming(wsMsg) {
const data = JSON.parse(wsMsg);
let io = require("socket.io")(server, {
cors: {
origin: "http://localhost:4200",
methods: ["GET", "POST"],
allowedHeaders: ["*"],
credentials: true,
},
});
io.on("connection", (socket) => {
const changes = parseTrades(data);
socketIo.sockets.emit(connection.change, changes);
Log whenever a user connects
console.log("user connected");
socket.emit("test event", JSON.stringify(changes));
});
console.log("DATA HERE", data[0]);
});
webSocketClient.on("close", function close() {
console.log("kraken websocket closed");
});
Although doing this doesnt relay the data to frontend and gives me a memory leak. Is there some way I can accomplish this?
I would probably split up the task a little bit. So have a service for the kraken websocket and maybe a service for your own socket, then have them communicate via observables, that you can also tap into from the front end to display data you want.
#Injectable()
export class KrakenService{
private webSocketClient : WebSocket | null;
private messages$ = new BehaviorSubject<any>(null); // give it a type
openConnection(){
// is connectionUrl from environment ??
this.webSocketClient = new WebSocket(connectionURL);
webSocketClient.on("open", function open() {
/// is webSocketSubscription from environment ??
webSocketClient.send(webSocketSubscription);
});
webSocketClient.on("message", function incoming(wsMsg) {
const data = JSON.parse(wsMsg);
this.messages$.next(data);
console.log("DATA HERE", data[0]);
});
webSocketClient.on("close", function close() {
console.log("kraken websocket closed");
this.webSocketClient = null;
});
}
getKrakenMessages(){
if(webSocketClient == null) this.openConnection();
return messages$.asObserbable();
}
}
So now when you want to either read the web socket messages or use them with the other socket you just subscribe to the krakenService.getKrakenMessages();
Then you can do something similar with you local service as well. Have something that opens connections, and one that emits messages. The example you showed it would open up a connection every time you got a message, so keep those logics separated. And reuse existing connection.
I'm making a tweet deleter, and I want to update the user on the progress.
I'm new to socket.io, I managed to connect the react frontend to the nodejs/express backend.
io.on("connection", (socket) => {
console.log("new connection");
socket.on("disconnect", () => console.log("disconnected"));
});
When a user clicks the delete button, a delete request goes to the backend, the file containing the tweets is then processed and thanks to Bull, each tweet is queued as a job.
because I added ìo to my routes, I can use it inside of them, but io.emit() emits to connected clients, and I only want emit to sender by using socket.emit() inside my routes as well as inside my jobs in the queue.
The approach I tried was to write a function inside io.on("connection") like this and to make it global :
io.on("connection", (socket) => {
console.log("new connection");
socket.on("disconnect", () => console.log("disconnected"));
global.emitCustom = function (event, payload) {
socket.emit(event, payload);
};
});
which allowed me to use in the queue process function :
const deletionProcess = async ({
data: { tweetId, tokens, twitterId, numberOfTweets, index },
}) => {
emitCustom("deleting", {
type: "deleting",
progress: Math.round(((index + 1) / numberOfTweets) * 100),
});
};
Is there a better way to do this? is my approach wrong or does it present any flaws?
It worked in the few tests I did.
In my case, Socket.io acts very unstably. It is needed to make it more debuggable and stable.
Introduction
There is a Node'js based backend (Strapi CMS), wherein bootstrap.js I have a right to run my custom code - in my case the socket server:
module.exports = async () => {
process.nextTick(() =>{
var io = require('socket.io')(strapi.server);
io.on('connection', async function(socket) {
console.log(`a user connected`)
// send a message on user connection
socket.emit('question', {message: await strapi.services.question.findOne({ id: 1 })});
//Send a message after a timeout of 4seconds
setTimeout(function() {
socket.emit('endEvent', { description: 'A custom event named EndEvent!'});
}, 4000);
// listen for user diconnect
socket.on('disconnect', () =>{
console.log('a user disconnected')
});
});
strapi.io = io; // register socket io inside strapi main object to use it globally anywhere
})
};
This server is expected to send a question from the database to the socket via custom event -question. The frontend runs with ReactJS. Whenever the page is loaded the socket is created, it establishes a connection with the server and receives the question event from the server, and the data of the quiz question, which is printed as an object.
import React, { useState, useEffect } from "react";
import io from "socket.io-client";
const ENDPOINT = "http://localhost:1337";
const socket = io(ENDPOINT);
function Question() {
const [questionID,setQuestionID] = useState(0)
useEffect(() => {
socket.on('message', (data)=> {setResponse(data)});
socket.on('endEvent', ()=>{console.log('End event done.')})
socket.on('question', (msg, cb) => {
console.log( msg.message)
});
// CLEAN UP THE EFFECT
return () => socket.disconnect();
}, [questionID]);
return(<p>The question debugable from console</p>);
}
export default Question;
Problem.
When I refresh the page, the console.log( msg.message) does print on time, but not each time. Sometimes it just prints when some other component of React is being triggered. For detailed info please see screenshots. This is expected:
It prints the object of question and also End event done. Which is triggered by the server in 4 seconds right after the connection. Instability shows itself when both of the commands run twice, or the object of the question itself delays to display anyhow.
I am not sure if the question is not being sent from the server or it is not printed in the console on client-side... The useEffect() is loaded at first, then triggered each time whenever questionID is changed.
Unexplained behavior
When the countdown time component triggers that the time is expired, you may mention that the 2 important commands of printing and 4 seconds session delay are somehow triggered, which is unexplainable for me, how two independent components of ReactJS may "not fit".
On the server-side, I receive information "connected" and "disconnected" when refreshed. Please comment if an update is needed or have a suggestion for a better title.
I'm trying to set up a realtime application using socket.io in Angular and node.js, which is not working as intended.
Whenever a client is making a new post, the other clients won't update until you interact with the client (e.g. clicking somewhere on the page, or clicking on the browsers tab).
However, having console open in the browser, I can see the new post in the console when I log the posts/objects - without the need to interact with the clients.
Angular:
import io from 'socket.io-client';
const socket = io('http://localhost:3000');
posts: Post[] = [];
...
// Inside ngOnInit:
socket.on('data123', (res) => {
console.log('Updating list..', res);
this.postService.getPosts();
this.postsSub = this.postService.getPostUpdateListener()
.subscribe((posts: Post[]) => {
this.posts = posts;
});
});
Displaying in the template:
<... *ngFor="let item of posts">
Inside PostsService:
getPosts() {
this.http.get<{ message: string, posts: Post[] }>('http://localhost:3000/api/posts')
.subscribe((postData) => {
this.posts = postData.posts;
this.postsUpdate.next([...this.posts]);
});
}
Node.js - this socket.io solution is not yet sending the actual list:
const io = socket(server);
io.sockets.on('connection', (socket) => {
console.log(`new connection id: ${socket.id}`);
sendData(socket);
})
function sendData(socket){
socket.emit('data123', 'TODO: send the actual updated list');
setTimeout(() => {
console.log('sending to client');
sendData(socket);
}, 3000);
}
What worked as intended:
Using setInterval instead "socket.on(..)" on the front-end gave the intended result, meaning the clients will update automatically without the need of interacting. I'm fully aware this solution is horrible, but I assume this pinpointing that it's something wrong with socket solution above in Angular part.
wait, every time when socket.on('data123', (res) => {... you are creating new subscribe? it's wrong way...you must create subscribe in your socket connect feature