How to use socket.io with node.js worker_threads - node.js

Currently I am introducing worker_threads to a game I am developing. Thea idea is to have each game room in a separate thread, so that they can be processed in parallel (yeah I know that will only really happen if there isn't more threads than cores).
However, I cannot get socket.io to work with worker_threads. So far I have tried the following:
1. Create the socket.io server within the server:
server.js
const { Worker, isMainThread } = require("worker_threads");
const express = require("express");
const app = express();
app.use(express.static(__dirname));
app.post("/newGame", (request, response) => {
let roomCode = request.query.room_code;
const worker = new Worker("./server-worker.js", {
workerData: {
roomCode: roomCode
},
});
});
app.listen(8080, () => {
console.info("Server started on http://localhost:8080");
});
server-worker.js
const { parentPort, workerData, threadId } = require("worker_threads");
const server = require("http").createServer();
const io = require("socket.io")(server);
server.listen(8080, () => {
console.info("Server socket.io on port 8080");
});
io.on("connection", (socket) => {
console.log(`Client with ID ${socket.id} connected to thread with ID ${threadID}`);
});
This results in the browser logging this:
GET http://localhost:8080/socket.io/?EIO=3&transport=polling&t= 404 (Not Found)
Also even if I got Express to forward the polling request to the worker I am guessing that I would not be able to open more than one room since the port would already be in use.
2. Create socket.io instance in main thread and pass it on
server.js
const { Worker, isMainThread } = require("worker_threads");
const express = require("express");
const app = express();
app.use(express.static(__dirname));
const server = require("http").createServer(app);
const io = require("socket.io")(server);
app.post("/newGame", (request, response) => {
let roomCode = request.query.room_code;
const worker = new Worker("./server-worker.js", {
workerData: {
roomCode: roomCode,
io: io
},
});
});
However, this does not work since I get a DataCloneError because worker_threads can only send native objects to workers.
Questions:
Is there an easier way to accomplish what I am doing? I.e.: Am I just using the wrong framworks?
Is there a way to get socket.io working with threads? I already saw a documentation on using it with node.js clusters using a Redis instance but that seems overkill for the problem at hand since I do not have multiple processes/ nodes.

Node's worker threads are not suitable for your purpose. They're for doing long-running computations, not handling network connections.
Also, it probably does not make sense to assign your rooms to particular server resources. Node is good at concurrently handling multiple app contexts and so is socket.io.
To scale up your app to use more processor cores, node offers clustering. To use a cluster you'll need to sort out session stickiness.

Related

Implement socket.io in node.js application controller

good afternoon. I am new to programming sockets in node.js and I need to implement socket.io in a controller of my application. The architecture I have is the following:
The file that starts the server is index.js
const express = require('express');
const app = express();
const port = 3000;
const socketRouter = require('./routes/socket')
app.use(express.json());
//Route
app.use('/socket', socketRouter);
app.listen(port, () => {
console.log(`Server connection on http://127.0.0.1:${port}`); // Server Connnected
});
The file where I define the routes is socket.js
const { Router } = require('express');
const { showData } = require('../controllers/socket');
const router = Router();
router.post('/send-notification', showData);
module.exports = router;
And my controller is:
const { response } = require('express');
const showData = (req, res = response) => {
const notify = { data: req.body };
//socket.emit('notification', notify); // Updates Live Notification
res.send(notify);
}
module.exports={
showData
}
I need to implement socket.io in this controller to be able to emit from it but I can't get it to work. Could you tell me how to do it?
Thanks a lot
CLARIFICATION: if I implement socket.io in the main file it works, but I want to have some order and separate things. This is how it works:
const express = require('express');
const app = express();
const port = 3000;
app.use(express.json());
app.post('/send-notification', (req, res) => {
const notify = { data: req.body };
socket.emit('notification', notify); // Updates Live Notification
res.send(notify);
});
const server = app.listen(port, () => {
console.log(`Server connection on http://127.0.0.1:${port}`); // Server Connnected
});
const socket = require('socket.io')(server);
socket.on('connection', socket => {
console.log('Socket: client connected');
});
Move your socket.io code to its own module where you can export a method that shares the socket.io server instance:
// local socketio.js module
const socketio = require('socket.io');
let io;
modules.exports = {
init: function(server) {
io = socketio(server);
return io;
},
getIO: function() {
if (!io) {
throw new Error("Can't get io instance before calling .init()");
}
return io;
}
}
Then, initialize the socketio.js module in your main app file:
const express = require('express');
const app = express();
const port = 3000;
app.use(express.json());
const server = app.listen(port, () => {
console.log(`Server connection on http://127.0.0.1:${port}`); // Server Connnected
});
// initialize your local socket.io module
const sio = require('./socketio.js');
sio.init(server);
// now load socket.io dependent routes
// only after .init() has been called on socket.io module
const socketRouter = require('./routes/socket')
app.use('/socket', socketRouter);
Then, anywhere you want to access the socket.io server instance, you can
require("./socketio.js") and use the .getIO() method to get the socket.io instance:
// use correct path to socketio.js depending upon where this module
// is located in the file system
const io = require("../../socketio.js").getIO();
// some Express route in a controller
const showData = (req, res) => {
const notify = { data: req.body };
// send notification to all connected clients
io.emit('notification', notify);
res.send(notify);
};
module.exports= {
showData
};
Note: A typical socket.io usage convention on the server is to use io as the server instance and socket as an individual client connection socket instance. Please don't try to use socket for both. This makes it clear that io.emit(...) is attempting to send to all connected clients and socket.emit() is attempting to send to a single connected client.
Also note that if your route is triggered by a form post where the browser itself sends the form post, then that particular client will not receive the results of io.emit(...) done from that form post route because that browser will be in the process of loading a new web page based on the response of the form post and will be destroying its current socket.io connection. If the form post is done entirely via Javascript using an Ajax call, then that webpage will stay active and will receive the results of the io.emit(...).
You can use the same socket and app (if you need to expose APIs as well) in other files if you want to separate socket messages and REST endpoints by functionality or however you choose to organize it. Here's an example of how this can be done:
Create a new file, let's say controller1.js:
function initialize(socket, app) {
socket.on('some-socket-message', socket => {
// Whatever you want to do
});
app.get('/some-endpoint', (req, res) => {
// whatever you want to do
});
}
module.exports = {initialize}
And then add the following to your controller.js
const controller1 = require('path/to/controller1');
...
// At some point after socket and app have been defined
controller1.initalize(socket, app);
This will be the bases of separating your controller however you want, while still using the same socket connection and API port in all of your controllers. You can also refactor the initialize method into different methods, but that would be at your own discretion and how you want to name functions, etc. It also does not need to be called initalize, that was just my name of preference.

No response from Socket.io server when client is making a connection

I have been fighting with setting this up for longer than I would like to admit.
At first, I was having CORS issues, after following what the socket.io documentation / other stack overflow threads I was getting hit with, GET / POST 400 errors.
Finally after that, I noticed a few threads mention to pass in {transports: ['websocket']} on the server and in the client.
Once I did that, I stopped getting error messages, however, I am still not able to make a connection from my client to my socket.io server. I am hoping I can get some guidance.
I am running Socket.io 3.0 and express 4+
Here is what my server / client looks like at the moment..
SERVER (As an express router)
const express = require('express');
const socketIO = require("socket.io");
const http = require('http');
let app = express();
let router = express.Router();
let server = http.createServer(app);
// The event will be called when a client is connected.
let io = socketIO(server, {transports: ['websocket']})
io.on("connection", socket => {
console.log("connection")
socket.emit("hello", { data: "more data" })
socket.on("disconnect", () => {
console.log("user left")
})
})
server.listen(8081, () => console.log('Socket.io listening on *:8081'));
module.exports = router;
Client (React)
// Socket.IO
import io from 'socket.io-client';
const socket_io = io('localhost:8081', {transports: ['websocket']})
// Socket.io UseEffect
useEffect( () => {
const initSocket = () => {
console.log(socket_io)
socket_io.on("hello", data => {
setSocket(data);
console.log(data);
});
// CLEAN UP THE EFFECT
return () => socket_io.disconnect();
}
initSocket()
},[])
Here is what my Console currently looks like when I log out the socket connection:
So, as embarrassing as this is, the breaking change was that the socket.io-client module in the React client application wasn't 3.0 like the one on the server. Therefore they weren't able to handshake.
My advice, is if you have the CORS rule added or the transport: websocket added, make sure you look at your package.json file in your server / client apps to make sure that the socket.io package version matches.

Should I use a global variable to share socket.io instance across entire server

The following is my server.js file in my node.js application. I want my socket.io instance to be accessed by other files on my server in order to emit events from my API (listingRoutesApi, userRoutesApi etc.) (refer to code).
The problem I have is that my routes are declared before the server is created; however, the socket.io instance is created after the server is created.
The solution I've used is to declare a global io variable that will allow me to emit events from anywhere within my web app like so:
global.io.of('/analytics').to(listing._id).emit('message', "There was a post.");
My question is: are there any pitfalls / dangers from doing this and will I encounter any scalability issues in the long-term? Additionally, is there a better way of achieving my objective?
Code within my server.js file:
const app = express();
app.use('/api', listingRoutesApi);
app.use('/api', userRoutesApi);
app.use('/api', imageRoutesApi);
// ...plenty more endpoints here...
app.use(serveStatic(path.join(__dirname, "/dist")));
app.use(history());
app.use(serveStatic(path.join(__dirname, "/dist")));
const server = app.listen(port, () => { console.log('server started ' + port); });
/* Start socket. */
global.io = socketio(server);
const analytics = global.io.of("/analytics");
analytics.on('connection', (socket) => {
socket.on('join', (data) => {
socket.join(data.room);
analytics.in(data.room).emit('message', `New user joined ${data.room}`);
});
socket.on('leave', (data) => {
analytics.in(data.room).emit('message', `User leaving ${data.room}`);
socket.leave(data.room);
});
socket.on('disconnect', () => {
console.log('user disconnected');
});
});
I'm asking this question since this SO post answers a similar question by declaring a getIOInstance function and passes it to all modules that require it. While, it works, it doesn't feel very elegant and seems a little unnecessary given I expect to only ever have exactly one socket.io instance in my application.
Additionally, I would assume the challenge I'm having is a very common one; however, I haven't been able to find many solutions to address it and none that suggest using a global variable.
Node.js is a modular environment. Modules are suppose to address some flaws that globals have.
Modules naturally provide singleton instances, in case there's a need to have only one instance:
app.js
module.export = express();
server.js
const app = require('./app');
// can go to app.js if configured but unlistened app is needed for reusability or testing
app.use(/* router */);
module.export = app.listen(...);
socketio.js
const server = require('./server');
module.export = socketio(server);
index.js
const app = require('./app');
const io = require('./socketio');
...
Express also provides a container for application global dependencies, application settings table. It can be used everywhere application instance is available, e.g. as req.app.get(...) inside middlewares. Accessing Express application instance outside middlewares won't be a problem if it's a singleton as well:
app.js
module.export = express();
index.js
const app = require('./app');
app.use(/* router */);
...
const server = app.listen(...);
const io = socketio(server);
app.set('io', io);
// available as req.app.get('io') inside middlewares
// and as require('./app').get('io') outside them

Multiple socket.io instances at different paths

I'm making a REST API that works with routes and actions like /api/route/action. But I want to add WebSocket functionalities. So I want WebSockets to also be addressable by url.
I have this code:
const socketio = require('socket.io');
//server is a http.createServer()
module.exports = server => {
const io = socketio(server, { route: '/socketapi/test' );
io.on('connection', s => {
s.on('a', () => s.emit('b'));
s.emit('message', 'You connected to /test.');
});
const io2 = socketio(server, { route: '/socketapi/something_else' });
io2.on('connection', s => {
s.on('z', () => s.emit('y'));
s.emit('message', 'Hi');
});
};
The reason why I want to split them is so I don't have to keep track of event names I've already used, and so I can separate the logic in the connection event.
But it seems this is not possible. If I have two socket.io instances running I can't connect to either.
Is this possible or will I have to use some tricks and perhaps an event that the client can send to let me know what it wants to subscribe to?
You can use a built in feature of socket.io called namespaces to achieve this behaviour.
Here is a basic example:
Server side:
const nsp = io.of('/my-namespace');
nsp.on('connection', function(socket){
console.log('someone connected');
});
nsp.emit('hi', 'everyone!');
Client side:
const socket = io('/my-namespace');
Now the client can emit and receive messages which are specific to a namespace. With the use of namespaces your problem of name conflicts of the events, will be solved.

Share express port over processes in node.js

So I switched from child.fork to cluster.fork to spawn chatbots that I run, as clusters share tcp ports. However, I can't seem to get the clusters to listen on the same port.
code:
var cluster = require('cluster');
var express = require('express');
if (cluster.isMaster) {
cluster.fork({path:'hello'});
cluster.fork({path:'goodbye'});
} else {
var web = express();
web.get("/"+process.env.path,function (req,res){
return res.end("hello, "+process.env.path);
});
web.listen(3000);
}
This is half working. I get no EADDRINUSE errors now, but only one of the paths is showing up.
It's not working for lots of reasons.
express.express dont exist. You are looking for plain express method
process.evn dont exist. You are looking for process.env
You are not returning anything to the client in your route. You should use res.end or another method to response the client. Check the http module documentation or the express one.
The workers can share tcp connections. The master can not.
Some code that works:
var cluster = require('cluster');
var express = require('express');
if (cluster.isMaster) {
cluster.fork({path:'hello'});
} else {
//this is in a required file
var web = express();
web.get("/"+process.env.path, function(req,res){
res.end("hello world!");
});
web.listen(3000);
}
If you want to use more than one worker, just fork more in the if(cluster.isMaster) condition.

Resources