should I create multiple websocket instance or using noserver mode when I want to separate websocket cononections? - node.js

as a node express backend. I need a websocket of real-time chat, and a websocket of go game for multiple players. I use https://github.com/websockets/ws library.
there are two solutions:
1.create multiple websocket server instances by passing different port.for example:
let chatServer = new WebsocketServer({port:8081})
let goServer = new WebsocketServer({port:8082})
from front end, different page useEffect connect to different websocket server.
2.Multiple servers sharing a single HTTP/S server as ws documentation suggested.
const server = createServer();
const wss1 = new WebSocketServer({ noServer: true });
const wss2 = new WebSocketServer({ noServer: true });
wss1.on('connection', function connection(ws) {
// ...
});
wss2.on('connection', function connection(ws) {
// ...
});
from front end, connect to different server by pathname.
My question is, which one I should choose, and what is the difference? I really really want to know, what does "noServer" mode mean. thanks.

Related

Websocket + Redis: multiple channels, specific subscriptions/publishing

I'm new to websockets, and am wondering how best to go about this.
My scenario: I have a server that handles different classes of users. For this example, let's say the classes are "mice", "cats", and "dogs"
Each of those classes should have their own channels to listen to for changes e.g. "mice-feed", "cat-feed", and "dog-feed"
My question is: after the server authenticates and determines the class of the current user, what's the best way to have them subscribed to a specific channel, or channel(s), so that when I broadcast messages to said channel(s), I can make sure that only members of particular classes get them (as against everyone currently connected to that server)?
My current code setup looks like this:
var ws = require('ws');
var redis = require('redis');
/* LOCATION 1 */
// prep redis, for websocket channels
var pub = redis.createClient();
var sub = redis.createClient();
// subscribe to our channels
sub.subscribe('mice-feed');
sub.subscribe('cat-feed');
sub.subscribe('dog-feed');
// declare the server
const wsServer = new ws.Server({
noServer: true,
path: "/",
});
/* ... removing some code for brevity... */
wsServer.on("connection", function connection(websocketConnection, connectionRequest) {
/* LOCATION 2 */
});
Do I put the redis declarations in LOCATION 1 (where it currently is), or in LOCATION 2 (when a successful connection is established)? Or neither of the above?
(also: I know it's possible to do this on the websocket end directly i.e. iterate through every client and ws.send if some criterion is matched, but iteration can become costly, and I'm wondering if I can do it on a redis-channel wide operation instead)
If I were building this, my first approach would be this:
// connect to Redis
const client = createClient();
client.on('error', (err) => console.log('Redis Client Error', err));
await client.connect();
// declare the server
const wsServer = new ws.Server(...elided...);
// handle connection
wsServer.on('connection', async (websocketConnection, connectionRequest) => {
const sub = client.duplicate()
// figure out the feed
const feed = 'animal-feed';
await sub.subscribe(feed, message => {
...do stuff...
});
});
It's pretty straightforward but would result in ever user having a dedicated connect to Redis. That may or may not matter depending on how many users you anticipate having.

WebSocket: Handle multiple connections for separate client apps

I would like to know how to best create multiple WebSocket connections for separate client applications. For example let's say I have one client app called 'racing game' and another app called 'football game', and these two games require entirely different JSON data. Would it be best to create new WebSocket instances for each game:
const SocketServer = require('ws').Server;
new SocketServer({ server, path: '/racing' });
new SocketServer({ server, path: '/football' });
Or would this create unnecessary overhead? Should I just create one WebSocket instance and handle messages from separate client apps in the 'message' callback?
ws.on('message', function incoming(message) {
let messageJSON = JSON.parse(message);
if(messageJSON.game == 'racing') {
// run this code
} else if (messageJSON.game == 'football') {
// run that code
}
})

how do i send a message to a specific user in ws library?

I'm exploring different websocket library for self-learning and I found that this library is really amazing ws-node. I'm building a basic 1 on 1 chat in ws-node library
My question is what is the equivalent of socket.io function which is socket.to().emit() in ws? because i want to send a message to a specific user.
Frontend - socket.io
socket.emit("message", { message: "my name is dragon", userID: "123"});
Serverside - socket.io
// listening on Message sent by users
socket.on("message", (data) => {
// Send to a specific user for 1 on 1 chat
socket.to(data.userID).emit(data.message);
});
WS - backend
const express = require('express');
const http = require('http');
const WebSocket = require('ws');
const express = require('express');
const http = require('http');
const WebSocket = require('ws');
const app = express();
const server = http.createServer(app);
const wss = new WebSocket.Server({ server });
wss.on('connection', (ws) => {
ws.on('message', (data) => {
\\ I can't give it a extra parameter so that I can listen on the client side, and how do I send to a specific user?
ws.send(`Hello, you sent -> ${data.message}`);
});
});
Honestly, the best approach is to abstract away the WebSocket using a pub/sub service.
The issue with client<=(server)=>client communication using WebSockets is that client connections are specific to the process (and machine) that "owns" the connection.
The moment your application expands beyond a single process (i.e., due to horizontal scaling requirements), the WebSocket "collection" becomes irrelevant at best. The array / dictionary in which you stored all your WebSocket connections now only stores some of the connections.
To correct approach would be to use a pub/sub approach, perhaps using something similar to Redis.
This allows every User to "subscribe" to a private "channel" (or "subject"). Users can subscribe to more than one "channel" (for example, a global notification channel).
To send a private message, another user "publishes" to that private "channel" - and that's it.
The pub/sub service routes the messages from the "channels" to the correct subscribers - even if they don't share the same process or the same machine.
This allows a client connected to your server in Germany to send a private message to a client connected to your server in Oregon (USA) without anyone being worried about the identity of the server / process that "owns" the connection.
There isn't an equivalent method. socket.io comes with a lot of helpers and functionalities, that will make your life easier, such as rooms, events...
socket.io is a realtime application framework, while ws is just a WebSocket client.
You will need to make your custom wrapper:
const sockets = {};
function to(user, data) {
if(sockets[user] && sockets[user].readyState === WebSocket.OPEN)
sockets[user].send(data);
}
wss.on('connection', (ws) => {
const userId = getUserIdSomehow(ws);
sockets[userId] = ws;
ws.on('message', function incoming(message) {
// Or get user in here
});
ws.on('close', function incoming(message) {
delete sockets[userId];
});
});
And then use it like this:
to('userId', 'some data');
In my opinion, if you seek that functionality, you should use socket.io. Which it's easy to integrate, has a lot of support, and have client libraries for multiple languages.
If your front-end uses socket.io you must use it on the server too.

How to connect multiple sockets to sails in test

I have a messaging controller setup in sails.js and want to test it with multiple clients to see if the pub/sub code works. I set up a test file
var socketIOClient = require('socket.io-client');
var sailsIOClient = require('sails.io.js');
var socket1 = socketIOClient;
var client1 = sailsIOClient(socket1);
var socket2 = socketIOClient;
var client2 = sailsIOClient(socket2);
var socket3 = socketIOClient('http://localhost:1337', {'force new connection': true});
var client3 = sailsIOClient(socket2);
...
client1.socket.get... works and says it is subscribed.
client1.socket.post... works and posts a message to the DB.
So I want to test that a client can receive the notification when a new message is posted. However, when I post from either client1 or client2, it posts from both. Essentially, they are linked to the same socket object or something like that, but I don't know where. So I want to connect multiple sockets, and I've tried variations like socket3 and client3, but get the following problem:
client3.socket.get... and client3.socket.post... and other variations (forceNew, multiplexing, etc.) each hang up and don't resolve.
Example of hang up:
sails.log('posting...');
client3.socket.post('/v1.0/messaging', data, function(body, JWR){
sails.log('posted');
done();
});
Only posting... is logged in this way, but posted is logged if using client1 or client2.
My question:
How can I connect multiple clients to my sails api to test if my pub/sub controller works?
I can't test it right now, but you could try this
var socketIOClient = require('socket.io-client');
var sailsIOClient = require('sails.io.js');
// Instantiate the socket client (`io`)
var io = sailsIOClient(socketIOClient);
// prevents socket to connect with it's default origin
io.sails.autoConnect = false
// Ask the client to create two socket connections
var socket1 = io.sails.connect('http://localhost:1337');
var socket2 = io.sails.connect('http://localhost:1337');
// Test it
socket1.get(url, data, cb)
socket1.post(url, data, cb)
socket2.get(url, data, cb)
socket2.post(url, data, cb)
// If you want to configure and use the "eager" instance
io.sails.url = 'http://localhost:1337';
io.socket.get(url, data, cb)
This way, you would create several SailsSocket instance instead of using the "eager" instance.
When you use sails.io.js in a browser, io.socket contains the socket instance (called "eager instance" in the comments) which will automatically try to connect using the host that the js file was served from. io.sails.connect() allows you to create other instances.
The correct syntax for actual version of socket.io should be
//first socket
var socket1 = socketIOClient('http://localhost:1337', {'forceNew: true});
//second socket
var socket2 = socketIOClient('http://localhost:1337', {'forceNew: true});
See socket.io docs http://socket.io/blog/socket-io-1-2-0/#

Socket.io with multiple Node.js hosts, emit to all clients

I am new to Socket.io and trying to get my head around the best approach to solve this issue.
We have four instances of a Node.js app running behind a load balancer.
What I am trying to achieve is for another app to POST some data to the load balancer URL which will hand if off to one of the instances.
The receiving instance will store the data, then use Socket.io to emit the data to the connected clients.
The issue is that browser/client can only be connected to a single instance at one time.
I am trying to determine if there is a way to emit to all clients at once?
Or have the clients connect to multiple servers using io.connect?
Or is this a case for Redis?
Publish/Subscribe is what you need here. Redis will give you the functionality your looking for out of the box. You just need to create a redis client and subscribe to an update channel on each of your app server nodes. Then, publish the update when a POST is successful (or whatever). Finally, have the redis client subscribe to the update chanel and on message emit a socketio event:
(truncated for brevity)
var express = require('express')
, socketio = require('socket.io')
, redis = require('redis')
, rc = redis.createClient()
;
var app = express();
var server = http.createServer(app);
var io = socketio.listen(server);
server.listen(3000);
app.post('/targets', function(req, res){
rc.publish('update', res.body);
});
rc.on('connect', function(){
// subscribe to the update channel
rc.subscribe('update');
});
rc.on('message', function(channel, msg){
// util.log('Channel: ' + channel + ' msg: ' + msg);
var msg = JSON.parse(msg);
io.sockets.in('update').emit('message', {
channel: channel,
msg: msg
});
});
Then in the JS app, listen for that emitted message:
socket.on('message', function(data){
debugger;
// do something with the updated data
});
Of course, introducing this new Redis Server adds another single point of failure. A more robust implementation may use something like a message broker with AMQP or ZeroMQ or some similar networking library which provides pub/sub capabilities.

Resources