I'm working with mean.js and I need to have some real time features in my app, to accomplish that I'm going to use socket.io library.
Here is my idea on how to integrate and still have a good structure in the app.
Mean is using a server.js file, that is the one that do a lot of configurations, so I want to do the following:
// Expose app
exports = module.exports = app;
// Add my reference to the socketServer
var io = require('/socketServer')(app);
The file '/socketServer.js' is going to be my starting point and my configuration point of my socket, could looks something like this:
var http = require('http');
var socketio = require('socket.io');
module.exports = function(express){
var server = http.Server(express);
var io = socketio(server);
io.path('/');
io.on('connect', function(socket){
socket.emit('connected', {msg: 'You are connected now.'});
socket.on('upvote', function(data){
socket.emit('upvoteR', 'newConnected');
socket.broadcast.emit('upvoteR', 'newCOnnected');
});
});
server.listen(8080);
return io;
};
I feel like could be useful for me separate the server default config, of my socket config, and use it file (socketServer.js) as my starting point to develop all my sockets logics injecting the dependencies I want.
I don't know if is out there a better approach to this problem, or some structure best practices that I should follow or inconveniences of doing this.
So besides this structure, this are other doubts:
How to use sockets and express server in the same port?
Seems like, with express 4 I'm not able to link the express server with socket, because express 4 server does not inherit any more of httpServer of node.js, so now I have to do a server.listen(socketPort) and if I use the same app.port of mean.js this just is an EADDRINUSE error. Is still possible to have it working in the same port ?
How to use express session to authenticate each socket connection? if not possible, what's the better approach ? An example or a document reference would be nice for me.
thanks in advance.
I would like to share my solution just in case someone in the future has the same requirement that I had.
How to authenticate each socket connection base on express session information.
First I configure express to use passport.js library the following way:
// CookieParser should be above session
var cp =cookieParser;
app.use(cp());
// Express MongoDB session storage
var mStore = new mongoStore({
db: db.connection.db,
collection: config.sessionCollection
});
app.use(session({
secret: config.sessionSecret,
store: mStore
}));
// use passport session
app.use(passport.initialize());
app.use(passport.session());
So far is the normal implementation of passport over express. be sides this configuration I added passport-socket.io.js to my project. This is my working configuration:
var server = http.Server(app);
var io = IO(server);
io.use(
function(socket,next){
passportSocketIo.authorize({
cookieParser: cp,
key: 'connect.sid', // the name of the express cookie
secret: config.sessionSecret, // the session_secret to parse the cookie
store: mStore, // mongo session storage
success: onAuthorizeSuccess, // *optional* callback on success
fail: onAuthorizeFail, // *optional* callback on fail/error
})(socket, next);
}
);
app.io=io;
server.listen(config.port);
Where "onAuthorizeSuccess" and "onAuthorizeFail" are functions to allow the conections and develop the sockets logics.. well,with this my socket.io connection is authenticated with my passport session information and if the user is not logged the socket would not connect..
And if we need some authorization logic based on user roles, the passport.socketio creates a socket.request.user where you can find yours users roles to use in your roles sockets logics..
Related
Pretty much, I'm wondering how to create, and maintain a replica of a Redis session store in a Nodejs app with a microservices architecture.
Short background (somewhat)
I'm planning the architecture of a project I'm gonna start working on, and have decided to use Redis store for storing user sessions. I'm trying out a microservices architecture, and essentially, there's gonna be an authentication service that writes, and reads the to the session store as needed. What instantiating the store looks like usually:
const express = require('express');
const session = require('express-session');
const redis = require('redis');
const redisStore = require('connect-redis')(session);
const redisClient = redis.createClient();
const { SESSION_OPTIONS } = require('./configs/session');
const app = express();
app.use(session({
...SESSION_OPTIONS,
store: new redisStore({ host: 'localhost', port: 6379, client: redisClient })
}))
However, throughout my app, almost all user actions will have to be authorised, which involves checking user credentials stored in the session. Usually, in my monolithic projects, it looks something like:
router.get('/someUserAction', ensureAuthorisation, (req, res) => {
...
})
where ensureAuthorisation is a function like:
ensureAuthorisation: (req, res, next) => {
if(req.user.isAuthorised) {
return next();
}
return res.status(401).json({success: false, msg: 'fail msg'});
}
Because almost every user action will require authorisation, hence reads to the Redis sessions store, I'm assuming this wouldn't be good for the authentication service, that uses the same store but writes and reads less frequently. Thus, I want to create a separate service, the authorisation service, which uses its own Redis store. This store for the authorisation service should be a copy of the authentication service's store, removing, and updating sessions accordingly.
My initial thought was to have some form of asynchronous communication between both stores, with the authentication store writing new sessions to the authorisations... Question is, is this even possible, or would the copy store be more of a cache?
This was a bit long, and maybe confusing, so please ask any questions if you need further explanation (be kind to me pls lol). ANY HELP IS APPRECIATED 🙏
You can go ahead using a single Redis only rather than creating a cache for other service.
use redis in sentinel mode, authorization service wirtes on the master and other read from the replicas.
I have an Express 4.17.1 app that uses express-session 1.17.2 and Socket.io 4.1.2.
To share Express's session with Socket.io, before a connection is made, we usually do:
import session from "../../loaders/session.js";
const init = function (io) {
io.use(function (socket, next) {
session(socket.request, {}, next);
});
io.on("connection", /* bla bla bla */);
};
export default init;
The problem is that whenever Express routes update session data, Socket.io session doesn't sync, it keeps the original values in what we defined in io.use. Is there a way to call a sync/update method or something?
(The reason I'm asking is that answers online are all for Socket.io 3.x and there is little documentation for using Socket.io with express-session)
I'm building an App deployed to Heroku which uses Websockets.
The websockets connection is working properly when I use only 1 dyno, but when I scale to >1, I get the following errors
POST
http://****.herokuapp.com/socket.io/?EIO=2&transport=polling&t=1412600135378-1&sid=zQzJJ8oPo5p3yiwIAAAC
400 (Bad Request) socket.io-1.0.4.js:2
WebSocket connection to
'ws://****.herokuapp.com/socket.io/?EIO=2&transport=websocket&sid=zQzJJ8oPo5p3yiwIAAAC'
failed: WebSocket is closed before the connection is established.
socket.io-1.0.4.js:2
I am using the Redis adaptor to enable multiple web processes
var io = socket.listen(server);
var redisAdapter = require('socket.io-redis');
var redis = require('redis');
var pub = redis.createClient(18049, '[URI]', {auth_pass:"[PASS]"});
var sub = redis.createClient(18049, '[URI]', {detect_buffers: true, auth_pass:"[PASS]"} );
io.adapter( redisAdapter({pubClient: pub, subClient: sub}) );
This is working on localhost (which I am using foreman to run, as Heroku does, and I am launching 2 web processes, same as on Heroku).
Before I implemented the redis adaptor I got a web-sockets handshake error, so the adaptor has had some effect. Also it is working occasionally now, I assume when the sockets match the same web dyno.
I have also tried to enable sticky sessions, but then it never works.
var sticky = require('sticky-session');
sticky(1, server).listen(port, function (err) {
if (err) {
console.error(err);
return process.exit(1);
}
console.log('Worker listening on %s', port);
});
I'm the Node.js Platform Owner at Heroku.
WebSockets works on Heroku out-of-the-box across multiple dynos; socket.io (and other realtime libs) use fallbacks to stateless processes like xhr polling that break without session affinity.
To scale up socket.io apps, first follow all the instructions from socket.io:
http://socket.io/docs/using-multiple-nodes/
Then, enable session affinity on your app (this is a free feature):
https://devcenter.heroku.com/articles/session-affinity
I spent a while trying to make socket.io work in multi-server architecture, first on Heroku and then on Openshift as many suggest.
The only way to make it work on both PAAS is disabling xhr-polling and setting transports: ['websocket'] on both client and server.
On Openshift, you must explicitly set the port of the server to 8000 (for ws – 8443 for wss on socket.io client initialization, using the *.rhcloud.com server, as explained in this post: http://tamas.io/deploying-a-node-jssocket-io-app-to-openshift/.
Polling strategy doesn't work on Heroku because it does not support sticky sessions (https://github.com/Automattic/engine.io/issues/261), and on Openshift it fails because of this issue: https://github.com/Automattic/engine.io/issues/279, that will hopefully be fixed soon.
So, the only solution I found so far, is disabling polling and use websocket transport only.
In order to do that, with socket.io > 1.0
server-side:
var app = express();
var server = require('http').createServer(app);
var socketio = require('socket.io')(server, {
path: '/socket.io-client'
});
socketio.set('transports', ['websocket']);
client-side:
var ioSocket = io('<your-openshift-app>.rhcloud.com:8000' || '<your-heroku-app>.herokuapp.com', {
path: '/socket.io-client'
transports: ['websocket']
})
Hope this will help.
It could be you need to be running RedisStore:
var session = require('express-session');
var RedisStore = require('connect-redis')(session);
app.use(session({
store: new RedisStore(options),
secret: 'keyboard cat'
}));
per earlier q here: Multiple dynos on Heroku + socket.io broadcasts
I know this isn't a normal answer, but I've tried to get WebSockets working on Heroku for more than a week. After many long conversations with customer support I finally tried out OpenShift. Heroku WebSockets are in beta, but OpenShift WebSockets are stable. I got my code working on OpenShift in under an hour.
http://www.openshift.com
I am not affiliated with OpenShift in any way. I'm just a satisfied (non-paying) customer.
I was having huge problems with this. There were a number of issues failing simultaneously making it a huge nightmare. Make sure you do the following to scale socket.io on heroku:
if you're using clusters make sure you implement socketio-sticky-session or something similar
client's connect url should not be https://example.com/socket.io/?EIO=3&transport=polling but rather https://example.com/ notably I'm using https because heroku supports it
enable cors in socket.io
specify only websocket connections
For you and others it could be any one of these.
if you're having trouble setting up sticky-session clusters here's my working code
var http = require('http');
var cluster = require('cluster');
var numCPUs = require('os').cpus().length;
var sticky = require('socketio-sticky-session');
var redis = require('socket.io-redis');
var io;
if(cluster.isMaster){
console.log('Inside Master');
// create the worker processes
for (var i = 0; i < numCPUs ; i++){
cluster.fork();
}
}
else {
// The worker code to be run is written inside
// the sticky().
}
sticky(function(){
// This code runs inside the workers.
// The sticky-session balances connection between workers based on their ip.
// So all the requests from the same client would go to the same worker.
// If multiple browser windows are opened in the same client, all would be
// redirected to the same worker.
io = require('socket.io')({transports:'websocket', 'origins' : '*:*'});
var server = http.createServer(function(req,res){
res.end('socket.io');
})
io.listen(server);
// The Redis server can also be used to store the socket state
//io.adapter(redis({host:'localhost', port:6379}));
console.log('Worker: '+cluster.worker.id);
// when multiple workers are spawned, the client
// cannot connect to the cloudlet.
StartConnect(); //this function connects my mongodb, then calls a function with io.on('connection', ..... socket.on('message'...... in relation to the io variable above
return server;
}).listen(process.env.PORT || 4567, function(){
console.log('Socket.io server is up ');
});
more information:
personally it would work flawlessly from a session not using websockets (I'm using socket.io for a unity game. It worked flawlessly from the editor only!). When connecting through the browser whether chrome or firefox it would show these handshaking errors, along with error 503 and 400.
I have a node/socket.io/express server that's connected to a HTML file (like so). So visiting the web address connects you to the server. I am trying to set up a system where by, said server is being run on multiple computers at a time and by way of some sort of username and password authentication, visiting the webpage with specific credentials connects you to one of the computers with those same credentials running the server.
Ive seen mention of "Redis" from previous similar questions but they are pretty old and im wondering if there is a newer or better way of achieving this.
You won't find a lot of up-to-date documentation since Express 4 is kind of new, so let me try to remedy that here :
Authentication in Express 4.x and Socket.IO 1.x
Let's start with a confusion I think you're making:
What is Redis?
Redis is a data structure engine. It allows you to store key/values pairs, nothing more (In this context). The only thing it can do for you when building your authentication system is storing the data, user info, session ids, etc. In your case, you can share a store between multiple machines, the same way you'd share a database, or a text file.
Redis
Authenticate user to node/express server
One of the ways you can do that is by using passport. Passport is a middleware dedicated to authentication on Node.js. It is made for use with Express and relatively easy to setup. There is an excellent tutorial series on how to setup passport with your express application, so I won't detail this part, please take the time to go through the series, it's invaluable knowledge.
Here's the link to the first part, which is the one I'll focus on for the next step.
Add socket.io to the mix
Socket.io doesn't have access to the session cookies that you create in part 1. To remedy that, we will use the passport-socketio module.
Passport-socketio requires a local session store, as opposed to a memory store. This means we need some way to store the session data somewhere, does that ring a bell?
Exactly, Redis.
You can try other stores, like mongoDB or MySQL, but Redis is the fastest.
In this example, I'll assume that your express app and passport are already operational and will focus on adding socket.io to the app.
Setup :
var session = require('express-session'); //You should already have this line in your app
var passportSocketIo = require("passport.socketio");
var io = require("socket.io")(server);
var RedisStore = require('connect-redis')(session);
var sessionStore = new RedisStore({ // Create a session Store
host: 'localhost',
port: 6379,
});
app.use(session({
store: sessionStore, //tell express to store session info in the Redis store
secret: 'mysecret'
}));
io.use(passportSocketIo.authorize({ //configure socket.io
cookieParser: cookieParser,
secret: 'mysecret', // make sure it's the same than the one you gave to express
store: sessionStore,
success: onAuthorizeSuccess, // *optional* callback on success
fail: onAuthorizeFail, // *optional* callback on fail/error
}));
Connect-redis is a session store package that uses redis (in case the name isn't obvious).
Final step :
function onAuthorizeSuccess(data, accept){
console.log('successful connection to socket.io');
accept(); //Let the user through
}
function onAuthorizeFail(data, message, error, accept){
if(error) accept(new Error(message));
console.log('failed connection to socket.io:', message);
accept(null, false);
}
io.sockets.on('connection', function(socket) {
console.log(socket.request.user);
});
The user object found in socket.request will contain all the user info from the logged in user, you can pass it around, or do whatever you need with it from this point.
Note : This setup will be slightly different for Socket.IO < 1.x
adapting an existing app to Express 4.4.x
Trying to implement redis session store, using the following code:
var express = require('express');
var session = require('express-session');
var RedisStore = require('connect-redis')(session);
var redis = require('redis').createClient();
app.use(session({
store: new RedisStore({
host: '1.1.1.1',
port: 1234,
prefix: 'yourprefix:',
client: redis
}),
secret: '.......',
resave: true,
saveUninitialized: true
}));
However, when I run this there is no key stored in redis and I cannot call req.session. I'm sure it's something really simple I've missed or included something in the code which is restricting it. Could also be redis settings?
Thanks
Ok, I just got lost in the documentation. Difficult to keep track of express-session, redis, connect-redis and entirely remember the difference between everything.
The simple solution was:
var redis = require('redis').createClient(port, host, auth);
Now it works. I believe I was able to use .createClient(); when the redis server was hosted locally, however now we've moved to a dedicated redis server - these attributes are required.
Thanks, hope this helps somebody else!