How to connect multiple sockets to sails in test - node.js

I have a messaging controller setup in sails.js and want to test it with multiple clients to see if the pub/sub code works. I set up a test file
var socketIOClient = require('socket.io-client');
var sailsIOClient = require('sails.io.js');
var socket1 = socketIOClient;
var client1 = sailsIOClient(socket1);
var socket2 = socketIOClient;
var client2 = sailsIOClient(socket2);
var socket3 = socketIOClient('http://localhost:1337', {'force new connection': true});
var client3 = sailsIOClient(socket2);
...
client1.socket.get... works and says it is subscribed.
client1.socket.post... works and posts a message to the DB.
So I want to test that a client can receive the notification when a new message is posted. However, when I post from either client1 or client2, it posts from both. Essentially, they are linked to the same socket object or something like that, but I don't know where. So I want to connect multiple sockets, and I've tried variations like socket3 and client3, but get the following problem:
client3.socket.get... and client3.socket.post... and other variations (forceNew, multiplexing, etc.) each hang up and don't resolve.
Example of hang up:
sails.log('posting...');
client3.socket.post('/v1.0/messaging', data, function(body, JWR){
sails.log('posted');
done();
});
Only posting... is logged in this way, but posted is logged if using client1 or client2.
My question:
How can I connect multiple clients to my sails api to test if my pub/sub controller works?

I can't test it right now, but you could try this
var socketIOClient = require('socket.io-client');
var sailsIOClient = require('sails.io.js');
// Instantiate the socket client (`io`)
var io = sailsIOClient(socketIOClient);
// prevents socket to connect with it's default origin
io.sails.autoConnect = false
// Ask the client to create two socket connections
var socket1 = io.sails.connect('http://localhost:1337');
var socket2 = io.sails.connect('http://localhost:1337');
// Test it
socket1.get(url, data, cb)
socket1.post(url, data, cb)
socket2.get(url, data, cb)
socket2.post(url, data, cb)
// If you want to configure and use the "eager" instance
io.sails.url = 'http://localhost:1337';
io.socket.get(url, data, cb)
This way, you would create several SailsSocket instance instead of using the "eager" instance.
When you use sails.io.js in a browser, io.socket contains the socket instance (called "eager instance" in the comments) which will automatically try to connect using the host that the js file was served from. io.sails.connect() allows you to create other instances.

The correct syntax for actual version of socket.io should be
//first socket
var socket1 = socketIOClient('http://localhost:1337', {'forceNew: true});
//second socket
var socket2 = socketIOClient('http://localhost:1337', {'forceNew: true});
See socket.io docs http://socket.io/blog/socket-io-1-2-0/#

Related

Can't receive redis data from socket io

I'm building a realtime visualization using redis as pubsub messenger between python and node. There's a python script always running which sets a redis hash with hmset. That side of the app is working fine, if I enter the following example command: "HGETALL 'sellers-80183917'" in a redis client I end up getting the proper data.
The problem is in the js side. I'm using socketio and redis nodejs libraries to listen to the redis instance and publish the results online through a d3js viz.
I run the following code with node:
var express = require('express');
var app = express();
var redis = require('redis');
app.use(express.static(__dirname + '/public'));
var http = require('http').Server(app);
var io = require('socket.io')(http);
var sredis = require('socket.io-redis');
io.adapter(sredis({ host: 'localhost', port: 6379 }));
redisSubscriber = redis.createClient(6379, 'localhost', {});
redisSubscriber.on('message', function(channel, message) {
io.emit(channel, message);
});
app.get('/sellers/:seller_id', function(req, res){
var seller_id = req.params.seller_id;
redisSubscriber.subscribe('sellers-'.concat(seller_id));
res.render( 'seller.ejs', { seller:seller_id } );
});
http.listen(3000, '127.0.0.1', function(){
console.log('listening on *:3000');
});
And this is the relevant part of the seller.ejs file that's receiving the user requests and outputting the viz:
var socket = io('http://localhost:3000');
var stats;
var seller_key = 'sellers-'.concat(<%= seller %>);
socket.on(seller_key, function(msg){
stats = [];
console.log('Im in');
var seller = $.parseJSON(msg);
var items = seller['items'];
for(item in items) {
var item_data = items[item];
stats.push({'title': item_data['title'], 'today_visits': item_data['today_visits'], 'sold_today': item_data['sold_today'], 'conversion_rate': item_data['conversion_rate']});
}
setupData(stats);
});
The problem is that the socket_on() method never receives anything and I don't see where the problem is as everything seems to be working fine besides this.
I think that you might be confused as to what Pub/Sub in Redis actually is. It's not a way to listen to changes on hashes; you can have a Pub/Sub channel called sellers-1, and you can have a hash with the key sellers-1, but those are unrelated to each other.
As documented here:
Pub/Sub has no relation to the key space.
There is a thing called keyspace notifications that can be used to listen to changes in the key space (through Pub/Sub channels); however, this feature isn't enabled by default because it'll take up more resources.
Perhaps an easier method would be to publish a message after the HMSET, so any subscribers would know that the hash got changed (they would then retrieve the hash contents themselves, or the published message would contain the relevant data).
This brings us to the next possible issue: you only have one subscriber connection, redisSubscriber.
From what I understand from the Node.js Redis driver, calling .subscribe() on such a connection would remove any previous subscriptions in favor of the new one. So if you were previously subscribed to the sellers-1 channel and subscribe to sellers-2, you wouldn't be receiving messages from the sellers-1 channel anymore.
You can listen on multiple channels by either passing an array of channels, or by passing them as a arguments:
redisSubscriber.subscribe([ 'sellers-1', 'sellers-2', ... ])
// Or:
redisSubscriber.subscribe('sellers-1', 'sellers-2', ... )
You would obviously have to track each "active" seller subscription. Either that, or create a new connection for each subscription, which also isn't ideal.
It's probably a better idea to have a single Pub/Sub channel on which all changes would get published, instead of a separate channel for each seller.
Finally: if your seller id's aren't hard to guess (for instance, if it's based on an incremental integer value), it would be trivial for someone to write a client that would make it possible to listen in on any seller channel they'd like. It might not be a problem, but it is something to be aware of.

Multiple connections, but single websocket.on("message") event emitter

I have a single device establishing two WebSocket connections to the Einaros/ws WebSocket server. Whenever the second WebSocket connection sends a message to the server, only the first websocket.on("message") event emitter responds. There is no way to differentiate which WebSocket the message is coming from because there seems to be only a single websocket.on("message") event emitter object.
How can I differentiate from which WebSocket connection the message is being received from without passing an ID from the client side?
I apologize if I am overlooking something simple, I am a node.js and coding novice. From the code below it looks like there should be separate event emitter objects created for each WebSocket connection so that the server knows which connection the message is coming from. My code looks like this:
var connections = new Map();
var idCounter = 0;
wss.on("connection", function connection(ws) {
var connectionID = idCounter++;
connections.set(connectionID, ws);
var session = connections.get(connectionID);
session.on("message", function incoming(message) {
session.send(message);
}
}
--- Update ---
I have performed another test. With the code below "objectTest" contains the unique WebSocket connection distinguished by 'sec-websocket-key' printed to the console. However "this.send(message);" and "console.log(this);" both refer to the first established WebSocket connection even while "objectTestMap" contains the second "objectTest" that is unique.
var connections = new Map();
var idCounter = 0;
wss.on("connection", function connection(ws) {
var connectionID = idCounter++;
connections.set(connectionID, ws);
var session = connections.get(connectionID);
var sendThis = String(connectionID);
session.send(sendThis);
var objectTestMap = new Map();
var objectTest = session.on("message", function incoming(message) {
this.send(message);
console.log(this);
});
objectTestMap.set(connectionID, objectTest);
console.log(objectTestMap.get(connectionID));
});
Their was an error on my client application that was connecting to the server. No problems with WS and the above code works as it should.

How to instantiate Multiple Redis Connections for Publish Subscribe (node.js + node_redis)

Scenario
Using node_redis to build a simple Redis Pubish Subscribe (chat) example: https://github.com/nelsonic/hapi-socketio-redis-chat-example (with Hapi.js and Socket.io)
We have created a node module redis_connection.js in our project ( see: http://git.io/vqaos ) to instantiate the Redis connection because we don't want to be repeating the code which connects (to RedisCloud) multiple times:
var redis = require('redis');
var url = require('url');
var redisURL = url.parse(process.env.REDISCLOUD_URL);
var redisClient = redis.createClient(redisURL.port, redisURL.hostname,
{no_ready_check: true});
redisClient.auth(redisURL.auth.split(":")[1]);
module.exports = redisClient;
Which we then use like this:
var redisClient = require('./redis_connection.js');
// Confirm we are able to connect to RedisCloud:
redisClient.set('redis', 'working', redisClient.print);
redisClient.get('redis', function (err, reply) {
console.log('RedisCLOUD is ' +reply.toString());
});
This works fine for normal GET/SET operations with Redis,
but when we try to instantiate multiple connections to Redis (e.g: one to publish, another to subscribe and a third just to GET/SET keys/values) we get an error:
Issue
We are seeing the following error:
Error: Connection in subscriber mode, only subscriber commands may be used
What are we doing wrong?
Full code at the point where we see this issue: http://git.io/vqa6y
Note
We tried to dig through existing SO Q/A on this, e.g:
Publish subscribe with nodejs and redis(node_redis)
Redis publish/subscribe: see what channels are currently subscribed to
how to use the redis publish/subscribe
but did not find a solution that exactly matched our situation...
(any suggestions/help much appreciated!)
Not tested, but too long for a comment.
Try to define another redis connection module, one for your regular usage and a second one solely for your pubsub subscriptions usage.
Add a redis_pubsub_connection.js to your project:
var redis = require('redis');
var url = require('url');
var redisURL = url.parse(process.env.REDISCLOUD_URL);
var redisPubSubClient = redis.createClient(redisURL.port, redisURL.hostname,
{no_ready_check: true});
redisPubSubClient.auth(redisURL.auth.split(":")[1]);
module.exports = redisPubSubClient;
And change your publish.js require statement to:
var redis = require('./redis_pubsub_connection'); // RedisCloud
redis-connection node.js module
In the interest of keeping this re-useable across our projects we wrote a (mini) node.js module to initialize Redis connections: https://github.com/dwyl/redis-connection
The code is simple and tested and takes care of authentication if required.
(not copy-pasting the module here to avoid duplication)
see: https://github.com/dwyl/redis-connection/blob/master/index.js
Usage:
Install from NPM
npm install redis-connection --save
Use in your script
var redisClient = require('redis-connection')();
redisClient.set('hello', 'world');
redisClient.get('hello', function (err, reply) {
console.log('hello', reply.toString()); // hello world
});
Publish Subscribe
var redisClient = require('redis-connection')(); // Publisher
var redisSub = require('redis-connection')('subscriber');
redisSub.subscribe("chat:messages:latest", "chat:people:new");
For a working example see: https://github.com/dwyl/hapi-socketio-redis-chat-example
The advantage is that we can re-use the same redisClient across multiple files in the same project without creating new connections (the single or pub/sub connection is cached and re-used)
Credit: We borrowed ideas from several places so have up-voted all the answers. But ultimately we wrote a slightly different solution so we have shared it with everyone on NPM/GitHub. Thanks again everyone!
If you want to supply regular connection and a sub one and you want to ensure you only have one of each across the application than you could use a combination of the two solutions that includes the notion of a singleton, something like this:
var subConnection, con;
var createConnection = module.exports.createConnection = function(){
var redis = require('redis');
var url = require('url'); var redisURL = url.parse(process.env.REDISCLOUD_URL);
var redisClient = redis.createClient(redisURL.port, redisURL.hostname, {no_ready_check: true});
redisClient.auth(redisURL.auth.split(":")
return redisClient;
}
module.exports.getSubConnection = function(){
if (!subConnection)
subConnection = createConnection();
return subConnection
}
module.exports.getConnection = function(){
if (!con)
con = createConnection();
return con
}
}
Repeat for the oher two connection types and call it like
var con = require('./redis_connection.js').getConnection();
The problem is that your redis client creation code is being cached by requires so you reuse the same connection again and again. Instead of returning the connection in your redis_connection module, you could return a function:
module.exports = function(){
var redis = require('redis');
var url = require('url'); var redisURL = url.parse(process.env.REDISCLOUD_URL);
var redisClient = redis.createClient(redisURL.port, redisURL.hostname, {no_ready_check: true});
redisClient.auth(redisURL.auth.split(":")
return redisClient;
}
And then call it like so:
var redisClient = require('./redis_connection.js')();

Socket.io with multiple Node.js hosts, emit to all clients

I am new to Socket.io and trying to get my head around the best approach to solve this issue.
We have four instances of a Node.js app running behind a load balancer.
What I am trying to achieve is for another app to POST some data to the load balancer URL which will hand if off to one of the instances.
The receiving instance will store the data, then use Socket.io to emit the data to the connected clients.
The issue is that browser/client can only be connected to a single instance at one time.
I am trying to determine if there is a way to emit to all clients at once?
Or have the clients connect to multiple servers using io.connect?
Or is this a case for Redis?
Publish/Subscribe is what you need here. Redis will give you the functionality your looking for out of the box. You just need to create a redis client and subscribe to an update channel on each of your app server nodes. Then, publish the update when a POST is successful (or whatever). Finally, have the redis client subscribe to the update chanel and on message emit a socketio event:
(truncated for brevity)
var express = require('express')
, socketio = require('socket.io')
, redis = require('redis')
, rc = redis.createClient()
;
var app = express();
var server = http.createServer(app);
var io = socketio.listen(server);
server.listen(3000);
app.post('/targets', function(req, res){
rc.publish('update', res.body);
});
rc.on('connect', function(){
// subscribe to the update channel
rc.subscribe('update');
});
rc.on('message', function(channel, msg){
// util.log('Channel: ' + channel + ' msg: ' + msg);
var msg = JSON.parse(msg);
io.sockets.in('update').emit('message', {
channel: channel,
msg: msg
});
});
Then in the JS app, listen for that emitted message:
socket.on('message', function(data){
debugger;
// do something with the updated data
});
Of course, introducing this new Redis Server adds another single point of failure. A more robust implementation may use something like a message broker with AMQP or ZeroMQ or some similar networking library which provides pub/sub capabilities.

Socket.IO & private messages

This must have been asked already a thousand times, but I do not find any of the answers satisfying, so I'll try having another go, being as clear as possible.
I am starting out with a clean Express; the one that is usually done via the following terminal commands:
user$ express
user$ npm install
then I proceed installing socket.io, this way:
user$ npm install socket.io --save
on my main.js file I then have the following:
//app.js
var express = require('express'),
http = require('http'),
path = require('path'),
io = require('socket.io'),
routes = require('./routes');
var app = express();
I start my socket.io server by attaching it to my express one:
//app.js
var server = http.createServer(app).listen(app.get('port'), function(){
console.log('express server started!');
});
var sIo = io.listen(server);
What I do now is to set the usual routes for Express to work with:
//app.js
app.get('/', routes.index);
app.get('/send/:recipient/:text', routes.sendMessage);
Now, Since I like to keep things organized, I want to put my socket.io code in another file, so instead of using the usual code:
//app.js
sIo.sockets.on('connection', function(socket){
console.log('got a connection');
});
I use the following to be able to access both the socket and the sIo object (as that object contains all the connections infos (important)):
//app.js
sIo.sockets.on('connection', function(socket){
routes.connection(sIo, socket);
});
// index.js (./routes)
exports.connection = function(sIo, socket){
console.log('got a connection.');
};
This way I can do all my socket.io jobs in here. I know that I can access all my clients information now from the sIo object, but of course, they do not contain any information about their session data.
My questions now are the following:
Suppose a user makes an HTTP request to send a message and the handler in my routes is like this:
exports.sendMessage = function(req, res){
//do stuff here
};
How can I get this to "fire" something in my socket.io to send a message? I do not want to know all the underlying work that needs to be done, like keeping track of messages, users, etc. I only want to understand how to "fire" socket.io to do something.
How can I make sure that socket.io sends the message only to a person in particular and be 100% sure that nobody else gets it? From what I can see, there is no way to get the session infos from the sIo object.
Thanks in advance.
question one: The cleanest way to separate the two would probably be to use an EventEmitter. You create an EventEmitter that emits when an http message comes in. You can pass session information along with the event to tie it back to the user who sent the message if necessary.
// index.js (./routes)
var EventEmitter = require('events').EventEmitter;
module.exports.messageEmitter = messageEmitter = new EventEmitter();
module.exports.sendMessage = function(req, res) {
messageEmitter.emit('private_message', req.params.recipient, req.params.text);
};
question 2: You can access the socket when the initial connection is made. An example mostly borrowed from this answer:
var connect = require('connect'),
userMap = {};
routes.messageEmitter.on('private_message', function(recipient, text) {
userMap[recipient].emit('private_message', text);
});
io.on('connection', function(socket_client) {
var cookie_string = socket_client.request.headers.cookie;
var parsed_cookies = connect.utils.parseCookie(cookie_string);
var connect_sid = parsed_cookies['connect.sid'];
if (connect_sid) {
session_store.get(connect_sid, function (error, session) {
userMap[session.username] = socket_client;
});
}
socket_client.on('private_message', function(username, message) {
userMap[username].emit(private_message, message)
});
});
So we're just creating a map between a session's username and a socket connection. Now whenever you need to send a message you can easily lookup what socket is associated with that user and send a message to them using their socket. Just make sure to handle disconnects, and reconnects and connecting in multiple tabs, etc.
I have built something like what you are saying. If a user can make a socket request, it pushes the message via the socket, and then the server does a broadcast or emit of it. But, if a user can't connect to the socket, it then does the http post, like what you are saying by calling the sendMessage. What I have done, rather than having sendMessage shoot off a socket is that I also have my clients doing an ajax request every 5 seconds or so. That will bring back new messages, and if any of the messages were not received via socket.io, I then add them to my clientside array. This acts as sort of a safety net, so I don't have to always fully trust socket.io.
see below in pseudo code
client
if canSendSocketMessage()
sendSocketMessage(message)
else
sendAjaxMessage(message)
setInterval( ->
// ajax call
getNewMessages()
), 5000
server
socket stuff
socket.on 'message' ->
saveMessage()
socket.emit(message)
ajax endpoints
app.post 'sendMessage'
saveMessage()
app.get 'getNewMessages'
res.send getNewMessages()

Resources