Node.js server side connection to Socket.io - node.js

I have a Node.js application with a frontend app and a backend app, the backend will manage the list and "push" an update to the frontend app, the call to the frontend app will trigger a list update so that all clients receive the correct list data.
The problem is on the backend side, when I press the button, I perform an AJAX call, and that AJAX call will perform the following code (trimmed some operations out of it):
Lists.findOne({_id: active_settings.active_id}, function(error, lists_result) {
var song_list = new Array();
for (i=0; i < lists_result.songs.length; i++) {
song_list.push(lists_result.songs[i].ref);
}
Song.find({
'_id': {$in: song_list}
}, function(error, songs){
// DO STUFF WITH THE SONGS
// UPDATE SETTINGS (code trimmed)
active_settings.save(function(error, updated_settings) {
list = {
settings: updated_settings,
};
var io = require('socket.io-client');
var socket = io.connect(config.app_url);
socket.on('connect', function () {
socket.emit('update_list', {key: config.socket_key});
});
response.json({
status: true,
list: list
});
response.end();
}
});
});
However the response.end never seems to work, the call keeps hanging, further more, the list doesn't always get refreshed so there is an issue with the socket.emit code. And the socket connection stays open I assume because the response isn't ended?
I have never done this server side before so any help would be much appreciated. (the active_settings etc exists)

I see some issues that might or might not be causing your problems:
list isn't properly scoped, since you don't prefix it with var; essentially, you're creating a global variable which might get overwritten when there are multiple requests being handled;
response.json() calls .end() itself; it doesn't hurt to call response.end() again yourself, but not necessary;
since you're not closing the socket(.io) connection anywhere, it will probably always stay open;
it sounds more appropriate to not set up a new socket.io connection for each request, but just once at your app startup and just re-use that;

Related

NodeJS - Response stream

I built a simple API endpoint with NodeJS using Sails.js.
When someone access my API endpoint, the server starts to wait for data and whenever a new data appears, he broadcasts it using sockets. Each client should receive his own stream of data based on his user input.
var Cap = require('cap').Cap;
collect: function (req, res) {
var iface = req.param("ip");
var c = new Cap(),
device = Cap.findDevice(ip);
c.on('data', function(myData) {
sails.sockets.blast('message', {"host": myData});
});
});
The response do not complete (I never send a res.json() - what actually happens is that the browser keep loading - but the above functionality works).
2 Problems:
I'm trying to subscribe and unsubscribe to to this API endpoint from my client (using RxJS). When I subscribe, I start to receive data via sockets - but I can't unsubscribe to the API endpoint (the browser expect the request to be completed).
Each client should subscribe to his own socket room based on the request IP parameter ( see updated code ). Currently it blasts the message to everyone.
How I can create a stream/service-like API endpoint with Sails.js that will emit new data to each user based on his input?
My goal is to be able to subscribe / unsubscribe to this API endpoint from each client.
Revised Answer
Let's assume your API endpoint is defined in config/routes.js like this:
...
'get /collect': 'SomeController.collectSubscribe',
'delete /collect': 'SomeController.collectUnsubscribe',
Since each Cap instance is tied to one device, we need one instance for each subscription. Instead of using the sails join/leave methods, we keep track of Cap instances in memory and just broadcast to the request socket's id. This works because Sails sockets are subscribed to their own ids by default.
In api/controllers/SomeController.js:
// In order for the `Cap` instances to persist after `collectSubscribe` finishes, we store them all in an Object, associated with which socket the were created for.
var caps = {/* req.socket.id: <instance of Cap>, */};
module.exports = {
...
collectSubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest("I need a websocket! Help!");
if (!!caps[req.socket.id]) return res.badRequest("Dude, you are already subscribed.");
caps[req.socket.id] = new Cap();
var c = caps[req.socket.id]; // remember that `c` is a reference to our new `Cap`, not a copy.
var device = c.findDevice(req.param('ip'));
c.open(device, ...);
c.on('data', function(myData) {
sails.sockets.broadcast(req.socket.id, 'message', {host: myData});
});
return res.ok();
},
collectUnsubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest("I need a websocket! Help!");
if (!caps[req.socket.id]) return res.badRequest("I can't unsubscribe you unless you actually subscribe first.");
caps[req.socket.id].removeAllListeners('data');
delete caps[req.socket.id];
return res.ok();
}
}
Basically, it goes like this: when a browser request triggers collectSubscribe, a new Cap instance listens to the provided IP. When the browser triggers collectUnsubscribe, the server retreives that Cap instance, tells it to stop listening, and then deletes it.
Production Considerations: please be aware that the list of Caps is NOT PERSISTENT (since it is stored in memory and not a DB)! So if your server is turned off and rebooted (due to lightning storm, etc), the list will be cleared, but considering that all websocket connections will be dropped anyway, I don't see any need to worry about this.
Old Answer, Kept for Reference
You can use sails.sockets.join(req, room) and sails.sockets.leave(req, room) to manage socket rooms. Essentially you have a room called "collect", and only sockets joined in that room will receive a sails.sockets.broadcast(room, eventName, data).
More info on how to user sails.sockets here.
In api/controllers/SomeController.js:
collectSubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest();
sails.sockets.join(req, 'collect');
return res.ok();
},
collectUnsubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest();
sails.sockets.leave(req, 'collect');
return res.ok();
}
Finally, we need to tell the server to broadcast messages to our 'collect' room.
Note that this only need to happen once, so you can do this in a file under the config/ directory.
For this example, I'll put this in config/sockets.js
module.exports = {
// ...
};
c.on('data', function(myData) {
var eventName = 'message';
var data = {host: myData};
sails.sockets.broadcast('collect', eventName, data);
});
I am assuming that c is accessible here; If not, you could define it as sails.c = ... to make it globally accessible.

How to inform client of current state of execution

I wish to have a progress bar on the client-side built using AngularJS. This progress bar will inform the end-user of the current state of execution of a query on the server.
The server, in this case, is ExpressJS.
So, using AngularJS I will make a request to the server such as:
$http.post('/data/aPath')
.success(function (result){
//Update the progress here.
});
What I wish to know is how can I send responses without ending them so that AngularJS can receive them as shown above and I can update the progress bar? That is, on the Node.js side,
app.post('/data/aPath', function (request, response){
//What should I do here to update the client on the
//current execution state
//Something on the lines of
//response.write("finished fetching user details, moving on to
//updating records"
});
so that the client can then update the progress bar?
I haven't done this myself, to be honest, but I believe one way to approach this would be like this:
On your client-side, modify your $http.post function
function getStatusFromProcessing() {
$http.post('/data/aPath').success(function (result){
// Pseudocode:
if(result.status is something) { // ex: (result >= 20)
// update progressbar with new value
}
if(result.status is somethingElse) { // ex: (result >= 40)
// update progressbar with new value
}
// All updating of progressbar done for now
if(result.status !== 100) { // Progress is not done
return getStatusFromProcessing(); // run the function again
} else {
return; // All done, move on
}
});
}
Now, the server side:
app.post('/data/aPath', function (request, response){
// You need some kind of function/service, which returns
// the current state of the processing.
var currentStatus = getStatusFromRunningProcess();
// getStatusFromRunningProcess() should return a value between (0 - 100)
return response.json({ status: currentStatus});
});
A few notes:
Code is not tested (obviously)
Should you choose to do something like this, I think the processing time should be somewhat substantial in time
Maybe add a small timeout in your client code inside if(result.status !== 100) of a couple of (hundred) milliseconds to avoid spamming the http request. But that is fine-tuning :)
Update, alternative solution using sockets
If you don't want to make several requests to the server from the client, you can switch it around; the server sends a message to the client when the process is updated. This requires more likely less bandwidth and requests done to the server. This method is possible using sockets.
When using sockets, the socket.io framework is very popular. There are also lots of tutorials online.
In short:
The server sends a message when the status of the processing is updated
The client receives this information and updates the progress bar
Here is a SO post regarding sockets and a progress bar. This is about file uploads though, but the concept is the same.

Reasonable design of using socket.io for RPC

I am building a web app that uses socket.io to trigger remote procedure calls that passes sessions specific data back go the client. As my app is getting bigger and getting more users, I wanted to check to see if my design is reasonable.
The server that receives websocket communications and triggers RPCs looks something like this:
s.on('run', function(input) {
client.invoke(input.method, input.params, s.id, function(error, res, more) {
s.emit('output', {
method: input.method,
error: error,
response: res,
more: more,
id: s.id
});
});
});
However, this means that the client has to first emit the method invocation, and then listen to all method returns and pluck out its correct return value:
socket.on('output', function(res) {
if (res.id === socket.sessionid) {
if (!res.error) {
if(res.method === 'myMethod') {
var myResponse = res.response;
// Do more stuff with my response
});
}
}
});
It is starting to seem like a messy design as I add more and more functions... is there a better way to do this?
The traditional AJAX way of attaching a callback to each function would be a lot nicer, but I want to take advantage of the benefits of websockets (e.g. less overhead for rapid communications).

Node.js Express rendering multiple subsequent views

I want to do something like:
//client -> notifies server that client is connected.
//server -> begins fetching information from DB (series of both async and synchronous requests).
//as sets of data become available on server -> server pushes updates to client via res.render()
Basically I have a menu item on the client, and I want to update that menu as the data that the server fetches gets ready. is there any way to do this? I notice I can't do
res.render('something');
// again
res.render('somethingElse');
Because once render is called, then the response is sent, and render cannot be called again
"Error: Can't set headers after they are sent."
Any suggestions?
You might benefit from using WebSockets:
http://en.wikipedia.org/wiki/WebSocket
This post has a little bit of info:
Which websocket library to use with Node.js?
HTTP works via request/response. Typically once the response is sent, the connection is terminated.
To stream data from the server to client, you can use websockets. There is a very popular node.js module called socket.io, which simplifies using websockets.
Using socket.io, the client code would look like this:
var socket = io.connect('http://yourserver.com');
socket.on('data', function (data) {
updateMenu(data);
});
And the server code:
var io = require('socket.io').listen(80);
io.sockets.on('connection', function (socket) {
socket.emit('data', data);
getMoreDataFromDb(function(data){
socket.emit('data', data);
});
// etc..
});
Alternatively, if you want a simpler solution, you can just make multiple small ajax requests to the server, until you get all your data:
(function getData(dataId){
$.ajax({
url:"yourserver.com/getdata",
data: dataId || {},
success:function(data){
updateMenu(data);
if(data) getData({ lastDataReceived: data.lastId }); // server is still returning data, request more
}
});
})();

How to disable Multiplexing with Socket.io

I am using Socket.io to stream live tweets to my users using Twitter's Streaming API (my implementation is more or less based on this tutorial).
The problem is that every time a connection event is fired by Socket.io the newly connected client causes every other client connected to the server to cease updating. While it would take too long to go through all the hacks that I tried, I will say that I played with it enough that I believe the problem is caused by Socket.io's multiplexing of the connections from multiple clients (enabled by default) as a performance boost to allow multiple clients or connections to share the same underlying socket. In short, I believe this to be the case because I don't think it would be possible for new connections to affect older connections in this manner if not for the connection multiplexing. In other words, if a new, independent connection with its own underlying (TCP) socket were created every time a client connected it would be impossible for this to occur since one connection would know nothing about the other and therefore couldn't affect any other client's state as is currently happening. This also leads me to believe that simply disabling the multiplexing functionality would be the simplest way to get around this problem since I am not concerned about scaling because Node.js already handles all the concurrency I'm likely to need to handle very adequately.
I have gone through Socket.io's documentation but could not see where the ability to "demultiplex" the connections is exposed via the API, so if anyone knows how to do this I'd create appreciate your response.
My code below is pretty standard and simple. But just to be clear, the issue is that whenever a new client connects to Socket.io every other client stops receiving new tweets and updates are no longer pushed to the older client unless I refresh the browser in which case the newly refreshed client will begin to update and receive fresh tweets again, but the other still connected clients will then stop updating.
Server-side Code:
// Code also uses ntwitter (https://github.com/AvianFlu/ntwitter) as an abstraction over Twitter's Streaming API
io.sockets.on('connection', function (socket) {
tweet.stream('statuses/filter', { track : 'new orleans' }, function (stream) {
stream.on('data', function (data) {
// The following lines simply pre-process data sent from Twitter so junk isn't
// unnecessarily sent to the client.
if (data.user) {
tweets = {
text : data.text,
image : data.user.profile_image_url,
user : data.user.name
};
var t = JSON.stringify(tweets);
console.log(t);
socket.send(t);
}
});
});
});
Client-Side Code
// Notice the option that I passed in as the second argument. This supposedly forces every
// new client to create a new connection with the server but it either doesn't work or I'm
// implementing it incorrectly. It is the very last configuration option listed in the
// documentation linked to above.
var socket = io.connect('http://' + location.host, {'force new connection' : true });
socket.on('message', function (tweet) {
var t = JSON.parse(tweet);
if (t.image) {
$('.hero-unit').prepend('<div class="media"><a class="pull-left" href="#"><img class="media-object" alt="64x64" style="width: 64px; height: 64px;" src="' + t.image + '"></a><div class="media-body"><h4 class="media-heading">' + t.user + '</h4>' + t.text + '</div></div>');
}
});
If I am thinking of this incorrectly or if there's something wrong with my code I am definitely open to any suggestions. I'd also be happy to reply with any additional details.
I would try something like this
Serverside:
io.sockets.on('connection', function (socket) {
//Other Connectiony goodness here.
});
});
tweet.stream('statuses/filter', { track : 'new orleans' }, function (stream) {
stream.on('data', function (data) {
// The following lines simply pre-process data sent from Twitter so junk isn't
// unnecessarily sent to the client.
if (data.user) {
tweets = {
text : data.text,
image : data.user.profile_image_url,
user : data.user.name
};
var t = JSON.stringify(tweets);
console.log(t);
io.sockets.emit("tweet", t);
}
});
Client-side:
var socket = io.connect('http://' + location.host, {'force new connection' : true });
socket.on('tweet', function (tweet) {
var t = JSON.parse(tweet);
if (t.image) {
$('.hero-unit').prepend('<div class="media"><a class="pull-left" href="#"><img class="media-object" alt="64x64" style="width: 64px; height: 64px;" src="' + t.image + '"></a><div class="media-body"><h4 class="media-heading">' + t.user + '</h4>' + t.text + '</div></div>');
}
});
Basically have the stream from twitter outside your socket, and then on a new tweet emit a message to all connected.

Resources