Node.js Express rendering multiple subsequent views - node.js

I want to do something like:
//client -> notifies server that client is connected.
//server -> begins fetching information from DB (series of both async and synchronous requests).
//as sets of data become available on server -> server pushes updates to client via res.render()
Basically I have a menu item on the client, and I want to update that menu as the data that the server fetches gets ready. is there any way to do this? I notice I can't do
res.render('something');
// again
res.render('somethingElse');
Because once render is called, then the response is sent, and render cannot be called again
"Error: Can't set headers after they are sent."
Any suggestions?

You might benefit from using WebSockets:
http://en.wikipedia.org/wiki/WebSocket
This post has a little bit of info:
Which websocket library to use with Node.js?

HTTP works via request/response. Typically once the response is sent, the connection is terminated.
To stream data from the server to client, you can use websockets. There is a very popular node.js module called socket.io, which simplifies using websockets.
Using socket.io, the client code would look like this:
var socket = io.connect('http://yourserver.com');
socket.on('data', function (data) {
updateMenu(data);
});
And the server code:
var io = require('socket.io').listen(80);
io.sockets.on('connection', function (socket) {
socket.emit('data', data);
getMoreDataFromDb(function(data){
socket.emit('data', data);
});
// etc..
});
Alternatively, if you want a simpler solution, you can just make multiple small ajax requests to the server, until you get all your data:
(function getData(dataId){
$.ajax({
url:"yourserver.com/getdata",
data: dataId || {},
success:function(data){
updateMenu(data);
if(data) getData({ lastDataReceived: data.lastId }); // server is still returning data, request more
}
});
})();

Related

NodeJS - Response stream

I built a simple API endpoint with NodeJS using Sails.js.
When someone access my API endpoint, the server starts to wait for data and whenever a new data appears, he broadcasts it using sockets. Each client should receive his own stream of data based on his user input.
var Cap = require('cap').Cap;
collect: function (req, res) {
var iface = req.param("ip");
var c = new Cap(),
device = Cap.findDevice(ip);
c.on('data', function(myData) {
sails.sockets.blast('message', {"host": myData});
});
});
The response do not complete (I never send a res.json() - what actually happens is that the browser keep loading - but the above functionality works).
2 Problems:
I'm trying to subscribe and unsubscribe to to this API endpoint from my client (using RxJS). When I subscribe, I start to receive data via sockets - but I can't unsubscribe to the API endpoint (the browser expect the request to be completed).
Each client should subscribe to his own socket room based on the request IP parameter ( see updated code ). Currently it blasts the message to everyone.
How I can create a stream/service-like API endpoint with Sails.js that will emit new data to each user based on his input?
My goal is to be able to subscribe / unsubscribe to this API endpoint from each client.
Revised Answer
Let's assume your API endpoint is defined in config/routes.js like this:
...
'get /collect': 'SomeController.collectSubscribe',
'delete /collect': 'SomeController.collectUnsubscribe',
Since each Cap instance is tied to one device, we need one instance for each subscription. Instead of using the sails join/leave methods, we keep track of Cap instances in memory and just broadcast to the request socket's id. This works because Sails sockets are subscribed to their own ids by default.
In api/controllers/SomeController.js:
// In order for the `Cap` instances to persist after `collectSubscribe` finishes, we store them all in an Object, associated with which socket the were created for.
var caps = {/* req.socket.id: <instance of Cap>, */};
module.exports = {
...
collectSubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest("I need a websocket! Help!");
if (!!caps[req.socket.id]) return res.badRequest("Dude, you are already subscribed.");
caps[req.socket.id] = new Cap();
var c = caps[req.socket.id]; // remember that `c` is a reference to our new `Cap`, not a copy.
var device = c.findDevice(req.param('ip'));
c.open(device, ...);
c.on('data', function(myData) {
sails.sockets.broadcast(req.socket.id, 'message', {host: myData});
});
return res.ok();
},
collectUnsubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest("I need a websocket! Help!");
if (!caps[req.socket.id]) return res.badRequest("I can't unsubscribe you unless you actually subscribe first.");
caps[req.socket.id].removeAllListeners('data');
delete caps[req.socket.id];
return res.ok();
}
}
Basically, it goes like this: when a browser request triggers collectSubscribe, a new Cap instance listens to the provided IP. When the browser triggers collectUnsubscribe, the server retreives that Cap instance, tells it to stop listening, and then deletes it.
Production Considerations: please be aware that the list of Caps is NOT PERSISTENT (since it is stored in memory and not a DB)! So if your server is turned off and rebooted (due to lightning storm, etc), the list will be cleared, but considering that all websocket connections will be dropped anyway, I don't see any need to worry about this.
Old Answer, Kept for Reference
You can use sails.sockets.join(req, room) and sails.sockets.leave(req, room) to manage socket rooms. Essentially you have a room called "collect", and only sockets joined in that room will receive a sails.sockets.broadcast(room, eventName, data).
More info on how to user sails.sockets here.
In api/controllers/SomeController.js:
collectSubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest();
sails.sockets.join(req, 'collect');
return res.ok();
},
collectUnsubscribe: function(req, res) {
if (!res.isSocket) return res.badRequest();
sails.sockets.leave(req, 'collect');
return res.ok();
}
Finally, we need to tell the server to broadcast messages to our 'collect' room.
Note that this only need to happen once, so you can do this in a file under the config/ directory.
For this example, I'll put this in config/sockets.js
module.exports = {
// ...
};
c.on('data', function(myData) {
var eventName = 'message';
var data = {host: myData};
sails.sockets.broadcast('collect', eventName, data);
});
I am assuming that c is accessible here; If not, you could define it as sails.c = ... to make it globally accessible.

Terminate EventSource event listener?

I'm trying to work around a problem to do with rest streaming between the Nest API and a service (ST) that does not support streaming.
To get around this, I have built a service on Sails which takes a post request from ST containing the Nest Token, and then triggers an EventSource event listener that sends the data back to ST.
It is heavily based off the Nest rest-streaming example here:
https://github.com/nestlabs/rest-streaming and my code is as follows:
startStream: function(req, res) {
var nestToken = req.body.nestToken,
stToken = req.body.stToken,
endpointURL = req.body.endpointURL,
source = new EventSource(sails.config.nest.nest_api_url + '?auth=' + nestToken);
source.addEventListener('put', function(e) {
var d = JSON.parse(e.data);
var data = { devices: d.data.devices, structures: d.data.structures},
config = { headers : {'Authorization': 'Bearer ' + stToken}};
sendData(endpointURL, data, config);
});
source.addEventListener('open', function(e) {
console.log("Connection opened");
});
source.addEventListener('auth_revoked', function(e){
console.log("Auth token revoed");
});
source.addEventListener('error', function(e) {
if (e.readyState == EventSource.CLOSED) {
console.error('Connection was closed! ', e);
} else {
console.error('An unknown error occurred: ', e);
}
}, false);
}
};
The problem I foresee though is that once a request is received by the node server, it start the event listener, however I cannot for the life of me figure out how I can kill the event listener.
If I cannot figure out a way to stop this, then every EventListener will run indefinitely which is obviously not suitable.
Has anyone got any suggestions on how to overcome the issue?
Each SSH client connection is a dedicated socket.
If a particular client doesn't want event streaming, don't make the connection. If they start event streaming, but want to turn it off, call source.close();source=NULL;
If from server-side you want to stop sending the messages, close the socket.
You didn't show the server-side code, but if it is running a dedicated process per SSE client then you just exit the process. If you are maintaining a list of sockets, one per connected client, close the socket. On node.js you might be running a function on setInterval. To close the connection you do and clearInterval() and response.end();.

SocketIO - processing data before broadcasting

I am currently having a problem with a small Node.JS application. It is extremely similar to, for example, the Socket.IO sample chat application: whenever an user sends data to the server, I need to process the data and then broadcast it to all users. However, the way data is processed depends on the user that receives it. I tried altering the Socket#emit method (in order to process the data before sending it) for each socket, but the broadcasting doesn't use it. Is there any way I can get this solved?
Thanks in advance.
EDIT
Here is what I'd tried ():
var io = require('socket.io');
io.use(function(socket, next) {
var old_emit = socket.emit;
socket.emit = function() {
// do something with the data (arguments[1] in this case)
// the processing of the data depends on a value that is
// unique to each connection, and can't be sent over to the client
old_emit.apply(socket, arguments);
}
});
I tried the code above because I thought, initially, that broadcasting would call emit somewhere else for each connection.
I am not sure what you want to do,
There is no any "io( )" like function.You are doing it wrong.
BTW, to process data.
From sample chat application,
// when the client emits 'new message', this listens and executes
socket.on('new message', function (data) {
//call a function which you want to process data
var newData = messageProcessingFunc(data);
//and then broadcast it.
// we tell the client to execute 'new message'
socket.broadcast.emit('new message', {
username: socket.username,
message: newData
});
});

Node.js server side connection to Socket.io

I have a Node.js application with a frontend app and a backend app, the backend will manage the list and "push" an update to the frontend app, the call to the frontend app will trigger a list update so that all clients receive the correct list data.
The problem is on the backend side, when I press the button, I perform an AJAX call, and that AJAX call will perform the following code (trimmed some operations out of it):
Lists.findOne({_id: active_settings.active_id}, function(error, lists_result) {
var song_list = new Array();
for (i=0; i < lists_result.songs.length; i++) {
song_list.push(lists_result.songs[i].ref);
}
Song.find({
'_id': {$in: song_list}
}, function(error, songs){
// DO STUFF WITH THE SONGS
// UPDATE SETTINGS (code trimmed)
active_settings.save(function(error, updated_settings) {
list = {
settings: updated_settings,
};
var io = require('socket.io-client');
var socket = io.connect(config.app_url);
socket.on('connect', function () {
socket.emit('update_list', {key: config.socket_key});
});
response.json({
status: true,
list: list
});
response.end();
}
});
});
However the response.end never seems to work, the call keeps hanging, further more, the list doesn't always get refreshed so there is an issue with the socket.emit code. And the socket connection stays open I assume because the response isn't ended?
I have never done this server side before so any help would be much appreciated. (the active_settings etc exists)
I see some issues that might or might not be causing your problems:
list isn't properly scoped, since you don't prefix it with var; essentially, you're creating a global variable which might get overwritten when there are multiple requests being handled;
response.json() calls .end() itself; it doesn't hurt to call response.end() again yourself, but not necessary;
since you're not closing the socket(.io) connection anywhere, it will probably always stay open;
it sounds more appropriate to not set up a new socket.io connection for each request, but just once at your app startup and just re-use that;

How to disable Multiplexing with Socket.io

I am using Socket.io to stream live tweets to my users using Twitter's Streaming API (my implementation is more or less based on this tutorial).
The problem is that every time a connection event is fired by Socket.io the newly connected client causes every other client connected to the server to cease updating. While it would take too long to go through all the hacks that I tried, I will say that I played with it enough that I believe the problem is caused by Socket.io's multiplexing of the connections from multiple clients (enabled by default) as a performance boost to allow multiple clients or connections to share the same underlying socket. In short, I believe this to be the case because I don't think it would be possible for new connections to affect older connections in this manner if not for the connection multiplexing. In other words, if a new, independent connection with its own underlying (TCP) socket were created every time a client connected it would be impossible for this to occur since one connection would know nothing about the other and therefore couldn't affect any other client's state as is currently happening. This also leads me to believe that simply disabling the multiplexing functionality would be the simplest way to get around this problem since I am not concerned about scaling because Node.js already handles all the concurrency I'm likely to need to handle very adequately.
I have gone through Socket.io's documentation but could not see where the ability to "demultiplex" the connections is exposed via the API, so if anyone knows how to do this I'd create appreciate your response.
My code below is pretty standard and simple. But just to be clear, the issue is that whenever a new client connects to Socket.io every other client stops receiving new tweets and updates are no longer pushed to the older client unless I refresh the browser in which case the newly refreshed client will begin to update and receive fresh tweets again, but the other still connected clients will then stop updating.
Server-side Code:
// Code also uses ntwitter (https://github.com/AvianFlu/ntwitter) as an abstraction over Twitter's Streaming API
io.sockets.on('connection', function (socket) {
tweet.stream('statuses/filter', { track : 'new orleans' }, function (stream) {
stream.on('data', function (data) {
// The following lines simply pre-process data sent from Twitter so junk isn't
// unnecessarily sent to the client.
if (data.user) {
tweets = {
text : data.text,
image : data.user.profile_image_url,
user : data.user.name
};
var t = JSON.stringify(tweets);
console.log(t);
socket.send(t);
}
});
});
});
Client-Side Code
// Notice the option that I passed in as the second argument. This supposedly forces every
// new client to create a new connection with the server but it either doesn't work or I'm
// implementing it incorrectly. It is the very last configuration option listed in the
// documentation linked to above.
var socket = io.connect('http://' + location.host, {'force new connection' : true });
socket.on('message', function (tweet) {
var t = JSON.parse(tweet);
if (t.image) {
$('.hero-unit').prepend('<div class="media"><a class="pull-left" href="#"><img class="media-object" alt="64x64" style="width: 64px; height: 64px;" src="' + t.image + '"></a><div class="media-body"><h4 class="media-heading">' + t.user + '</h4>' + t.text + '</div></div>');
}
});
If I am thinking of this incorrectly or if there's something wrong with my code I am definitely open to any suggestions. I'd also be happy to reply with any additional details.
I would try something like this
Serverside:
io.sockets.on('connection', function (socket) {
//Other Connectiony goodness here.
});
});
tweet.stream('statuses/filter', { track : 'new orleans' }, function (stream) {
stream.on('data', function (data) {
// The following lines simply pre-process data sent from Twitter so junk isn't
// unnecessarily sent to the client.
if (data.user) {
tweets = {
text : data.text,
image : data.user.profile_image_url,
user : data.user.name
};
var t = JSON.stringify(tweets);
console.log(t);
io.sockets.emit("tweet", t);
}
});
Client-side:
var socket = io.connect('http://' + location.host, {'force new connection' : true });
socket.on('tweet', function (tweet) {
var t = JSON.parse(tweet);
if (t.image) {
$('.hero-unit').prepend('<div class="media"><a class="pull-left" href="#"><img class="media-object" alt="64x64" style="width: 64px; height: 64px;" src="' + t.image + '"></a><div class="media-body"><h4 class="media-heading">' + t.user + '</h4>' + t.text + '</div></div>');
}
});
Basically have the stream from twitter outside your socket, and then on a new tweet emit a message to all connected.

Resources