Terminate EventSource event listener? - node.js

I'm trying to work around a problem to do with rest streaming between the Nest API and a service (ST) that does not support streaming.
To get around this, I have built a service on Sails which takes a post request from ST containing the Nest Token, and then triggers an EventSource event listener that sends the data back to ST.
It is heavily based off the Nest rest-streaming example here:
https://github.com/nestlabs/rest-streaming and my code is as follows:
startStream: function(req, res) {
var nestToken = req.body.nestToken,
stToken = req.body.stToken,
endpointURL = req.body.endpointURL,
source = new EventSource(sails.config.nest.nest_api_url + '?auth=' + nestToken);
source.addEventListener('put', function(e) {
var d = JSON.parse(e.data);
var data = { devices: d.data.devices, structures: d.data.structures},
config = { headers : {'Authorization': 'Bearer ' + stToken}};
sendData(endpointURL, data, config);
});
source.addEventListener('open', function(e) {
console.log("Connection opened");
});
source.addEventListener('auth_revoked', function(e){
console.log("Auth token revoed");
});
source.addEventListener('error', function(e) {
if (e.readyState == EventSource.CLOSED) {
console.error('Connection was closed! ', e);
} else {
console.error('An unknown error occurred: ', e);
}
}, false);
}
};
The problem I foresee though is that once a request is received by the node server, it start the event listener, however I cannot for the life of me figure out how I can kill the event listener.
If I cannot figure out a way to stop this, then every EventListener will run indefinitely which is obviously not suitable.
Has anyone got any suggestions on how to overcome the issue?

Each SSH client connection is a dedicated socket.
If a particular client doesn't want event streaming, don't make the connection. If they start event streaming, but want to turn it off, call source.close();source=NULL;
If from server-side you want to stop sending the messages, close the socket.
You didn't show the server-side code, but if it is running a dedicated process per SSE client then you just exit the process. If you are maintaining a list of sockets, one per connected client, close the socket. On node.js you might be running a function on setInterval. To close the connection you do and clearInterval() and response.end();.

Related

How to poll another server periodically from a node.js server?

I have a node.js server A with mongodb for database.
There is another remote server B (doesn't need to be node based) which exposes a HTTP/GET API '/status' and returns either 'FREE' or 'BUSY' as the response.
When a user hits a particular API endpoint in server A(say POST /test), I wish to start polling server B's status API every minute, until server B returns 'FREE' as the response. The user doesn't need to wait till the server B returns a 'FREE' response (polling B is a background job in server A). Once the server A gets a 'FREE' response from B, it shall send out an email to the user.
How can this be achieved in server A, keeping in mind that the number of concurrent users can go large ?
I suggest you use Agenda. https://www.npmjs.com/package/agenda
With agenda you can create recurring schedules under which you can schedule anything pretty flexible.
I suggest you use request module to make HTTP get/post requests.
https://www.npmjs.com/package/request
Going from the example in node.js docs I'd go with something like the code here. I tested and it works. BTW, I'm assuming here that the api response is something like {"status":"BUSY"} & {"status":"FREE"}
const http = require('http');
const poll = {
pollB: function() {
http.get('http://serverB/status', (res) => {
const { statusCode } = res;
let error;
if (statusCode !== 200) {
error = new Error(`Request Failed.\n` +
`Status Code: ${statusCode}`);
}
if (error) {
console.error(error.message);
res.resume();
} else {
res.setEncoding('utf8');
let rawData = '';
res.on('data', (chunk) => { rawData += chunk; });
res.on('end', () => {
try {
const parsedData = JSON.parse(rawData);
// The important logic comes here
if (parsedData.status === 'BUSY') {
setTimeout(poll.pollB, 10000); // request again in 10 secs
} else {
// Call the background process you need to
}
} catch (e) {
console.error(e.message);
}
});
}
}).on('error', (e) => {
console.error(`Got error: ${e.message}`);
});
}
}
poll.pollB();
You probably want to play with this script and get rid of unnecessary code for you, but that's homework ;)
Update:
For coping with a lot of concurrency in node.js I'd recommend to implement a cluster or use a framework. Here are some links to start researching about the subject:
How to fully utilise server capacity for Node.js Web Apps
How to Create a Node.js Cluster for Speeding Up Your Apps
Node.js v7.10.0 Documentation :: cluster
ActionHero.js :: Fantastic node.js framework for implementing an API, background tasks, cluster using http, sockets, websockets
Use a library like request, superagent, or restify-clients to call server B. I would recommend you avoid polling and instead use a webhook when calling B (assuming you are also authoring B). If you can't change B, then setTimeout can be used to schedule subsequent calls on a 1 second interval.

SocketIO - processing data before broadcasting

I am currently having a problem with a small Node.JS application. It is extremely similar to, for example, the Socket.IO sample chat application: whenever an user sends data to the server, I need to process the data and then broadcast it to all users. However, the way data is processed depends on the user that receives it. I tried altering the Socket#emit method (in order to process the data before sending it) for each socket, but the broadcasting doesn't use it. Is there any way I can get this solved?
Thanks in advance.
EDIT
Here is what I'd tried ():
var io = require('socket.io');
io.use(function(socket, next) {
var old_emit = socket.emit;
socket.emit = function() {
// do something with the data (arguments[1] in this case)
// the processing of the data depends on a value that is
// unique to each connection, and can't be sent over to the client
old_emit.apply(socket, arguments);
}
});
I tried the code above because I thought, initially, that broadcasting would call emit somewhere else for each connection.
I am not sure what you want to do,
There is no any "io( )" like function.You are doing it wrong.
BTW, to process data.
From sample chat application,
// when the client emits 'new message', this listens and executes
socket.on('new message', function (data) {
//call a function which you want to process data
var newData = messageProcessingFunc(data);
//and then broadcast it.
// we tell the client to execute 'new message'
socket.broadcast.emit('new message', {
username: socket.username,
message: newData
});
});

Node.js server side connection to Socket.io

I have a Node.js application with a frontend app and a backend app, the backend will manage the list and "push" an update to the frontend app, the call to the frontend app will trigger a list update so that all clients receive the correct list data.
The problem is on the backend side, when I press the button, I perform an AJAX call, and that AJAX call will perform the following code (trimmed some operations out of it):
Lists.findOne({_id: active_settings.active_id}, function(error, lists_result) {
var song_list = new Array();
for (i=0; i < lists_result.songs.length; i++) {
song_list.push(lists_result.songs[i].ref);
}
Song.find({
'_id': {$in: song_list}
}, function(error, songs){
// DO STUFF WITH THE SONGS
// UPDATE SETTINGS (code trimmed)
active_settings.save(function(error, updated_settings) {
list = {
settings: updated_settings,
};
var io = require('socket.io-client');
var socket = io.connect(config.app_url);
socket.on('connect', function () {
socket.emit('update_list', {key: config.socket_key});
});
response.json({
status: true,
list: list
});
response.end();
}
});
});
However the response.end never seems to work, the call keeps hanging, further more, the list doesn't always get refreshed so there is an issue with the socket.emit code. And the socket connection stays open I assume because the response isn't ended?
I have never done this server side before so any help would be much appreciated. (the active_settings etc exists)
I see some issues that might or might not be causing your problems:
list isn't properly scoped, since you don't prefix it with var; essentially, you're creating a global variable which might get overwritten when there are multiple requests being handled;
response.json() calls .end() itself; it doesn't hurt to call response.end() again yourself, but not necessary;
since you're not closing the socket(.io) connection anywhere, it will probably always stay open;
it sounds more appropriate to not set up a new socket.io connection for each request, but just once at your app startup and just re-use that;

How to disable Multiplexing with Socket.io

I am using Socket.io to stream live tweets to my users using Twitter's Streaming API (my implementation is more or less based on this tutorial).
The problem is that every time a connection event is fired by Socket.io the newly connected client causes every other client connected to the server to cease updating. While it would take too long to go through all the hacks that I tried, I will say that I played with it enough that I believe the problem is caused by Socket.io's multiplexing of the connections from multiple clients (enabled by default) as a performance boost to allow multiple clients or connections to share the same underlying socket. In short, I believe this to be the case because I don't think it would be possible for new connections to affect older connections in this manner if not for the connection multiplexing. In other words, if a new, independent connection with its own underlying (TCP) socket were created every time a client connected it would be impossible for this to occur since one connection would know nothing about the other and therefore couldn't affect any other client's state as is currently happening. This also leads me to believe that simply disabling the multiplexing functionality would be the simplest way to get around this problem since I am not concerned about scaling because Node.js already handles all the concurrency I'm likely to need to handle very adequately.
I have gone through Socket.io's documentation but could not see where the ability to "demultiplex" the connections is exposed via the API, so if anyone knows how to do this I'd create appreciate your response.
My code below is pretty standard and simple. But just to be clear, the issue is that whenever a new client connects to Socket.io every other client stops receiving new tweets and updates are no longer pushed to the older client unless I refresh the browser in which case the newly refreshed client will begin to update and receive fresh tweets again, but the other still connected clients will then stop updating.
Server-side Code:
// Code also uses ntwitter (https://github.com/AvianFlu/ntwitter) as an abstraction over Twitter's Streaming API
io.sockets.on('connection', function (socket) {
tweet.stream('statuses/filter', { track : 'new orleans' }, function (stream) {
stream.on('data', function (data) {
// The following lines simply pre-process data sent from Twitter so junk isn't
// unnecessarily sent to the client.
if (data.user) {
tweets = {
text : data.text,
image : data.user.profile_image_url,
user : data.user.name
};
var t = JSON.stringify(tweets);
console.log(t);
socket.send(t);
}
});
});
});
Client-Side Code
// Notice the option that I passed in as the second argument. This supposedly forces every
// new client to create a new connection with the server but it either doesn't work or I'm
// implementing it incorrectly. It is the very last configuration option listed in the
// documentation linked to above.
var socket = io.connect('http://' + location.host, {'force new connection' : true });
socket.on('message', function (tweet) {
var t = JSON.parse(tweet);
if (t.image) {
$('.hero-unit').prepend('<div class="media"><a class="pull-left" href="#"><img class="media-object" alt="64x64" style="width: 64px; height: 64px;" src="' + t.image + '"></a><div class="media-body"><h4 class="media-heading">' + t.user + '</h4>' + t.text + '</div></div>');
}
});
If I am thinking of this incorrectly or if there's something wrong with my code I am definitely open to any suggestions. I'd also be happy to reply with any additional details.
I would try something like this
Serverside:
io.sockets.on('connection', function (socket) {
//Other Connectiony goodness here.
});
});
tweet.stream('statuses/filter', { track : 'new orleans' }, function (stream) {
stream.on('data', function (data) {
// The following lines simply pre-process data sent from Twitter so junk isn't
// unnecessarily sent to the client.
if (data.user) {
tweets = {
text : data.text,
image : data.user.profile_image_url,
user : data.user.name
};
var t = JSON.stringify(tweets);
console.log(t);
io.sockets.emit("tweet", t);
}
});
Client-side:
var socket = io.connect('http://' + location.host, {'force new connection' : true });
socket.on('tweet', function (tweet) {
var t = JSON.parse(tweet);
if (t.image) {
$('.hero-unit').prepend('<div class="media"><a class="pull-left" href="#"><img class="media-object" alt="64x64" style="width: 64px; height: 64px;" src="' + t.image + '"></a><div class="media-body"><h4 class="media-heading">' + t.user + '</h4>' + t.text + '</div></div>');
}
});
Basically have the stream from twitter outside your socket, and then on a new tweet emit a message to all connected.

twitter-node detecting connection end

I'm using the twitter-node library for node.js and it works well, however I'm having some minor difficulty handling disconnects.
When twitter disconnects me (I'm connecting a second time from the same server to force a disconnect so I can make sure I'm handling these sorts of issues) it doesn't produce an error or an end event.I thought the following would handle it:
var twitter = new TwitterNode({
user : opts.account,
password : opts.password,
track : opts.hashtags,
follow : opts.follow
});
// omitted handlers for receiving tweets/deletes/limit info, but its there
twitter.addListener('error', function(error) {
console.log('error occoured:' + error.message);
}).addListener('end', function(resp) {
sys.puts("wave goodbye... " + resp.statusCode);
}).stream();
However, I don't get either the message from 'end' or 'error' when I'm disconnected. Anyone familiar with this issue?
For anyone having this same issue:
There's no notification from twitter-node because it doesn't handle the https libraries close event - by going into the source and adding:
response.on('close', function() { twit.emit('close',this); }
The library now emits a close event when the connection is closed by the remote server (twitter) and you can handle it with a listener in your code like this:
twitterStreamReader = new TwitterNode({...});
twitterStreamReader.addListener('close', function(resp) {
sys.puts('The server connection has been closed. You may want to do something about that.');
});

Resources