Test NodeJS socket.io with custom headers - node.js

I'm trying to write simple test to test my NodeJS socket.io app. Problem is that on handshake phase I do require certain values to be there (two cookies and headers). This is what I now do have:
var options = {
transports: ['websocket'],
'force new connection': true,
headers: {'accept-langauge': 'foo'}
};
it("send data", function(done) {
var client = io.connect('http://localhost:3000', options);
client.once("connect", function(s) {
expect(s.handshake).to.not.be(undefined);
expect(s.handshake.headers).to.be.an('object');
expect(s.handshake.headers['accept-language']).to.be('en');
client.once("send_premise_to_snet", function(id) {
id.should.equal("123");
client.disconnect();
done();
});
client.emit("send_data", 123);
});
});
I would like to be able to set accept-language and cookies so that they would appear in handshake and thus would be accessible through handshake property.
In normal browser request browser would fill headers properly, and now I would like to do it in the test phase as well.

After a successful hit with proper Google query words I managed to find out more information. It seems that currently it is impossible to do with current socket.io and socket.io-client implementations.
There exists few alternative approaches taken but none of them have managed to get through to actual packages.
For further information:
https://github.com/Automattic/socket.io/issues/2036
https://github.com/Automattic/socket.io-client/issues/648

Related

Sinon fake server not intercepting requests

Trying to use Sinon for the first time because of its fake server functionality that lets me stub an API response. Test itself is written for Mocha
However, the fake server doesn't seem to be intercepting the requests.
Code:
describe('when integrated', function() {
var server;
beforeEach(function() {
server = sinon.createFakeServer();
});
afterEach(function() {
server.restore();
});
it('can send a message to the notification service', function() {
server.respondWith("POST", new RegExp('.*/api/notificationmanager/messages.*'),
[200,
{ "Content-Type": "application/json" },
'{ "messageId":23561}'
]);
var messageOnly = new PushMessage(initMessageObj);
var originalUrl = PushMessage.serverUrl;
messageOnly.setServerAPI("http://a.fake.server/api/notificationmanager/messages");
console.log("fake server is: ", server);
messageOnly.notify()
.then(function(response) {
messageOnly.setServerAPI(originalUrl);
return response;
})
.then(function(response) {
response.should.be.above(0);
})
console.log(server.requests);
server.respond();
})
});
For reference, PushMessage is an object that has a static property serverUrl. I'm just setting the value to a fake URL & then resetting it.
The notify() function makes a post message using request-promise-native to the serverUrl set in the PushMessage's static property.
What seems to be happening, is that the POST request ends up being properly attempted against the URL of http://a.fake.server/api/notificationmanager/messages, resulting in an error that the address doesn't exist...
Any idea what I'm doing wrong...? Thanks!
There have been several issues on the Sinon GitHub repository about this. Sinon's fake server:
Provides a fake implementation of XMLHttpRequest and provides several interfaces for manipulating objects created by it.
Also fakes native XMLHttpRequest and ActiveXObject (when available, and only for XMLHTTP progids). Helps with testing requests made with XHR.
Node doesn't use XHR requests, so Sinon doesn't work for this use case. I wish it did too.
Here's an issue that breaks it down: https://github.com/sinonjs/sinon/issues/1049
Nock is a good alternative that works with Node: https://www.npmjs.com/package/nock

How to cancel previous ES request before making a new one

I am using NodeJS with https://www.npmjs.com/package/elasticsearch package
Use Case is like this: When a link is clicked on the page, I will make a request to NodeJS Server which will in turn use the ES node package to fetch the data from ES Server and sends the data back to the client.
The issue is, when two requests are made in quick session(two links clicked in a short span), the Response of first request and then the Response of second request is reaching the client. The UI depends on this response, and i would like to directly show only the second request's response.
So, the question is, Is there any way to cancel out the previous request made to ES Server before starting a new one ?
Code:
ES Client:
var elasticsearch = require('elasticsearch');
var client = new elasticsearch.Client({
host: 'HostName',
log: 'trace'
});
Route:
app.get('/data/:reportName', dataController.getReportData);
DataController:
function getReportData(req, res) {
query = getQueryForReport(report)
client.search(query)
.then(function(response) {
res.json(parseResponse(response)
})
}
So, the same API /data/reportName is called twice in succession with different reportNames. I would like to send only the second report Data back and cancel our the first request.
If you're only concerned about the UX, rather than stressing your ES, than aborting the ajax request is what you want.
Since you didn't post your client side code, I'll give you a generic example:
var xhr = $.ajax({
type: "GET",
url: "searching_route",
data: "name=John&location=Boston",
success: function(msg){
alert( "Data Saved: " + msg );
}
});
//kill the request
xhr.abort()
Remember that aborting the request may not prevent the elasticsearch query from being processed, but will prevent the client from receiving the data.

How to check if ElasticSearch client is connected?

I'm working with elasticsearch-js (NodeJS) and everything works just fine as long as long as ElasticSearch is running. However, I'd like to know that my connection is alive before trying to invoke one of the client's methods. I'm doing things in a bit of synchronous fashion, but only for the purpose of performance testing (e.g., check that I have an empty index to work in, ingest some data, query the data). Looking at a snippet like this :
var elasticClient = new elasticsearch.Client({
host: ((options.host || 'localhost') + ':' + (options.port || '9200'))
});
// Note, I already have promise handling implemented, omitting it for brevity though
var promise = elasticClient.indices.delete({index: "_all"});
/// ...
Is there some mechanism to send in on the client config to fail fast, or some test I can perform on the client to make sure it's open before invoking delete?
Update: 2015-05-22
I'm not sure if this is correct, but perhaps attempting to get client stats is reasonable?
var getStats = elasticClient.nodes.stats();
getStats.then(function(o){
console.log(o);
})
.catch(function(e){
console.log(e);
throw e;
});
Via node-debug, I am seeing the promise rejected when ElasticSearch is down / inaccessible with: "Error: No Living connections". When it does connect, o in my then handler seems to have details about connection state. Would this approach be correct or is there a preferred way to check connection viability?
Getting stats can be a heavy call to simply ensure your client is connected. You should use ping, see 2nd example https://github.com/elastic/elasticsearch-js#examples
We are using ping too, after instantiating elasticsearch-js client connection on start up.
// example from above link
var elasticsearch = require('elasticsearch');
var client = new elasticsearch.Client({
host: 'localhost:9200',
log: 'trace'
});
client.ping({
// ping usually has a 3000ms timeout
requestTimeout: Infinity,
// undocumented params are appended to the query string
hello: "elasticsearch!"
}, function (error) {
if (error) {
console.trace('elasticsearch cluster is down!');
} else {
console.log('All is well');
}
});

Node.js server side connection to Socket.io

I have a Node.js application with a frontend app and a backend app, the backend will manage the list and "push" an update to the frontend app, the call to the frontend app will trigger a list update so that all clients receive the correct list data.
The problem is on the backend side, when I press the button, I perform an AJAX call, and that AJAX call will perform the following code (trimmed some operations out of it):
Lists.findOne({_id: active_settings.active_id}, function(error, lists_result) {
var song_list = new Array();
for (i=0; i < lists_result.songs.length; i++) {
song_list.push(lists_result.songs[i].ref);
}
Song.find({
'_id': {$in: song_list}
}, function(error, songs){
// DO STUFF WITH THE SONGS
// UPDATE SETTINGS (code trimmed)
active_settings.save(function(error, updated_settings) {
list = {
settings: updated_settings,
};
var io = require('socket.io-client');
var socket = io.connect(config.app_url);
socket.on('connect', function () {
socket.emit('update_list', {key: config.socket_key});
});
response.json({
status: true,
list: list
});
response.end();
}
});
});
However the response.end never seems to work, the call keeps hanging, further more, the list doesn't always get refreshed so there is an issue with the socket.emit code. And the socket connection stays open I assume because the response isn't ended?
I have never done this server side before so any help would be much appreciated. (the active_settings etc exists)
I see some issues that might or might not be causing your problems:
list isn't properly scoped, since you don't prefix it with var; essentially, you're creating a global variable which might get overwritten when there are multiple requests being handled;
response.json() calls .end() itself; it doesn't hurt to call response.end() again yourself, but not necessary;
since you're not closing the socket(.io) connection anywhere, it will probably always stay open;
it sounds more appropriate to not set up a new socket.io connection for each request, but just once at your app startup and just re-use that;

Socket.io session without express.js?

I want to make a sessionhandling over websockets via node.js and socket.io without necessarily using cookies and avoiding express.js, because there should be also clients not running in a browser environment. Somebody did this already or got some experience with a proof of concept?
Before socket.io connection is established, there is a handshake mechanism, by default, all properly incoming requests successfully shake hands. However there is a method to get socket data during handshake and return true or false depending on your choice which accepts or denies the incoming connection request. Here is example from socket.io docs:
Because the handshakeData is stored after the authorization you can actually add or remove data from this object.
var io = require('socket.io').listen(80);
io.configure(function (){
io.set('authorization', function (handshakeData, callback) {
// findDatabyip is an async example function
findDatabyIP(handshakeData.address.address, function (err, data) {
if (err) return callback(err);
if (data.authorized) {
handshakeData.foo = 'bar';
for(var prop in data) handshakeData[prop] = data[prop];
callback(null, true);
} else {
callback(null, false);
}
})
});
});
The first argument of callback function is error, you can provide a string here, which will automatically refuse the client if not set to null. Second argument is boolean, whether you want to accept the incoming request or not.
This should be helpful, https://github.com/LearnBoost/socket.io/wiki/Authorizing
You could keep track of all session variables and uniquely identify users using a combination of the following available in handshakeData
{
headers: req.headers // <Object> the headers of the request
, time: (new Date) +'' // <String> date time of the connection
, address: socket.address() // <Object> remoteAddress and remotePort object
, xdomain: !!headers.origin // <Boolean> was it a cross domain request?
, secure: socket.secure // <Boolean> https connection
, issued: +date // <Number> EPOCH of when the handshake was created
, url: request.url // <String> the entrance path of the request
, query: data.query // <Object> the result of url.parse().query or a empty object
}
This example may help as well, just have your non-browser clients supply the information in a different way:
SocketIO + MySQL Authentication

Resources