How can I test through a mocked promise with Mocha? - node.js

My node.js code is:
function getPatientNotificationNumbers(patientId) {
patientId = patientId && patientId.toString();
var sql = "SELECT * FROM [notification_phone_number] ";
sql += "WHERE patient_id = " + escapeSql(patientId);
return sqlServer.query(sql).then(function(results) {
var phoneNumbers = _.map(results[0], function (result) {
var obj = {
id: result.id,
phoneNumber: result.phone_number,
isPrimary: result.is_primary,
isVerified: result.is_verified,
patientId: result.patient_id
}
return obj;
});
return phoneNumbers;
});
}
Pretty simple and straight forward. What I want to test is that the return of this function, properly resolved, is an array of phoneNumbers that match that format.
sqlServer is require'd above and I have a ton of things require'd in this file. To stub them out, I am using mockery, which seems to be pretty great.
Here is my test, so far:
before(function() {
deferred = Q.defer();
mockery.enable();
moduleConfig.init();
mockery.registerSubstitute('moment', moment);
mockery.registerAllowable('../../util');
mockStubs.sqlServer = {
query: sinon.stub().returns(deferred.promise)
}
mockery.registerMock('../../db/sqlserver', mockStubs.sqlServer);
methods = require('../../../rpc/patient/methods');
});
beforeEach(function() {
deferred = Q.defer();
})
it('should get the patient notification numbers', function(done) {
// sinon.spy(sqlServer, 'query').and.returnValue(deferred.promise);
deferred.resolve('here we go');
methods.getPatientNotificationNumbers(1).then(function(result) {
console.log(result);
done();
});
});
However, it never gets past sqlServer.query in my code. So the results are pointless. I also tried something like:
response = methods.getPatientNotificationNumbers(1)
but when I console.log(response), it's basically {state: 'pending'}, which I guess is an unresolved promise.
So I'm all over the place and I'm open to using whatever libraries make things easy. I am not married to mockery, sinon or whatever else. Any suggestions would help.
Thanks!

So another approach is to use a combination of rewire and deride.
var should = require('should-promised');
var rewire = require('rewire');
var Somefile = rewire('./path/to/file');
var deride = require('deride');
var sut, mockSql;
before(function() {
mockSql = deride.stub(['query']);
Somefile.__set__('sqlServer', mockSql);
sut = new Somefile();
});
describe('getting patient notification numbers', function() {
beforeEach(function() {
mockSql.setup.query.toResolveWith(fakeSqlResponse);
});
it('resolves the promise', function() {
sut.getPatientNotificationNumbers(id).should.be.fulfilledWith(expectedPhoneNumbers);
});
it('queries Sql', function(done) {
sut.getPatientNotificationNumbers(id).then(function() {
mockSql.expect.query.called.once();
done();
});
});
});
This will mean that you do not need to change your production code and you can easily start testing the un-happy paths using something like this:
it('handles Sql errors', function(done) {
mockSql.setup.query.toRejectWith(new Error('Booom'));
sut.getPatientNotificationNumbers(id)
.then(function(){
done('should not have resolved');
})
.catch(function(e) {
e.should.be.an.Error;
e.message.should.eql('Booom');
done();
});
});
Or even more succinctly:
it('handles Sql errors', function(done) {
mockSql.setup.query.toRejectWith(new Error('Booom'));
sut.getPatientNotificationNumbers(id).should.be.rejectedWith(/Booom/);
});

I would zoom out a little bit and think of the test strategy. Assume that the goal is to test your model layer (methods) on top of the sql server. It can be done without stubbing out sql server. Your test suite layer can have set of util methods to create, initialize and drop the db which can be called from before, beforeEach etc.
Pros of doing this:
Testing real product code path is better (than stubbed code path).
Stubbing is better if you suspect the underlying layer bugs to generate noise. sqlserver layer is likely stable.
model layer seem to be simple enough that does not require testing in isolation.
Stubbing would make sense if you are trying to test sqlserver failure handling in your model layer. Then the stub layer can fake such errors - to exercise error paths in model code.
This is based on limited view of your problem. If there is more to it. Pl do share and we can take it from there.

It seems to me your intent is to test the function your passing the the .then() method, not the sqlserver library nor the actual promise it returns. We can assume those have already been tested.
If this is the case, you can simply factor out that function (the one that extracts the phone numbers from the SQL result) and test it independently.
Your code would become something along the lines of:
var getNumbers = require('./getNumbers');
function getPatientNotificationNumbers(patientId) {
patientId = patientId && patientId.toString();
var sql = "SELECT * FROM [notification_phone_number] ";
sql += "WHERE patient_id = " + escapeSql(patientId);
return sqlServer.query(sql).then(getNumbers);
}
... and your ./getNumbers.js file would look something like:
function getNumbers(results) {
var phoneNumbers = _.map(results[0], function (result) {
var obj = {
id: result.id,
phoneNumber: result.phone_number,
isPrimary: result.is_primary,
isVerified: result.is_verified,
patientId: result.patient_id
}
return obj;
});
return phoneNumbers;
}
module.exports = getNumbers;
Now you can test it independently:
var getNumbers = require('../path/to/getNumbers');
...
it('should get the patient notification numbers', function(done) {
var sampleResults = [ ... ]; // sample results from sqlServer
var phoneNumbers = getNumbers(sampleResults);
assert(...); // make what ever assertions you want
});

Related

Using nock to mimic multiple couch db requests

In my original function I need to make 2 requests to 2 different db's within the same couch login.
var cloudant = require('cloudant')(https://cloudant_url);
var userdb = cloudant.db.use('user');
var addrdb = cloudant.db.use('address');
function onChange(username) {
userdb.get(username, function(err,resp) {
var user_id = resp.id;
addrdb.get(user_id,function(err1,resp1){
var addr = resp1.address;
});
});
};
var nockVar = function() {
nock(testCloudantDBURL)
.get('/user/jack')
.reply(200,{'id' : 123});
nock(testCloudantDBURL)
.get('/address/123')
.reply(200,{'address':'123});
};
describe('Test Cloudant Listener code' , function() {
nockVar();
it('test get scenario', function() {
onChange('jack');
});
});
With this only the first call works and I can get the id : 123. The second call on address db is not getting intercepeted.
With nock I'm able to intercept only the first call,the second call is not happening.Any pointers ?
This happens because your code is executed asynchronously and your test doesn't wait for the userdb.get and addrdb.get to finish. Easiest (not best) way to handle this is to add a done callback to your test scenario and call it as soon as your onChange function is finished. Roughly something like:
function onChange(username, done) {
userdb.get(username, function(err,resp) {
var user_id = resp.id;
addrdb.get(user_id,function(err1,resp1){
var addr = resp1.address;
done();
});
};
};
it('test get scenario', function(done) {
onChange('jack', done);
});
You might also consider working with Promises based code.

Loopback testing with supertest, mocha and models

On the Google groups post on deprecating loopback-testing there is a question that asks about providing a proper example of how testing can be achieved without loopback-testing. That thread talks about using supertest instead.
Below is an attempt that I made to combine Mocha, supertest along with models (from app.js). The result works really well when I run the file by itself. But if I had another test file (say test-teacher.js) then the first test file (call it test-student.js) starts to fail in weird ways I can't describe.
Am I missing something or can models not be used like I am using them below?
describe('/Student', function () {
var server = require('../server/server')
var loopback = require('loopback')
var supertest = require('supertest')
var request = require('supertest')(server)
var dataSource = server.dataSource('db', {adapter: 'memory'})
var Student = dataSource.define('Student', {
'id': Number,
'points': Number
});
beforeEach(function () {
Student.updateOrCreate({id: 1, points: 5000});
})
it('Post a new student', function (done) {
request.post('/api/Students').send({points: 5000}).expect(200, done)
})
})
Based on feedback from jakerella on the previous answer, I changed the code above so that I don't have to redefine the models from scratch in the code (thanks jakerella!)
With the code below, I am able to run all the tests from multiple different models as a single suite using npm test without any failures.
Since I am interested in only individual orders ... listen and close were not necessary. I suspect that if I was testing overall instances of models that were created it would become required.
describe('/Student', function () {
var server = require('../server/server')
var request = require('supertest')(server)
var expect = require('expect.js')
var Student
before(function() {
Student = server.models.Student
})
beforeEach(function (done) {
Student.upsert({id: 1, points: 5000}, function() { done() })
})
it('Post a new student', function (done) {
request.post('/api/Students').send({points: 5000}).expect(200, done)
})
})
Wanted to toss this into an answer... the first issue was an undefined dataSource var, but then you also had redefined Student in your two tests. My recommendation instead is used the LoopBack app and models already defined (usually in common/models/).Then the basic implementation for testing (that I use) is something like the code below (using mocha and chai). Note the beforeEach and afterEach to start and stop the server.
var assert = require('chai').assert,
superagent = require('superagent'),
app = require('../server/server');
describe('Person model', function() {
var server;
beforeEach(function(done) {
server = app.listen(done);
});
afterEach(function(done) {
server.close(done);
});
it('should log in and log out with live server', function(done) {
superagent
.post('http://localhost:3000/api/People/login')
.send({ email: 'john#doe.com', password: 'foobar' })
.set('Accept', 'application/json')
.set('Content-Type', 'application/json')
.end(function(err, loginRes) {
if (err) { return done(err); }
assert.equal(loginRes.status, 200);
assert.ok(loginRes.body);
assert.equal(loginRes.body.userId, 1);
}
});
});
});

provide data for protractor test

I'm new to nodejs and working now to automatize functionnal test doing BDD with cucumber and protractor.
For some of my steps, i need to send query to an oracle database and then use the result to make a search in the tested website.
1- I'm trying with oracleDB witch returns me the expected result that i put in a variable but this one is not available in my steps. Method sendKeys of webdriver puts "undefined" in the input.
2- I'm also wondering if there's not another way because it was hard to install oracledb and the next task is to have jenkins builds
here is my code:
var dbQueryContract = function() {
var oracledb = require('oracledb');
//Database communication
oracledb.getConnection(
{
user : "xxx",
password : "xxx",
connectString : "xxx"
},
function(err, connection)
{
if (err) {
console.error(err.message);
return;
}
connection.execute(
"select xxx, xxx " +
"FROM xxxxx " +
"where xxx is not null and rownum < 5",
{
resultSet: true
},
// bind value for :id [110],
function(err, result)
{
if (err) { console.error(err.message); return; }
console.log(result.rows[0][0]);
});
criteria= result.rows[0][0];
// the connection is ok and i can log result.rows[0][0] i want to use for search
});
};
module.exports = new dbQueryContract();
************************************************************************************************
// Use the external Chai As Promised to deal with resolving promises in
// expectations.
var chai = require('chai');
var chaiAsPromised = require('chai-as-promised');
chai.use(chaiAsPromised);
var expect = chai.expect;
var page1 = require('../page1.js');
var page2 = require('../page2.js');
var dbQueryContract = require('../dbQueryContract.js');
var EC = protractor.ExpectedConditions;
// Chai expect().to.exist syntax makes default jshint unhappy.
// jshint expr:true
module.exports = function() {
this.Given(/^thanx for help$/, function(next) {
browser.ignoreSynchronization=true;
browser.get('toto.com');
page1.login.sendKeys('login')
page1.password.sendKeys('P#ssword')
page1.validateButton.click();
browser.ignoreSynchronization=false;
page2.searchLink.click();
browser.waitForAngular();
browser.sleep(5000);
// console.log(dbQueryContract.numabo.result.rows[0][0]);
dbQueryContract().then(function(criteria) {
page2.searchInput.sendKeys(criteria, protractor.Key.ENTER);
});
next();
});
this.When(/^i learn more$/, function(next) {
browser.sleep(5000);
next();
});
};
This code doesn't look right. Your module is exporting the result of invoking new dbQueryContract();. However, that function isn't really a constructor function. You're not setting any properties on this or adding any properties to the prototype. Also, you're not returning anything, which is probably why your getting undefined later.
The next problem is that you're chaining a then call, assuming that the driver uses promises - it doesn't (at least not yet). You'd need to have your function return a deferred which you'd need to resolve at the right time yourself. There's some discussion here about adding more robust JavaScript layer:
https://github.com/oracle/node-oracledb/pull/321
If we go that route I'd like to see all async methods return a promise by default which would make things easier...

Unit testing node.js model code accessing mongodb, without actually accessing the database

For example, I have the code below:
var db = require('./_mongo.js');
module.exports = {
check: function (cb) {
var content = {};
content.collection = 'counters';
content.query = {_id: 'ping'};
content.columns = {};
db.read(content, function(err, result){
if (err) {
cb(-1);
}
else {
cb(0);
}
});
}
};
How do I write a unit test for the 'check' function, without actually accessing the database, while at the same time checking if I am able to code the correct 'content' variable being passed to the read method?
You can mock an entire module with a mock framework, like sinon.js:
var db = sinon.mock(require('_mongo.js'))
I would not recommend to mock database access, it could require you to code all possible responses...
It would be best if you would hide the database access behind an abstracted service layer and mock that layer.
For example, you can create a database access layer in this way:
var db = require('./_mongo.js');
module.exports = {
//this is a mockable method
getCounter: function (id, callback) {
var content = {};
content.collection = 'counters';
content.query = {_id: id};
content.columns = {};
db.read(content, callback);
}
};
//and then using it
module.exports = {
check: function (cb) {
//access the actual method or the mock
da.getCounter('ping', function(err, result){
if (err) {
cb(-1);
}
else {
cb(0);
}
});
}
};
test-studio provides mechanisms for stubbing out module dependencies. It also supports things like executing individual or groups of tests and stepping node-inspector into individual tests.
Read more about it here.

Making Mocha test work for my db module

I am new to Mocha, and only a little experience with Node/Express. My DbProvider module works perfectly (mongodb) when I am access it through my Express app. And now I want to test it. I have read the Mocha site and some tutorials I could find. But I have big trouble of finding an real-world example out there that I could follow (any links much appreciated!).
Here is my unsuccessful attempt to write a testfile:
var DbProvider = require('../db').DbProvider;
var assert = require('assert');
var dbProvider = new DbProvider('localhost', 27017, 'mydb');
var util = require('util');
console.log(util.inspect(dbProvider));
describe('DbProvider', function(){
describe('findAllNotes', function(){
it('should return some notes', function(){
dbProvider.findAllNotes({}, function (err, result){
assert(result.length > 0);
});
})
})
})
The output I get is this:
$ mocha
{}
✖ 1 of 1 test failed:
1) DbProvider findAllNotes should return some notes:
TypeError: Cannot call method 'collection' of undefined
at DbProvider.doOperation (/Users/frode/Node/json/db.js:46:11)
at DbProvider.findAllNotes (/Users/frode/Node/json/db.js:56:8)
at Context.<anonymous> (/Users/frode/Node/json/test/test.js:15:18)
(cutting out the rest)
It seems that I am unsuccessful to create the dbProvider. This works perfectly in my app... How can I make this work? (And perhaps also: Is the way I have set it up in general ok?)
Edit: Here is the db.js file:
// Database related
'use strict';
var MongoClient = require('mongodb').MongoClient;
var BSON = require('mongodb').BSONPure;
var ObjectID = require('mongodb').ObjectID;
var checkForHexRegExp = new RegExp("^[0-9a-fA-F]{24}$");
var Validator = require('validator').Validator
var fieldMaxLength = 1024;
//var util = require('util');
var DbProvider = function(host, port, database) {
var dbUrl = "mongodb://"+host+":"+port+"/"+database;
var self = this;
MongoClient.connect(dbUrl, function(err, db) {
self.db = db;
});
};
// Do some basic validation on the data we get from the client/user
var validateParams = function(params, callback) {
// Let´ do a quick general sanity check on the length on all fields
for(var key in params) {
if(params[key].length > fieldMaxLength) callback(new Error('Field ' + key + ' is too long.'));
}
// and the let us check some specific fields better
if (params._id) {
if(checkForHexRegExp.test(params._id)) {
// In case of '_id' we also need to convert it to BSON so that mongodb can use it.
params._id = new BSON.ObjectID(params._id);
} else {
var err = {error: 'Wrong ID format'};
}
}
if(err) callback(err);
}
// Generalized function to operations on the database
// Todo: Generalize even more when user authenication is implemented
DbProvider.prototype.doOperation = function(collection, operation, params, callback) {
validateParams(params, callback);
var operationCallback = function(err, result) {
callback(err, result);
};
this.db.collection(collection, function(err, collection) {
if(operation==='find') {
collection.find().toArray(operationCallback);
} else {
collection[operation](params, operationCallback);
}
});
}
DbProvider.prototype.findAllNotes = function(params, callback) {
this.doOperation('notes', 'find', params, callback);
};
DbProvider.prototype.findNoteById = function(params, callback) {
this.doOperation('notes', 'findOne', params, callback);
};
DbProvider.prototype.saveNote = function(params, callback) {
params.created_at = new Date();
this.doOperation('notes', 'save', params, callback);
};
DbProvider.prototype.deleteNote = function(params, callback) {
this.doOperation('notes', 'remove', params, callback);
};
DbProvider.prototype.findUser = function(params, callback) {
this.doOperation('users', 'findOne', params, callback);
};
exports.DbProvider = DbProvider;
SOLUTION:
After Benjamin told me to handle the async nature of mongodb connecting to the database, and inspired by his suggestion on how to adapt the code, I split the constructor function DbProvider into two parts. The first part, the constructor DbProvider now just saves the db-parameters into a variable. The second part, a new function, DbProvider.connect does the actual async connection. See below.
var DbProvider = function(host, port, database) {
this.dbUrl = "mongodb://"+host+":"+port+"/"+database;
};
DbProvider.prototype.connect = function(callback) {
var self = this;
MongoClient.connect(this.dbUrl, function(err, db) {
self.db = db;
callback();
});
};
So I can now make a Mocha test like this (and async tests also need the "Done" included, like you see in the code below):
var assert = require('assert');
var DbProvider = require('../db').DbProvider;
var dbProvider = new DbProvider('localhost', 27017, 'nki');
describe('DbProvider', function(){
describe('findAllNotes', function(){
it('should return some notes', function(done){
dbProvider.connect(function(){
dbProvider.findAllNotes({}, function (err, result){
assert(result.length > 0);
done();
});
});
})
})
})
Note that the acutal test ("should return some notes") is nothing to be proud of. What I wanted here was to get set up so I am able to test something. Now that I finally acutally can do that, I need to write good tests (something in the the line of having a test database, clear it, test insert a document, test search for a document, and so on...).
And in my Express app, I used to set up the database like this:
var DbProvider = require('./db').DbProvider;
// Setup db instance
var dbProvider = new DbProvider(
process.env.mongo_host || 'localhost',
process.env.mongo_port || 27017,
process.env.mongo_db || 'nki'
);
Now I do the same, but in addition, I call the new connect-function:
// Connect to db. I use (for now) 1 connection for the lifetime of this app.
// And I do not use a callback when connecting here (we do in the testing)
dbProvider.connect(function(){});
Benjamin actually pointed out that it may be ok but not the best practice to have the database set up like this in an Express app. But until I figure out what the best practice really is, I will leave this code as it is. Here is a couple of links reagarding the subject I found (but I have still not concluded of how I will solve it myself):
What's the best practice for MongoDB connections on Node.js? and
[node-mongodb-native] MongoDB Best practices for beginner
If you like, you are very welcome to follow/fork/whatever this project on github. My goal is to get it as production ready I can. The link is
https://github.com/frodefi/node-mongodb-json-server
MongoClient.connect is asynchronous.
From the docs:
callback (function) – this will be called after executing this method. The first parameter will contain the Error object if an error occured, or null otherwise. While the second parameter will contain the initialized db object or null if an error occured.
That means DbProvider.db isn't set yet in the test which is why you're getting undefined.
In here:
MongoClient.connect(dbUrl, function(err, db) {
self.db = db;
});
You're telling it "update self.db after the connection happened", which is at least one event loop tick after this one (but may be more). In your mocha code you're executing your .describe and .it methods right after creating your DbProvider instance which means it was not initialized yet.
I suggest that you re-factor DbProvider to return a callback instead of being a constructor function. Maybe something along the lines of:
var getDbProvider = function(host, port, database,callback) {
var dbUrl = "mongodb://"+host+":"+port+"/"+database;
MongoClient.connect(dbUrl, function(err, db) {
self.db = db;
callback(db);
});
};
Which also means moving all the DBProvider methods to an object (maybe the callback will return a dbprovider object and not just a db?).
Another bug solved by using Unit Tests :)
This is what I used: https://github.com/arunoda/mocha-mongo
It has set of testing helpers for mongodb

Resources