How do I unit test keystonejs models? - node.js

Is there any way to run tests for keystonejs that also hit a test or real mongodb instance?
It would be nice if similar to the way Django does it.

There aren't any official examples of implementing unit testing for KeystoneJS sites yet, but there wouldn't be anything stopping you from writing tests with a framework like mocha, the way you would in any other node.js app.
You'd want to initialise Keystone, register your models, then connect to the database and execute tests without starting the web server. Something like this:
./tests.js
var keystone = require('keystone');
keystone.init({
'name': 'Your Project'
});
keystone.import('models');
keystone.mongoose.connect('localhost', 'your-database');
keystone.mongoose.connection.on('open', function() {
// Run tests here
// Use keystone.list('Key') to access Lists and execute queries
// as you would in your main application
});
then run tests.js, or make it an npm / grunt / etc. script.
Keep an eye on issue #216 for an integrated testing framework.

Related

why is so differult between the app generated by fastify-cli and the example in the documentation?

where is the start.js in the project which is generated by fastify-cli ?
i think its big different between the getting start example and the app generated by fastify-cli?
should i write the start function like this in the project created by fastify-cli?
const start = async () => {
try {
await sequelize.sync({})
app.log.info('database sync correctly')
await app.listen(PORT, '0.0.0.0')
app.swagger()
} catch (err) {
app.log.error(err)
process.exit(1)
}
}
start()
there are just a app.js in the project generated by fastify-cli.what a different!
where is the start.js in the project which is generated by fastify-cli ?
There is not, it is replaced by the CLI utility fastify your-file.js in the package.json (like mocha, jest etc.. does to run tests)
Usually the starter file is always the same, so it has been integrated in the cli and you can use the args to set the PORT or to reload the server automatically when you edit one file.
i think its big different between the getting start example and the app generated by fastify-cli?
The docs teach all you need to know about the framework, the plugins and utility around it want to ease the developer experience.. like manage a mongodb-connectio: it is one line with the official plugin.
should i write the start function like this in the project created by fastify-cli?
If you use fastify my-file.js you don't need it.
After some experience you will understand when you need the fastify-cli or not.
I think the cli is useful in the most use cases and it suggests good ways to implement configurations loading and encapsulation.
You won't need it for special use cases that need to run some async operation before the server is created

Check in nightwatchjs if E2E test correctly saved to database?

For a meteor project with typescript and react I use nightwatch testing which work's great:
https://github.com/arichter83/meteor-react-typescript-nightwatch
a.) Checking database results via Client
Now I want to check in the database if the end2end test successfully added the data and that turned out surprisingly difficult. I can go via the client and look in the Mongo.Collection (on github):
browser
.execute(function() {
return (Meteor as any).connection._stores['links']._getCollection()
.insert({title:"new link"})
}, [], (result) => {
const newid = result.value
browser
.assert.containsText('#' + newid, 'new link')
.execute(function(newid) {
return (Meteor as any).connection._stores['links']._getCollection()
.remove({_id: newid})
}, [newid], () => {
browser
.assert.elementNotPresent('#' + newid)
})
})
With this approach it is quite difficult to use my existing models and interacting with nightwatch.
b.) Checking database results in test
But I'd would instead use nightwatch's unit test capability in between, but from the docs it seems that E2E and unit tests can't be mixed.
Furthermore, when importing my models in the test on the server:
import { Links } from '../../imports/api/links'
console.log(Links.findOne())
Typescript throws an error that it can't resolve the atmosphere package meteor/mongo - so #types/meteor seems not to be loaded (probably meteor specific):
Cannot find module 'meteor/mongo'
Questions
Is it generally advisable to check database results for E2E tests?
What is the most elegant way to do this with nightwatch (+ meteor)? (I also created a Feature Request there)
How to use meteor libraries in nightwatch tests?

Unit testing for loopback model

I have a Loopback API with a model Student.
How do I write unit tests for the node API methods of the Student model without calling the REST API? I can't find any documentation or examples for testing the model through node API itself.
Can anyone please help?
Example with testing the count method
// With this test file located in ./test/thistest.js
var app = require('../server');
describe('Student node api', function(){
it('counts initially 0 student', function(cb){
app.models.Student.count({}, function(err, count){
assert.deepEqual(count, 0);
});
});
});
This way you can test the node API, without calling the REST API.
However, for built-in methods, this stuff is already tested by strongloop so should pretty useless to test the node API. But for remote (=custom) methods it can still be interesting.
EDIT:
The reason why this way of doing things is not explicited is because ultimately, you will need to test your complete REST API to ensure that not only the node API works as expected, but also that ACLs are properly configured, return codes, etc. So in the end, you end up writing 2 different tests for the same thing, which is a waste of time. (Unless you like to write tests :)

Use test database with grunt and mocha

I am building a web app in Node.js, Express, and MongoDB using Mongoose. I want to have a dedicated database for when i run my Mocha tests with Grunt so that I do not mess up the database I am using for development. How would I do this?
I currently have my development database configuration information in a file at /config/db.js, which is loaded and connecting to my development database in my app.js file at startup. How would I make my Mocha tests, that are run in a Grunt task, use a test database dynamically when I run Grunt? I have tried to disconnect from development database in my test files in the before() hook in my Mocha test files, and then connect to test database. However, it keeps using development database. An example is the following:
before(function(done) {
if(mongoose.connection.db) mongoose.connection.close();
mongoose.connect(<test_db_uri>, done);
}
Your question is near of the following question Test environment in Node.js / Express application.
Basicly what you should do is use an env variable ('NODE_ENV' for exemple) access it with process.env.NODE_ENV and base on its value call the right configuration file. You should take a look to grunt-express-server which helps you a lot with the environement setup.
I hop this will help!

Test environment in Node.js / Express application

I've just starting working with Node, and I've been following along with various tutorials.
I've created an Express app, and setup Mongoose and Jasmine.
How can I configure my specs so that I can:
create models, automatically clean them up after each spec
use a different database for creating test objects (say myapp_test)
do this in a way that is as DRY as possible, i.e. not creating a before / after block with the teardown for each describe block
?
I'll try to answer you.
Create models, automatically clean them up after each spec.
To do that I'll assume you use Mocha as the testing framework you can simply use the function beforeEach like this :
describe('POST /api/users', function() {
beforeEach(function(done) {
User.remove({}, function (err) {
if (err) throw err;
done();
});
});
});
Basicly what I'm doing here is cleanning up my database before each it but you can make it do anything you want.
Use a different database for creating test objects
Here, you should use the node process.env method to setting your env. Here is a article to understand a little how it works. Take a lot to GRUNT projects, it helps a lot with your workflow and the configurations stuff.
do this in a way that is as DRY as possible, i.e. not creating a
before / after block with the teardown for each describe block
I'm not sure I got what you want but take a look at the doc for the hooks before, after, beforeEach, afterEach. I think you will find what you want here.

Resources