I checked some NPM libraries to test a webpages or web-services. But all of them expect that the server is already running. Since I want to automate functional testing, how can I setup NPM package in such a way that
It can start the server
Test the application
Stop the server
So that I can test it locally, as well as on online CI tools like travis-ci or circleci.
Case 1: Webservice
I wrote a NPM package which starts nodejs HTTP(s) server. It can be started from command line $stubmatic. Currently, I use 2 approaches to test it,
manual : I manually start it from command line. Then run the tests.
Automatic: I use exec module to run a unix command which can start the application and run pkill command to kill the application. But for this automation, my application need to be installed on testing machine.
Case 2: Website
I have create a NPM package: fast-xml-parser and created a demo page within the repo so that I can be tested in the browser. To test the demo page, I currently start a local server using http-server npm package manually. Test the application.
What can be the better way to write automate functional tests for node js applications?
Note:
I never used task runners like gulp or grunt. So I'm not sure if they can help in this case.
In case 1, my application starts node js native HTTP server. I'm not using any 3rd party application like express currently.
This question mentions a new Docker container system for Travis that could be duplicated locally. It might be a way: How to run travis-ci locally
Did you look at supertest (SuperAgent driven library for testing HTTP servers) and expect (Assertions library)(documented here) with mocha (Test Framework)?
I use them and I never had any kind of problems for all the tests I do until now.
The documention in the links contains all the informations you need to build up your test.
Case 1: Webservice
Problem 1
As nodejs server.close() was not working. I copied paste this snippet in every test file which is starting my webservice.
try{
server.setup(options);
server.start();
}catch(err){
console.log(err);
}
Once all the tests are completed, server stops.
**Problem 2
I was using chai-http incorrectly. Here is the complete working solution.
//Need to be placed before importing chai and chai-http
if (!global.Promise) {
global.Promise = require('q');
}
var server = require('.././lib/server');
var chai = require('chai')
, chaiHttp = require('chai-http');
chai.use(chaiHttp);
try{
server.setup(someoptions);
server.start();
}catch(err){
console.log(err);
}
describe('FT', function () {
describe('scenario::', function () {
it('responds to POST', function (done) {
chai.request("http://localhost:9999")
.post('/someurl')
.then(res => {
expect(res.status).toBe(200);
//console.log(res.text);
done();
}).catch( err => {
console.log(err);
done();
});
});
});
Case 2: Website This was quite simple.
I used http-server to start the server so my html files can be accessed.
I used zombie js for browser testing. (There are many other options available for browser testing)
Here is the code
process.env.NODE_ENV = 'test';
const Browser = require('zombie');
const httpServer = require('http-server');
describe("DemoApp", function() {
var browser = new Browser({site: 'http://localhost:8080'});
var server = httpServer.createServer();
server.listen(8080);
beforeEach(function(done){
browser.visit('/', done);
});
describe("Parse XML", function() {
it("should parse xml to json", function(done) {
browser.pressButton('#submit');
browser.assert.text('#result', 'some result text');
done();
});
});
afterEach(function(){
server.close();
})
});
Related
I was trying to create a simple room based webchat app using socket.io. But I've been facing issues with serving the node server file with node and nodemon. I've tried different ports i.e., 8000, 8080, 80 and I've also tried reinstalling node and nodemon both locally and globally but nothing seems to work. I guess the code is alright because I was following a tutorial (that didn't include any link to the source code though). I'm on windows so I've also used used the command
Set-ExecutionPolicy Unrestricted
Here are the screenshots of Windows power shell and command prompt
PS: I am a newbie, so please consider that I might have missed something basic.
const io = require('socket.io')(8000)
const users = {};
io.on('connection', socket =>{
socket.on('new-user-joined',name =>{
users[socket.id] = name;
socket.broadcast.emit('user-joined', name);
});
socket.on('send', message =>{
socket.broadcast.emit('receive', {message: message,name: user[socket.id]});
});
})
I am currently trying to deploy some tests to Azure pipelines (but this should apply to any CICD framework)
I have an express project, and I am using supertest with mocha to run tests locally and everything is fine.
Now I want to deploy to production, and I want to run the tests in the pipeline before deployment happens.
The thing is that since these tests are running against http, they need a server running.
So, in my pipeline I have
npm install
npm start
npm test
But the thing is that npm test is not running when the server is started, but instead it just hangs in the server running.
Is there a way to start the tests when the server starts? and then stop the server when the tests are finished?
Or is there a better way to achieve all this?
If you're using supertest I would suggest to export the express app for the tests instead of running it as shown in their docs, and then you would only need to run npm install and npm test.
For example:
app.js
const express = require('express');
const app = express();
// Add middlewares
module.exports = app;
test.spec.js
const app = require('../app.js');
request(app)
.get('/user')
.set('Accept', 'application/json')
.expect('Content-Type', /json/)
.expect(200)
...
I'm working on my app.js in node.js
-- trying to deploy server-side script.
Many fine node.js modules need a require('something');
I use NPM locally, which works for require, as modules are nicely visible in the local node_modules folder structure. but now I'm ready to upload or bundle to a host. I can't run npm on this hosted server.
const Hapi = require('hapi');
will result in
Error: Cannot find module 'hapi'
because I don't know how to copy/install/bundle/ftp files to my host.
Hapi is just an example. Most anything that has a require will need something on the host.
I used webpack to create a server side bundle.js but just sticking bundle.js under /node_modules doesn't do anything.
Most modules have a complex folder structure underneath --- I'm trying to avoid copying a ton of folders and files under /node_modules. Ideally, I want to combine the modules into a bundle.js and have those modules visible to app.js
but I am open to other ideas.
I have not yet tried using webpack to bundle app.js TOGETHER with the various modules. Have you had luck with that approach?
thanks.
I've tried to upload hapi files a folder-ful at a time, reaching a new require('something') error at every step.
'use strict';
const Hapi = require('hapi'); // <-- how can I deploy hapi on my node.js server?
// Create a server with a host and port
const server=Hapi.server({
host:'localhost',
port:8000
});
// Add the route
server.route({
method:'GET',
path:'/hello',
handler:function(request,h) {
return'hello world';
}
});
// Start the server
async function start() {
try {
await server.start();
}
catch (err) {
console.log(err);
process.exit(1);
}
console.log('Server running at:', server.info.uri);
};
start();
one approach that worked: using webpack to bundle the back end js.
thanks to
https://medium.com/code-oil/webpack-javascript-bundling-for-both-front-end-and-back-
end-b95f1b429810
the aha moment... run webpack to create bundle-back.js then
tie bundle-back.js to my node server
**You start your backend server with ‘bundle-back.js’ using:
node bundle-back.js
In other words, include app.js in the bundle with the modules.
I want to use Circle CI to integrate a git project.
I'm using mocha for my tests .
What i want to do?
When running npm test I want:
my node server to start
my test file to run
How can I run a single npm test command to run both node and my mocha tests which are already wrapped in a single index.js file.
I have tried this in my package.json:
"scripts": {
"test": "node server/app.js & mocha server/tests/index.js",
"start": "node server/app.js",
"postinstall": "bower install"
}
The problems with the above
My server takes some time to start and the tests fail since they run before the server starts
Is there a standard way to run a server and the tests with a single command but I'm missing something?
If it is possible at all in your case I'd suggest using something like supertest to do the testing. This way, you can avoid having to start a server before starting the test.
I understand that there are scenarios where using supertest is not possible. In such case, you could poll your server in a before hook before all tests to wait until it is ready:
before(function (done) {
// Set a reasonable timeout for this hook.
this.timeout(5000);
function check() {
if (serverIsReady()) {
done();
return;
}
// The server is no ready, check again in 1/10th of a second.
setTimeout(check, 100);
}
check(); // Start checking.
});
I'm not sure what serverIsReady should be precisely in your case. It could be an attempt at getting a trivial path from your server like issuing a GET on the path /.
I think the key is to run your node server in your test, rather than trying to initialise it in another process.
Your mocha test should start with a require to your app, then each
of your tests can interact with it.
For example:
var http = require('http');
var server = http.createServer(function(req, res){
res.end('Hello World\n');
})
server.listen(8888);
describe('http', function(){
it('should provide an example', function(done){
http.get({ path: '/', port: 8888 }, function(res){
expect(res).to.have.property('statusCode', 200);
done();
})
})
})
What I do when running a test that needs certain pre-requisites is use mochas beforeEach() functionality.
From the documentation
You may also pick any file and add “root”-level hooks. For example, add beforeEach() outside of all describe() blocks. This will cause the callback to beforeEach() to run before any test case, regardless of the file it lives in (this is because Mocha has an implied describe() block, called the “root suite”).
beforeEach(function() {
console.log('before every test in every file');
});
In the before each code block you can run your command to start the server
using for example the exec library from npm
https://www.npmjs.com/package/exec
This will ensure your server is running before your tests are run allowing you to simply run npm test.
I am working off of Yeoman's gulp-webapp generator. I have modified my gulp serve task to use my Express server, rather than the default connect server it ships with. My issue is with Livereload functionality. I am trying to simply port the connect-livereload to work with my Express server rather than having to install new dependencies. It's to my understanding that most connect middleware should work fine with Express, so I am assuming connect livereload is compatible with Express 4.
Here are the contents of the relevant tasks in my gulpfile:
gulp.task('express', function() {
var serveStatic = require('serve-static');
var app = require('./server/app');
app.use(require('connect-livereload')({port: 35729}))
.use(serveStatic('.tmp'));
app.listen(3000);
});
gulp.task('watch', ['express'], function () {
$.livereload.listen();
// watch for changes
gulp.watch([
'app/*.ejs',
'.tmp/styles/**/*.css',
'app/scripts/**/*.js',
'app/images/**/*'
]).on('change', $.livereload.changed);
gulp.watch('app/styles/**/*.css', ['styles']);
gulp.watch('bower.json', ['wiredep']);
});
gulp.task('styles', function () {
return gulp.src('app/styles/main.css')
.pipe($.autoprefixer({browsers: ['last 1 version']}))
.pipe(gulp.dest('.tmp/styles'));
});
gulp.task('serve', ['express', 'watch'], function () {
require('opn')('http://localhost:3000');
});
With this simple setup, when I run gulp serve in my cmd everything spins up fine and I can accept requests at http://localhost:3000.
Now if I go and change the body's background color from #fafafa to #f00 in main.css and hit save, my gulp output will respond with main.css was reloaded, as seen in the bottom of this screenshot.
However, my webpage does not update. The background color is still light-grey instead of red.
Is there perhaps a conflict between my express server config and the way gulp handles its files? Is my Express server forcing the use of app/styles/main.css rather than the use of .tmp/styles/main.css? Shouldn't the livereload script handle the injection of the new temporary file?
Thanks for any help.
EDIT:
I was able to move forward a bit by adding livereload.js to the script block of my index file, like so:
<script src="http://localhost:35729/livereload.js"></script>
I am now able to get live changes pushed to the client. Why was this file not getting injected before? How can I ensure this is getting used programatically as opposed to pasting it into my files?
I was able to get past this issue by removing the app.use(require('connect-livereload')({port: 35729})) from my gulpfile, along with a couple of other lines, and having that instantiate in my Express server's app.js file.
My gulpfile's express task now looks like this:
gulp.task('express', function() {
var app = require('./server/app');
app.listen(3000);
});
I added in the connect-livereload just above where I specify my static directory in Express:
if (app.get('env') === 'development') {
app.use(require('connect-livereload')());
}
app.use(express.static(path.join(__dirname, '../app')));
Once I started using this setup, I was getting the livereload.js script injected into my document, and client-side changes are now auto-refreshed just how I wanted.
Hope this helps someone!