I have been trying and failing to get basic breakpoints working in a fresh sails js app.
I create a new sails project called test and added a couple lines of code so I could hit a route and try to initiate a breakpoint in my terminal:
// app/controllers/TestController.js
module.exports = {
test: function (req, res) {
console.log('test');
debugger;
return res.json({
text: 'hello',
});
}
}
in my routes file:
// /config/routes.js
'get /test': 'TestController.test',
then I go to my terminal and use nodemon to launch the sails server with:
$ nodemon debug -w . app.js
My one caveat is that this solution should work with something like nodemon, forever, or supervisor that will allow for livereload on the nodeserver
Related
I am trying to make my processes (webpack, nodemon-restart) work with a single gulp command. This works well enough. However, webpack builds only once if its task is tied to gulp's default task (together with nodemon), or embedded withing nodemon's gulp task.
Then I decided to tie both webpack build task and nodemon restart task to gulp's watch command and this works just the way I wanted, except that if you make changes and save them more than twice, the app nodemon crashed and prints this error in the console
"/home/nnanyielugo/Workspace/activity-calendar/node_modules/nodemon/lib/monitor/match.js:132
var rules = monitor.sort(function (a, b) {
^
TypeError: Cannot read property 'sort' of undefined"
As a solution, i tried to tie the webpack build task to the nodemon restart using the .on() method, and instead got an infinite loop of restarting an rebuilding (nodemon restarts first, webpack builds, nodemon restarts again, webpack rebuilds, and on and on).
Does anyone have a solution please?`
Here is a sample of my code `
var gulp = require('gulp'),
nodemon = require('gulp-nodemon'),
webpack = require('webpack-stream');
gulp.task('default', ['watch']);
gulp.task('webpack', function() {
return gulp.src('src/entry.js')
.pipe(webpack(require('./webpack.config.js')))
.pipe(gulp.dest('./public'));
});
gulp.task('nodemon', function () {
return nodemon({
script: 'app.js'
, ext: 'js html'
, env: { 'NODE_ENV': 'development' }
})
})
gulp.task('watch', function(){
gulp.watch(['./api/**/*.js', './server/**/*.js', './*.js'], ['webpack', 'nodemon']);
})`
I guess, your nodemon and gulp's watch task collides with each other. Either you should get ride of using nodemon and to rely upon gulp to start your application.
Or else, you can get rid of your gulp's watch task and add the relevant script in your nodemon's restart method like this,
nodemon({
// script goes here.
}).on('restart', your_reload_logic)
Hope this helps!
I checked some NPM libraries to test a webpages or web-services. But all of them expect that the server is already running. Since I want to automate functional testing, how can I setup NPM package in such a way that
It can start the server
Test the application
Stop the server
So that I can test it locally, as well as on online CI tools like travis-ci or circleci.
Case 1: Webservice
I wrote a NPM package which starts nodejs HTTP(s) server. It can be started from command line $stubmatic. Currently, I use 2 approaches to test it,
manual : I manually start it from command line. Then run the tests.
Automatic: I use exec module to run a unix command which can start the application and run pkill command to kill the application. But for this automation, my application need to be installed on testing machine.
Case 2: Website
I have create a NPM package: fast-xml-parser and created a demo page within the repo so that I can be tested in the browser. To test the demo page, I currently start a local server using http-server npm package manually. Test the application.
What can be the better way to write automate functional tests for node js applications?
Note:
I never used task runners like gulp or grunt. So I'm not sure if they can help in this case.
In case 1, my application starts node js native HTTP server. I'm not using any 3rd party application like express currently.
This question mentions a new Docker container system for Travis that could be duplicated locally. It might be a way: How to run travis-ci locally
Did you look at supertest (SuperAgent driven library for testing HTTP servers) and expect (Assertions library)(documented here) with mocha (Test Framework)?
I use them and I never had any kind of problems for all the tests I do until now.
The documention in the links contains all the informations you need to build up your test.
Case 1: Webservice
Problem 1
As nodejs server.close() was not working. I copied paste this snippet in every test file which is starting my webservice.
try{
server.setup(options);
server.start();
}catch(err){
console.log(err);
}
Once all the tests are completed, server stops.
**Problem 2
I was using chai-http incorrectly. Here is the complete working solution.
//Need to be placed before importing chai and chai-http
if (!global.Promise) {
global.Promise = require('q');
}
var server = require('.././lib/server');
var chai = require('chai')
, chaiHttp = require('chai-http');
chai.use(chaiHttp);
try{
server.setup(someoptions);
server.start();
}catch(err){
console.log(err);
}
describe('FT', function () {
describe('scenario::', function () {
it('responds to POST', function (done) {
chai.request("http://localhost:9999")
.post('/someurl')
.then(res => {
expect(res.status).toBe(200);
//console.log(res.text);
done();
}).catch( err => {
console.log(err);
done();
});
});
});
Case 2: Website This was quite simple.
I used http-server to start the server so my html files can be accessed.
I used zombie js for browser testing. (There are many other options available for browser testing)
Here is the code
process.env.NODE_ENV = 'test';
const Browser = require('zombie');
const httpServer = require('http-server');
describe("DemoApp", function() {
var browser = new Browser({site: 'http://localhost:8080'});
var server = httpServer.createServer();
server.listen(8080);
beforeEach(function(done){
browser.visit('/', done);
});
describe("Parse XML", function() {
it("should parse xml to json", function(done) {
browser.pressButton('#submit');
browser.assert.text('#result', 'some result text');
done();
});
});
afterEach(function(){
server.close();
})
});
I couldnt find answer for this question.So that..for example
app.js
//express require
app.get("/test",function(req,res) {
res.send("hello world");
});
app.listen(3000)
when we work it via "node app.js".. We see "hello world" we requested http://ip:3000/test on browser .
but when we changed file, for example
//express require
app.get("/test",function(req,res) {
res.send("hello world 2");
});
app.listen(3000);
when we refreshed to browser.. we still see "hello world"...because we does not "node app.js command"
well!! but why????
when we worked "node app.js" command on console..what does node work for this?
Actually node doesn't refreshes the content until you restart the server. So, just restart the server after changing the content.
You can use --watch to watch for the changes automatically restart the server if content is changed.
I have a route that calls my controller and the controller does nothing but respond with a view. Nothing dynamic happening here. I am using Swig for my views. For some reason, my view is not getting refreshed with the new changes. it still shows me the old view. I have tried the following to no avail:
app.set('view cache', false);
// To disable Swig's cache, do the following:
swig.setDefaults({ cache: false });
app.disable('view cache');
app.set('etag', 'strong');
I have a console.log statement in my controller and it never seems to log anything out leading me to believe that that controller is never called. I do have a simple middleware that logs the path out before the routes are set up and I see the request for my route but the controller never gets called and an older version of the template view is served.
Am I missing something??
Update: Server controller code:
exports.index = function (req, res) {
res.render('myapp', {});
};
In order to check your code without node.js restart, consider to use pm2 or nodemon
After installation of pm2:
pm2 start app.js --watch
PM2 automatically restarts your app when a file changes in the current directory or its subdirectories.
But my favorite choice is nodemon:
nodemon [your node app]
I am working off of Yeoman's gulp-webapp generator. I have modified my gulp serve task to use my Express server, rather than the default connect server it ships with. My issue is with Livereload functionality. I am trying to simply port the connect-livereload to work with my Express server rather than having to install new dependencies. It's to my understanding that most connect middleware should work fine with Express, so I am assuming connect livereload is compatible with Express 4.
Here are the contents of the relevant tasks in my gulpfile:
gulp.task('express', function() {
var serveStatic = require('serve-static');
var app = require('./server/app');
app.use(require('connect-livereload')({port: 35729}))
.use(serveStatic('.tmp'));
app.listen(3000);
});
gulp.task('watch', ['express'], function () {
$.livereload.listen();
// watch for changes
gulp.watch([
'app/*.ejs',
'.tmp/styles/**/*.css',
'app/scripts/**/*.js',
'app/images/**/*'
]).on('change', $.livereload.changed);
gulp.watch('app/styles/**/*.css', ['styles']);
gulp.watch('bower.json', ['wiredep']);
});
gulp.task('styles', function () {
return gulp.src('app/styles/main.css')
.pipe($.autoprefixer({browsers: ['last 1 version']}))
.pipe(gulp.dest('.tmp/styles'));
});
gulp.task('serve', ['express', 'watch'], function () {
require('opn')('http://localhost:3000');
});
With this simple setup, when I run gulp serve in my cmd everything spins up fine and I can accept requests at http://localhost:3000.
Now if I go and change the body's background color from #fafafa to #f00 in main.css and hit save, my gulp output will respond with main.css was reloaded, as seen in the bottom of this screenshot.
However, my webpage does not update. The background color is still light-grey instead of red.
Is there perhaps a conflict between my express server config and the way gulp handles its files? Is my Express server forcing the use of app/styles/main.css rather than the use of .tmp/styles/main.css? Shouldn't the livereload script handle the injection of the new temporary file?
Thanks for any help.
EDIT:
I was able to move forward a bit by adding livereload.js to the script block of my index file, like so:
<script src="http://localhost:35729/livereload.js"></script>
I am now able to get live changes pushed to the client. Why was this file not getting injected before? How can I ensure this is getting used programatically as opposed to pasting it into my files?
I was able to get past this issue by removing the app.use(require('connect-livereload')({port: 35729})) from my gulpfile, along with a couple of other lines, and having that instantiate in my Express server's app.js file.
My gulpfile's express task now looks like this:
gulp.task('express', function() {
var app = require('./server/app');
app.listen(3000);
});
I added in the connect-livereload just above where I specify my static directory in Express:
if (app.get('env') === 'development') {
app.use(require('connect-livereload')());
}
app.use(express.static(path.join(__dirname, '../app')));
Once I started using this setup, I was getting the livereload.js script injected into my document, and client-side changes are now auto-refreshed just how I wanted.
Hope this helps someone!