Karma: Catch errors using jasmine & grunt - node.js

I do have some problems with me very special grunt-karma-requirejs-jasmine setup. Basically I have everything up and running and as long as test are defined properly in the spec-files everything ist just working perfect, resulting in an junit-XML-file showing me test success and test failures.
Now the problem:
If there is a problem with the scripts the tests run on (e.g. variables undefined, modules not available --> everything that causes a normal browser to throw an error), my Karma stops with throwing an error, which is the default behaviour. But what I really want Karma to do is, not to stop, but log it's errors to my junit-XML-file => so I have three types of states (success, failure, error) in my XML and not on my console.
How can I:
- get Karma to custom handle errors?
- get Karma to tell karma-junit-reporter what to log?
Any help is appreciated! Thanks a lot guys!

Related

How to prevent Jest from running tests when an error occurred

I wonder if there is any way to prevent tests from running when we have an error.
For example, in the beforeAll() function. I have tried "return" or "throw" an error but after that, Jest runs all of my tests.
So when my code in the beforeAll() function has an error that can affect other test results, I would like to be able to prevent Jest from running all the tests.
Jest tries to run all the tests even though we already know all the tests would fail.
You can try to use bail in config:
bail: 2 // finish after 2 failed tests
or
bail: true // finish after first
https://jestjs.io/docs/cli#--bail
To fail your test use:
fail('something wrong');

Error: write EPIPE when running Jest tests on Gitlab CI's personal VPS

So, what I'm trying to do now is simple, create a job at Gitlab CI to run tests that I've made on my own personal VPS. I am using NestJS as my backend. The problem is, for some reason one/some of the test returned a write EPIPE error. Also, the only pattern I get is that the error only occurred for the test that's uploading image using multipart form but not consistently occurred, so like when I ran 3 times, sometimes write EPIPE only occurred 1 time, sometimes twice, sometimes none.
Here is my code snippets for uploading image using multipart form on the test:
it('should be able to upload image', () => {
return request(myHost)
.post('/upload-image')
.expect(200)
.attach('image', './testimage.jpg')
.then((res): any => {
expect(res.body).toEqual({});
});
});
For additional information, testimage.jpg is only 13.9kB, so it's not a big file.
My node version: 14.16.0, Jest version: 26.6.3, NestJS version: 7.6.15,and Ubuntu version: 20.04.
What I've tried is installing libpng-dev package, libfontconfig package, and running tests using -- --silent tag and all of it is not working.
I finally got it working by setting the Connection header to be keep-alive to the tests that has that write EPIPE error. The best possible explanation I can give is when the image is uploading, the connection somehow got closed, so the upload process is stopped. That explains why there is no error when I added console.log(err) to the test and also there's no error on my backend side.
The EPIPE error explanations also mentioned this, which can be seen here:
Source: https://nodejs.org/api/errors.html, on the Common System Errors section.
Some part of the issue that I don't understand is why would the other upload image tests did not encounter this EPIPE error. I had like for sure more than 20 tests that have the upload image part, but only like < 10 encountered it. Besides that, the < 10 tests are randomized too, meaning like not all tests inside this group will 100% encountered it when they were tested.
For example, I have 20 tests for uploading an image, named test1, test2, and so on. For some reason, only some tests encountered the EPIPE error, let's say only test1 to test8 got it (in the real case, the test that got it is random). Then between test1 and test8, sometimes they also got it, sometimes they don't, but the range is always the same (at the first run, the tests who got the EPIPE error was test1, test3, test5, test6. at the second run, the tests who got it could be different from the first one, which is test1, test3, test5, test7, and test8, but never went out of test8).
Perhaps someone can explain this.

Handling process.exit(1) with Jest

I am writing my unit tests for NodeJS using Jest.
A part of my code exits using process.exit(1), so when trying to test it using Jest, the test terminates when it comes to this line with the error Command failed with exit code 1. which is the default behaviour of process.exit(1).
Can anyone please tell how to work with Jest to handle this scenario and continue with the other tests?
I think, throwing an error instead gives a more maintainable and testable code for you. Like:
if (err) {
throw new Error("Some relevant error msg")
}
If you insist on using process.exit(1), you can test for process.exitCode, which should be 1 in your case.

Is there a good way to print the time after each run with `mocha -w`?

I like letting mocha -w run in a terminal while I work on test so I get immediate feedback, but I can't always tell from a glance if it's changed or not when the status doesn't change - did it run, or did it get stuck (it's happened)?
I'd like to have a way to append a timestamp to the end of each test run, but ideally only when run in 'watch' mode - if I'm running it manually, of course I know if it ran or not.
For now, I'm appending an asynchronous console log to the last test that runs:
it('description', function () {
// real test parts.should.test.things();
// Trick - schedule the time to be printed to the log - so I can see when it was run last
setTimeout(() => console.log(new Date().toDateString() + " # " + new Date().toTimeString()), 5);
});
Obviously this is ugly and bad for several reasons:
It's manually added to the last test - have to know which that is
It is added every time that test is run, but never others - so if I run a different file or test -> no log; if I run only that test manually -> log
It's just kind of an affront to the purpose of the tests - subverting it to serve my will
I have seen some references to mocha adding a global.it object with the command line args, which could be searched for the '-w' flag, but that is even uglier, and still doesn't solve most of the problems.
Is there some other mocha add-in module which provides this? Or perhaps I've overlooked something in the options? Or perhaps I really shouldn't need this and I'm doing it all wrong to begin with?
Mocha supports root level hooks. If you place an after hook (for example) outside any describe block, it should run at the end of all tests. It won't run only in watch mode, of course, but should otherwise be fit for purpose.

Inconsistently getting Error: [$injector:modulerr] Failed to instantiate module

I'm getting inconsistent results from Angular/Karma/Jasmine. When I run 'npm test', I get:
INFO [karma]: Karma v0.10.10 server started at http://localhost:9876/
INFO [launcher]: Starting browser Chrome
INFO [Chrome 35.0.1916 (Linux)]: Connected on socket aW0Inld7aRhONC2vo04k
Chrome 35.0.1916 (Linux): Executed 1 of 1 SUCCESS (0.345 secs / 0.016 secs)
Then, if I just save the code or test file (no changes), it will sometimes have the same results, sometimes it gives errors:
INFO [watcher]: Removed file "/home/www/raffler/entries.js".
Chrome 35.0.1916 (Linux) Raffler controllers RafflerCtrl should start with an empty masterList FAILED
Error: [$injector:modulerr] Failed to instantiate module raffler due to:
Error: [$injector:nomod] Module 'raffler' is not available! You either misspelled the module name or forgot to load it. If registering a module ensure that you specify the dependencies as the second argument.
This is WITHOUT MAKING CHANGES. If I stop Karma and restart it works again, but once it fails it always fails. What gives? Buggy Angular/Jasmine/Karma? The code and test are trivial. Here is the code:
var myApp = angular.module('raffler', []);
myApp.controller('RafflerCtrl', function($scope, $log) {
$scope.$log = $log;
$scope.masterList = [];
});
And here is the test:
'use strict';
describe('Raffler controllers', function() {
describe('RafflerCtrl', function(){
var scope, ctrl;
beforeEach(module('raffler'));
beforeEach(inject(function($controller) {
scope = {};
ctrl = $controller('RafflerCtrl', {$scope:scope});
}));
it('should start with an empty masterList', function() {
expect(scope.masterList.length).toBe(0);
});
});
});
Am I doing something dumb? Seems like it should give me consistent results, regardless of my stupidity level... Thanks.
You were asking if there was a bug. There is. The authors of Karma know that there are problems with file watching. See this issue: https://github.com/karma-runner/karma/issues/974
Simply saving the file without changes can trigger this behavior. There's two main ways that files get saved. The first is to delete the original (or rename it to .bak or something) and then write out the new content. The second method writes the new content to a temporary file, deletes the original and then moves the temporary file to where the original used to be. For both of those the file system monitoring can fire an event saying that some files/directories changed. Node is quick enough to be able to detect the file was removed and tells Karma to stop using it in its tests. A slightly less used third way is to open the file in a special way to overwrite the contents, which will keep Karma happy.
Back to that bug ... in the above scenario, the globs are not re-evaluated when file system changes are detected. So it thinks your file was removed. It never saw that a new file was added, thus it's now out of the test suite.
This bug is bothering me too. If you've got an idea or a pull request then I'd suggest providing it to the Karma team. There's already a patch being reviewed that should address these problems - see https://github.com/karma-runner/karma/issues/1123.
As a workaround, you can use "set backupcopy=yes" for vim. There may be settings in other editors to change the behavior so that the file is overwritten instead of replaced.

Resources