Gitlab failed pipeline when merging yaml file - gitlab

I'm trying to merge a yaml file containing 7000 lines of swagger documentation into gitlab but the pipeline keeps failing. The failure is happening because there are no unit tests, but I'm not aware of a way to unit test a swagger file. I've included the script that's causing the failure. Thanks for all and any help.
script:
"testForbidOnly": "npm run apolloEnv && TEST=true nyc mocha --recursive --exit --forbid-only"

You will need to configure the code coverage tool Istanbul (cli: nyc) to ignore files with the yml|yaml extension. See: Excluding files from coverage when using Mocha and Istanbul.

Related

Why does nyc does not gather test coverage for my project?

I am developing an extension for vscode using typescript. I set up a GitHub Action which is executing my test cases. To gather the test coverage I am using nyc.
These are my node scripts:
"compile": "tsc -p ./",
"test": "npm run compile && node ./out/test/runTest.js",
"coverage": "nyc npm run test"
When I run npm run coverage I see all my test cases are getting executed (they are not stored in the file runTest.js), but only the coverage for the test file runTest.js is gathered. The relevant classes lib.ts and extension.ts are not shown in the coverage.
Why is this so? What am I doing wrong?
See my package.json and .nycrc.json for configuration.
I could fix this problem by sticking to the great blogpost which I found posted on a similar Question.
The problem is, that NYC is a command line tool. When executing the test I focused on my configuration in the .nycrc file. Fact is, that the visual studio code test runner is not using this configuration and has to be configured inside the test runner.
I fixed the broken test coverage generation inside this commit.

How would I setup bamboo unit testing that runs both karma and jest

I have been trying to set up server-side testing for my application but have run into an issue with bamboo recognizing more than one test type. In the past, it worked just fine using only karma as the unit test, but I need to add server-side testing which works much better with jest.
The main problem I am running into is that when both are run as shown below the coverage report is only created for the karma(unit) tests.
test-unit=npm run test:unit && npm run test:server
I have also tried running jest under bamboos test-e2e and test-contract-test but when I do this nothing is reported. So is there a way to set up server-side testing separately in the .bamboorc file
So, I found an answer!
In the .bamboorc file move the test-unit = npm run test:unit to the end of the file and for the first test use
test-contract-consumer=npm run test:server -- --collect-coverage
This should then collect the coverage for the server tests in bamboo under the contract tests consumer artifacts, and the karma unit test should still show up under the unit test coverage artifacts.

Mocha istanbul not covering one file

I am using grunt mocha istanbul to run my test cases and check for code coverage. When I run the command npm test, all my test cases in every file gets executed. However when it comes to code coverage one of the files is not checked. Interestingly all files in that folder have been tested for code coverage.
I am not able to find any error logs which states where the problem is either. Could someone guide me?

How to make Istanbul generate coverage for all of my source code?

Currently Istanbul is only generating coverage for files that are used in my tests, which is okay, but seems to defeat the purpose of having coverage somewhat.
I have no Istanbul configuration, and am invoking it via npm test with the following script string:
$ istanbul cover _mocha -- -R dot --check-leaks --recursive test/
Is there a way to generate coverage for all of my source code?
Found the answer, I think I'm partly lucky that the directory structure I have chosen allows me to use this option, but my test command is now:
$ istanbul --include-all-sources cover _mocha -- -R dot --recursive test/
The --include-all-sources is the important part.
Istanbul recommends using nyc in order to check code coverage. It suggests an approach like this:
nyc mocha
After running this command, we'll get the coverage report. But there is a couple of pitfalls.
First of all, by default mocha is looking for tests in folder test. In order to override it, we have to set our own path in file mocha.opts like this:
nyc mocha --opts ./mocha.opts
And mocha.opts contains such code, for example:
spec/unit/back-end/**/*.spec.js
Another problem is that by default nyc checks coverage of only required files, that is your question is about. The solution is to set two options for nyc (I run test as an npm script so I set options in package.json). Here is the code:
"nyc": {
"all": true,
"include": [
"routes/*.js",
"routes/**/*.js",
"models/*.js"
]
},
"scripts": {
"mocha": "nyc mocha --opts ./mocha.opts",
}
Another way to achieve it is to set not include, but exclude option in order to exclude from coverage checking inappropriate files. It's strange, but the sole option all doesn't work, it requires include or exclude options. You can get more info about nyc options via nyc --help.
P.S. I don't know nyc and mocha deeply and I'm only based on my own experience.
For generating coverage for all files, have the following in your package.json
"istanbulCoverage": "nyc --reporter=lcov --reporter=text-lcov --all -x
\"./node_modules/\" -x \"./coverage/\" check-coverage --functions 90 npm run test"
Here --all flag fetches all the files in your project. If u want to exclude specific files or folders, you can use -x option.
Apart from this, if you want to specify coverage rate for you application, then use check-coverage option to specify the threshold. In my case, I've specified functions to have a coverage threshold of 90%. Otherwise, the coverage report generation fails.(i am running my karma test after coverage report generation).
Hope this helped:)
In my case, --include-all-sources did not work for me. Files that were not require-d were still excluded from the final coverage report.
Eventually, I came across this issue on the istanbul GitHub where the maintainer stated:
Yes, that is expected behavior. istanbul works by hooking require so if a file is never require-d it is as if it doesn't exist.
The only fool-proof solution that I found was to manually require all files that I wanted to include in my coverage report. I create the file include-all.test.js alongside my other test scripts and added the following bit of code:
var glob = require( 'glob' )
var path = require( 'path' );
glob.sync( './path/to/js/code/*.js' ).forEach( function( file ) {
// we don't care about errors here, we just want to require the file
try {
require( path.resolve( file ) );
} catch(e) {}
});
This will absolutely ensure that your untested files are included in the istanbul coverage report.

Sonarqube not detecting LCOV report generated using mocha

I have created a simple project using Node.js, mocha and generated the report for code coverage and unit testing as follows:
mocha -R lcov --ui tdd > coverage/coverage.lcov
mocha -R xunit --ui tdd > coverage/TEST-all.xml
The reports generated using the sonar runner does not reflect the coverage on Sonarqube. The sample test javascript project using LCOV that ships with the sonar-examples-master as well shows 0% code coverage in Sonarqube.
The sonar properties set are as follows:
sonar.language=js
sonar.sourceEncoding=UTF-8
sonar.tests=test
sonar.javascript.jstestdriver.reportsPath=coverage
sonar.javascript.lcov.reportPath=coverage/coverage.lcov
sonar.dynamicAnalysis=reuseReports
Looking forward for inputs on how to resolve this issue and enable the SonarQube to report the coverage on an existing LCOV report.
Thanks,
Neo
JS Test Driver was removed as part of the Sonar Javascript 1.5 release, http://jira.codehaus.org/browse/SONARPLUGINS-3408
So I switched back to the 1.4 plugin.
Regarding the LCOV, I had to match the paths in the LCOV with sonar.sources path.
So
sonar.sources=webapp/app
LCOV was like
SF:webapp/app/path/to/js.js
Hope that helps, I can correct anything I might have gotten wrong tomorrow when I'm at work again.

Resources