Sonarqube not detecting LCOV report generated using mocha - node.js

I have created a simple project using Node.js, mocha and generated the report for code coverage and unit testing as follows:
mocha -R lcov --ui tdd > coverage/coverage.lcov
mocha -R xunit --ui tdd > coverage/TEST-all.xml
The reports generated using the sonar runner does not reflect the coverage on Sonarqube. The sample test javascript project using LCOV that ships with the sonar-examples-master as well shows 0% code coverage in Sonarqube.
The sonar properties set are as follows:
sonar.language=js
sonar.sourceEncoding=UTF-8
sonar.tests=test
sonar.javascript.jstestdriver.reportsPath=coverage
sonar.javascript.lcov.reportPath=coverage/coverage.lcov
sonar.dynamicAnalysis=reuseReports
Looking forward for inputs on how to resolve this issue and enable the SonarQube to report the coverage on an existing LCOV report.
Thanks,
Neo

JS Test Driver was removed as part of the Sonar Javascript 1.5 release, http://jira.codehaus.org/browse/SONARPLUGINS-3408
So I switched back to the 1.4 plugin.
Regarding the LCOV, I had to match the paths in the LCOV with sonar.sources path.
So
sonar.sources=webapp/app
LCOV was like
SF:webapp/app/path/to/js.js
Hope that helps, I can correct anything I might have gotten wrong tomorrow when I'm at work again.

Related

Gitlab failed pipeline when merging yaml file

I'm trying to merge a yaml file containing 7000 lines of swagger documentation into gitlab but the pipeline keeps failing. The failure is happening because there are no unit tests, but I'm not aware of a way to unit test a swagger file. I've included the script that's causing the failure. Thanks for all and any help.
script:
"testForbidOnly": "npm run apolloEnv && TEST=true nyc mocha --recursive --exit --forbid-only"
You will need to configure the code coverage tool Istanbul (cli: nyc) to ignore files with the yml|yaml extension. See: Excluding files from coverage when using Mocha and Istanbul.

Why does nyc does not gather test coverage for my project?

I am developing an extension for vscode using typescript. I set up a GitHub Action which is executing my test cases. To gather the test coverage I am using nyc.
These are my node scripts:
"compile": "tsc -p ./",
"test": "npm run compile && node ./out/test/runTest.js",
"coverage": "nyc npm run test"
When I run npm run coverage I see all my test cases are getting executed (they are not stored in the file runTest.js), but only the coverage for the test file runTest.js is gathered. The relevant classes lib.ts and extension.ts are not shown in the coverage.
Why is this so? What am I doing wrong?
See my package.json and .nycrc.json for configuration.
I could fix this problem by sticking to the great blogpost which I found posted on a similar Question.
The problem is, that NYC is a command line tool. When executing the test I focused on my configuration in the .nycrc file. Fact is, that the visual studio code test runner is not using this configuration and has to be configured inside the test runner.
I fixed the broken test coverage generation inside this commit.

How would I setup bamboo unit testing that runs both karma and jest

I have been trying to set up server-side testing for my application but have run into an issue with bamboo recognizing more than one test type. In the past, it worked just fine using only karma as the unit test, but I need to add server-side testing which works much better with jest.
The main problem I am running into is that when both are run as shown below the coverage report is only created for the karma(unit) tests.
test-unit=npm run test:unit && npm run test:server
I have also tried running jest under bamboos test-e2e and test-contract-test but when I do this nothing is reported. So is there a way to set up server-side testing separately in the .bamboorc file
So, I found an answer!
In the .bamboorc file move the test-unit = npm run test:unit to the end of the file and for the first test use
test-contract-consumer=npm run test:server -- --collect-coverage
This should then collect the coverage for the server tests in bamboo under the contract tests consumer artifacts, and the karma unit test should still show up under the unit test coverage artifacts.

Run Jest unit test with TFS 2015

Has anyone attempted to integrate jest unit tests with TFS 2015? I tried to use Chutzpah Test Adapter (https://visualstudiogallery.msdn.microsoft.com/f8741f04-bae4-4900-81c7-7c9bfb9ed1fe?SRC=VSIDE) however it's not able to recognize jest. I receive below error:
Can't find variable Jest
When I run the unit tests through "npm test" I get the results. However to integrate with TFS 2015 I need a test runner which can run Jest unit test so that I can run the unit tests in conjunction with vstest.console.exe which the TFS 2015 provides so it can manage build results and publish results in the build summary report.
Any help would be appreciated!!
Any test runner which can run tests using below command should work (considering VS 2015 installed on the system):
"C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE\CommonExtensions\Microsoft\TestWindow\vstest.console.exe" "\test.js" /UseVsixExtensions:true
Extending on Merlin's answer, here is how I've implemented publishing jest test results AND code coverage to TFS2015 vNext builds (I am using create-react-app boilerplate):
First install required packages on the Server you are running your Agent on:
npm install -g jest-json-to-tap
npm install -g tap-xunit
configure jest to output json, by changing in package.json the "test" task to:
"test": "react-scripts test --env=jsdom --json",
configure jest options in package.json:
"jest": { "coverageReporters": ["cobertura"] }
created a vNext build (TFS2015v4) with the following tasks:
a. "npm" task, command=run, arguments=test -- --coverage | jest-json-to-tap | tap-xunit > TEST-result.xml
b. "publish test results" task, format=JUnit
c. "public code coverage results" task, code coverage tool=Cobertura, Summary file=$(Build.Repository.LocalPath)\coverage\cobertura-coverage.xml
make sure your build's "Variables" include setting the environment variable "CI"="true"
NOTES:
- test results will not include times nor assemblies - something to extend for the future...
Voila'! Running this build will correctly publish the test results and code coverage stats, as well as report artifacts.
I'm not sure about jest, but there's a neat npm package that can convert TAP based results to xUnit XMLformat, and then you can publish that to TFS.
Take a look at tap-xunit.
I had a build environment where javascript testing was done by various tools and frameworks (AVA, Mocha, Jasmine etc). We decided to export them all to TAP format, run them throw tap-xunit and then publish to TFS.
Basically, you need something like this:
npm test | tap-xunit > results.xml
You pipe the results to tap-xunit and save them to an XML. This gives you an XML formatted as xUnit that you can publish to TFS. If you're running TFS 2015, I strongly recommend going with vNext builds, a lot easier to get these running. Check the "Publish Test Results" build step.
If you are running with XAML build, this link will help you: Javascript Unit Tests on Team Foundation Service with Chutzpah
If you are running with vNext build, please try the detail steps mentioned with Jasmine.JS test(also a kind of JavaScript test ) in this blog.

Branch coverage% mismatch between Istanbul and Sonar

My Jenkins job reads lcov file, generated by Istanbul, via Sonar Runner. The numbers/misses in lcov-report generated by Istanbul do not match with that displayed in Sonar. There is 0-7% difference with Istanbul being stricter by finding more misses.
Is it expected? Why the difference?
Environment:
SonarQube 3.5 and 3.7.4
SonarRunner 2.3
Sonar JavaScript plugin 1.6
Node.js code
Coverage% mismatch in a single file shouldn't differ, except for rounding. As for project's %coverage, you'll need to experiment with sonar.exclusions. This is what we're using for a specific Node project:
sonar.sources=.
sonar.exclusions=src/**/*,test/**/*,node_modules/**/*,public/**/*,coverage/**/*,html-report/**/*,views/**/*,Gruntfile.js,*.html
sonar.tests=test

Resources