I have a node.js project running mocha tests, and I'm generating a coverage report using blanket. I've managed to get the coverage report generated, but I'm not sure how to generate a report that can be consumed by and viewed in Jenkins. Any suggestions? I'm looking for a result similar to the Cobertura plugin (https://wiki.jenkins-ci.org/display/JENKINS/Cobertura+Plugin).
Edit: Sorry i misread your question, if the coverage report gets published with the xunit report i dont know. So the following might not help you.
The XUnit reporter should create a report that can be parsed by jenkins.
Check out this blog post.
Also, have a look into the XUnit Plugin, it allows to specify the parser for various kinds of report formats.
Quote for persistence:
Source https://blog.dylants.com/2013/06/21/jenkins-and-node/
Related
I have installed sonar Qube and integrated it with node js application.
Right now I am getting code coverage as zero so I created a sonar properties file as below
sonar.projectKey=MyApp
sonar.login=[token]
sonar.sources=.
#sonar.exclusions=app/node_modules/*,app/coverage/lcov-report/*
# coverage reporting
sonar.javascript.coveragePlugin=lcov
sonar.javascript.lcov.reportPaths=app/coverage/lcov.info
but still my code coverage is coming as 0 %
Any Idea how can I solve this.
There are some information gaps that we should fill in so that I can help you better, like, when you're trying to get coverage? You're using some kind of CI/CD pipeline?
Despite this, I had a similar problem a while back and I believe it might help, it turns out that when you use sonarcloud in some pipeline the code that is scanned by the sonar is the difference between the source code and yours.
So if you add or change code which is not covered by test setup it will show code coverage as no information and if you add or change code covered it will show a percentage just referring to changed code.
I will also share with you my sonar setup from a personal project, hope it helps.
sonar.projectKey=some_organization_algorithms-data-structures
sonar.organization=some_organization
sonar.projectName=Algorithms and Data Structures
sonar.sourceEncoding=UTF-8
sonar.sources=src
sonar.exclusions=**/*.spec.ts
sonar.tests=__tests__
sonar.test.inclusions=**/*.spec.ts
sonar.javascript.lcov.reportPaths=coverage/lcov.info
I know there are lots of tools available for doing code coverage test for Mocha/Jasmine tests. But my automation framework is protractor-cucumber framework. Is there any code coverage process that can be done for this? Any tools or example can be appreciated.
As far as I know, code coverage is to understand how much of the application code is covered in unit testing as it will have access to almost every line of application code.
Protractor is a end to end based automation testing tool. You should be thinking about functional/scenario coverage rather than code coverage.
Let me start by saying I'm new to both ReSharper and dotCover and that I'm using v10.0.2 of both.
The attached screenshot shows solution explorer in VS and the coverage tree for a set of tests.
Whenever I run coverage, it always shows the same subset of assemblies in the coverage tree. Importantly, all of the tests shown are for code in either the Services or Infrastructure assemblies, neither of which show in the coverage tree.
Clearly, the product is not doing something right or I'm not.
Why are only some of the assemblies shown in the coverage tree?
Why aren't any of the assemblies covered by the tests I'm running
shown in the coverage tree?
How do I make it work properly?
EDIT
If it makes any difference, I'm using xUnit and have the xUnit running extension installed in ReSharper and the tests themselves run just fine.
This is due to shadow copying - when enabled, dotCover expects .pdb files to be copied too, and the standard shadow copy that xunit performs doesn't do this. If you disable shadow copy in the Unit Testing options page, it'll work fine. I think the xunit runner can be updated to fix this.
The YouTrack issue that describes what's going on is here: DCVR-7976
In my case the *.pdb files where deleted by a post-build event. After changing that, coverage-analysis worked again.
This post from the support forum of jetbrains helped me
I'm confused, I see people using both. They're both code coverage reporting tools. So is it just that people are using the Istanbul functionality and want to use coveralls UI instead of the istanbul html output files as just a nicer coverage runner, is that it? is that the reason to use both??
Istanbul generates coverage information and coveralls provides historical coverage reporting. Istanbul provides a snapshot of where you are; coveralls tells you where you have been.
Typically, you use coveralls as part of a CI/CD pipeline: local build, push to Git, Travis build, push results to coveralls, ...
When you build your project, you will look at your lcov html report to review coverage. How do you know if your coverage has increased or reduced? Look at coveralls for the history.
Shields.io provides badges for Coveralls coverage that you can wear on your GitHub README.md which also shows on npmjs.com if you publish there. It is a nice quality indicator for people using your product and equally nice as a note-to-self that your coverage is slipping (badges are colored and show a % coverage).
We would like to be able to deploy our code to azure and then run integration/acceptance tests on the deployed instances to validate the functionality, as using the emulator does not always give realistic results.
We would also like to have these tests generate code coverage reports which we could then merge in with the code coverage from our unit tests. We are using TeamCity as our build server with the built in dotcover as our code coverage tool.
Can we do this? Does anyone have any pointers on where to start?
Check out this video
Kudu can be extended to run Unit Tests and much more more using Custom Deployment Scripts.
http://www.windowsazure.com/en-us/documentation/videos/custom-web-site-deployment-scripts-with-kudu/