SonarQube showing 0% code coverage in node js application - node.js

I have installed sonar Qube and integrated it with node js application.
Right now I am getting code coverage as zero so I created a sonar properties file as below
sonar.projectKey=MyApp
sonar.login=[token]
sonar.sources=.
#sonar.exclusions=app/node_modules/*,app/coverage/lcov-report/*
# coverage reporting
sonar.javascript.coveragePlugin=lcov
sonar.javascript.lcov.reportPaths=app/coverage/lcov.info
but still my code coverage is coming as 0 %
Any Idea how can I solve this.

There are some information gaps that we should fill in so that I can help you better, like, when you're trying to get coverage? You're using some kind of CI/CD pipeline?
Despite this, I had a similar problem a while back and I believe it might help, it turns out that when you use sonarcloud in some pipeline the code that is scanned by the sonar is the difference between the source code and yours.
So if you add or change code which is not covered by test setup it will show code coverage as no information and if you add or change code covered it will show a percentage just referring to changed code.
I will also share with you my sonar setup from a personal project, hope it helps.
sonar.projectKey=some_organization_algorithms-data-structures
sonar.organization=some_organization
sonar.projectName=Algorithms and Data Structures
sonar.sourceEncoding=UTF-8
sonar.sources=src
sonar.exclusions=**/*.spec.ts
sonar.tests=__tests__
sonar.test.inclusions=**/*.spec.ts
sonar.javascript.lcov.reportPaths=coverage/lcov.info

Related

TFS Build Defintion in VS 2013

Guys can anyone will explain me the use of the parameter's like automated test, run settings , analyze test impact, (each of them) etc present under Test in process section of Build Definition in TFS for Visual studio 2013. I am a fresher and new to this technology so can anyone explain me this in detail.
This is a very general and basic question. And most is related to the concept.
Such as run settings.
Unit tests in Visual Studio can be configured by using a
*.runsettings file.
analyze test impact
Used to determine which tests should be run since a previous build, only for manually test on TFS2013, more details please refer this tutorial.
Because of the limitation of length, no more tautology here. If you are a totally fresher to test in TFS, you could take some time go through all the topics in this link Testing the application. Which include all the answers you want and can help you quickly,comprehensively understand knowledge in this area.

Consume blanket.js coverage reports in Jenkins

I have a node.js project running mocha tests, and I'm generating a coverage report using blanket. I've managed to get the coverage report generated, but I'm not sure how to generate a report that can be consumed by and viewed in Jenkins. Any suggestions? I'm looking for a result similar to the Cobertura plugin (https://wiki.jenkins-ci.org/display/JENKINS/Cobertura+Plugin).
Edit: Sorry i misread your question, if the coverage report gets published with the xunit report i dont know. So the following might not help you.
The XUnit reporter should create a report that can be parsed by jenkins.
Check out this blog post.
Also, have a look into the XUnit Plugin, it allows to specify the parser for various kinds of report formats.
Quote for persistence:
Source https://blog.dylants.com/2013/06/21/jenkins-and-node/

dotCover not showing all of the projects in a solution

Let me start by saying I'm new to both ReSharper and dotCover and that I'm using v10.0.2 of both.
The attached screenshot shows solution explorer in VS and the coverage tree for a set of tests.
Whenever I run coverage, it always shows the same subset of assemblies in the coverage tree. Importantly, all of the tests shown are for code in either the Services or Infrastructure assemblies, neither of which show in the coverage tree.
Clearly, the product is not doing something right or I'm not.
Why are only some of the assemblies shown in the coverage tree?
Why aren't any of the assemblies covered by the tests I'm running
shown in the coverage tree?
How do I make it work properly?
EDIT
If it makes any difference, I'm using xUnit and have the xUnit running extension installed in ReSharper and the tests themselves run just fine.
This is due to shadow copying - when enabled, dotCover expects .pdb files to be copied too, and the standard shadow copy that xunit performs doesn't do this. If you disable shadow copy in the Unit Testing options page, it'll work fine. I think the xunit runner can be updated to fix this.
The YouTrack issue that describes what's going on is here: DCVR-7976
In my case the *.pdb files where deleted by a post-build event. After changing that, coverage-analysis worked again.
This post from the support forum of jetbrains helped me

Is there any tool replacement for SONAR for .net code quality and generate report from it?

I have a Visual studio solution, which is designed using c# 4.0 .
I want to check the code quality for my solution and generate report out of it.
I tried the FxCop and i also got the report but i need the report something like this(from the image).
The rules compliance is 85% but in FxCop it only showed me the critical, error, etc.
I was not able to even deploy my project into SONAR because I had some timeout issue
coming for one of my project in the solution.
please someone help me.
Thanks in advance.
Regards,
Roopini
I don't know if there's an equivalent of SonarQube for .NET projects, but if you really want such reporting (which I can understand, obviously!), you should rather ask questions on how to resolve your installation issue for SonarQube instead of searching for something else. There are plenty of organizations where big .NET solutions are successfully analyzed with SonarQube and the C# plugins, so there's no reason why it can't work for you!
You can find useful material on the net to help you on this. For instance, a blog post written by John M Wright about "setting up SonarQube for C# projects". John periodically updates his post, so the information should still be very relevant.
Have you tried the tool NDepend? It generates interactive reports about .NET code quality and code rules compliance. Here are some sample reports.
NDepend is also a tool integrated in Visual Studio (2017, 2015, 2013, 2012, 2010) that proposes a range of interactive features (graph, dependency matrix, code metrics visualization, code diff...). Another point about NDepend is that code rules are actually C# LINQ queries, so it is pretty easy to customize a default code rule or create your own code rules.
NDepend also integrates in VS Team Services and you'll get all code quality data from your VSTS UI instead of being redirected to a server.
I read that you have time-out problems analyzing your code base, maybe it is because your code base is pretty large. NDepend is optimized and it can analyze a very large code base and create a report in a few dozens of seconds (it takes around a minute to analyze the whole .NET Fx).
A 14 days full featured trial is available.
Disclaimer: I work in the NDepend team
If you haven't already, I would suggest taking a look at my blog post on setting up SonarQube for C# projects: http://www.wrightfully.com/setting-up-sonar-analysis-for-c-projects/
The key to fixing your issue will be determining what the system is doing when the timeout occurs. Take a look at your log files and see what the last lines were before it timed out. It could be that your code is complex and just needs more time, in which case you can adjust the timeout values for whichever tool is running at the time.
Otherwise, I would suggest running whichever analysis tool (fxcop, gendarme, sytlecop, etc) was running when the timeout occurred outside of SonarQube. That is, run the tool directly from the commandline to see if it still times out or provides any additional information on the console.
Also, assuming you're using the sonar-runner tool to execute the SonarQube analysis, you can add the -X argument to the commandline, which will run it with debug-level logging enabled. This will create a LOT more log messages which may shed some additional light on the issue.

Can I generate code coverage from deployments on azure?

We would like to be able to deploy our code to azure and then run integration/acceptance tests on the deployed instances to validate the functionality, as using the emulator does not always give realistic results.
We would also like to have these tests generate code coverage reports which we could then merge in with the code coverage from our unit tests. We are using TeamCity as our build server with the built in dotcover as our code coverage tool.
Can we do this? Does anyone have any pointers on where to start?
Check out this video
Kudu can be extended to run Unit Tests and much more more using Custom Deployment Scripts.
http://www.windowsazure.com/en-us/documentation/videos/custom-web-site-deployment-scripts-with-kudu/

Resources