Need to merge coverage reports in Azure DevOps - azure

I am using Cobertura to generate coverage.xml for the Azure pipeline. I need to merge two coverage reports, but it is picking up only the latest one.
Can anyone let me know how I can generate a single coverage.xml for the pipeline?

Related

How to publish the testNG reports in azure devops pipeline ? is there any extensions for same?

Getting blank page in Allure report tab under the Azure Devops build.
As per the log, the report has been generated. Can someone please tell me what is the issue here?
This is the information I am using for Allure Task :
Task I have added :
How to publish the testNG reports in azure devops pipeline ? is there any extensions for same?
Azure DevOps does not support testNG test results format.
And testNG also produces JUnit test results in a separate junitreports folder. So, we can publish TESTNG test results in JUnit format instead.
You could use the Publish Test Results task to publish the reports.
Please select the “JUnit” Test result format and change the Test results files as **/junitreports/TEST-*.xml.
You could check my provious thread for some more details:
How to integrate TestNG or Extent reports with Azure devops?
But for the extension Allure Test Reports task, there are many communities report the same issue on the Q&A for this task:
https://marketplace.visualstudio.com/items?itemName=Molecula.allure-test-reports&ssr=false#qna
This issue should be more related to this extension, you could check the Q&A for the feedback.
Referring to the documentation for Allure Test Reports.
To enable "Open Allure Report" option you will need to add additional build step to publish the report somewhere. For instance you can create a simple Azure Web App and upload reports there using Build Task extension like FTP Upload. Your website should support HTTPS.

Is it possible to forward an azure artifact from one pipeline to another?

I'm doing a project where I want a intermediate pipeline to consume an artifact from another pipeline to later forward it to another pipeline, which then starts running. I have not managed to find a solution online to this functionality and I'm starting to think that that functionality is not (as of today) supported in Azure DevOps. Can someone confirm whether this implementation is possible with pipelines? If not, how should one approach this scenario? I'm relatively new to Azure DevOps, so it's possible that I might have missed or misunderstood some information online.
I know that a pipeline can download an artifact which was published by another pipeline. And I know about pipeline triggers. Currently I have managed to consume artifacts in the intermediate pipeline, but now I have to find a way to send those artifacts to a specific pipeline, from the intermediate pipeline.
I appreciate all the help I can get.
but now I have to find a way to send those artifacts to a specific pipeline, from the intermediate pipeline.
We could publish build artifact from the intermediate pipeline by the Publish Build Artifacts or Publish Pipeline Artifacts task in the intermediate pipeline, then we use the Download Build Artifacts or Download Pipeline Artifacts task in the specific pipeline to download the artifact which we published from intermediate pipeline.
Yo could check this document Publish and download artifacts in Azure Pipelines for some more details.
Update:
Are you aware if this functionality is planned for Azure DevOps? If we
put it this way - is it possible for my intermediate pipeline to,
depending on some variables, trigger a specific pipeline (i.e. it can
trigger two different pipelines, but which one to trigger depends on
some variables)?
The answer is yes. There is extension Trigger Build Task, which can be used to trigger a new Build so that build-chaining is possible.
And we just need add the value of the variables as condition for this task, then we could trigger a specific pipeline depending on some variables.
Where this specific pipeline is in another project. Is it also
possible for this pipeline to access the variables which I created in
my intermediate pipeline?
I think I might understand what you want to achieve. you want to create some variables in the intermediate pipeline, then use this variables in the specific pipeline.
The answer is yes. But there is no directly way to access the variables which you created in intermediate pipeline. The solution we need to set the default value for the variables in the specific pipeline, then use REST API Definitions - Update to update those variables in the intermediate pipeline.
Please check my previous thread for detailed REST API scripts.
As far as I know, the solution would be adding Azure Artificats to store the output from Pipeline 1, then in Pipeline 2, download the artifact.
https://learn.microsoft.com/en-us/azure/devops/artifacts/overview?view=azure-devops

How to Run Karate API tests on Azure pipelines

New to Karate & Azure. Just created few API tests using Karate and easy simple. Want to take it further adding it Azure pipeline.
Found few links that just points to add pom.xml and maven from Microsoft. Also found Jekins integration but none for Azure.
Unknown / How to.
What file needs to be moved to the pipeline, eg, jar, war, xml etc.
How to create them i use intelliJ.
Any step by step tutorials available? any help appreciated.
To run Karate tests on azure devops pipeline, you can follow below general steps for building/testing java project.
1, First create your Karate tests projact with Maven. Add the related dependencies and plugins in the pom.xml. See example here.
2, Push your local source code(eg. .feature/.java/pom.xml etc) to github, or azure devop git repository. No need to push .jar dependencies, for the dependencies can be downloaded by the Maven task in the pipeline.
3, Create a azure pipeline, Follow this example to create a Yaml format pipeline. If you want to create a classic UI view pipeline, follow the example here.
4, Add Maven task in your pipeline to run the Karate test: See below example in Yaml.
steps:
- task: Maven#3
displayName: 'Maven Test'
inputs:
mavenPomFile: 'pom.xml'
goals: test
publishJUnitResults: false
If You use Microsoft cloud hosted agents to run your pipeline, you need to make sure the API tested by Karate can be accessed from the cloud hosted agents.(ie. API can be accessed publicly)
If the API server is hosted locally, you need to create self-hosted agent, and run your azure pipeline on your self-hosted agent.
#Devanathan
Better way is go for Publishing artifacts for both
a) your Cucumber Report
b) Your Karate HTML report.
Anyone can download both the artifacts (i.e reports from pipeline) and see the Report.
I tried to publish cucumber-html-reports (this is the exact name of folder that Karate generates) and was successful.
I intentionally kept report output directory in src/test/java for some reason [by using .reportDir("src/test/java/reports") in testParallel() method], you can try same by keeping reports in 'target' folder only.
I was not able to Publish Karate HTML report as somehow its not being picked in azure

Generic selenium testing in Azure DevOps pipelines

I am currently working on generic selenium tests to add to all my pipelines.
Because all my projects are similar (webshops) but have different solutions, I created a independent project to execute unit tests for all these webshops. I am wondering if it is possible to execute these tests in the independent project the Azure DevOps pipeline.
What are my possibilities?
Thank you,
Thomas

Continuous integration and Continuous deployment in Azure Data factory

I want to do continuous integration and deployment in Azure Data factory. I'm not able to find any specific document explaining this.
How can I do it or where can I read about it?
To build your project, you can use msbuild - just like it's done in Visual Studio. It will validate syntax, check references between json configurations and check all dependencies. If you are using Visual Studio Team Services as CI server you can use Visual Studio Build step in build configuration to do it. However, it requires to install ADF tools for VS on build agent machine.
To deploy, you can try:
Powershell. For example, you can use Set-AzureRmDataFactoryV2Dataset to deploy datasets. There are similar commands for all other configurations and for version 1 of Azure Data Factory as well.
If you are using VSTS, you can try this 3rd party extension. It allows to deploy json configurations and start/pause pipelines. I'm not sure if it works with ADF v2.
You can use the VSTS GIT integration with ADF v2 UX to do continous deployment and continuous integration. VSTS GIT integration allows you to choose a feature/development branch or create a new one in your VSTS GIT repo. You can work in your feature/development branch and create PR in VSTS GIT to merge your changes to the master branch. You can then publish to your data factory using ADF v2 UX. Please try this and let us know if this doesn't work for you or you face any issues.

Resources