New to Karate & Azure. Just created few API tests using Karate and easy simple. Want to take it further adding it Azure pipeline.
Found few links that just points to add pom.xml and maven from Microsoft. Also found Jekins integration but none for Azure.
Unknown / How to.
What file needs to be moved to the pipeline, eg, jar, war, xml etc.
How to create them i use intelliJ.
Any step by step tutorials available? any help appreciated.
To run Karate tests on azure devops pipeline, you can follow below general steps for building/testing java project.
1, First create your Karate tests projact with Maven. Add the related dependencies and plugins in the pom.xml. See example here.
2, Push your local source code(eg. .feature/.java/pom.xml etc) to github, or azure devop git repository. No need to push .jar dependencies, for the dependencies can be downloaded by the Maven task in the pipeline.
3, Create a azure pipeline, Follow this example to create a Yaml format pipeline. If you want to create a classic UI view pipeline, follow the example here.
4, Add Maven task in your pipeline to run the Karate test: See below example in Yaml.
steps:
- task: Maven#3
displayName: 'Maven Test'
inputs:
mavenPomFile: 'pom.xml'
goals: test
publishJUnitResults: false
If You use Microsoft cloud hosted agents to run your pipeline, you need to make sure the API tested by Karate can be accessed from the cloud hosted agents.(ie. API can be accessed publicly)
If the API server is hosted locally, you need to create self-hosted agent, and run your azure pipeline on your self-hosted agent.
#Devanathan
Better way is go for Publishing artifacts for both
a) your Cucumber Report
b) Your Karate HTML report.
Anyone can download both the artifacts (i.e reports from pipeline) and see the Report.
I tried to publish cucumber-html-reports (this is the exact name of folder that Karate generates) and was successful.
I intentionally kept report output directory in src/test/java for some reason [by using .reportDir("src/test/java/reports") in testParallel() method], you can try same by keeping reports in 'target' folder only.
I was not able to Publish Karate HTML report as somehow its not being picked in azure
Related
I am currently using Jenkins for my pipelines and I am also using the shared libraries written in groovy. Now I am planning to move to Azure Devops and use the Azure Pipelines. Is there any way we can use the same groovy pipelines and the shared libraries in Azure Pipelines or do I need to convert all of them to yml files from groovy. And is there any automated way to convert to yml or I need to convert all the groovy to yml manually?
If you want to move the CI/CD pipelines for your project to Azure DevOps, you need to set up the pipelines on Azure DevOps following the syntax supported in Azure Pipelines. If you want to set up the YAML pipelines, you can reference the documents about "YAML schema".
For the libraries required in your project:
If the libraries are built and maintained by yourself, you can build and publish the libraries to a Maven feed on Azure Artifacts. For more details, you can reference the documents about "Maven packages".
If the libraries are shared by others, you can publish the library files to a universal feed. For more details, you can reference the documents about "Universal packages".
After above, in the CI/CD pipelines on Azure DevOps, you can restore the packages from the Artifact feed to your project.
I've been trying to find a way to run integration tests on a remote server during an Azure pipeline process. In my situation we have the pipeline running in Azure and deploying to a local server. I am wondering if there is a way to also deploy integration tests to the same server and run them and report back to Azure in the same process?
You can use self-host agent to run your pipeline. Since Azure agents cannot communicate with your localDB, you can set up a self-hosted agent on your local machine. Your localDB is accesible to the self-hosted agent.
In order to run Integration Tests in your release pipeline. You can include your Test projects or test assembly dll files in the artifacts published in your build pipeline. So that your Integration test projects are accessible to the test tasks in your release pipeline.
To Include your test files in the artifacts. You can add a second publish build artifacts task in your Build Pipeline. Specify Path to publish to the location of your test files.
Run tests in release pipeline by adding the VsTest task or other test tasks in your release pipeline. The release pipeline will download your artifacts to folder $(System.DefaultWorkingDirectory).
Visual Studio Test task and Dot NetCore CLI task automatically publish test results to the pipeline, while tasks such as Ant, Maven, Gulp, Grunt, .NET Core and Xcode provide publishing results as an option within the task. Besides you can use Publish Test Results task.
Here are some articles you can refer to:
Integration tests in ASP.NET Core
Running UAT and Integration Tests During a VSTS Build
Run automated tests from test plans
Good question. This comes up if your integration infrastructure is behind a cooperate firewall, for example.
One solution is to use a self hosted agent on that very integration infrastructure.
Another straight forward approach is to scp your integration tests to your integration infrastructure, then ssh run them, and scp the test results back. There are Pipeline Tasks for both scp and ssh.
Note that the communication in these alternatives are reversed, i.e. Hosted Agend calls Pipeline and Pipeline calls Infrastructure. Your corporate security may prefer one over the other.
I have a java application and am trying to use Azure DevOps to build and deploy. Am able to do a build and publish the artifact in the build pipeline. In the release pipeline, I stages (dev/train/prod) in each stage I have a maven task to detokenize the build specific to the environment which I am able to do but I want to publish it as a artifact similar to the one in build pipeline. Is there any task to do that or any other alternate approach?
Can we publish artifacts in release pipeline - Azure devOps?
Sorry for any inconvenience.
This behavior is by designed and MS replied they don't have plans to support uploading folder/artifacts from release in near future.
When you check the document Publish Pipeline Artifacts task, It stated that:
Use this task in a pipeline to publish artifacts for the Azure
Pipeline (note that publishing is NOT supported in release
pipelines. It is supported in multi stage pipelines, build
pipelines, and yaml pipelines).
And if you check directly code that is executed, then you can see the Publish Pipeline Artifact task works only for Build pipelines.
You could check this ticket on github for some more details, many communities are waiting for publish artifact from release pipeline.
So, I helped you add a request for this feature on our UserVoice site, which is our main forum for product suggestions:
https://developercommunity.visualstudio.com/idea/823829/support-publish-artifact-from-release-pipeline.html
You could vote and add your comments for this feedback. When there are enough communities vote and add comments for this feedback, the product team member will take this feedback seriously.
Hope this helps.
I was facing the same problem: I wanted to upload artifacts in a release pipeline and in a later agent phase download those artifacts again.
Based on some answers from a related SO Post I created an Extension that offers the possibility to:
Upload a file or a folder to the Release Logs
Automatically download an Artifact from the logs that was previously uploaded
The upload task is making use of the built-in logging command to add files to the Release Logs. The download task then queries the Azure DevOps REST Api to download all logs collected thus far, tries to find the specified artifact and copies it to a specific place.
If anyone is interested, it can be found on the Marketplace
I have built an automation framework using Java, Selenium Webdriver, Maven, TestNG. Currently, I am using Jenkins for pipeline and CI.
Now new requirement assigned to me is using Azure DevOps as CI tool and execute all test from there instead of Jenkins.
After some research, I am getting the following :
Upload code to Github or other azure supported repo. and create a pipeline.
Write your Java code using Visual studio code and then it will be far easy to execute from Azure DevOps.
Is there any better way to do this?
You need to follow steps below. Main effort is of tools integration if those are not present in azure devops portal :
I am not sure which code repository you are using however if you are not using one which is supported by Azure devops, then you need integrate it with azure devops portal.
Create agentpool in azure devops with same configuration as your jenkins agent.
Create build pipeline in azure devops. It will ask your repository name . Give the same.
4.While Creating pipeline it will ask you whether to create azure pipline yaml or not . Say "Yes" and it will create sample yaml file in code repository.
Open Yaml file.
give your agentpool name where they have mentioned it.
under section " steps "
mention all steps which you want to do run test cases. you would mentioned same
thing is jenkins pipeline under stages --> steps like shell ''' '''
save yaml and run it. you are done
NOTE : Main thing is configuration of agentpool. you need to make it should have all software tools ( except jenkin agent jar :) ).
I want to do continuous integration and deployment in Azure Data factory. I'm not able to find any specific document explaining this.
How can I do it or where can I read about it?
To build your project, you can use msbuild - just like it's done in Visual Studio. It will validate syntax, check references between json configurations and check all dependencies. If you are using Visual Studio Team Services as CI server you can use Visual Studio Build step in build configuration to do it. However, it requires to install ADF tools for VS on build agent machine.
To deploy, you can try:
Powershell. For example, you can use Set-AzureRmDataFactoryV2Dataset to deploy datasets. There are similar commands for all other configurations and for version 1 of Azure Data Factory as well.
If you are using VSTS, you can try this 3rd party extension. It allows to deploy json configurations and start/pause pipelines. I'm not sure if it works with ADF v2.
You can use the VSTS GIT integration with ADF v2 UX to do continous deployment and continuous integration. VSTS GIT integration allows you to choose a feature/development branch or create a new one in your VSTS GIT repo. You can work in your feature/development branch and create PR in VSTS GIT to merge your changes to the master branch. You can then publish to your data factory using ADF v2 UX. Please try this and let us know if this doesn't work for you or you face any issues.