I've been trying to find a way to run integration tests on a remote server during an Azure pipeline process. In my situation we have the pipeline running in Azure and deploying to a local server. I am wondering if there is a way to also deploy integration tests to the same server and run them and report back to Azure in the same process?
You can use self-host agent to run your pipeline. Since Azure agents cannot communicate with your localDB, you can set up a self-hosted agent on your local machine. Your localDB is accesible to the self-hosted agent.
In order to run Integration Tests in your release pipeline. You can include your Test projects or test assembly dll files in the artifacts published in your build pipeline. So that your Integration test projects are accessible to the test tasks in your release pipeline.
To Include your test files in the artifacts. You can add a second publish build artifacts task in your Build Pipeline. Specify Path to publish to the location of your test files.
Run tests in release pipeline by adding the VsTest task or other test tasks in your release pipeline. The release pipeline will download your artifacts to folder $(System.DefaultWorkingDirectory).
Visual Studio Test task and Dot NetCore CLI task automatically publish test results to the pipeline, while tasks such as Ant, Maven, Gulp, Grunt, .NET Core and Xcode provide publishing results as an option within the task. Besides you can use Publish Test Results task.
Here are some articles you can refer to:
Integration tests in ASP.NET Core
Running UAT and Integration Tests During a VSTS Build
Run automated tests from test plans
Good question. This comes up if your integration infrastructure is behind a cooperate firewall, for example.
One solution is to use a self hosted agent on that very integration infrastructure.
Another straight forward approach is to scp your integration tests to your integration infrastructure, then ssh run them, and scp the test results back. There are Pipeline Tasks for both scp and ssh.
Note that the communication in these alternatives are reversed, i.e. Hosted Agend calls Pipeline and Pipeline calls Infrastructure. Your corporate security may prefer one over the other.
Related
Is it possible to pull / clone bitbucket repository within Azure Logic Apps?
I am curious if it is possible to set up some backend tests within Azure Logic Apps. So to pull repo with tests first and then execute them within CLI. I see that there is bitbucket connector in Logic Apps but there is no option to pull the repo. Or should I check some custom connector to run commands from hand like "git clone" etc. - if yes which one?
Azure Logic Apps is a cloud platform where you can create and run automated workflows with little to no code. By using the visual designer and selecting from prebuilt operations, you can quickly build a workflow that integrates and manages your apps, data, services, and systems.
From: What is Azure Logic Apps?
The key concepts here are "little to no code" and "prebuilt operations". Building your code with a CLI and running its tests is not something Logic Apps is targeting. It would also make executing Logic Apps a LOT more complex on the Azure end, since it would mean installing any and all frameworks, tools, etc. that are needed for building the code/running the tests.
If you look at Bitbucket actions that are supported, you can kind of make out that they're all API calls.
What you're looking for is available for free with GitHub workflows
A workflow is a configurable automated process that will run one or more jobs. Workflows are defined by a YAML file checked in to your repository and will run when triggered by an event in your repository, or they can be triggered manually, or at a defined schedule.
or Azure Pipelines.
Azure Pipelines automatically builds and tests code projects. It supports all major languages and project types and combines continuous integration, continuous delivery, and continuous testing to build, test, and deliver your code to any destination.
Potentially interesting read: Build Bitbucket Cloud repositories.
I have a simple executable, that I have build and unittested in a build pipeline.
now I want to install it on some test machines and run some tests on the app, before releasing it to production. (eventually I hope to automate the tests with specflow, but that is the next step)
So basically I have an helloworld.exe build, that I want to be installed from a pipeline to at test agent computer.
I think clickonce is the optimal option, but am unsure how to set it up on azure devops. (we use a server on premises)
Msbuild has Publish target to build and publish the ClickOnce application. It will generate the setup.exe you want. Please check document: Create and build a basic ClickOnce application with MSBuild. In release pipeline you can use msbuild task or Visual Studio Task with Publish target(/t:Publish as argument).
Then you'll get one app.publish folder where the files you want exist there:
This folder can be used for your further deployment.
Here is a ticket you can refer to .
In addition, azure devops Marketplace provides some extension: ClickOnce Packager, ClickOnceMore DevOps Pipeline Task.
My website is already deployed and can be accessed by a server I've already provisioned.
I created an automated selenium webdriver test in Visual Studio to test my website. Now I want this to run on a nightly basis automatically, would it be possible to use Azure DevOps for this? I've been checking tutorials on running Automated Tests in Azure DevOps and it always includes deploying the website in Azure DevOps too, which is not applicable for my site.
Now I want this to run on a nightly basis automatically, would it be
possible to use Azure DevOps for this? I've been checking tutorials on
running Automated Tests in Azure DevOps and it always includes
deploying the website in Azure DevOps too, which is not applicable for
my site.
Yes, it's possible. If your source code is in Azure Devops repo, you can add Visual Studio build task to generate the test assembly. And then use Visual Studio Test task to run the tests.
Or if you've got the test assembly in one path, just specify the search folder and Test files:
You don't need deploy-related tasks since your website is deployed. Just make sure your website is running, and then install self-hosted agent in the server you've already provisioned. Run the pipeline with that self-hosted agent from Default agent pool, then you can run tests easily for your website.
We are currently testing this where I'm just running the test.
This is what my Azure pipeline looks like:
Basically we have a console app running the test (With NUnit) on the server.
We deploy the app that contains all the tests. We have variables for the URL to use in the pipeline with the login user etc that will populate our runsettings file.
We then create a Virtual VS Session then run the .dll containing the selenium tests.
Export the .trx report and mail it out.
Hope this helps a bit
We have already configure Azure Build Pipeline for the project and need to configure continuous automated unit & integration(with database) tests .
Project specification
Build-in .NET Core
Unit and integration tests.
To run the Integration tests a database is required. Database project (SSDT) is part of repository to use .dacpack for deployment.
What we are trying to achieve
We are expecting a report should be generated on each build with the following data:
Total Unit tests with pass/fail status.
Total integration tests with pass/fail status.
Since we are using Azure Build Service where we are able to run the Unit tests successfully but not able to run Integration tests because we need a database for the same.
Reseed of integration database required for each build
Need the approach to update the database if any schema changes happens in repository database project.
Use Service Containers
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/service-containers?view=azure-devops&tabs=yaml
Add one to your build pipeline and use the SQL base image of your choice.
Build your SSDT project
Create a new DB on the container hosted SQL server
Deploy the Dacpac to the container hosted SQL server
Run your integration tests
In my build definition I deploy to multiple Azure cloud services and would like to deploy in parallel, however the build definition in 2015 doesn't allow steps to be run this way.
Is there a way I can have three groups of steps (each with a Visual Studio Build and then Azure Cloud Service Deployment step) running in parallel.
This will save me huge amounts of time in our CI/CD builds allowing for faster feedback from the builds.
Instead of deploying from a build, deploy using the Release hub.
You can define multiple release environments, then use the "Deployment Conditions" option to set multiple environments to deploy at once. However, you would need one agent per environment so that the agents can run in parallel.
Currently, there are not parallel tasks in the VSTS build and release process. However, there is a uservoice item for it. Please vote for it!
https://visualstudio.uservoice.com/forums/330519-team-services/suggestions/13481574-add-ability-to-run-build-steps-in-parallel