Is any possibility to test created Azure Pipelines? From UI or your yaml definition of pipelines?
Mean that I have some yamls pipelines or pipelines defined from UI and I want to ensure by some tests(Unit Tests e.x.) that all have defined variables, build, test, and package parts or something else in each pipeline.
And verify pipelines configurations after some changes of them or after adding some new repos/branches if it's required.
Thanks...
Is any possibility to test created Azure Pipelines? From UI or your
yaml definition of pipelines?
If you want a out of box feature to achieve this, sorry to say, No, there hasn't.
BUT, the work around is using API to check them.
Client API.
You could write a simple script to get Builds definition with Client API.
In this simple script, you first get the whole definition:
List<BuildDefinitionReference> buildDefinitions = new List<BuildDefinitionReference>();
Then you could apply your customized check/test into this definition with scripts. In one word, write some test classes/methods. After the script complete, you can import it into VSTS, and then use task to run those tests part. Only this test succeed, then your builds could be executed.
So, at this time, it need you add 2 agent jobs into your pipeline, the first one is used to run your script test(names test agent job). And the second agent job is the one you want to check. In the second agent job, set its condition as:
At this time, only the test succeed, this current job can be ran.
Or, if you don't want the builds you want to check would be broken because of the test, please consider about using Build completion trigger. Set a separate pipeline to run the test. In the pipeline you want to check, set it can be run only when the test pipeline finished.
Rest API
You could use rest api with powershell which very similar with the above description. Use api to get builds definition, and then write some check powershell script.
I more recommend you to put the test at a separate pipeline. Then the API could only get the part you want check, not including the test part.
Related
I'm working on a solution that has a yaml build pipeline that does the following
>restore
>build
>test
>publish test
>publish test coverage
>publish source code
And I want to implement a policy in a branch that does the following: whenever a developer creates a pull request to develop branch, that action triggers a build to ensure that the code the developer is trying to merge to develop builds and passes all tests
My question is: as a best practice, should I reuse the build pipeline that I already have, or should I create a new pipeline for that specific job?
Hi Rodrigo you can use the same pipeline as long as you don't have some specific requirements. You can make use of conditions or stages in pipeline yaml to make it robust reuse.
for more information
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema
So to give you a bit of context we have a service which has been split into two different services ie one for the read and one for the write side operations. The read side is called ProductStore and the write side is called ProductCatalog. The issue were facing was down the write side as the load tests create 100 products in the write side resource web app and then they are transferred to the read side for the load test to then read x number of times. If a build is launched in the product catalog because something new was merged to master then this will cause issues in the product store pipeline if it gets run concurrently.
The question I want to ask is there a way in the ProductStore yaml file to directly query via a specified azure task or via an AzurePowershell script to check if a build is currently running in the ProductCatalog pipeline.
The second part of this would be to loop/wait until that pipeline has successfully finished before resuming the product store pipeline.
Hope this is clear as I'm not sure how to best ask this question as I'm very new to the DevOps pipelines flow but this would massively help if there was a good way of checking this sort of thing.
As a workaround , you can set Pipeline completion trigger in ProductStore pipeline.
To trigger a pipeline upon the completion of another, specify the triggering pipeline as a pipeline resource.
Or configure build completion triggers in the UI, choose Triggers from the settings menu, and navigate to the YAML pane.
Is it possible to loop running tasks in agent at build or release pipelines?
like for-each from json file we have list of blocks in json file
for-each block i will start the running the task of the agent again
It's not able to use for-each control to run tasks multiple times in your pipeline.
If you just want to run the tasks multiple times, you could simply add the task multiple times when using Classic UI pipeline or using template when using YAML pipeline.
For how to use template, kindly refer answer in this question: Azure Devops YAML pipeline - how to repeat a task.
If you just want to use the loop to re-run/re-try failed task/step. This is also not supported at the moment.
There has been a related user voice.
Rerun failed build task/step
https://developercommunity.visualstudio.com/idea/365697/rerun-failed-build-taskstep.html
Multiple persons commented and echoed. You could monitor the status of above user voice.
Is it possible for a pipeline to have multiple triggers in one YAML file that executes different jobs per trigger?
In our pipeline, we pack each project in the solution and push it as a nuget package in our own azure devops artifacts and want to do the packing and pushing depending on the project. Saw that it is possible to specify the branch and path in the trigger, but you can only have one trigger according to this. But he only indicated it in the question, and the documentation doesn't explicitly state it.
Right now my option is to just configure different pipelines with yaml files per project but I want to ask here to confirm if this is possible or not.
Agree with Jessehouwing You can add multiple triggers. You can use conditionals on tasks, jobs, stages and environments to only run in specific cases.
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema#triggers
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?tabs=yaml&view=azure-devops
Thanks for the input, studied the docs but it's not possible to achieve what I wanted with just the built in tasks for azure devops. I had to make a script that does it and assign true of false values to the conditionals.
The exact answer I was looking for was in this post
I would like to move the existing Azure DevOps pipelines to YAML based for obvious advantages. The problem is there are many of these and each one has many jobs.
When I click around in Azure DevOps, the "View YAML" link only appears for one job at a time. So that's gonna be a lot of manual work to view YAMLs for each pipeline x jobs and move that to code.
But for each pipeline there seems to be a way to "export" the entire pipeline in json. I was wondering if there is a similar way to at least dump the entire pipeline as YAML if not an entire folder.
If there is an API which exports the same then even better.
Until now, what we supported is what you see, use View YAML to copy/paste the definition of agent job. There has another workaround to get the entire definition of one pipeline is use the API to get the JSON from a build definition, convert it to YAML, tweak the syntax, then if needed, update the tasks which are referenced.
First, use Get Build Definition api to get the entire definition
of one pipeline.
Invoke JSON to YAML converter. Copy/paste the JSON of definition
into this converter.
Copy the YAML to a YAML editor of Azure Devops. Then the most important step is tweak the syntax.
Replace the refName key values with task names and version. For this, you can go our tasks source code which opened in github, built in tasks can be found there(note: please see the task.json file of corresponding task)
Noted: Use this method has another disadvantage that you need very familiar with YAML syntax so that you can tweak the content which convert from JSON successfully.
This is done and there is blog post about exporting pipelineas YAML on devblogs
It's it worth to mention that the new system knows how to handle every feature listed here:
Single and multiple jobs
Checkout options
Execution plan parallelism
Timeout and name metadata
Demands
Schedules and other triggers
Pool selection, including jobs which differ from the default
All tasks and all inputs, including optimizing for default inputs
Job and step conditions
Task group unrolling
In fact, there are only two areas which we know aren’t covered.
Variables
Variables defined in YAML “shadow” (hide) variables set in the UI. Therefore, we didn’t want to export them into the YAML file in case you need an ability to alter them at runtime. If you have UI variables in your Classic pipeline, we mention them by name in the comments to remind you that you need to configure them in your new YAML pipeline definition.
Timezone translation
cron schedules in YAML are in UTC, while Classic schedules are in the organization’s timezone. Timezone handling has a lot of sharp edges, so we opted not to try to be clever. We export the schedule without doing any translation, so your scheduled builds might be off by a certain number of hours unless you manually modify them. Here again, we make a note in the comments to remind you.
But there won't be support for release pipelines:
No plans to do so. Classic RM pipelines are different enough in their execution that I can’t make the same strong guarantees about correctness as I can with classic Build. Also, a number of concepts were re-thought between RM and unified YAML pipelines. In some cases, there isn’t a direct translation for an RM feature. A human is required to think about what the pipeline is designed to accomplish and re-implement it using new constructs.
I tried yamlizr https://github.com/f2calv/yamlizr
It works pretty well for exporting Release Pipelines, except it doesn't export out Pre/Post deployment conditions. We use these for Approval gates. So hopefully in a future release it will be supported.
But per Microsoft they won't support the export to YAML for Release Pipelines it sounds like.
https://devblogs.microsoft.com/devops/replacing-view-yaml/#comment-2043