Azure Devops and yaml pipelines creation - azure

How to create Azure devops yaml Pipleine.I'm currently trying to create multiple build pipelines for my Angular app in Azure DevOps using the new YAML way. … As far as I can tell from the docs it is not possible to define multiple pipelines in a single .yml file either. Is this scenario currently not supported in Azure DevOps

To create a pipeline, the simplified steps are ...
Go to the project you want to create the pipeline in
Go to the 'Pipelines' menu
Click the blue 'New pipeline' button on the top right corner
Follow the wizard that will help you set up your YAML pipeline
You can also read Create your first pipeline
As far as multiple pipelines in one .yml file: no, you define one pipeline in one yaml file. But that doesn't mean you cannot have multiple stages in one pipeline.
A stage is a logical boundary in the pipeline. It can be used to mark separation of concerns (for example, Build, QA, and production). Each stage contains one or more jobs. When you define multiple stages in a pipeline, by default, they run one after the other. You can specify the conditions for when a stage runs. When you are thinking about whether you need a stage, ask yourself:
Do separate groups manage different parts of this pipeline? For example, you could have a test manager that manages the jobs that relate to testing and a different manager that manages jobs related to production deployment. In this case, it makes sense to have separate stages for testing and production.
Is there a set of approvals that are connected to a specific job or set of jobs? If so, you can use stages to break your jobs into logical groups that require approvals.
Are there jobs that need to run a long time? If you have part of your pipeline that will have an extended run time, it makes sense to divide them into their own stage.
and
You can organize pipeline jobs into stages. Stages are the major divisions in a pipeline: "build this app", "run these tests", and "deploy to pre-production" are good examples of stages. They are logical boundaries in your pipeline where you can pause the pipeline and perform various checks.
Source for the last snippet and an interesting read: Add stages, dependencies, & conditions.

Related

How to Specifically Schedule Pipelines with Azure DevOps

How to schedule a one time run, non-repeating pipeline in AzurDevOps. I want to create this pipeline for our UAT environment, but I don't want to run it manually, so I was thinking is there a way I can put multiple specific dates to run the pipeline?
In short, we can't schedule a non-repeating pipeline in DevOps because it defines a schedule using cron syntax.
Each Azure Pipelines scheduled trigger cron expression is a space-delimited expression with five entries(Minutes, Hours, Days, Months, Days of week).
If you need to run pipeline at some specific days, as a workaround, please schedule it on your end and call the Rest API to run your pipeline.
There are the detailed steps: https://blog.geralexgr.com/cloud/trigger-azure-devops-build-pipelines-using-rest-api.

Configure pipeline to trigger multiple pipelines

I had around 30 pipelines (each doing its own build, deploy, tests) , all in same project.
Instead of having to manually trigger all 30 pipelines each time, I wanted to create a separate pipeline YAML which on running it can trigger all the 30 individual pipelines.
Is there a way to achieve this?
I understand from documentation there is concept to add the pipeline triggers. However, I was not able to understand if single yaml can trigger individual pipelines - and if so, whether it is getting triggered at the completion of the pipeline or at the start of it.
Flow I was looking for is -
There are 30 individual pipelines each having complete flow for services:
stages:
stageA
stageB
stageC
Now, I was trying to create a pipeline yaml all_apps.yml which triggers all the 30 individual pipelines at once.
Configure pipeline to trigger multiple pipelines
There are several ways to accomplish it, you can choose the one that suits you.
First, we could set the Build completion for those 30 pipelines:
Go the edit page of the triggered yaml pipeline(Deploy pipeline), Click the 3dots and choose Triggers :
Go to Triggers--> Build completion and click add--> Select your triggering pipeline(all_apps.yml pipeline):
Second, there is an extension Trigger Azure DevOps Pipeline, we could use this task to trigger those 30 pipelines.
Third, you could it with both the Runs API and Build Queue API, both work with Personal Access Tokens. You can also use loops to make REST API calls more graceful. Check this thread for some more details.

Azure DevOps REST API - Get agent pools for jobs?

I'm using the Azure DevOps REST API to retrieve pipeline runs (aka "builds"). The build response has a bunch of good data, but it seems that the pool it reports only applies if the overall pipeline has a top-level pool defined.
For example, I have a pipeline that runs several parallel jobs, each one in a different self-hosted agent pool. But when I retrieve a build of this pipeline using the REST API, the only data available is for the pipeline's pool, which is the normal Hosted Ubuntu 1604 response you get for Microsoft-hosted builds - there's no mention of any of the self-hosted agent pools that did all the work.
I've tried drilling down into different sections (including the stage and task queries). The task level will eventually show the name of the agent used, but it's just a string, so it's not easy to infer the agent pool used unless you happen to name your agents in a specific way.
Is there any way to drill down into the individual "jobs" that ran as part of a pipeline and see what agent pools they were run on, using the REST API?
The Build response contains a link timeline (under Links field).
You can form it like this: https://dev.azure.com/<org>/<project>/_apis/build/builds/<buildId>/Timeline
The Json response has useful info for each job/task including queueId
at present, our Rest API cannot help us drill down into the individual "jobs" that ran as part of a pipeline and see what agent pools they were run on. This Rest Api is used to find the default settings about your pipeline.
As the work around, we can use the api: Builds - Get Build Log, and find the agent's name, like this:

While deploying on specific machines using azure devops can I only target Pilot run machines & post that other machines?

I am currently having below scenario to crack :
I have deployment target(XYZ) on Azure devops. This XYZ group holds 20 targets, out of this 20 targets while deploying to production I want deployment should only happen at 2-3 machines first. Post successfull checks or post 2-3 days I can trigger on other set of machines.
I have no definite number of machiens that will always be my pilot machines.It may differ everytime.
As of now I have below approach , but I want to know what is the best practice that I can use to fulfil above requirement:
For now before deployment I will identify machines where pilot run will happen, I will add an extra tag to them under deployment group. Same tag will also be added under pipeline to the task.
As per this approach my deployment team will every time have to modify pipeline & deployment group, is there any better way to do this?
As indicated in the ticket you mentioned in the comment, currently as workarounds, we can filter by custom conditions or tags.
Filter by custom condition:
Create a Pipeline Variable that contains server names as value.
Add a Custom condition on Custom condition step:
and(succeeded(),contains(variables['IncludedServers'],variables['Agent.MachineName']))
Modify the variables as needed when creating the release
Filter by tags:
We can use machine tags to limit deployment to specific sets of target servers.
The tags you assign allow you to limit deployment to specific servers when the deployment group is used in a Deployment group job.

Azure Data Factory V2 - Mutliple Instance of Same Pipeline Triggered parallely

We've created a re-usable azure data factory V2 pipeline. We're thinking to invoke this pipeline from different master pipelineS. These master pipelines may run in parallel. So, my concern is will this re-usable run as multiple instance process OR experience deadlock ?
Do I need to make any settings to run the re-usable pipeline with multiple instances(In case, by default multiple instantiation is not supported)?
thanks
Do I need to make any settings to run the re-usable pipeline with
multiple instances(In case, by default multiple instantiation is not
supported)?
As I know, no such specific settings you need to configure. However, based on this azure-data-factory-limits, azure data factory V2 have many limitations.
Such as Concurrent pipeline runs per pipeline is 100 and Write API calls is 2500/hr. You need to optimize your behaviors against the limitations.In addition, you could contact support about your custom requirements.

Resources