Calling AZ pipeline API from within pipeline - azure

I would like to call an Azure Pipeline API from within a stage of a Pipeline. Specifically to get the status of one pipeline from another so that a job can be forced to wait until the other pipeline is not busy.
I can call the API with a PAT locally. I am just not sure of the best way of passing auth from within the pipeline. Does the agent have some kind of built in auth mechanism with devops apis? Does the agent itself need a PAT and if so what's the best way of providing it one?

The System.AccessToken as detailed here https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml was whart I required to call pipeline apis.

Related

Azure: How to build *resilient* resource deployment pipelines?

I am looking for some best practices and suggestions:
My team is creating an Azure DevOps pipeline to deploy a complex infrastructure of VNets, VMs, Azure ML workspaces, SQL databases, etc.
The pipeline uses Terraform where possible, but Powershell or AZ CLI where needed.
The pipeline works, it is version controlled, it has proper unit tests and integration tests (or at least decent ones).
However, due to the instability of Azure resourcing sometimes the pipeline will fail because, for instance:
SQL server provisioning fails
AD join of VMs fails
or other activities which are not due to bad Infra as Code, but rather the stochasticity of the task. Provisioning resources is inherently unstable, similar to networking, etc.
I am not complaining about Azure. I am just asking:
How can I adjust the IaC pipeline so that when Azure fails occur, some sort of retry can automatically be triggered?
As a concrete example, is there an Azure or Terraform equivalent to Python's tenacity package or Java's Spring Retry?
How can I adjust the IaC pipeline so that when Azure fails occur, some sort of retry can automatically be triggered?
You could try to use Trigger Azure DevOps Pipeline task to trigger current build pipeline automatically when the build failed.
To be able to use the extension an Azure DevOps API endpoint needs to be created.
For the service connection to work as it should you need to configure the following parameters:
Organization Url: The URL of the organization.
(https://dev.azure.com/[organization])
Personal Access Token: The personal access token.
Besides, we need to set the condition for this task, which will trigger the current pipeline Only when a previous task has failed:

Send Azure DevOps Pipeline Details to SIEM

I want to add pipeline details(status, id, who triggered it) to my SIEM solution. For that, can you suggest me how to get that information.
Some rough Ideas
Invoke a lambda from Pipeline and supply pipeline related information through SNS
If the above is possible, can you tell me how. I couldn't find a way to do it dynamically for all pipelines. I don't want to hardcore Project and Organization details.
Not very familiar with SIEM, but for azure devops pipelines, you can get their detailed information programmatically by REST APIs.
You can use Pipelines - Get to get information about pipeline's status, id, who triggered it and so on.
GET https://dev.azure.com/{organization}/{project}/_apis/pipelines/{pipelineId}?api-version=6.0-preview.1
You can click this link for more resources of Azure DevOps Pipelines REST APIs.
It uses PAT or OAuth2 as the authentication method and returns information as JSON.
Updates:
Here is a REST API Pipelines - List that can get a list of pipelines in a project.
GET https://dev.azure.com/{organization}/{project}/_apis/pipelines?api-version=6.0-preview.1

How to get Azure DevOps pipeline manual cancellation call back

I am using Azure DevOps pipeline to run some jobs. This pipeline has been created using YML.
As I am calling to Azure data factory pipeline using my DevOps pipeline, so if a user manually cancell DevOps pipeline, in that case my Azure Data Factory still in running mode which should ideally not happened.
Is there a way to stop my azure data factory pipeline autmatically whenever there is cancellation of Azure DevOps pipeline from UI?
As a workaround, We could add task power shell task and set the custom condition canceled(), this task will only run if you cancel the build.
Then add power shell script to call the API to cancel the Azure Data Factory pipeline.
POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.DataFactory/factories/{factoryName}/pipelineruns/{runId}/cancel?api-version=2018-06-01
Also, we could do this via webhook, you could also check this blog and update the json.

Machine learning in Azure: How do I publish a pipeline to the workspace once I've already built it in Python using the SDK?

I don't know where else to ask this question so would appreciate any help or feedback. I've been reading the SDK documentation for azure machine learning service (in particular azureml.core). There's a class called Pipeline that has methdods validate() and publish(). Here are the docs for this:
https://learn.microsoft.com/en-us/python/api/azureml-pipeline-core/azureml.pipeline.core.pipeline.pipeline?view=azure-ml-py
When I call validate(), everything validates and I call publish but it seems to only create an API endpoint in the workspace, it doesn't register my pipeline under Pipelines and there's obviously nothing in the designer.
My question: I want to publish my pipeline so I just have to launch from the workspace with one click. I've built it already using the SDK (Python code). I don't want to work with an API. Is there any way to do this or would I have to rebuild the entire pipeline using the designer (drag and drop)?
Totally empathize with your confusion. Our team has been working with Azure ML pipelines for quite some time but PublishedPipelines still confused me initially because:
what the SDK calls a PublishedPipeline is called as a Pipeline Endpoint in the Studio UI, and
it is semi-related to Dataset and Model's .register() method, but fundamentally different.
TL;DR: all Pipeline.publish() does is create an endpoint that you can use to:
schedule and version Pipelines, and
re-run the pipeline from other services via a REST API call (e.g. via Azure Data Factory).
You can see PublishedPipelines in the Studio UI in two places:
Pipelines page :: Pipeline Endpoints tab
Endpoints page :: Pipeline Endpoints tab

Azure DevOps invoked from an API to lauch a Terraform file

Based on the following design I want to launch a Terraform file and an application code on Azure DevOps when an Azure Functions trigger the Pipeline.
Do you know any example to do a normal API REST Call to A. Dev Ops?
(using Python or JS)
Do you think it is a good strategy to generate an orchestration pipeline to make the workflow of the infrastructure and later on the application code?
In order to deploy the application
code, how can I modify it using the arguments coming from the API?
architecture design

Resources