Azure Devops YAML Pipelines - While loop? - azure

Is there such a thing in Azure Release Pipelines (YAML) as a while loop? My use case is that I am using runOnce strategy to deploy artifacts to a clean environment, deploy a clients data and before I move onto the next client I need to run a query to ensure all the processing is finished and health checks are done. All checks can be done via SQL scripts into an Azure SQL Database and eventually I need to compare the results and task timings against an expected set.
i.e Does processing the client data across branches yield the expected results and timings.
Might be a square peg round hole so happy to use a different approach if there is an easier way.
- deployment : Install
pool:
vmImage: ubuntu-latest
environment:
name: 'Test_Env'
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
# Remove and re-create blank database on the Elastic Pool.
- task: SqlAzureDacpacDeployment#1
displayName: Drop DB
inputs:
azureSubscription: 'Azure'
AuthenticationType: 'server'
ServerName: '$(DB_SERVER)'
DatabaseName: 'master'
SqlUsername: '$(DB_USERNAME)'
SqlPassword: '$(DB_PASSWORD)'
deployType: 'InlineSqlTask'
SqlInline: |
IF EXISTS (SELECT name FROM master.sys.databases WHERE name = N'$(DB_DATABASE)') DROP DATABASE [$(DB_DATABASE)]
IpDetectionMethod: 'AutoDetect'
- task: SqlAzureDacpacDeployment#1
displayName: Create DB
inputs:
azureSubscription: 'Azure'
AuthenticationType: 'server'
ServerName: '$(DB_SERVER)'
DatabaseName: 'master'
SqlUsername: '$(DB_USERNAME)'
SqlPassword: '$(DB_PASSWORD)'
deployType: 'InlineSqlTask'
SqlInline: |
CREATE DATABASE $(DB_DATABASE) ( SERVICE_OBJECTIVE = ELASTIC_POOL (name = [SQL_ElasticPool] ));
IpDetectionMethod: 'AutoDetect'
- task: CmdLine#2
displayName: Install Product
inputs:
script: |
start /wait msiexec.exe /i "$(System.ArtifactsDirectory)\installer.msi" client_data= $(client_data) DB_USERNAME=$(DB_USERNAME) DB_PASSWORD=$(DB_PASSWORD)
workingDirectory: $(System.ArtifactsDirectory)
- task: CmdLine#2
displayName: Start Service
inputs:
script: |
sc start $(WIN_SERVICE)
# This is where I would want a while-loop
- task: SqlAzureDacpacDeployment#1
displayName: Check if processing finished
inputs:
azureSubscription: 'Azure'
AuthenticationType: 'server'
ServerName: '$(DB_SERVER)'
DatabaseName: '$(DB_DATABASE)'
SqlUsername: '$(DB_USERNAME)'
SqlPassword: '$(DB_PASSWORD)'
deployType: 'InlineSqlTask'
SqlInline: |
select 1 from eventlog if complete = 0
IpDetectionMethod: 'AutoDetect'

Azure Devops YAML Pipelines - While loop?
You could create a template which will have a set of Check if processing finished task, and pass loop time as parameters across during your build, like:
- template: CheckProcessingFinished.yaml
parameters:
param: ["1","2","3"]
CheckProcessingFinished.yaml:
parameters:
param : []
steps:
- ${{each Looptimes in parameters.param}}:
- task: SqlAzureDacpacDeployment#1
displayName: Check if processing finished
inputs:
azureSubscription: 'Azure'
AuthenticationType: 'server'
ServerName: '$(DB_SERVER)'
DatabaseName: '$(DB_DATABASE)'
SqlUsername: '$(DB_USERNAME)'
SqlPassword: '$(DB_PASSWORD)'
deployType: 'InlineSqlTask'
SqlInline: |
select 1 from eventlog if complete = 0
IpDetectionMethod: 'AutoDetect'
- task: PowerShell#2
inputs:
targetType: 'Sleep 30 seconds'
script: |
Start-Sleep 30
You could check the document Solving the looping problem in Azure DevOps Pipelines for some more details.

In YAML we use each keyword, and it is an equivalent of the for loop .
You cannot run a task until a condition is met , you can run multiple
task either by writing them one by one or by using the each
statement which is equivalent to a for loop.
But you can use a GATE which is a way to control a deployment. They are predominantly used for heath checkups of infrastructure, external approvals for deployment, etc.
The gates can either be at the start of the pipeline or at the end of
the pipeline.
Gates are of many types like the ones which use invokes functions and azure monitors and some gates use rest api too. Also you can create your custom gates too.
Gates will continue to check for a user specified condition until it is met.
So, you can use a azure function gate run the required scripts and return the results in the gate for validation.
Refence:
Azure Pipeline Gate

Related

Azure DevOps pipelines - how to execute a whole template stage in a docker container by a self-hosted agent

I have a multi stage Azure DevOps pipeline in yaml. Each stage includes multiple jobs. The pipeline runs properly by using Azure-hosted agents. I need to run it by our company's self-hosted agent. The self-hosted agent is a virtual machine scale set (VMSS) which is defined as an agent pool in DevOps, it has been tested and is working properly. Because the self-hosted agent does not have all the capabilities we need, our company's policy is to run the jobs inside docker containers. The docker images reside in our company's private Azure Container Registy.
My question is how can I run a whole template stage in a docker container? Please note that my question is not about executing a job but all the jobs in a stage template. This link explains how to execute jobs inside a container. It doesn't give any examples of how to run all the jobs in a stage in a container, especially when the stage is defined as a template.
My pipeline in simplified form is as follows:
## Name of self-hosted agent
pool: Platform-VMSS
resources:
containers:
- container: base
image: myazurecontainerregistryname.azurecr.io/platform/myimg:1.0.0
endpoint: 'serviceconnection-acr'
trigger:
branches:
include:
- "feature/ORGUS-4810"
exclude:
- "main"
- "release"
paths:
include:
- "usecase-poc/**"
variables:
pythonVersion: 3.8
whlPackageName: py_core
srcDirectory: usecase-poc/$(whlPackageName)
${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}:
BuildFrom: main
PackageName: $(whlPackageName)
versionOption: custom
deployToDevPipeline: True
deployToQAPipeline: True
deployToProdPipeline: True
${{ else }}:
BuildFrom: branch_${{ lower(variables['Build.SourceBranchName'] ) }}
PackageName: $(whlPackageName)_$(BuildFrom)
versionOption: patch
deployToDevPipeline: True
deployToQAPipeline: False
deployToProdPipeline: False
stageName: 'deployToDev'
stages:
- stage: DownloadArtifact
displayName: "Download python whl from artifactory"
jobs:
- job: DownloadArtifactJob
steps:
- checkout: self
- task: UniversalPackages#0
displayName: 'Download Artifact with Universal Packages'
inputs:
vstsFeed: 'some-value/00000000-0000-0009-0000-d24300000000'
vstsFeedPackage: '00000000-0000-0000-0000-0000000'
vstsPackageVersion: 0.8.4
downloadDirectory: $(Build.ArtifactStagingDirectory)
- task: Bash#3
name: GetWheelName_Task
inputs:
targetType: 'inline'
script: |
echo $(Build.ArtifactStagingDirectory)
find $(Build.ArtifactStagingDirectory) -name '*.whl'
ArtifactName=$(find $(Build.ArtifactStagingDirectory) -name '*.whl')
echo "Artifact name value is " $ArtifactName
echo "##vso[task.setvariable variable=ArtifactName;isOutput=true]$ArtifactName"
displayName: 'Get downloaded artifact in source directory'
- task: PublishBuildArtifacts#1
displayName: "Publish downloaded artifact to pipeline's output"
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: whl
- stage: ConstructSharedVariablesForAllStages
displayName: Construct Shared Variables For All Stages
dependsOn: DownloadArtifact
variables:
- group: proj-shared-vg
- name: ArtifactName
value: $[stageDependencies.DownloadArtifact.DownloadArtifactJob.outputs['GetWheelName_Task.ArtifactName']]
jobs:
- job: DownloadArtifact
container: base
steps:
- task: Bash#3
displayName: "Print variable value"
inputs:
targetType: 'inline'
script: |
echo $(ArtifactName)
- task: Bash#3
name: extractWheelName
displayName: Extract Wheel Name
inputs:
targetType: inline
script: |
echo $(ArtifactName) | awk -F"/" '{print $NF}'
WhlName="py_core-0.8.4-py3-none-any.whl"
echo "##vso[task.setvariable variable=WhlName;isOutput=true]$WhlName"
- task: DownloadPipelineArtifact#2
displayName: "Download artifact from previous stage"
inputs:
buildType: 'current'
project: 'Project Name'
buildVersionToDownload: 'latest'
artifactName: whl
targetPath: '$(System.ArtifactsDirectory)'
- pwsh: |
$whlFile = Get-ChildItem -Filter *.whl -Path "$(System.ArtifactsDirectory)" | ForEach-Object { $_.fullname } | Select-Object -First 1
Write-Host "##vso[task.setvariable variable=whlFile]$whlFile"
name: SetVars
displayName: Get wheel name
## This is the section where my question is about. How can I make sure each stage runs in the self-hosted agent pool. The stage contains multiple jobs.
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToDevPipeline}}
variableGroup: databricks-sp-vg-dev
stageName: DeployToDev
environment: DBRKS_Dev_WHL
conditionParameter: deployToDev
dependsOnStage: ConstructSharedVariablesForAllStages
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToQAPipeline}}
variableGroup: databricks-sp-vg-qa
stageName: DeployToQA
environment: DBRKS_QA_WHL
conditionParameter: deployToQA
dependsOnStage: DeployToDev
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToProdPipeline}}
variableGroup: databricks-sp-vg-prod
stageName: DeployToProd
environment: DBRKS_Prod_WHL
conditionParameter: deployToProd
dependsOnStage: DeployToQA
In the code above in resources the container resides in our Azure Container Registry (ACR), the endpoint is our DevOps service connection of type container registry to pull and push images to and from ACR. In the code above I have commented where the issue is. In templates where I am refering to stage templates, I would like to run them all inside a container where I have defined as a resource at the beginning of the pipeline. The stage template has multiple jobs. Here is just a sample of stage template when running to emphasize it has multiple jobs:
The highlighted stage is the one created by template:
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToDevPipeline}}
variableGroup: databricks-sp-vg-dev
stageName: DeployToDev
environment: DBRKS_Dev_WHL
conditionParameter: deployToDev
dependsOnStage: ConstructSharedVariablesForAllStages
Question is how to enforce all the jobs in the above template run in the docker container defined as resource in our pipeline. Thank you very much for your valuable input.
Add a container field at the job level as shown below. Then all the jobs in the template will be running on specified container.
pool:
  vmImage: ubuntu-latest
resources:
  containers:
    - container: testcontainer
      image: ubuntu
stages:
  - stage: template01
    displayName: tempate test
    jobs:
      - job: template
        container: testcontainer
        steps:
          - template: templates/template01.yaml
  - stage: template02
    displayName: template test
    jobs:
      - job: template
        container: testcontainer
        steps:
          - template: templates/template02.yaml
Else, add a step target field to all the required tasks in a template, as referred to in this link Build and Release Tasks - Azure Pipelines | Microsoft Learn](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#step-target) "https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#step-target)")
resources:
containers:
- container: pycontainer
image: python:3.11
steps:
- task: AnotherTask#1
target: pycontainer

Run a command in container after deployment in Azure

I'm using Azure for hosting and Azure Pipelines for CI/CD operations
I have an image build and deploy operations defined like that:
- stage: Package
displayName: 'Package app'
jobs:
- job:
steps:
- task: Docker#2
displayName: 'Build image'
inputs:
containerRegistry: '$(containerRegistry)'
repository: '$(containerRepository)'
command: 'build'
Dockerfile: './Dockerfile'
buildContext: '.'
tags: |
$(Build.BuildId)
- task: Docker#2
displayName: 'Push image'
inputs:
command: push
containerRegistry: '$(containerRegistry)'
repository: '$(containerRepository)'
tags: |
$(Build.BuildId)
- stage: Deploy
displayName: 'Deploy'
jobs:
- job:
steps:
- task: AzureWebAppContainer#1
inputs:
azureSubscription: $(subscription)
appName: $(appName)
What should I do to execute some operations in my container after task AzureWebAppContainer is finished? I have to make some database updates after the deploy operation.
I've tried to find documentation for Azure and search for some SO topics, but didn't find any solutions yet, except usage of entrypoint / cmd for database updates, which is not working for me
I think there should be some Azure pipelines mechanism to perform such actions
You can use the startup command in the AzureWebAppContainer#1 or the AzureAppServiceSettings to manage the afterward operation.
By the way, you could also refer to this doc for Azure Web App for Containers to get more details.

Run two stages in Azure DevOps Pipeline "partially" parallel

I have two stages in my Azure DevOps pipeline. One with Pulumi Preview (let's call it Preview) and one with Pulumi Up (Up) in order to run my infrastructure as code.
Both run from the same container and it takes a while to pull it. I want to manually approve the Preview before the implementation.
Can I pull and run the container for both stages simultaneously but wait for the last job of the UP-Stage until the Preview-Stage is approved?
Currently they depend on eachother as follows:
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Pulumi_Preview
jobs:
- job: Preview
container:
image: REGISTRY.azurecr.io/REPOSITORY:latest
endpoint: ServiceConnection
steps:
- task: Pulumi#1
displayName: pulumi preview
inputs:
azureSubscription: 'Something'
command: 'preview'
args: '--diff --show-config --show-reads --show-replacement-steps'
stack: $(pulumiStackShort)
cwd: "./"
- stage: Pulumi_Up
displayName: "Pulumi (Up)"
dependsOn: Pulumi_Preview
jobs:
- job: Up
container:
image: REGISTRY.azurecr.io/REPOSITORY:latest
endpoint: ServiceConnection
steps:
- task: Pulumi#1
inputs:
azureSubscription: 'Something'
command: 'up'
args: "--yes --skip-preview"
stack: $(pulumiStackShort)
cwd: "./"
You could use the Manual Validation Task.
Use this task in a YAML pipeline to pause a run within a stage, typically to perform some manual actions or validations, and then resume/reject the run.
jobs:
- job: waitForValidation
displayName: Wait for external validation
pool: server
timeoutInMinutes: 4320 # job times out in 3 days
steps:
- task: ManualValidation#0
timeoutInMinutes: 1440 # task times out in 1 day
inputs:
notifyUsers: |
test#test.com
example#example.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'resume'

Create a pull request environment - Azure

I would want to create a pull request environment in Azure which gets deployed when a pull request is opened. Users can play around in that environment and find bugs. Once the bugs are fixed, and the PR is closed, I would want to delete that environment.
I am following through Sam Learns Azure and I was able to go through most steps, but its all .NET related.
Does anyone have an idea to do the same for a react-app?
I am also in favor of creating a new slot inside the app-service.
This is my modified code:
jobs:
- deployment: DeployWebServiceApp
displayName: "Deploy webservice app"
environment: ${{parameters.environment}}
pool:
vmImage: ubuntu-latest
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifacts'
inputs:
artifactName: "$(Build.BuildId)"
buildType: 'current'
- task: AzureCLI#2
displayName: 'Deploy infrastructure with ARM templates'
inputs:
azureSubscription: "${{parameters.serviceConnection}}"
scriptType: bash
scriptLocation: inlineScript
inlineScript: az webapp deployment slot create --name ui-dev-$(prLC)
--resource-group rg-dev
--slot $(prLC)
--configuration-source app-dev
- task: AzureRmWebAppDeployment#3
displayName: 'Azure App Service Deploy: web service'
inputs:
azureSubscription: "${{parameters.serviceConnection}}"
appName: "${{parameters.appServiceName}}"
DeployToSlotFlag: true
ResourceGroupName: '${{parameters.resourceGroupName}}'
package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
RemoveAdditionalFilesFlag: true
TakeAppOfflineFlag: true
RenameFilesFlag: true
Why Not. You can do that in below steps-
Create the Build Pipe which build your react app
In the build pipe set a VSTS var only when it is a PR. eg varPR = x
Create a Release pipe which deploys the app
Add one more task at the end of build pipe to call the release using webhook based on condition varPR = x.
Ref - Trigger azure pipeline via webhook?
The link you have shared is also cool that does the job same job in a single build pipe.

Referencing Azure Key Vault secrets from CI/CD YAML

We have a multi-stage YAML pipeline that does CI/CD to an existing set of Azure Resources
The stages are
Build
Deploy to Development and Run Tests
If Previous succeeded - Deploy to Production and Run Tests
We use the AzureRmWebAppDeployment task during the deployment stages and we use the AppSettings argument to that task to specify environment-specific settings. For example
- task: AzureRmWebAppDeployment#4
displayName: 'Deploy Azure App Service'
inputs:
azureSubscription: '$(azureSubscriptionEndpoint)'
appType: '$(webAppKind)'
WebAppName: 'EXISTING__AZURE_RESOURCENAME-DEV'
Package: '$(Pipeline.Workspace)/**/*.zip'
AppSettings: >
-AzureAd:CallbackPath /signin-oidc
-AzureAd:ClientId [GUID was here]
-AzureAd:Domain [domain was here]
-AzureAd:Instance https://login.microsoftonline.com/
-AzureAd:TenantId [Id was here]
-EmailServer:SMTPPassword SECRETPASSWORD
-EmailServer:SMTPUsername SECRETUSERNAME
There are two settings in that set, EmailServer: SMTPUsername and EmailServer: SMTPPassword that I want to pull from an Azure KeyVault. I know how to reference the KV secret from Azure Portal using the syntax
#Microsoft.KeyVault(SecretUri=https://our.vault.azure.net/secrets/SendGridPassword/ReferenceGuidHere)
but how do I reference the value from the YAML pipeline so it is set in Azure?
As pointed out by Thomas in this comment, Referencing Azure Key Vault secrets from CI/CD YAML
I can explicitly set the value in the YAML file like this:
-EmailServer:SMTPPassword #Microsoft.KeyVault(SecretUri=https://our.vault.azure.net/secrets/SendGridPassword/ReferenceGuidHere)
You need to set an AzureKeyVault#1 task with RunAsPreJob to true, this will make your key vault values available as CI/CD jobs environment variables so you can use it as $(KEY-OF-SECRET-VALUE) on the rest of your stages in the job.
The following piece of yaml file is a working example.
We set for python unittest a set of env variable provided from Azure key-vault
trigger:
batch: true # disable concurrent build for pipeline
branches:
include:
- '*' # CI start for all branches
pool:
vmImage: ubuntu-16.04
stages:
- stage: Test
jobs:
- job: sample_test_stage
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'YOUR SUBSCRIPTION HERE'
KeyVaultName: 'THE-KEY-VAULT-NAME'
SecretsFilter: '*'
RunAsPreJob: true
- task: UsePythonVersion#0
inputs:
versionSpec: '3.7'
- script : python -m unittest discover -v -s tests
displayName: 'Execute python unittest'
env: { MY-ENV-VAL-1: $(SECRET-VALUE-1), MY-ENV-VAL-2: $(SECRET-VALUE-2)}
Note that sometimes you need to approve connection beetween AzureDevops and another Azure service like KeyVault

Resources