How to use nested each loop in Azure DevOps YAML - azure

Below is piece of code I am trying to put in each loop in YAML pipeline.
Can anyone help ?
parameters:
- name: parameter1
type: object
default:
IN:
Test1:
folderPath: abc/myFolder1
Test2:
folderPath: abc/myFolder2
Test3:
folderPath: abc/myFolder3
US:
Test4:
folderPath: xyz/myFolder4
CA:
Test5:
folderPath: lmn/myFolder5

Related

Azure Function App infra redeployment makes existing functions in app fail because of missing requirements and also delete previous invocations data

I'm facing quite a big problem. I have a function app that I deploy by Azure Bicep in the following fashion:
param environmentType string
param location string
param storageAccountSku string
param vnetIntegrationSubnetId string
param kvName string
/*
This module contains the IaC for deploying the Premium function app
*/
/// Just a single minimum instance to start with and max scaling of 3 for dev, 5 for prd ///
var minimumElasticSize = 1
var maximumElasticSize = ((environmentType == 'prd') ? 5 : 3)
var name = 'nlp'
var functionAppName = 'function-app-${name}-${environmentType}'
/// Storage account for service ///
resource functionAppStorage 'Microsoft.Storage/storageAccounts#2019-06-01' = {
name: 'st4functionapp${name}${environmentType}'
location: location
kind: 'StorageV2'
sku: {
name: storageAccountSku
}
properties: {
allowBlobPublicAccess: false
accessTier: 'Hot'
supportsHttpsTrafficOnly: true
minimumTlsVersion: 'TLS1_2'
}
}
/// Premium app plan for the service ///
resource servicePlanfunctionApp 'Microsoft.Web/serverfarms#2021-03-01' = {
name: 'plan-${name}-function-app-${environmentType}'
location: location
kind: 'linux'
sku: {
name: 'EP1'
tier: 'ElasticPremium'
family: 'EP'
}
properties: {
reserved: true
targetWorkerCount: minimumElasticSize
maximumElasticWorkerCount: maximumElasticSize
elasticScaleEnabled: true
isSpot: false
zoneRedundant: ((environmentType == 'prd') ? true : false)
}
}
// Create log analytics workspace
resource logAnalyticsWorkspacefunctionApp 'Microsoft.OperationalInsights/workspaces#2021-06-01' = {
name: '${name}-functionapp-loganalytics-workspace-${environmentType}'
location: location
properties: {
sku: {
name: 'PerGB2018' // Standard
}
}
}
/// Log analytics workspace insights ///
resource applicationInsightsfunctionApp 'Microsoft.Insights/components#2020-02-02' = {
name: 'application-insights-${name}-function-${environmentType}'
location: location
kind: 'web'
properties: {
Application_Type: 'web'
Flow_Type: 'Bluefield'
publicNetworkAccessForIngestion: 'Enabled'
publicNetworkAccessForQuery: 'Enabled'
Request_Source: 'rest'
RetentionInDays: 30
WorkspaceResourceId: logAnalyticsWorkspacefunctionApp.id
}
}
// App service containing the workflow runtime ///
resource sitefunctionApp 'Microsoft.Web/sites#2021-03-01' = {
name: functionAppName
location: location
kind: 'functionapp,linux'
identity: {
type: 'SystemAssigned'
}
properties: {
clientAffinityEnabled: false
httpsOnly: true
serverFarmId: servicePlanfunctionApp.id
siteConfig: {
linuxFxVersion: 'python|3.9'
minTlsVersion: '1.2'
pythonVersion: '3.9'
use32BitWorkerProcess: true
appSettings: [
{
name: 'FUNCTIONS_EXTENSION_VERSION'
value: '~4'
}
{
name: 'FUNCTIONS_WORKER_RUNTIME'
value: 'python'
}
{
name: 'AzureWebJobsStorage'
value: 'DefaultEndpointsProtocol=https;AccountName=${functionAppStorage.name};AccountKey=${listKeys(functionAppStorage.id, '2019-06-01').keys[0].value};EndpointSuffix=core.windows.net'
}
{
name: 'WEBSITE_CONTENTAZUREFILECONNECTIONSTRING'
value: 'DefaultEndpointsProtocol=https;AccountName=${functionAppStorage.name};AccountKey=${listKeys(functionAppStorage.id, '2019-06-01').keys[0].value};EndpointSuffix=core.windows.net'
}
{
name: 'WEBSITE_CONTENTSHARE'
value: 'app-${toLower(name)}-functionservice-${toLower(environmentType)}a6e9'
}
{
name: 'APPINSIGHTS_INSTRUMENTATIONKEY'
value: applicationInsightsfunctionApp.properties.InstrumentationKey
}
{
name: 'ApplicationInsightsAgent_EXTENSION_VERSION'
value: '~2'
}
{
name: 'APPLICATIONINSIGHTS_CONNECTION_STRING'
value: applicationInsightsfunctionApp.properties.ConnectionString
}
{
name: 'ENV'
value: toUpper(environmentType)
}
]
}
}
/// VNET integration so flows can access storage and queue accounts ///
resource vnetIntegration 'networkConfig#2022-03-01' = {
name: 'virtualNetwork'
properties: {
subnetResourceId: vnetIntegrationSubnetId
swiftSupported: true
}
}
}
/// Outputs for creating access policies ///
output functionAppName string = sitefunctionApp.name
output functionAppManagedIdentityId string = sitefunctionApp.identity.principalId
Output is used for giving permissions to blob/queue and some keyvault stuff. This code is a single module called in a main.bicep module and deployed via an Azure Devops pipeline.
I have a second repository in which I have some functions and which I also deploy via Azure Pipelines. This one contains three .yaml files for deploying, 2 templates (CI and CD) and 1 main pipeline called azure-pipelines.yml pulling it all together:
functions-ci.yml:
parameters:
- name: environment
type: string
jobs:
- job:
displayName: 'Publish the function as .zip'
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python $(pythonVersion)'
- task: CopyFiles#2
displayName: 'Create project folder'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
Contents: |
**
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: Bash#3
displayName: 'Install requirements for running function'
inputs:
targetType: 'inline'
script: |
python3 -m pip install --upgrade pip
pip install setup
pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt
workingDirectory: '$(Build.ArtifactStagingDirectory)'
- task: ArchiveFiles#2
displayName: 'Create project zip'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)'
includeRootFolder: false
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
- task: PublishPipelineArtifact#1
displayName: 'Publish project zip artifact'
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifactName: 'functions$(environment)'
publishLocation: 'pipeline'
functions-cd.yml:
parameters:
- name: environment
type: string
- name: azureServiceConnection
type: string
jobs:
- job: worfklowsDeploy
displayName: 'Deploy the functions'
steps:
# Download created artifacts, containing the zipped function codes
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
artifactName: 'functions$(environment)'
targetPath: '$(Build.ArtifactStagingDirectory)'
# Zip deploy the functions code
- task: AzureFunctionApp#1
inputs:
azureSubscription: $(azureServiceConnection)
appType: functionAppLinux
appName: function-app-nlp-$(environment)
package: $(Build.ArtifactStagingDirectory)/**/*.zip
deploymentMethod: 'zipDeploy'
They are pulled together in azure-pipelines.yml:
trigger:
branches:
include:
- develop
- main
pool:
name: "Hosted Ubuntu 1804"
variables:
${{ if notIn(variables['Build.SourceBranchName'], 'main') }}:
environment: dev
azureServiceConnection: SC-NLPDT
${{ if eq(variables['Build.SourceBranchName'], 'main') }}:
environment: prd
azureServiceConnection: SC-NLPPRD
pythonVersion: '3.9'
stages:
# Builds the functions as .zip
- stage: functions_ci
displayName: 'Functions CI'
jobs:
- template: ./templates/functions-ci.yml
parameters:
environment: $(environment)
# Deploys .zip workflows
- stage: functions_cd
displayName: 'Functions CD'
jobs:
- template: ./templates/functions-cd.yml
parameters:
environment: $(environment)
azureServiceConnection: $(azureServiceConnection)
So this successfully deploys my function app the first time around when I have also deployed the infra code. The imports are done well, the right function app is deployed, and the code runs when I trigger it.
But, when I go and redeploy the infra (bicep) code, all of a sudden I the newest version of the functions is gone and is replaced by a previous version.
Also, running this previous version doesn't work anymore since all my requirements that were installed in the pipeline (CI part) via pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt suddenly cannot be found anymore, giving import errors (i.e. Result: Failure Exception: ModuleNotFoundError: No module named 'azure.identity'). Mind you, this version did work previously just fine.
This is a big problem for me since I need to be able to update some infra stuff (like adding an APP_SETTING) without this breaking the current deployment of functions.
I had thought about just redeploying the function automatically after an infra update, but then I still miss the previous invocations which I need to be able to see.
Am I missing something in the above code because I cannot figure out what would be going wrong here that causes my functions to change on infra deployment...
Looking at the documentation:
To enable your function app to run from a package, add a WEBSITE_RUN_FROM_PACKAGE setting to your function app settings.
1 Indicates that the function app runs from a local package file deployed in the d:\home\data\SitePackages (Windows) or /home/data/SitePackages (Linux) folder of your function app.
In your case, when you deploy your function app code using AzureFunctionApp#1 and zipDeploy, this automatically add this appsetting into your function app. When redeploying your infrastructure, this setting is removed and the function app host does not know where to find the code.
If you add this app setting in your bicep file this should work:
{
name: 'WEBSITE_RUN_FROM_PACKAGE'
value: '1'
}

Azure Pipelines Automatic retries for a task

This is part of my Yml file
I need to re-call this template if it failed.it should be rerun again. After a few seconds, ideally.I am new to yml files
I tried to use retryCountOnTaskFailure but it should be under task but the template calling different hierarchy
https://learn.microsoft.com/en-us/azure/devops/release-notes/2021/sprint-195-update#automatic-retries-for-a-task
- template: ${{variables['System.DefaultWorkingDirectory']}}
parameters:
Test: ${{ parameters.isTestRelease }}
Environment: ${{ parameters.deploymentTarget }}
Component: '${{ parameters.component }}'
The retryCountOnTaskFailure feature is applied to individual tasks within a pipeline. Templates aren't tasks, they act more like an include that expands the contents of the template into your pipeline.
# pipeline.yml
trigger: none
parameters:
- name: isTestRelease
type: boolean
default: false
- name: deploymentTarget
type: string
default: DEV
values:
- DEV
- QA
- UAT
- PROD
- name: component
type: string
default: 'x'
stages:
- template: my-template.yml
parameters:
Test: ${{ parameters.isTestRelease }}
Environment: ${{ parameters.deploymentTarget }}
Component: ${{ parameters.component }}
And within the template, you'd want to add the retry logic to specific tasks:
# my-template.yml
parameters:
- name: isTestRelease
type: boolean
- name: component
type: string
stages:
- stage: Test
jobs:
- job: 1
steps:
- task: ...
- task: ...
- task: ...
retryCountOnFailure: 2
- task: ...
It's also located under "Control Options" when the "Command Line" task is added:

How to provide expression as value inside an ssm document?

I would like to add a server to an ausostaling-group using SSM document, if the group has n instances running - i want to have (n+1).
Since this stack is managed by cloudformation, i just need to increase the 'DesiredCapacity' variable and update the stack. so i created a document with 2 steps:
get the current value of 'DesiredCapacity'
update stack with value of 'DesiredCapacity' + 1
I didnt find a way to express this simple operation, i guess im doing something wrong ...
SSM Document:
schemaVersion: '0.3'
parameters:
cfnStack:
description: 'The cloudformation stack to be updated'
type: String
mainSteps:
- name: GetDesiredCount
action: 'aws:executeAwsApi'
inputs:
Service: cloudformation
Api: DescribeStacks
StackName: '{{ cfnStack }}'
outputs:
- Selector: '$.Stacks[0].Outputs.DesiredCapacity'
Type: String
Name: DesiredCapacity
- name: UpdateCloudFormationStack
action: 'aws:executeAwsApi'
inputs:
Service: cloudformation
Api: UpdateStack
StackName: '{{ cfnStack }}'
UsePreviousTemplate: true
Parameters:
- ParameterKey: WebServerCapacity
ParameterValue: 'GetDesiredCount.DesiredCapacity' + 1 ### ERROR
# ParameterValue: '{{GetDesiredCount.DesiredCapacity}}' + 1 ### ERROR (trying to concat STR to INT)
# ParameterValue: '{{ GetDesiredCount.DesiredCapacity + 1}}' ### ERROR
There is a way to do calculation inside an SSM document using python runtime.
The additional python step do the following:
Python runtime get variables via the the 'InputPayload' property
The 'current' (str) key added to the event object
The python function script_handler called
The 'current' extracted using event['current']
Converting string to int and adding 1
return a dictionary with the 'desired_capacity' key and value as string
expose the output ($.Payload.desired_capacity referred to the 'desired_capacity' of the returned dictionary)
schemaVersion: '0.3'
parameters:
cfnStack:
description: 'The cloudformation stack to be updated'
type: String
mainSteps:
- name: GetDesiredCount
action: 'aws:executeAwsApi'
inputs:
Service: cloudformation
Api: DescribeStacks
StackName: '{{ cfnStack }}'
outputs:
- Selector: '$.Stacks[0].Outputs.DesiredCapacity'
Type: String
Name: DesiredCapacity
- name: Calculate
action: 'aws:executeScript'
inputs:
Runtime: python3.6
Handler: script_handler
Script: |-
def script_handler(events, context):
desired_capacity = int(events['current']) + 1
return {'desired_capacity': str(desired_capacity)}
InputPayload:
current: '{{ GetDesiredCount.DesiredCapacity }}'
outputs:
- Selector: $.Payload.desired_capacity
Type: String
Name: NewDesiredCapacity
- name: UpdateCloudFormationStack
action: 'aws:executeAwsApi'
inputs:
Service: cloudformation
Api: UpdateStack
StackName: '{{ cfnStack }}'
UsePreviousTemplate: true
Parameters:
- ParameterKey: WebServerCapacity
ParameterValue: '{{ Calculate.NewDesiredCapacity}}'

Azure devops pipeline access Json file inputs and perform for(each) loop

I am using, Linux agent, I need to iterate over json input objects for each project.
I have below json file and It may have more than 200 charts, I need perform build, lint, templates and push to repository, I can do this using shell/python, but I thought to use Azure pipelines yaml code.
{
"helm": {
"charts": [
{
"project1": {
"secretName" : "mysecret1",
"setValues": ["a", "b", "c"]
}
},
{
"project2": {
"secretName" : "mysecret2",
"setValues": ["x", "y", "z"]
}
}
]
}
}
azure-pipelines.yaml:
trigger:
- '*'
variables:
buildConfiguration: 'Release'
releaseBranchName: 'dev'
stages:
- stage: 'Build'
pool:
name: 'linux'
displayName: 'Build helm Projects'
jobs:
- job: 'buildProjects'
displayName: 'Building all the helm projects'
steps:
- task: HelmInstaller#0
displayName: install helm
inputs:
helmVersion: 'latest'
installKubectl: false
- script: chmod -R 755 $(Build.SourcesDirectory)/
displayName: 'Set Directory permissions'
- task: PythonScript#0
inputs:
scriptSource: inline
script: |
import argparse, json, sys
parser = argparse.ArgumentParser()
parser.add_argument("--filePath", help="Provide the json file path")
args = parser.parse_args()
with open(args.filePath, 'r') as f:
data = json.load(f)
data = json.dumps(data)
print('##vso[task.setvariable variable=helmConfigs;]%s' % (data))
arguments: --filePath $(Build.SourcesDirectory)/build/helm/helmConfig.json
failOnStderr: true
displayName: setting up helm configs
- template: helmBuild.yml
parameters:
helmCharts: $(HELMCONFIGS)
Json input is saved to HELMCONFIGS variable in azure pipelines, As per Azure documents, it string type and we cannot convert to any other type like array.
helmBuild.yml file:
parameters:
- name: helmCharts
type: object
default: {}
steps :
- ${{ each helm in parameters.helmCharts }}:
- ${{ each charts in helm}}:
- ${{ each chart in charts }}:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'A-B-C'
KeyVaultName: chart.secretName
SecretsFilter: '*'
RunAsPreJob: true
I am not able to access chart.secretName, How to access to secretNames input?

Is it possible to mix map arguments in Azure Pipelines template yaml? How?

I wonder if it is possible to mix map arguments in Azure Pipelines template yaml, and how to do it.
These two scenarios shown bellow do same thing: place template parameter as env argument in a task, but in the second I'm trying to do it through two maps instead of a single one. That could be useful when I have different purposes to those values (at the eyes of someone who is extending the template) but both are going to be used as 'env' under the hood.
This works fine:
Main Pipeline:
...
extends:
template: templates/deploy/v1/deployment.job.yaml#infrastructure-templates
parameters:
name: dev
variableGroup: 'AzureDevopsVariableGroupName'
secretEnvVariables:
SECRET1: ${SECRET1}
SECRET2: ${SECRET2}
Target Template:
parameters:
- name: secretEnvVariables
type: object
jobs:
...
steps:
- bash: |
#!/bin/bash
echo "SECRET1 = ${SECRET1}"
...
displayName: Substitute Env VARS on files
enabled: true
env:
${{ parameters.secretEnvVariables }}
This doesn't work (and I wonder if it is possible to make it work):
Main Pipeline:
...
extends:
template: templates/deploy/v1/deployment.job.yaml#infrastructure-templates
parameters:
name: dev
variableGroup: 'AzureDevopsVariableGroupName'
secretEnvVariables:
SECRET1: ${SECRET1}
SECRET2: ${SECRET2}
moreVariables:
VAR1: ${VAR1}
Target Template:
parameters:
- name: secretEnvVariables
type: object
- name: moreVariables
type: object
jobs:
...
steps:
- bash: |
#!/bin/bash
echo "SECRET1 = ${SECRET1}"
echo "VAR = ${VAR1}"
...
displayName: Substitute Env VARS on files
enabled: true
env:
${{ parameters.secretEnvVariables }}
${{ parameters.moreVariables }}
Can it be done? How to do it?
I am doing something similar and this isn't well documented but can use objects to accomodate for this.
Here it is the combo of environment and region deployment:
- name: environmentObjects
type: object
default:
- environmentName: 'dev'
regionAbrvs: ['eus']
- environmentName: 'uat'
regionAbrvs: ['eus', 'cus']
From there it would be a loop to access each one like:
- ${{ each environmentObject in parameters.environmentObjects }} :
- ${{ each regionAbrv in enviornmentObject.regionAbrvs }} :
This should work for your scenario as well.

Resources