Currently, I'm working on a pipeline that should call an Azure Function in a certain way, depending on the outcome/result of a previous job in that pipeline.
The Azure Function should be called when the result of previous job is either: Succeeded, SucceededWithIssues or Failed. We want to ignore Skipped and Cancelled.
The body sent to the Azure Function differs based on the result: Succeeded/SucceededWithIssues VS Failed. It only differs by a single boolean in the payload called: DeploymentFailed.
The current implementation is using two separate tasks for calling the Azure Function. This was necessary, since I couldn't find a way to convert the outcome of the previous job to a boolean.
The current pipeline as is:
trigger:
- master
parameters:
- name: jobA
default: 'A'
- name: correlationId
default: '90c7e477-2141-45db-812a-019a9f88bdc8'
pool:
vmImage: ubuntu-latest
jobs:
- job: job_${{parameters.jobA}}
steps:
- script: echo "This job could potentially fail."
- job: job_B
dependsOn: job_${{parameters.jobA}}
variables:
failed: $[dependencies.job_${{parameters.jobA}}.result]
condition: in(variables['failed'], 'Succeeded', 'SucceededWithIssues', 'Failed')
pool: server
steps:
- task: AzureFunction#1
displayName: Call function succeeded
condition: in(variables['failed'], 'Succeeded', 'SucceededWithIssues')
inputs:
function: "<azure-function-url>"
key: "<azure-function-key>"
method: 'POST'
waitForCompletion: false
headers: |
{
"Content-Type": "application/json"
}
body: |
{
"CorrelationId": "${{parameters.correlationId}}",
"DeploymentFailed": false # I would like to use the outcome of `variable.failed` here and cast it to a JSON bool.
}
- task: AzureFunction#1
displayName: Call function failed
condition: in(variables['failed'], 'Failed')
inputs:
function: "<azure-function-url>"
key: "<azure-function-key>"
waitForCompletion: false
method: 'POST'
headers: |
{
"Content-Type": "application/json"
}
body: |
{
"CorrelationId": "${{parameters.correlationId}}",
"DeploymentFailed": true # I would like to use the outcome of `variable.failed` here and cast it to a JSON bool.
}
My question: How can I use the outcome of the previous job to only have 1 Azure Function invoke task?
You can map condition directly to variable:
variables:
failed: $[dependencies.job_${{parameters.jobA}}.result]
result: $[lower(notIn(dependencies.job_${{parameters.jobA}}.result, 'Succeeded', 'SucceededWithIssues', 'Failed'))]
and then:
- task: AzureFunction#1
displayName: Call function succeeded
condition: in(variables['failed'], 'Succeeded', 'SucceededWithIssues')
inputs:
function: "<azure-function-url>"
key: "<azure-function-key>"
method: 'POST'
waitForCompletion: false
headers: |
{
"Content-Type": "application/json"
}
body: |
{
"CorrelationId": "${{parameters.correlationId}}",
"DeploymentFailed": $(result) # I would like to use the outcome of `variable.failed` here and cast it to a JSON bool.
}
Related
I am continuously struggling to connect with the GCP from Azure Devops InvokeRestAPI task.
I have created a service connection with empty credentials. And created a API task in YAML file as below.
When I add the 'Authorization' in header, Devops fails to recognize it.
When I add the token w/wo Bearer in 'AuthToken', it fails with a 401 error, saying authentication error.
This is the wrror I face everytime, no matter what I do.
"message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
Here is the yaml code:
- job: planing_df1
pool: server
steps:
- task: InvokeRESTAPI#1
inputs:
connectionType: 'connectedServiceName'
serviceConnection: 'GCPServiceConnectionBasic'
method: 'GET'
headers: |
{
"PlanUrl": "$(system.CollectionUri)",
"ProjectId": "$(system.TeamProjectId)",
"HubName": "$(system.HostType)",
"PlanId": "$(system.PlanId)",
"JobId": "$(system.JobId)",
"TimelineId": "$(system.TimelineId)",
"TaskInstanceId": "$(system.TaskInstanceId)",
"AuthToken": "ya29.a0AeTM1ie8PKbCNb3nnTJ9XFnoVlBUlgiM48XAENJIFAl-dp4gHblablabla"
}
urlSuffix: '/myproj/locations/europe-west4/repositories/Dataform'
waitForCompletion: 'true'
When you run the API in Invoke Rest API task, you need to make sure that the same token can work fine on your local environment.
Then you can use the following format to set the headers of the Invoke Rest API Task.
"Authorization": "Bearer bearertokendeatil "
For example:
- job: planing_df1
pool: server
steps:
- task: InvokeRESTAPI#1
inputs:
connectionType: 'connectedServiceName'
serviceConnection: 'GCPServiceConnectionBasic'
method: 'GET'
headers: |
{
"PlanUrl": "$(system.CollectionUri)",
"ProjectId": "$(system.TeamProjectId)",
"HubName": "$(system.HostType)",
"PlanId": "$(system.PlanId)",
"JobId": "$(system.JobId)",
"TimelineId": "$(system.TimelineId)",
"TaskInstanceId": "$(system.TaskInstanceId)",
"Authorization": "Bearer bearertokendeatil"
}
urlSuffix: '/myproj/locations/europe-west4/repositories/Dataform'
waitForCompletion: 'true'
I'm facing quite a big problem. I have a function app that I deploy by Azure Bicep in the following fashion:
param environmentType string
param location string
param storageAccountSku string
param vnetIntegrationSubnetId string
param kvName string
/*
This module contains the IaC for deploying the Premium function app
*/
/// Just a single minimum instance to start with and max scaling of 3 for dev, 5 for prd ///
var minimumElasticSize = 1
var maximumElasticSize = ((environmentType == 'prd') ? 5 : 3)
var name = 'nlp'
var functionAppName = 'function-app-${name}-${environmentType}'
/// Storage account for service ///
resource functionAppStorage 'Microsoft.Storage/storageAccounts#2019-06-01' = {
name: 'st4functionapp${name}${environmentType}'
location: location
kind: 'StorageV2'
sku: {
name: storageAccountSku
}
properties: {
allowBlobPublicAccess: false
accessTier: 'Hot'
supportsHttpsTrafficOnly: true
minimumTlsVersion: 'TLS1_2'
}
}
/// Premium app plan for the service ///
resource servicePlanfunctionApp 'Microsoft.Web/serverfarms#2021-03-01' = {
name: 'plan-${name}-function-app-${environmentType}'
location: location
kind: 'linux'
sku: {
name: 'EP1'
tier: 'ElasticPremium'
family: 'EP'
}
properties: {
reserved: true
targetWorkerCount: minimumElasticSize
maximumElasticWorkerCount: maximumElasticSize
elasticScaleEnabled: true
isSpot: false
zoneRedundant: ((environmentType == 'prd') ? true : false)
}
}
// Create log analytics workspace
resource logAnalyticsWorkspacefunctionApp 'Microsoft.OperationalInsights/workspaces#2021-06-01' = {
name: '${name}-functionapp-loganalytics-workspace-${environmentType}'
location: location
properties: {
sku: {
name: 'PerGB2018' // Standard
}
}
}
/// Log analytics workspace insights ///
resource applicationInsightsfunctionApp 'Microsoft.Insights/components#2020-02-02' = {
name: 'application-insights-${name}-function-${environmentType}'
location: location
kind: 'web'
properties: {
Application_Type: 'web'
Flow_Type: 'Bluefield'
publicNetworkAccessForIngestion: 'Enabled'
publicNetworkAccessForQuery: 'Enabled'
Request_Source: 'rest'
RetentionInDays: 30
WorkspaceResourceId: logAnalyticsWorkspacefunctionApp.id
}
}
// App service containing the workflow runtime ///
resource sitefunctionApp 'Microsoft.Web/sites#2021-03-01' = {
name: functionAppName
location: location
kind: 'functionapp,linux'
identity: {
type: 'SystemAssigned'
}
properties: {
clientAffinityEnabled: false
httpsOnly: true
serverFarmId: servicePlanfunctionApp.id
siteConfig: {
linuxFxVersion: 'python|3.9'
minTlsVersion: '1.2'
pythonVersion: '3.9'
use32BitWorkerProcess: true
appSettings: [
{
name: 'FUNCTIONS_EXTENSION_VERSION'
value: '~4'
}
{
name: 'FUNCTIONS_WORKER_RUNTIME'
value: 'python'
}
{
name: 'AzureWebJobsStorage'
value: 'DefaultEndpointsProtocol=https;AccountName=${functionAppStorage.name};AccountKey=${listKeys(functionAppStorage.id, '2019-06-01').keys[0].value};EndpointSuffix=core.windows.net'
}
{
name: 'WEBSITE_CONTENTAZUREFILECONNECTIONSTRING'
value: 'DefaultEndpointsProtocol=https;AccountName=${functionAppStorage.name};AccountKey=${listKeys(functionAppStorage.id, '2019-06-01').keys[0].value};EndpointSuffix=core.windows.net'
}
{
name: 'WEBSITE_CONTENTSHARE'
value: 'app-${toLower(name)}-functionservice-${toLower(environmentType)}a6e9'
}
{
name: 'APPINSIGHTS_INSTRUMENTATIONKEY'
value: applicationInsightsfunctionApp.properties.InstrumentationKey
}
{
name: 'ApplicationInsightsAgent_EXTENSION_VERSION'
value: '~2'
}
{
name: 'APPLICATIONINSIGHTS_CONNECTION_STRING'
value: applicationInsightsfunctionApp.properties.ConnectionString
}
{
name: 'ENV'
value: toUpper(environmentType)
}
]
}
}
/// VNET integration so flows can access storage and queue accounts ///
resource vnetIntegration 'networkConfig#2022-03-01' = {
name: 'virtualNetwork'
properties: {
subnetResourceId: vnetIntegrationSubnetId
swiftSupported: true
}
}
}
/// Outputs for creating access policies ///
output functionAppName string = sitefunctionApp.name
output functionAppManagedIdentityId string = sitefunctionApp.identity.principalId
Output is used for giving permissions to blob/queue and some keyvault stuff. This code is a single module called in a main.bicep module and deployed via an Azure Devops pipeline.
I have a second repository in which I have some functions and which I also deploy via Azure Pipelines. This one contains three .yaml files for deploying, 2 templates (CI and CD) and 1 main pipeline called azure-pipelines.yml pulling it all together:
functions-ci.yml:
parameters:
- name: environment
type: string
jobs:
- job:
displayName: 'Publish the function as .zip'
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python $(pythonVersion)'
- task: CopyFiles#2
displayName: 'Create project folder'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
Contents: |
**
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: Bash#3
displayName: 'Install requirements for running function'
inputs:
targetType: 'inline'
script: |
python3 -m pip install --upgrade pip
pip install setup
pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt
workingDirectory: '$(Build.ArtifactStagingDirectory)'
- task: ArchiveFiles#2
displayName: 'Create project zip'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)'
includeRootFolder: false
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
- task: PublishPipelineArtifact#1
displayName: 'Publish project zip artifact'
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifactName: 'functions$(environment)'
publishLocation: 'pipeline'
functions-cd.yml:
parameters:
- name: environment
type: string
- name: azureServiceConnection
type: string
jobs:
- job: worfklowsDeploy
displayName: 'Deploy the functions'
steps:
# Download created artifacts, containing the zipped function codes
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
artifactName: 'functions$(environment)'
targetPath: '$(Build.ArtifactStagingDirectory)'
# Zip deploy the functions code
- task: AzureFunctionApp#1
inputs:
azureSubscription: $(azureServiceConnection)
appType: functionAppLinux
appName: function-app-nlp-$(environment)
package: $(Build.ArtifactStagingDirectory)/**/*.zip
deploymentMethod: 'zipDeploy'
They are pulled together in azure-pipelines.yml:
trigger:
branches:
include:
- develop
- main
pool:
name: "Hosted Ubuntu 1804"
variables:
${{ if notIn(variables['Build.SourceBranchName'], 'main') }}:
environment: dev
azureServiceConnection: SC-NLPDT
${{ if eq(variables['Build.SourceBranchName'], 'main') }}:
environment: prd
azureServiceConnection: SC-NLPPRD
pythonVersion: '3.9'
stages:
# Builds the functions as .zip
- stage: functions_ci
displayName: 'Functions CI'
jobs:
- template: ./templates/functions-ci.yml
parameters:
environment: $(environment)
# Deploys .zip workflows
- stage: functions_cd
displayName: 'Functions CD'
jobs:
- template: ./templates/functions-cd.yml
parameters:
environment: $(environment)
azureServiceConnection: $(azureServiceConnection)
So this successfully deploys my function app the first time around when I have also deployed the infra code. The imports are done well, the right function app is deployed, and the code runs when I trigger it.
But, when I go and redeploy the infra (bicep) code, all of a sudden I the newest version of the functions is gone and is replaced by a previous version.
Also, running this previous version doesn't work anymore since all my requirements that were installed in the pipeline (CI part) via pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt suddenly cannot be found anymore, giving import errors (i.e. Result: Failure Exception: ModuleNotFoundError: No module named 'azure.identity'). Mind you, this version did work previously just fine.
This is a big problem for me since I need to be able to update some infra stuff (like adding an APP_SETTING) without this breaking the current deployment of functions.
I had thought about just redeploying the function automatically after an infra update, but then I still miss the previous invocations which I need to be able to see.
Am I missing something in the above code because I cannot figure out what would be going wrong here that causes my functions to change on infra deployment...
Looking at the documentation:
To enable your function app to run from a package, add a WEBSITE_RUN_FROM_PACKAGE setting to your function app settings.
1 Indicates that the function app runs from a local package file deployed in the d:\home\data\SitePackages (Windows) or /home/data/SitePackages (Linux) folder of your function app.
In your case, when you deploy your function app code using AzureFunctionApp#1 and zipDeploy, this automatically add this appsetting into your function app. When redeploying your infrastructure, this setting is removed and the function app host does not know where to find the code.
If you add this app setting in your bicep file this should work:
{
name: 'WEBSITE_RUN_FROM_PACKAGE'
value: '1'
}
Stage is getting skipped eventhough all dependencies seems conditions are present.
Here are rquirements for a pipeline:
stage 'NextTest' should be run after stage 'FakeTests' completed
stage 'NextTest' should not depend on 'FakeTests' stage result (even if 'FakeTests' failed 'NextTest' should be run)
stage 'NextTest' should not run if 'ToSkip' stage failed
stage 'NextTest' should run if 'ToSkip' stage skipped.
Here is yaml:
stages:
- stage: ToSkip
jobs:
- job: Skip
steps:
- powershell: |
Write-Host "Something"
- stage: FakeTests
jobs:
- job: PassTest
steps:
- powershell: |
Write-Host "Test Passed"
- job: FailTest
steps:
- powershell: |
Write-Host "Test Failed"
exit 1
- stage: NextTests
dependsOn:
- FakeTests
- ToSkip
condition: |
and(
in(stageDependencies.FakeTests.result, 'Succeeded', 'SucceededWithIssues', 'Skipped', 'Failed'),
in(stageDependencies.ToSkip.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
)
jobs:
- job: NextPassTest
steps:
- powershell: |
Write-Host "Test Passed"
This YAML is used as AzureDevOps pipeline.
However if 'ToSkip' skipped and 'FakeTests' failed 'NextTest' are skipped as well.
Why is that? And how to correct it?
By looking at the stageDependencies JSON structure, there's no key called 'result' under <STAGE_NAME>. you should refer results through
stageDependencies.<STAGE_NAME>.<JOB_NAME>.result
"stageDependencies": {
"<STAGE_NAME>": {
"<JOB_NAME>": {
"result": "Succeeded|SucceededWithIssues|Skipped|Failed|Canceled",
"outputs": {
"stepName.variableName": "value"
}
},
"...": {
//another job
}
},
"...": {
//anotherstage
}
}
Try this with your code
condition: |
and(
in(stageDependencies.FakeTests.PassTest.result, 'Succeeded', 'SucceededWithIssues', 'Skipped', 'Failed'),
in(stageDependencies.FakeTests.FailTest.result, 'Succeeded', 'SucceededWithIssues', 'Skipped', 'Failed'),
in(stageDependencies.ToSkip.result, 'Succeeded', 'SucceededWithIssues', 'Skipped')
)
For more information, you might want to checkout this page
I am using, Linux agent, I need to iterate over json input objects for each project.
I have below json file and It may have more than 200 charts, I need perform build, lint, templates and push to repository, I can do this using shell/python, but I thought to use Azure pipelines yaml code.
{
"helm": {
"charts": [
{
"project1": {
"secretName" : "mysecret1",
"setValues": ["a", "b", "c"]
}
},
{
"project2": {
"secretName" : "mysecret2",
"setValues": ["x", "y", "z"]
}
}
]
}
}
azure-pipelines.yaml:
trigger:
- '*'
variables:
buildConfiguration: 'Release'
releaseBranchName: 'dev'
stages:
- stage: 'Build'
pool:
name: 'linux'
displayName: 'Build helm Projects'
jobs:
- job: 'buildProjects'
displayName: 'Building all the helm projects'
steps:
- task: HelmInstaller#0
displayName: install helm
inputs:
helmVersion: 'latest'
installKubectl: false
- script: chmod -R 755 $(Build.SourcesDirectory)/
displayName: 'Set Directory permissions'
- task: PythonScript#0
inputs:
scriptSource: inline
script: |
import argparse, json, sys
parser = argparse.ArgumentParser()
parser.add_argument("--filePath", help="Provide the json file path")
args = parser.parse_args()
with open(args.filePath, 'r') as f:
data = json.load(f)
data = json.dumps(data)
print('##vso[task.setvariable variable=helmConfigs;]%s' % (data))
arguments: --filePath $(Build.SourcesDirectory)/build/helm/helmConfig.json
failOnStderr: true
displayName: setting up helm configs
- template: helmBuild.yml
parameters:
helmCharts: $(HELMCONFIGS)
Json input is saved to HELMCONFIGS variable in azure pipelines, As per Azure documents, it string type and we cannot convert to any other type like array.
helmBuild.yml file:
parameters:
- name: helmCharts
type: object
default: {}
steps :
- ${{ each helm in parameters.helmCharts }}:
- ${{ each charts in helm}}:
- ${{ each chart in charts }}:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'A-B-C'
KeyVaultName: chart.secretName
SecretsFilter: '*'
RunAsPreJob: true
I am not able to access chart.secretName, How to access to secretNames input?
I'm using Azure Pipelines as a part of Azure DevOps.
I'm trying to define variables in my template file, because I need to use the same value multiple times.
This is my stage-template.yml:
parameters:
- name: param1
type: string
- name: param2
type: string
variables:
var1: path/${{ parameters.param2 }}/to-my-file.yml
stages:
- stage: Deploy${{ parameters.param2 }}
displayName: Deploy ${{ parameters.param1 }}
jobs:
...
When trying to use this pipeline, I get an error message:
/stage-template.yml (Line: 7, Col: 1): Unexpected value 'variables'
Why is this not working? What did I do wrong?
You can't have variables in a template that is included as a stage, job or step template (i.e. included below a stages, jobs or steps element in a pipeline). You can only use variables in a variable template.
The documentation sadly is not really clear about that.
Including a stage template
# pipeline-using-stage-template.yml
stages:
- stage: stage1
[...]
# stage template reference, no 'variables' element allowed in stage-template.yml
- template: stage-template.yml
Including a variable template
# pipeline-using-var-template.yml
variables:
# variable template reference, only variables allowed inside the template
- template: variables.yml
steps:
- script: echo A step.
If you are using a template to include variables in a pipeline, the included template can only be used to define variables.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#variable-reuse
you cant have parameters in the pipeline, only in the templateReferences:
name: string # build numbering format
resources:
pipelines: [ pipelineResource ]
containers: [ containerResource ]
repositories: [ repositoryResource ]
variables: # several syntaxes, see specific section
trigger: trigger
pr: pr
stages: [ stage | templateReference ]
if you want to use variables in templates you have to use proper syntax:
parameters:
- name: param1
type: string
- name: param2
type: string
stages:
- stage: Deploy${{ parameters.param2 }}
displayName: Deploy ${{ parameters.param1 }}
variables:
var1: path/${{ parameters.param2 }}/to-my-file.yml
jobs:
...
This works for me:
In your parent yaml:
stages:
- stage: stage1
displayName: 'stage from parent'
jobs:
- template: template1.yml
parameters:
param1: 'somevalueforparam1'
inside template1:
parameters:
param1: ''
param2: ''
jobs:
- job: job1
workspace:
clean: all
displayName: 'Install job'
pool:
name: 'yourpool'
variables:
- name: var1
value: 'value1'