Changing Azure YAML Pipeline causes authorization to be lost to Resources - azure

I've had this happen to me a number of times where I have a working Azure Pipeline written in YAML. Then I change the Pipeline and then I get the error that There was a resource authorization issue. Typically I delete the pipeline, re-create it, and then it works. However, now it is not working and I continuously get the following error:
So, I click the little button, and it pops up saying, Resources have been Authorized. I attempt to run the pipeline again, and I get the same error.
I am an Account/Collection/Organization Administrator, and created the Library Group originally where it is set to have Access to all Pipelines enabled. I've tried renaming the Pipeline and re-creating it a few times to the same error. short of reverting the pipeline back to it original state what should I do?
--EDIT--
Simply resetting the branch to an earlier version of the pipeline worked. Still no clue on why moving the steps into Stages and Jobs failed though.
--EDIT--
Below will is the YAML I used originally and the updated version. When the updated version gave Resource Authorization issues, I performed a git log and took the commit id of the previous commit that worked, and did a git reset $commitId. Pushed the reset branch back up to Azure DevOps, then it just magically worked.
Original Azure Pipeline YAML:
---
trigger: none
variables:
- name: ProjectFolder
value: tf-datafactory
- group: 'Deploy_Terraform_Library_Group'
pool:
vmImage: 'ubuntu-latest'
steps:
- task: replacetokens#3
displayName: Replace tokens
inputs:
targetFiles: '$(System.DefaultWorkingDirectory)/$(ProjectFolder)/variables.tf'
encoding: 'auto'
writeBOM: true
verbosity: 'detailed'
actionOnMissing: 'warn'
keepToken: false
tokenPrefix: '#{{'
tokenSuffix: '}}#'
- task: AzureCLI#2
displayName: Get the storage account key
inputs:
azureSubscription: '$(ARM.SubscriptionEndpoint)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
export ContainerAccessKeyExport=$(az storage account keys list \
--resource-group $(StorageResourceGroupName) \
--account-name $(StorageAccountName) \
--query "[0].value")
echo "##vso[task.setvariable variable=ContainerAccessKey]$ContainerAccessKeyExport"
...
I then moved these steps into stages and jobs.
---
parameters:
- name: TerraformAction
displayName: 'Will Terraform Create or Destroy?'
type: 'string'
default: 'create'
values:
- 'create'
- 'destroy'
trigger: none
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: 'Terraform'
displayName: 'Terraform Stage'
variables:
- name: 'TerraformAction'
value: '${{ parameters.TerraformAction }}'
- name: ProjectFolder
value: tf-datafactory
jobs:
- job: 'DeployTerraform'
displayName: 'Terraform Deploy Data Factory'
condition: eq(variables['TerraformAction'], 'create')
variables:
- group: 'Deploy_Terraform_Library_Group'
steps:
- task: replacetokens#3
displayName: Replace tokens
inputs:
targetFiles: '$(System.DefaultWorkingDirectory)/$(ProjectFolder)/variables.tf'
encoding: 'auto'
writeBOM: true
verbosity: 'detailed'
actionOnMissing: 'warn'
keepToken: false
tokenPrefix: '#{{'
tokenSuffix: '}}#'
- task: AzureCLI#2
displayName: Get the storage account key
inputs:
azureSubscription: '$(ARM.SubscriptionEndpoint)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
export ContainerAccessKeyExport=$(az storage account keys list \
--resource-group $(StorageResourceGroupName) \
--account-name $(StorageAccountName) \
--query "[0].value")
echo "##vso[task.setvariable variable=ContainerAccessKey]$ContainerAccessKeyExport"
...
--EDIT--
What I get from this: https://aka.ms/yamlauthz is that you either need to start with Stages and Jobs from the get go, otherwise you have to stick with the original Pipeline that was created. Most people that let Azure DevOps create their initial pipeline won't know to use the Stages and Jobs because the pipeline generator doesn't do that for them and only starts with Steps.

Related

Azure DevOps pipelines - how to execute a whole template stage in a docker container by a self-hosted agent

I have a multi stage Azure DevOps pipeline in yaml. Each stage includes multiple jobs. The pipeline runs properly by using Azure-hosted agents. I need to run it by our company's self-hosted agent. The self-hosted agent is a virtual machine scale set (VMSS) which is defined as an agent pool in DevOps, it has been tested and is working properly. Because the self-hosted agent does not have all the capabilities we need, our company's policy is to run the jobs inside docker containers. The docker images reside in our company's private Azure Container Registy.
My question is how can I run a whole template stage in a docker container? Please note that my question is not about executing a job but all the jobs in a stage template. This link explains how to execute jobs inside a container. It doesn't give any examples of how to run all the jobs in a stage in a container, especially when the stage is defined as a template.
My pipeline in simplified form is as follows:
## Name of self-hosted agent
pool: Platform-VMSS
resources:
containers:
- container: base
image: myazurecontainerregistryname.azurecr.io/platform/myimg:1.0.0
endpoint: 'serviceconnection-acr'
trigger:
branches:
include:
- "feature/ORGUS-4810"
exclude:
- "main"
- "release"
paths:
include:
- "usecase-poc/**"
variables:
pythonVersion: 3.8
whlPackageName: py_core
srcDirectory: usecase-poc/$(whlPackageName)
${{ if eq(variables['Build.SourceBranch'], 'refs/heads/main') }}:
BuildFrom: main
PackageName: $(whlPackageName)
versionOption: custom
deployToDevPipeline: True
deployToQAPipeline: True
deployToProdPipeline: True
${{ else }}:
BuildFrom: branch_${{ lower(variables['Build.SourceBranchName'] ) }}
PackageName: $(whlPackageName)_$(BuildFrom)
versionOption: patch
deployToDevPipeline: True
deployToQAPipeline: False
deployToProdPipeline: False
stageName: 'deployToDev'
stages:
- stage: DownloadArtifact
displayName: "Download python whl from artifactory"
jobs:
- job: DownloadArtifactJob
steps:
- checkout: self
- task: UniversalPackages#0
displayName: 'Download Artifact with Universal Packages'
inputs:
vstsFeed: 'some-value/00000000-0000-0009-0000-d24300000000'
vstsFeedPackage: '00000000-0000-0000-0000-0000000'
vstsPackageVersion: 0.8.4
downloadDirectory: $(Build.ArtifactStagingDirectory)
- task: Bash#3
name: GetWheelName_Task
inputs:
targetType: 'inline'
script: |
echo $(Build.ArtifactStagingDirectory)
find $(Build.ArtifactStagingDirectory) -name '*.whl'
ArtifactName=$(find $(Build.ArtifactStagingDirectory) -name '*.whl')
echo "Artifact name value is " $ArtifactName
echo "##vso[task.setvariable variable=ArtifactName;isOutput=true]$ArtifactName"
displayName: 'Get downloaded artifact in source directory'
- task: PublishBuildArtifacts#1
displayName: "Publish downloaded artifact to pipeline's output"
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: whl
- stage: ConstructSharedVariablesForAllStages
displayName: Construct Shared Variables For All Stages
dependsOn: DownloadArtifact
variables:
- group: proj-shared-vg
- name: ArtifactName
value: $[stageDependencies.DownloadArtifact.DownloadArtifactJob.outputs['GetWheelName_Task.ArtifactName']]
jobs:
- job: DownloadArtifact
container: base
steps:
- task: Bash#3
displayName: "Print variable value"
inputs:
targetType: 'inline'
script: |
echo $(ArtifactName)
- task: Bash#3
name: extractWheelName
displayName: Extract Wheel Name
inputs:
targetType: inline
script: |
echo $(ArtifactName) | awk -F"/" '{print $NF}'
WhlName="py_core-0.8.4-py3-none-any.whl"
echo "##vso[task.setvariable variable=WhlName;isOutput=true]$WhlName"
- task: DownloadPipelineArtifact#2
displayName: "Download artifact from previous stage"
inputs:
buildType: 'current'
project: 'Project Name'
buildVersionToDownload: 'latest'
artifactName: whl
targetPath: '$(System.ArtifactsDirectory)'
- pwsh: |
$whlFile = Get-ChildItem -Filter *.whl -Path "$(System.ArtifactsDirectory)" | ForEach-Object { $_.fullname } | Select-Object -First 1
Write-Host "##vso[task.setvariable variable=whlFile]$whlFile"
name: SetVars
displayName: Get wheel name
## This is the section where my question is about. How can I make sure each stage runs in the self-hosted agent pool. The stage contains multiple jobs.
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToDevPipeline}}
variableGroup: databricks-sp-vg-dev
stageName: DeployToDev
environment: DBRKS_Dev_WHL
conditionParameter: deployToDev
dependsOnStage: ConstructSharedVariablesForAllStages
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToQAPipeline}}
variableGroup: databricks-sp-vg-qa
stageName: DeployToQA
environment: DBRKS_QA_WHL
conditionParameter: deployToQA
dependsOnStage: DeployToDev
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToProdPipeline}}
variableGroup: databricks-sp-vg-prod
stageName: DeployToProd
environment: DBRKS_Prod_WHL
conditionParameter: deployToProd
dependsOnStage: DeployToQA
In the code above in resources the container resides in our Azure Container Registry (ACR), the endpoint is our DevOps service connection of type container registry to pull and push images to and from ACR. In the code above I have commented where the issue is. In templates where I am refering to stage templates, I would like to run them all inside a container where I have defined as a resource at the beginning of the pipeline. The stage template has multiple jobs. Here is just a sample of stage template when running to emphasize it has multiple jobs:
The highlighted stage is the one created by template:
- template: ../templates/azure-pipeline-stage-template.yaml
parameters:
deploy: ${{variables.deployToDevPipeline}}
variableGroup: databricks-sp-vg-dev
stageName: DeployToDev
environment: DBRKS_Dev_WHL
conditionParameter: deployToDev
dependsOnStage: ConstructSharedVariablesForAllStages
Question is how to enforce all the jobs in the above template run in the docker container defined as resource in our pipeline. Thank you very much for your valuable input.
Add a container field at the job level as shown below. Then all the jobs in the template will be running on specified container.
pool:
  vmImage: ubuntu-latest
resources:
  containers:
    - container: testcontainer
      image: ubuntu
stages:
  - stage: template01
    displayName: tempate test
    jobs:
      - job: template
        container: testcontainer
        steps:
          - template: templates/template01.yaml
  - stage: template02
    displayName: template test
    jobs:
      - job: template
        container: testcontainer
        steps:
          - template: templates/template02.yaml
Else, add a step target field to all the required tasks in a template, as referred to in this link Build and Release Tasks - Azure Pipelines | Microsoft Learn](https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#step-target) "https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#step-target)")
resources:
containers:
- container: pycontainer
image: python:3.11
steps:
- task: AnotherTask#1
target: pycontainer

Trying to add PSRule testing to Azure Devops pipeline. Pipeline is succeeding, but no tests are running

I've been working on integrating PSRule into an Azure Devops pipeline. The pipeline is succeeding, but the following error: Target object <FileName> has not been processed because no matching rules were found is being displayed for every file in my repository, at the PSRule stage in the pipeline.
Below, I've included the code for my pipeline. Can anyone help me understand where I've gone wrong with implementing PSRule? Custom rules can be defined, but from my understanding, even without defining any, PSRule should run the 270 or so default rules associated with the module.
trigger:
batch: true
branches:
include:
- main
pool:
vmImage: ubuntu-latest
stages:
- stage: Lint
jobs:
- job: LintCode
displayName: Lint code
steps:
- script: |
az bicep build --file $(MainBicepFile)
name: LintBicepCode
displayName: Run Bicep linter
- stage: PSRule
jobs:
- job: PSRuleRun
displayName: PSRule Run
steps:
- task: ps-rule-assert#1
displayName: Analyze Azure template files
inputs:
modules: 'PSRule.Rules.Azure'
- stage: Validate
jobs:
- job: ValidateBicepCode
displayName: Validate Bicep code
steps:
- task: AzureCLI#2
name: RunPreflightValidation
displayName: Run preflight validation
inputs:
azureSubscription: $(ServiceConnectionName)
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
az deployment group validate \
--resource-group $(ResourceGroupName) \
--template-file $(MainBicepFile) \
- stage: Preview
jobs:
- job: PreviewAzureChanges
displayName: Preview Azure changes
steps:
- task: AzureCLI#2
name: RunWhatIf
displayName: Run what-if
inputs:
azureSubscription: $(ServiceConnectionName)
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
az deployment group what-if \
--resource-group $(ResourceGroupName) \
--template-file $(MainBicepFile) \
- stage: Deploy
jobs:
- deployment: DeployBicep
displayName: Deploy Bicep
environment: 'Test'
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzureCLI#2
name: DeployBicepFile
displayName: Deploy Bicep file
inputs:
azureSubscription: $(ServiceConnectionName)
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
set -e
deploymentOutput=$(az deployment group create \
--name $(Build.BuildNumber) \
--resource-group $(ResourceGroupName) \
--template-file $(MainBicepFile))
The warning on by default but not critical to the process after Bicep code has been expanded.
You can disable the warning which shows by default by setting the execution.notProcessedWarning option to false. You can do this in a number of ways however the easiest is to configure it in the ps-rule.yaml option file. See doc for additional options to set this.
Your pipeline configuration is fine however you need to enable expansion for Bicep or parameter files to be processed by the PSRule.Rules.Azure module.
Configuring expansion for Bicep code is done by setting the configuration.AZURE_BICEP_FILE_EXPANSION option to true. This is covered in Using Bicep source.
Hope that helps.

Deploy code from Azure repo to Function app slot

I have setup Azure DevOps repo and pipeline to deploy my code to Azure Functio App slot. I was able to deploy app to my slot but after making some code changes I tried to re deploy my app/code but now pipeline failed. Probably because my pipeline is somehow wrong configured (so it can only deploy if slot is empty but can not update or redeploy on top of existing code).
This is my pipeline
- master
variables:
# Azure Resource Manager connection created during pipeline creation
azureSubscription: 'My Service Connector'
# Function app name
functionAppName: 'My Fuction app name'
# Agent VM image name
vmImageName: 'vs2017-win2016'
# Working Directory
workingDirectory: '$(System.DefaultWorkingDirectory)/'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- powershell: |
if (Test-Path "extensions.csproj") {
dotnet build extensions.csproj --output ./$(workingDirectory)/bin
}
displayName: 'Build extensions'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: $(workingDirectory)
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: $(functionAppName)
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- task: AzureFunctionApp#1
inputs:
azureSubscription: '$(azureSubscription)'
appType: functionApp
appName: $(functionAppName)
package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
deployToSlotOrASE: true
resourceGroupName: 'my app resource group'
slotName: testing
Error from pipeline:
FYi, I'm not developer nor that familiar with tools like git, repos, pipelines etc. Any tips what could I do different so that I could update my app code from Azure Repos + pipelines?
Try to set up a new ARM connection (Azure Resource Manager service connection) for use in the pipeline to see if it can work.
If the issue still exists, you can enable debugging via setting the pipeline variable System.Debug to true. The queue a new run. From the debug logs, you may get more helpful information about the issue.
Had to change vmImageName: 'vs2017-win2016' to vmImageName: 'windows-lastest' to get this working.

Create a pull request environment - Azure

I would want to create a pull request environment in Azure which gets deployed when a pull request is opened. Users can play around in that environment and find bugs. Once the bugs are fixed, and the PR is closed, I would want to delete that environment.
I am following through Sam Learns Azure and I was able to go through most steps, but its all .NET related.
Does anyone have an idea to do the same for a react-app?
I am also in favor of creating a new slot inside the app-service.
This is my modified code:
jobs:
- deployment: DeployWebServiceApp
displayName: "Deploy webservice app"
environment: ${{parameters.environment}}
pool:
vmImage: ubuntu-latest
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
displayName: 'Download Pipeline Artifacts'
inputs:
artifactName: "$(Build.BuildId)"
buildType: 'current'
- task: AzureCLI#2
displayName: 'Deploy infrastructure with ARM templates'
inputs:
azureSubscription: "${{parameters.serviceConnection}}"
scriptType: bash
scriptLocation: inlineScript
inlineScript: az webapp deployment slot create --name ui-dev-$(prLC)
--resource-group rg-dev
--slot $(prLC)
--configuration-source app-dev
- task: AzureRmWebAppDeployment#3
displayName: 'Azure App Service Deploy: web service'
inputs:
azureSubscription: "${{parameters.serviceConnection}}"
appName: "${{parameters.appServiceName}}"
DeployToSlotFlag: true
ResourceGroupName: '${{parameters.resourceGroupName}}'
package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
RemoveAdditionalFilesFlag: true
TakeAppOfflineFlag: true
RenameFilesFlag: true
Why Not. You can do that in below steps-
Create the Build Pipe which build your react app
In the build pipe set a VSTS var only when it is a PR. eg varPR = x
Create a Release pipe which deploys the app
Add one more task at the end of build pipe to call the release using webhook based on condition varPR = x.
Ref - Trigger azure pipeline via webhook?
The link you have shared is also cool that does the job same job in a single build pipe.

unable to pass parameter from azure Devops yaml to PowerShell

parameters:
- name: AzureSubscription
default: 'abc'
- name: BlobName
type: string
default: ""
stages:
- stage: MyStage
displayName: 'My Stage'
variables:
- name: sas
jobs:
- job: ABC
displayName: ABC
steps:
- task: AzureCLI#2
displayName: 'XYZ'
inputs:
azureSubscription: ${{ parameters.AzureSubscription }}
scriptType: pscore
arguments:
scriptLocation: inlineScript
inlineScript: |
$sas=az storage account generate-sas --account-key "mykey" --account-name "abc" --expiry (Get-Date).AddHours(100).ToString("yyyy-MM-dTH:mZ") --https-only --permissions rw --resource-types sco --services b
Write-Host "My Token: " $sas
- task: PowerShell#2
inputs:
targetType: 'filepath'
filePath: $(System.DefaultWorkingDirectory)/psscript.ps1
arguments: >
-Token "????"
-BlobName "${{parameters.BlobName}}"
displayName: 'some work'
In this Azure Devops yaml, i have created 2 tasks. AzureCLI#2 and PowerShell#2
In AzureCLI#2 i get value in $sas varaible. Write-Host confirms that, but $sas does not get passes as parameter to PowerShell#2 powershell file as parameter.
"${{parameters.BlobName}}" is working fine. In powershell i am able to read that value.
How to pass sas variable value?
Tried
-Token $sas # not worked
-Token "${{sas}}" # not worked
Different tasks in Azure Pipeline don't share a common runspace that would allow them to preserve or pass on variables.
For this reason Azure Pipelines offers special logging commands that allow to take string output from a task to update an Azure Pipeline environment variable that can be used in subsequent tasks: Set variables in scripts (Microsoft Docs).
In your case you would use a logging command like this to make your sas token available to the next task:
Write-Host "##vso[task.setvariable variable=sas]$sas"
In the argument of your subsequent task (within the same job) use the variable syntax of Azure Pipelines:
-Token '$(sas)'

Resources