How can an Azure DevOps task use the Output of a previous Azure DevOps task - azure

When executing a DevOps Release Pipeline:
While currently executing Task 2, I would like to reference the Output from Task 1. (Task 1 retrieves json from a REST api call.)
They are both in the same Agentless Job (they are both REST api calls). basically i'm trying to "chain" them together and/or pass the json output from Task 1 to Task 2
I have seen documentation on variables, but I am not sure if I can set variables via Task 1's output, or even better if there was something built-in, like ${PreviousStep.Output}
Is this even possible?

We could create a temporary file in pipeline and save the response body to the file, then read the file in the next power shell.
In my sample, I get release pipeline definition in the first power shell task and save the output to the temporary file, then read it in the next power shell task.
Sample pipeline definition
trigger:
- none
pool:
vmImage: 'windows-latest'
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
New-Item $(Build.SourcesDirectory)\test1.txt
$outfile = "$(Build.SourcesDirectory)\test1.txt"
Write-Host $outfile
$url = "https://vsrm.dev.azure.com/{Org name}/{Project name}/_apis/release/definitions/{Definition ID}?api-version=6.0-preview.4"
Write-Host "URL: $url"
$connectionToken="{PAT}"
$base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($connectionToken)"))
$pipelineInfo = Invoke-RestMethod -Uri $url -Headers #{authorization = "Basic $base64AuthInfo"} -Method Get
$pipelineInfo | Out-File -FilePath $outfile
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$json = Get-Content -Path $(Build.SourcesDirectory)\test1.txt
Write-Host $json
Result:

Related

How to write a foreach loop on build variable in Azure devops Powershell#2 task?

I have an azure pipeline that maintain a variable that holds project names - lets assume parameters.projects that holds projectA, projectB, projectC
I wish to execute a foreach loop and perform an operation on every project.
I currently use
- ${{ each Project in parameters.projects}}:
- task: PowerShell#2
displayName: "operation on [${{ Project }}]."
inputs:
targetType: 'inline'
workingDirectory: $(System.DefaultWorkingDirectory)
script: |
New-Item -Path '$(Build.ArtifactStagingDirectory)\${{ Project }}' -ItemType Directory
....
In the above example the foreach argument (iteration value) is azure's, which means it will spwan a task for each project in the pipeline. This works but roughly slow.
I wish to run something like...
- task: PowerShell#2
displayName: "operation on [${{ Project }}]."
inputs:
targetType: 'inline'
workingDirectory: $(System.DefaultWorkingDirectory)
script: |
foreach ($Project in ${{ parameters.projects }})
{
New-Item -Path '$(Build.ArtifactStagingDirectory)\${{ Project }}' -ItemType Directory
....
}
But i'm not sure about the syntax, and couldn't find a useful explanation/examples.
what is the right syntax? also a description web page it useful also.
Runtime parameters let you have more control over what values can be passed to a pipeline. With runtime parameters you can:
Supply different values to scripts and tasks at runtime
Control parameter types, ranges allowed, and defaults
Dynamically select jobs and stages with template expressions
So, your current script is the right syntax.
I tested the object type parameters; It cannot be used in the PowerShell task.
You can try to use the string type parameters and split the string for loop.
YAML like:
parameters:
- name: projects
type: string
default: project1;project2
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
$prjs = "${{parameters.projects}}"
$projects = $prjs -Split ";"
foreach($project in $projects){
New-Item -Path '$(Build.ArtifactStagingDirectory)\$project' -ItemType Directory
}
For more information, you could refer to the runtime parameters.

Persist changes to AssemblyInfo.Version.cs between Gitlab CI/CD jobs

I am trying to append my assembly version with the Gitlab pipeline's ID before a build like so:
set_version_job:
variables:
GIT_CLEAN_FLAGS: none
stage: build
script:
- '$content = Get-Content -Path "$env:SRC_FOLDER\Properties\AssemblyInfo.Version.cs" -Raw'
- '$oldVersion = if($content -match "\d+\.\d+") { $matches[0] }'
- '$newVersion = $oldVersion + ".$env:CI_PIPELINE_ID"'
- '$content -replace $oldVersion,$newVersion | Set-Content -Path "$env:SRC_FOLDER\Properties\AssemblyInfo.Version.cs"'
This works great. The only problem is, I have separate build and publish jobs and the changes I make to the AssemblyInfo.Version.cs file are gone once the pipeline reaches those jobs. It seems that Gitlab is cleaning any changes between them.
Is there any way to persist those changes until the entire pipeline is done?
Tried using
variables:
GIT_CLEAN_FLAGS: none
but the result is the same.

Azure Pipelines, iterate through files

In azure build pipeline, I'm trying to iterate through all the files that have extension .hex. The aim is to use every file the same way, but not knowing how many are there.
I receive an error from Azure :
Encountered error(s) while parsing pipeline YAML:
/azure-pipelines.yml (Line: 11, Col: 3): Unexpected symbol: '*'. Located at position 26 within expression: $(Build.SourcesDirectory)*.hex.
The Pipeline has been reduced to the minimum possible :
trigger: none
pool:
name: "LabelManagementPool"
steps:
- ${{ each filename in $(Build.SourcesDirectory)\*.hex}}:
- script: echo ${{ filename }}
What am I missing here ? Is what I want to achieve possible ?
Unfortunately it is not possible to do it this way as when the pipeline kicks of it will compile into a structure with all stages, jobs, etc defined. Since this path doesn't exist yet it can't be compiled.
The workaround would be to use another programming language, such as python or powershell, to obtain the list of files that match the decription. And then use it to loop over each file and perform whatever action you want it to run.
As an example, you could use the following in PowerShell to find all hex files and then would need to write additional code for looping over it and performing your desired task.
# Obtain list of all files that are hex
$files = Get-ChildItem "C:\repos\python" -Filter *.hex -Recurse | % { $_.FullName }
# Loop over every file
Foreach ($file in $files)
{echo $file}
The yaml would be this
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Obtain list of all files that are hex
$files = Get-ChildItem "$(Build.SourcesDirectory)" -Filter *.hex -Recurse | % { $_.FullName }
# Loop over every file
Foreach ($file in $files)
{echo $file}

Referencing yaml variables in AzurePowerShellV5 inline script

I have a Azure DevOps yaml pipeline, that looks (a bit like) like this:
variables:
MyVar: Test
Steps:
- task: AzurePowerShell#5
displayName: 'Test variables from yml file'
inputs:
azureSubscription: MyServiceConnection
ScriptType: InLineScript
InLine: |
Write-Host "I really want to see the values from the variables in the yml file here"
Write-Host "parameters from the yml file would be great too"
Write-Host "But what would I write to do that?"
Write-Host "$(MyVar) <- Nothing here"
Write-Host "$(variables.MyVar) <- Nothing here"
Write-Host "$DoesThisWork <- Nothing here"
Write-Host "$OrThis <- Nothing here"
env:
DoesThisWork: $(MyVar)
OrThis: $(variables.MyVar)
How do I use MyVar in the InLine script?
I stripped it down to the simplest possible and it works just fine:
pool:
vmImage: 'windows-latest'
variables:
myVar: test value
steps:
- powershell: Write-Host "$(myVar)"
generates:
I modified your example to remove the env block which did not compile, and remove the incorrect variable references:
pool:
vmImage: 'windows-latest'
variables:
myVar: test value
steps:
- task: AzurePowerShell#5
inputs:
azureSubscription: 'My Service Connection Name'
ScriptType: 'InlineScript'
Inline: |
Write-Host "I really want to see the values from the variables in the yml file here"
Write-Host "parameters from the yml file would be great too"
Write-Host "But what would I write to do that?"
Write-Host "$(MyVar) <- Nothing here"
azurePowerShellVersion: 'LatestVersion'
and got:
It didn't get past the first one because the syntax $(variables.MyVar) is invalid. The syntax works as follows:
Compile time (usable only in the file in which the variable is declared and not to nested files like templates): ${{ variables.MyVar }}
Runtime (before task execution): $(MyVar) - expands to "$(MyVar)" if empty
Runtime (designed for conditions, or where the default value causes problems): $[variables.MyVar] - expands to empty string if empty
I'm wondering if the lack of an Azure Powershell version was part of your problem?
When you specify an env block, it creates the variables as environment variables.
In PowerShell, you reference environment variables with $env:VariableName. So in your case, $env:DoesThisWork.

Release pipeline - share Artifacts between docker stages

I have following azure release pipeline:
Stage 1 runs a docker image, that produces some results, say results1.json
Stage 2 runs a docker image, that produces some results, say results2.json
Now, stage 3 (also docker image) will wait for first two stages to complete, and use both results1.json and results2.json files to do something else.
Any suggestions would be appreciated.
You can add a powershell task to run below ##vso[task.uploadfile] in both stage1 and stage2 to upload the json fils to the task log which is available to download along with the task log.
You may first need to save the json files to a place on the agent. For example save the json file to folder $(System.DefaultWorkingDirectory)
on stage1
run below script in the powershell task
echo "##vso[task.uploadfile]$(System.DefaultWorkingDirectory)\results1.json"
on stage2
echo "##vso[task.uploadfile]$(System.DefaultWorkingDirectory)\results2.json"
On stage3
Add a powershell task to call release log rest api to get the logs and save to a place on the agent(eg.$(System.DefaultWorkingDirectory)).
GET https://{instance}/{collection}/{project}/_apis/release/releases/{releaseId}/logs?api-version=4.1-preview.2
Then use powershell to extract results1.json and results2.json files from the downloaded logs.
Please refer to below full powershell scripts:
$url = "https://vsrm.dev.azure.com/<Org>/<Proj>/_apis/release/releases/$(Release.ReleaseId)/logs?api-version=5.1-preview.2"
$filename="$(System.DefaultWorkingDirectory)\filefinal.zip"
Invoke-RestMethod -Uri $url -Headers #{Authorization="Bearer $env:SYSTEM_ACCESSTOKEN"} -Method get -OutFile $filename
# extract results1.json and results2.json
$sourceFile="$filename"
$file1= "$(System.DefaultWorkingDirectory)\results1.json"
$file2 = "$(System.DefaultWorkingDirectory)\results2.json"
Add-Type -Assembly System.IO.Compression.FileSystem
$zip = [IO.Compression.ZipFile]::OpenRead($sourceFile)
$zip.Entries | where {$_.Name -match 'results1.json'} | foreach {[System.IO.Compression.ZipFileExtensions]::ExtractToFile($_, $file1, $true)}
$zip.Entries | where {$_.Name -match 'results2.json'} | foreach {[System.IO.Compression.ZipFileExtensions]::ExtractToFile($_, $file2, $true)}
$zip.Dispose()
If you encounter not authorized error in the powershell task on stage3. Please refere to blow screenshot and check Allow scripts to access the OAuth token

Resources