Release pipeline - share Artifacts between docker stages - azure

I have following azure release pipeline:
Stage 1 runs a docker image, that produces some results, say results1.json
Stage 2 runs a docker image, that produces some results, say results2.json
Now, stage 3 (also docker image) will wait for first two stages to complete, and use both results1.json and results2.json files to do something else.
Any suggestions would be appreciated.

You can add a powershell task to run below ##vso[task.uploadfile] in both stage1 and stage2 to upload the json fils to the task log which is available to download along with the task log.
You may first need to save the json files to a place on the agent. For example save the json file to folder $(System.DefaultWorkingDirectory)
on stage1
run below script in the powershell task
echo "##vso[task.uploadfile]$(System.DefaultWorkingDirectory)\results1.json"
on stage2
echo "##vso[task.uploadfile]$(System.DefaultWorkingDirectory)\results2.json"
On stage3
Add a powershell task to call release log rest api to get the logs and save to a place on the agent(eg.$(System.DefaultWorkingDirectory)).
GET https://{instance}/{collection}/{project}/_apis/release/releases/{releaseId}/logs?api-version=4.1-preview.2
Then use powershell to extract results1.json and results2.json files from the downloaded logs.
Please refer to below full powershell scripts:
$url = "https://vsrm.dev.azure.com/<Org>/<Proj>/_apis/release/releases/$(Release.ReleaseId)/logs?api-version=5.1-preview.2"
$filename="$(System.DefaultWorkingDirectory)\filefinal.zip"
Invoke-RestMethod -Uri $url -Headers #{Authorization="Bearer $env:SYSTEM_ACCESSTOKEN"} -Method get -OutFile $filename
# extract results1.json and results2.json
$sourceFile="$filename"
$file1= "$(System.DefaultWorkingDirectory)\results1.json"
$file2 = "$(System.DefaultWorkingDirectory)\results2.json"
Add-Type -Assembly System.IO.Compression.FileSystem
$zip = [IO.Compression.ZipFile]::OpenRead($sourceFile)
$zip.Entries | where {$_.Name -match 'results1.json'} | foreach {[System.IO.Compression.ZipFileExtensions]::ExtractToFile($_, $file1, $true)}
$zip.Entries | where {$_.Name -match 'results2.json'} | foreach {[System.IO.Compression.ZipFileExtensions]::ExtractToFile($_, $file2, $true)}
$zip.Dispose()
If you encounter not authorized error in the powershell task on stage3. Please refere to blow screenshot and check Allow scripts to access the OAuth token

Related

Persist changes to AssemblyInfo.Version.cs between Gitlab CI/CD jobs

I am trying to append my assembly version with the Gitlab pipeline's ID before a build like so:
set_version_job:
variables:
GIT_CLEAN_FLAGS: none
stage: build
script:
- '$content = Get-Content -Path "$env:SRC_FOLDER\Properties\AssemblyInfo.Version.cs" -Raw'
- '$oldVersion = if($content -match "\d+\.\d+") { $matches[0] }'
- '$newVersion = $oldVersion + ".$env:CI_PIPELINE_ID"'
- '$content -replace $oldVersion,$newVersion | Set-Content -Path "$env:SRC_FOLDER\Properties\AssemblyInfo.Version.cs"'
This works great. The only problem is, I have separate build and publish jobs and the changes I make to the AssemblyInfo.Version.cs file are gone once the pipeline reaches those jobs. It seems that Gitlab is cleaning any changes between them.
Is there any way to persist those changes until the entire pipeline is done?
Tried using
variables:
GIT_CLEAN_FLAGS: none
but the result is the same.

Azure Pipelines, iterate through files

In azure build pipeline, I'm trying to iterate through all the files that have extension .hex. The aim is to use every file the same way, but not knowing how many are there.
I receive an error from Azure :
Encountered error(s) while parsing pipeline YAML:
/azure-pipelines.yml (Line: 11, Col: 3): Unexpected symbol: '*'. Located at position 26 within expression: $(Build.SourcesDirectory)*.hex.
The Pipeline has been reduced to the minimum possible :
trigger: none
pool:
name: "LabelManagementPool"
steps:
- ${{ each filename in $(Build.SourcesDirectory)\*.hex}}:
- script: echo ${{ filename }}
What am I missing here ? Is what I want to achieve possible ?
Unfortunately it is not possible to do it this way as when the pipeline kicks of it will compile into a structure with all stages, jobs, etc defined. Since this path doesn't exist yet it can't be compiled.
The workaround would be to use another programming language, such as python or powershell, to obtain the list of files that match the decription. And then use it to loop over each file and perform whatever action you want it to run.
As an example, you could use the following in PowerShell to find all hex files and then would need to write additional code for looping over it and performing your desired task.
# Obtain list of all files that are hex
$files = Get-ChildItem "C:\repos\python" -Filter *.hex -Recurse | % { $_.FullName }
# Loop over every file
Foreach ($file in $files)
{echo $file}
The yaml would be this
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Obtain list of all files that are hex
$files = Get-ChildItem "$(Build.SourcesDirectory)" -Filter *.hex -Recurse | % { $_.FullName }
# Loop over every file
Foreach ($file in $files)
{echo $file}

Azure Pipeline Extract Task 7zip

Unable to extract zip to destination with default usage of Extract Task its fails with error:
##[error]Unable to locate executable file: 'C:\azagent\A5\_work\_tasks\ExtractFiles_5e1e3830-fbfb-11e5-aab1-090c92bc4988\1.200.0\7zip\7z.exe'. Please verify either the file path exists or the file can be found within a directory specified by the PATH environment variable. Also verify the file has a valid extension for an executable file.
Stating its fails to locate default 7zip path. Tried to use custom PATH setting but also fails with the same error.
UPDATE
Issue seem to be caused by permissions of agent. Still haven't been able to execute Release with Admin privileges in service mode. When run in interactive mode as Admin the release executes successfully.
Task fails whenever admin permission is required.
From the error message, 7zip seems not installed on your self-hosted agent. Try to install the 7zip before you use the Extract Task.
Take Bash Task as an example:
brew install p7zip
For Windows, use the below PowerShell script to install:
$dlurl = 'https://7-zip.org/' + (Invoke-WebRequest -UseBasicParsing -Uri 'https://7-zip.org/' | Select-Object -ExpandProperty Links | Where-Object {($_.outerHTML -match 'Download')-and ($_.href -like "a/*") -and ($_.href -like "*-x64.exe")} | Select-Object -First 1 | Select-Object -ExpandProperty href)
# modified to work without IE
# above code from: https://perplexity.nl/windows-powershell/installing-or-updating-7-zip-using-powershell/
$installerPath = Join-Path $env:TEMP (Split-Path $dlurl -Leaf)
Invoke-WebRequest $dlurl -OutFile $installerPath
Start-Process -FilePath $installerPath -Args "/S" -Verb RunAs -Wait
Remove-Item $installerPath

How can an Azure DevOps task use the Output of a previous Azure DevOps task

When executing a DevOps Release Pipeline:
While currently executing Task 2, I would like to reference the Output from Task 1. (Task 1 retrieves json from a REST api call.)
They are both in the same Agentless Job (they are both REST api calls). basically i'm trying to "chain" them together and/or pass the json output from Task 1 to Task 2
I have seen documentation on variables, but I am not sure if I can set variables via Task 1's output, or even better if there was something built-in, like ${PreviousStep.Output}
Is this even possible?
We could create a temporary file in pipeline and save the response body to the file, then read the file in the next power shell.
In my sample, I get release pipeline definition in the first power shell task and save the output to the temporary file, then read it in the next power shell task.
Sample pipeline definition
trigger:
- none
pool:
vmImage: 'windows-latest'
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
New-Item $(Build.SourcesDirectory)\test1.txt
$outfile = "$(Build.SourcesDirectory)\test1.txt"
Write-Host $outfile
$url = "https://vsrm.dev.azure.com/{Org name}/{Project name}/_apis/release/definitions/{Definition ID}?api-version=6.0-preview.4"
Write-Host "URL: $url"
$connectionToken="{PAT}"
$base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($connectionToken)"))
$pipelineInfo = Invoke-RestMethod -Uri $url -Headers #{authorization = "Basic $base64AuthInfo"} -Method Get
$pipelineInfo | Out-File -FilePath $outfile
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$json = Get-Content -Path $(Build.SourcesDirectory)\test1.txt
Write-Host $json
Result:

Automatic deployment of solutions with PowerShell

I have a folder, containing several solutions for a SharePoint application, which I want to add and install. I want to iterate over the elements in the folder, then use the Add-SPSolution. After that I want to do a check if the solutions are done deploying, before using the Install-SPSolution. Here is a snippet that I am currently working on:
# Get the location of the folder you are currently in
$dir = $(gl)
# Create a list with the .wsp solutions
$list = Get-ChildItem $dir | where {$_.extension -eq ".wsp"}
Write-Host 'DEPLOYING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Add-SPSolution -LiteralPath $my_file.FullName}
Write-Host 'SLEEP FOR 30 SECONDS'
Start-Sleep -s 30
Write-Host 'INSTALLING SOLUTIONS...'
foreach($my_file in Get-ChildItem $list){Install-SPSolution -Identity $my_file.Name -AllWebApplications -GACDeployment}
Is there a way to check if the deployment is finished, and it is ready to start installing the solutions?
You need to check the SPSolution.Deployed property value in a loop - basic solution looks like this:
do { Start-Sleep 2 } while (!((Get-SPSolution $name).Deployed))
The Deploying SharePoint 2010 Solution Packages Using PowerShell article contains more details and this comment discusses a potential caveat.

Resources