Persist changes to AssemblyInfo.Version.cs between Gitlab CI/CD jobs - gitlab

I am trying to append my assembly version with the Gitlab pipeline's ID before a build like so:
set_version_job:
variables:
GIT_CLEAN_FLAGS: none
stage: build
script:
- '$content = Get-Content -Path "$env:SRC_FOLDER\Properties\AssemblyInfo.Version.cs" -Raw'
- '$oldVersion = if($content -match "\d+\.\d+") { $matches[0] }'
- '$newVersion = $oldVersion + ".$env:CI_PIPELINE_ID"'
- '$content -replace $oldVersion,$newVersion | Set-Content -Path "$env:SRC_FOLDER\Properties\AssemblyInfo.Version.cs"'
This works great. The only problem is, I have separate build and publish jobs and the changes I make to the AssemblyInfo.Version.cs file are gone once the pipeline reaches those jobs. It seems that Gitlab is cleaning any changes between them.
Is there any way to persist those changes until the entire pipeline is done?
Tried using
variables:
GIT_CLEAN_FLAGS: none
but the result is the same.

Related

Suppress gitlab CI stage if push only changes README.md

I have a CI build stage that runs whenever someone pushes to the repo, and it takes a long time to run. So I want to configure a .gitlab-ci.yml rule that says if the user is only updating the documentation in README.md, it doesn't need to execute the build stage.
Following the tips in How to exclude gitlab-ci.yml changes from triggering a job, it seems like the way to do this would be to add something like:
stages:
- A
Stage A:
stage: A
rules:
- changes:
- "README.md"
when: never
However, based on the documentation in https://docs.gitlab.com/ee/ci/yaml/#ruleschanges, I think this will also suppress the stage if the push contains multiple files, if one of them is README.md.
In that situation, I want the stage to run, not be suppressed. Is there a way to handle this distinction?
I use this syntax to not trigger pipelines when modifying *.md files
workflow:
rules:
- if: '$CI_COMMIT_BRANCH && $CI_COMMIT_BEFORE_SHA !~ /0{40}/'
changes:
- "{*[^.]md*,*.[^m]*,*.m,*.m[^d]*,*.md?*,*[^d]}"
- if: '$CI_PIPELINE_SOURCE == "web"'
- if: '$CI_PIPELINE_SOURCE == "schedule"'
- if: '$CI_PIPELINE_SOURCE == "pipeline"'
- if: '$CI_COMMIT_TAG'
Following the idea from #j_b, I was able to solve this (with the caveat that it exits with pipeline having been marked as failed, as described in the code below). Here is the relevant code I added to my .gitlab-ci.yml.
stages:
- Check Only Docs Changed
- mybuild
# Stage that checks whether only documentation changed. If so, then we won't build a new release.
check-only-docs-changed:
stage: Check Only Docs Changed
script:
- BRANCH=origin/$CI_COMMIT_BRANCH
- echo "Checking commit to see if only docs have changed. "
# The line below is from
# https://stackoverflow.com/questions/424071/how-do-i-list-all-the-files-in-a-commit
- GET_CMD="git diff-tree --no-commit-id --name-only -r $CI_COMMIT_SHA"
- FILES_CHANGED=`$GET_CMD`
- echo "Files in this commit:" $FILES_CHANGED
- NUM_FILES_CHANGED=`$GET_CMD | wc -l`
# We consider any file that ends in .md to be a doc file
# The "|| true" trick on the line below is to deal with the fact that grep will exit with non-zero if it doesn't find any matches.
# See https://stackoverflow.com/questions/42251386/the-return-code-from-grep-is-not-as-expected-on-linux
- NUM_DOC_FILES_CHANGED=`$GET_CMD | grep -c ".*\.md$" || true`
- echo $NUM_FILES_CHANGED "files changed," $NUM_DOC_FILES_CHANGED "of which were documentation."
- |
# We have to test whether NUM_FILES_CHANGED is > 0 because when one branch gets merged into another
# it will be 0, as will NUM_DOC_FILES_CHANGED.
if [[ $NUM_FILES_CHANGED -gt 0 && $NUM_FILES_CHANGED -eq $NUM_DOC_FILES_CHANGED ]]
then
DID_ONLY_DOCS_CHANGE="1"
# Write out the env file before we exit. Otherwise, gitlab will complain that the doccheck.env artifact
# didn't get generated.
echo "DID_ONLY_DOCS_CHANGE=$DID_ONLY_DOCS_CHANGE" >> doccheck.env
echo "Only documentation files have changed. Exiting in order to skip further stages of the pipeline."
# Ideally it would be great to not have to exit with a non-zero code, because this will make the gitlab pipeline
# look like it failed. However, there is currently no easy way to do this, as discussed in
# https://stackoverflow.com/questions/67269109/how-do-i-exit-a-gitlab-pipeline-early-without-failure
# The only way would be to use child pipelines, which is more effort than it's worth for this.
# See https://stackoverflow.com/questions/67169660/dynamically-including-excluding-jobs-in-gitlab-pipeline and
# https://stackoverflow.com/questions/71017961/add-gitlab-ci-job-to-pipeline-based-on-script-command-result
exit 1
else
DID_ONLY_DOCS_CHANGE="0"
echo "DID_ONLY_DOCS_CHANGE=$DID_ONLY_DOCS_CHANGE" >> doccheck.env
fi
# The section below makes the environment variable available to other jobs, but those jobs
# unfortunately cannot access this environment variable in their "rules:" section to control
# whether they execute or not.
artifacts:
reports:
dotenv: doccheck.env

Azure Pipelines, iterate through files

In azure build pipeline, I'm trying to iterate through all the files that have extension .hex. The aim is to use every file the same way, but not knowing how many are there.
I receive an error from Azure :
Encountered error(s) while parsing pipeline YAML:
/azure-pipelines.yml (Line: 11, Col: 3): Unexpected symbol: '*'. Located at position 26 within expression: $(Build.SourcesDirectory)*.hex.
The Pipeline has been reduced to the minimum possible :
trigger: none
pool:
name: "LabelManagementPool"
steps:
- ${{ each filename in $(Build.SourcesDirectory)\*.hex}}:
- script: echo ${{ filename }}
What am I missing here ? Is what I want to achieve possible ?
Unfortunately it is not possible to do it this way as when the pipeline kicks of it will compile into a structure with all stages, jobs, etc defined. Since this path doesn't exist yet it can't be compiled.
The workaround would be to use another programming language, such as python or powershell, to obtain the list of files that match the decription. And then use it to loop over each file and perform whatever action you want it to run.
As an example, you could use the following in PowerShell to find all hex files and then would need to write additional code for looping over it and performing your desired task.
# Obtain list of all files that are hex
$files = Get-ChildItem "C:\repos\python" -Filter *.hex -Recurse | % { $_.FullName }
# Loop over every file
Foreach ($file in $files)
{echo $file}
The yaml would be this
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Obtain list of all files that are hex
$files = Get-ChildItem "$(Build.SourcesDirectory)" -Filter *.hex -Recurse | % { $_.FullName }
# Loop over every file
Foreach ($file in $files)
{echo $file}

How can an Azure DevOps task use the Output of a previous Azure DevOps task

When executing a DevOps Release Pipeline:
While currently executing Task 2, I would like to reference the Output from Task 1. (Task 1 retrieves json from a REST api call.)
They are both in the same Agentless Job (they are both REST api calls). basically i'm trying to "chain" them together and/or pass the json output from Task 1 to Task 2
I have seen documentation on variables, but I am not sure if I can set variables via Task 1's output, or even better if there was something built-in, like ${PreviousStep.Output}
Is this even possible?
We could create a temporary file in pipeline and save the response body to the file, then read the file in the next power shell.
In my sample, I get release pipeline definition in the first power shell task and save the output to the temporary file, then read it in the next power shell task.
Sample pipeline definition
trigger:
- none
pool:
vmImage: 'windows-latest'
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
New-Item $(Build.SourcesDirectory)\test1.txt
$outfile = "$(Build.SourcesDirectory)\test1.txt"
Write-Host $outfile
$url = "https://vsrm.dev.azure.com/{Org name}/{Project name}/_apis/release/definitions/{Definition ID}?api-version=6.0-preview.4"
Write-Host "URL: $url"
$connectionToken="{PAT}"
$base64AuthInfo= [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($connectionToken)"))
$pipelineInfo = Invoke-RestMethod -Uri $url -Headers #{authorization = "Basic $base64AuthInfo"} -Method Get
$pipelineInfo | Out-File -FilePath $outfile
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
$json = Get-Content -Path $(Build.SourcesDirectory)\test1.txt
Write-Host $json
Result:

Azure Pipeline File-based Trigger and Tags

Is it possible to make a build Pipeline with a file-based trigger?
Let's say I have the following Directory structure.
Microservices/
|_Service A
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
|_Service B
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
I want to have just one single YAML Build Pipeline File.
Based on the Variables $(Project) & $(Stage) different builds are created.
Is it possible to check what directory/file initiated the Trigger and set the variables accordingly?
Additionally it would be great if its possible to use those variables to set the tags to the artifact after the run.
Thanks
KR
Is it possible to check what directory/file initiated the Trigger and
set the variables accordingly?
Of course yes. But there's no direct way since we do not provide any pre-defined variables to store such message, so you need additional complex work around to get that.
#1:
Though there's no variable can direct stores the message like which folder and which file is modified, but you could get it by tracking the commit message Build.SourceVersion via api.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}/changes?api-version=5.1
From its response body, you can directly know its path and file:
Since the response body is JSON format, you could make use of some JSON function to parsing this path value. See this similar script as a reference.
Then use powershell script to set these value as pipeline variable which the next agent jobs/tasks could use them.
Also, in your scenario, all of these should be finished before all next job started. So, you could consider to create a simple extension with pipeline decorator. Define all above steps in decorator, so that it can be finished in the pre-job of every pipeline.
#2
Think you should feel above method is little complex. So I'd rather suggest you could make use of commit messge. For example, specify project name and file name in commit message, get them by using variable Build.SourceVersionMessage.
Then use the powershell script (I mentioned above) to set them as variable.
This is more convenient than using api to parse commits body.
Hope one of them could help.
Thanks for your reply.
I tried a different approach with a Bash Script.
Because I only use ubuntu Images.
I make "git log" with Filtering for the last commit of the Directory Microservices.
With some awk (not so a satisfying Solution) I get the Project & Stage and write them into Pipeline Variables.
The Pipeline just gets triggered when there is a change to the Microservices/* Path.
trigger:
batch: true
branches:
include:
- master
paths:
include:
- Microservices/*
The first job when the trigger activated, is the Dynamic_variables job.
This Job I only use to set the Variables $(Project) & $(Stage). Also the build tags are set with those Variables, so I'm able do differentiate the Artifacts in the Releases.
jobs:
- job: Dynamic_Variables
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- task: Bash#3
name: Dynamic_Var
inputs:
filePath: './scripts/multi-usage.sh'
arguments: '$(Build.SourcesDirectory)'
displayName: "Set Dynamic Variables Project"
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Project) ]; then
echo "target Project not specified";
exit 1;
fi
echo "Project is:" $(Dynamic_Var.Dynamic_Project)
displayName: 'Verify that the Project parameter has been supplied to pipeline'
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Stage) ]; then
echo "target Stage not specified";
exit 1;
fi
echo "Stage is:" $(Dynamic_Var.Dynamic_Stage)
displayName: 'Verify that the Stage parameter has been supplied to pipeline'
The Bash Script I run in this Job looks like this:
#!/usr/bin/env bash
set -euo pipefail
WORKING_DIRECTORY=${1}
cd ${WORKING_DIRECTORY}
CHANGEPATH="$(git log -1 --name-only --pretty='format:' -- Microservices/)"
Project=$(echo $CHANGEPATH | awk -F[/] '{print $2}')
CHANGEFILE=$(echo $CHANGEPATH | awk -F[/] '{print $4}')
Stage=$(echo $CHANGEFILE | awk -F[-] '{print $1}')
echo "##vso[task.setvariable variable=Dynamic_Project;isOutput=true]${Project}"
echo "##vso[task.setvariable variable=Dynamic_Stage;isOutput=true]${Stage}"
echo "##vso[build.addbuildtag]${Project}"
echo "##vso[build.addbuildtag]${Stage}"
If someone has a better solution then the awk commands please let me know.
Thanks a lot.
KR

Release pipeline - share Artifacts between docker stages

I have following azure release pipeline:
Stage 1 runs a docker image, that produces some results, say results1.json
Stage 2 runs a docker image, that produces some results, say results2.json
Now, stage 3 (also docker image) will wait for first two stages to complete, and use both results1.json and results2.json files to do something else.
Any suggestions would be appreciated.
You can add a powershell task to run below ##vso[task.uploadfile] in both stage1 and stage2 to upload the json fils to the task log which is available to download along with the task log.
You may first need to save the json files to a place on the agent. For example save the json file to folder $(System.DefaultWorkingDirectory)
on stage1
run below script in the powershell task
echo "##vso[task.uploadfile]$(System.DefaultWorkingDirectory)\results1.json"
on stage2
echo "##vso[task.uploadfile]$(System.DefaultWorkingDirectory)\results2.json"
On stage3
Add a powershell task to call release log rest api to get the logs and save to a place on the agent(eg.$(System.DefaultWorkingDirectory)).
GET https://{instance}/{collection}/{project}/_apis/release/releases/{releaseId}/logs?api-version=4.1-preview.2
Then use powershell to extract results1.json and results2.json files from the downloaded logs.
Please refer to below full powershell scripts:
$url = "https://vsrm.dev.azure.com/<Org>/<Proj>/_apis/release/releases/$(Release.ReleaseId)/logs?api-version=5.1-preview.2"
$filename="$(System.DefaultWorkingDirectory)\filefinal.zip"
Invoke-RestMethod -Uri $url -Headers #{Authorization="Bearer $env:SYSTEM_ACCESSTOKEN"} -Method get -OutFile $filename
# extract results1.json and results2.json
$sourceFile="$filename"
$file1= "$(System.DefaultWorkingDirectory)\results1.json"
$file2 = "$(System.DefaultWorkingDirectory)\results2.json"
Add-Type -Assembly System.IO.Compression.FileSystem
$zip = [IO.Compression.ZipFile]::OpenRead($sourceFile)
$zip.Entries | where {$_.Name -match 'results1.json'} | foreach {[System.IO.Compression.ZipFileExtensions]::ExtractToFile($_, $file1, $true)}
$zip.Entries | where {$_.Name -match 'results2.json'} | foreach {[System.IO.Compression.ZipFileExtensions]::ExtractToFile($_, $file2, $true)}
$zip.Dispose()
If you encounter not authorized error in the powershell task on stage3. Please refere to blow screenshot and check Allow scripts to access the OAuth token

Resources