VSTS SDK store and retrieve value that persists between phases - azure-pipelines-build-task

I want to write a task in VSTS that persists a value between phases. The purpose is so I could store a value that points to a record in an external system, and then retrieve and update that record in a later phase that always executes during a release failure. I tried writing an environment variable, but that does not persist:
write-output("##vso[task.setvariable variable=CRQID;]$changeid")
I see you can write an attachment (below), but I can't find any reference to a "get-attachment" cmdlet in the SDK:
write-output "##vso[task.addattachment type=Distributedtask.Core.Summary;name=Change Request;]$fileName"
I was referencing this document.
I thought I might be able to write the file to the file system, but then if the agents were pooled and my second phase executed on another agent the path would be worthless.

For VSTS itself, it can not persist values between phases. But you can archieve it by developing your own task.
And as you found, if you stored the value in the build agent of first phase, the value can not be found if you use another agent in the second phase.
Actually you just need to store the value to a place where both phases (different agents) can get the value. Such as you can store the value in your github repo by below commands:
git clone https://github.com/username/reponame
#copy filename under reponame folder (overwrite of the filename already exist under reponame folder)
cd repo name
git add .
git commit -m 'store values in the filename'
git push https://username:password#github.com/ master
If you want to use the value in another phase, then clone the github repo and get the value from the filename.

Related

use output from previous pipeline run in current pipeline

I want to get some output from the same pipeline that was run previously.
Suppose in run my-pipeline which outputs some hash. Next time I run this pipeline, I want to get the hash from the previous run of my-pipeline.
The reason for that is essentially conditional cache invalidation, So sometimes I need to reuse the same generated hash, while other times I want to generate a new hash that from then on is passed down each pipeline run, until it's changed again.
To transfer output between different pipeline runs, typically we will consider artifacts or store output somewhere which will be not deleted.
According to your description, you can put the hash value to a file and push to your repo in the pipeline, when you check it out in the next pipeline run, you can read from the file to get the value.You can also add tag value to hint it's changed or not.

How to access previous GitLab CI artifacts, process them and save file in "main" repository?

My application for this is to visualize the performance of my software. Therefore I briefly describe what I'm doing and where I'm stuck.
I have my source code in GitLab
Compile and run some tests in the CI for each commit
Measure the time it took for the test run and save it to a file
Upload the file with the time as an artifact
--------- From here on I don't know how to achieve it.
Run some new job that reads all timing files of the previous artifacts
Plot the times, probably with Python and save the image as SVG in the "main" repository
Show the image on the GitLab start page (README.md should probably include it)
Now I see which commits had which impact on my software's performance
No idea whether I'm asking for the impossible or not. I hope someone can help me as I'm not an CI expert. Maybe a single expression is already enough to google the solution but I don't even know how to formulate this.
Thanks everyone :)
Committing Images to Main
You can't just save an SVG image to the main repo from a pipeline job. You would need to make a commit. Not only would that pollute your git history and bulk up your repo, but it could also trigger a new pipeline, resulting in an endless loop.
There are ways around the endless loop, e.g. by controlling which sources/branches trigger pipelines or by prefixing the commit message with [skip ci], but it can get complicated and is probably not worth it. The truth is GitLab cannot do exactly what you want, so you will have to compromise somewhere.
Generate Metrics Graphs From Artifacts
You can collect metrics from past pipelines in a CSV file and save it as an artifact.
Add this to a reusable script called add_metrics.sh:
#!/bin/bash
HTTP_HEADER="PRIVATE-TOKEN: $YOUR_ACCESS_TOKEN"
URL_START="https://gitlab.example.com/api/v4/projects/$CI_PROJECT_ID/jobs/artifacts"
URL_END="raw/<path/to/artifact>/metrics.csv?job=$CI_JOB_NAME"
COLUMN_HEADERS=Date,Time,Branch,Commit SHA,Test Time(s),Code Coverage (%)
# download latest artifact
if curl --location --header $HTTP_HEADER $URL_START/$CI_COMMIT_BRANCH/$URL_END
then echo "Feature branch artifact downloaded."
elif curl --location --header $HTTP_HEADER $URL_START/master/$URL_END
then echo "Master branch artifact downloaded."
else echo $COLUMN_HEADERS >> metrics.csv
fi
# add data sample row to CSV
NOW_DATE=$(date +"%F")
NOW_TIME=$(date +"%T")
echo $NOW_DATE,$NOW_TIME,$CI_COMMIT_BRANCH,$CI_COMMIT_SHA,$TEST_TIME,$CODE_COVERAGE >> metrics.csv
# keep last 50 lines
echo "$(tail -50 metrics.csv)" > metrics.csv
Then call it from your pipeline in gitlab-ci.yml:
job_name:
script:
- TEST_TIME=10
- CODE_COVERAGE=85
- chmod +x add_metrics.sh
- bash add_metrics.sh
artifacts:
paths:
- metrics.csv
expire_in: 1 month
Note: You will have to create a personal token and add it to a masked variable. I will also leave it up to you to populate the data metrics, like test time, code coverage, etc.
Explanation of Code
Download the latest artifact for the current branch.
The first commit of a feature branch will not find a "latest" artifact. If that happens, download the latest artifact from master.
The first time the script runs, master won't even have a "latest" artifact, so create a new CSV file.
APPEND the current sample to the end of the CSV file. Delete old samples to keep a fixed number of data points. You can add date, pipeline ID and other metrics.
Store the updated artifact.
To view the graph, download the artifact from the GitLab UI and view it in a spreadsheet app.
Publish to Pages
Using Python (pandas, matplotlib), you can generate an image of the plot and publish it to Gitlab Pages from your master branch pipeline. You can have a static HTML page in your repository referencing the same image filename, and keep replacing the same image from your pipeline. You can also add more useful metrics, such as code coverage.

How to get all commits in a repository along with the corresponding branch name?

I am faced with a problem of getting all commits in a repository with the branch name too along with the commit ID . While there exists one endpoint that lists all the commits (https://developer.atlassian.com/bitbucket/api/2/reference/resource/repositories/%7Bworkspace%7D/%7Brepo_slug%7D/commits) what this API does not give is the branch name along with the commit ID . If I call the branches endpoint : /2.0/repositories/{workspace}/{repo_slug}/refs/branches/{name} I can get only the latest commit and not all commits in the branch . To do any kind of mapping I would need to call each branch and then another loop to call each commit within a branch this causes the code to fail as I exceed the no. of requests allowed . I need some solutions to tackle this problem .
I am writing a python script that calls these two api endpoints in two loops and generating a list of lists out of this
You can use the file history option provided by bitbucket.
And Branch name by default is master unless you change it in your properties file.
https://developer.atlassian.com/bitbucket/api/2/reference/resource/repositories/%7Bworkspace%7D/%7Brepo_slug%7D/filehistory/%7Bcommit%7D/%7Bpath%7D

Unable to use Custom Pipeline Variable for Release Name

I've created a powershell script that updates a Pipeline Variable during a Release Pipeline. It takes the custom variable and updates it with a new version using semantic versioning with every run.
I've tried to add this custom variable as the Release Pipeline but keeps on giving me an error "Names of releases from this pipeline will not be unique. Use pre-defined variables to generate unique release names."
I've tried setting the variable to "Settable at Release" and putting the scope to "Release"
Does anybody perhaps know if there is a way to let the release pipeline know this is a dynamic variable that changes?
The only other option is to add the revision number to it $(versionnumber)$(rev:.r)
use Custom Pipeline Variable for Release Name
For this issue ,I think it not feasible to achieve it. Release name must be a unique name,
the $(rev:r) token can ensure that every completed build/release has a unique name because it's adding a incremental number for each release. When a build/release is completed, if nothing else in the number has changed, the Rev integer value is incremented by one. So, basically we cannot achieve that without using $(rev:r), unless you can defined a token which has the same function with $(rev:r).
In addition,you can also use $(Build.BuildNumber) or $(Release.ReleaseId) which are also unique.
For the similar issue,please refer to this case .

TFS Build - Can I create a new branch and add the new mapping to my workspace dynamically?

As part of our CI, after each release we create a new branch and manually change the version number in our AssemblyInfo and Config files. This is prone to human error and we have decided to automate this process. So far I have a script that creates a new branch from our Main branch which I run before our build; the XAML has been modified with a number of activities that checks out all the AssemblyInfo and config files, updates the version numbers and checks the changes in.
What I want to do is to make this two stage process into a single process. The idea I have at the moment is to add the "TF Branch" script as an invoke process at the beginning of my template before the "initialize Workspace" sequence; this will create the new branch (say branches\1.2.3.4). After the branch has been created I then use a "TFWorkFold" activity placed just under "Create Workspace", in the TFWorkFold activity I add the new mapping - ( ServerPath = $\TeamProject\Branches\1.2.3.4 - LocalPath = SourcesDirectory) but when the process hits the "Get Workspace" activity none of the source files from the new branch are added to the Workspace.
When I run this on other builds the new mapping is successful, only if the branch has been created before the build is initialized.
Is there a step i'm missing? In my Create branch script i have a ping timer of 600secs to allow the branching enough time to complete before the new workspace mapping is added
Figured it out,
After the new branch is created the build process needs to get the latest source files created. In the Build definition I need to tell it to get the Latest Version, this is set in the "Process" section under "Advanced" then "Get Version" I set the value as "T" for "Get Latest". Done :)

Resources