Gitlab CI use export variable - gitlab

Is there any way to use an export variable, defined in the generic before_script:
before_script:
- export UPPERHASH=$(echo $CI_COMMIT_REF_SLUG | md5sum | tr [a-z] [A-Z])
into another job as a variable, because I am gonna use trigger but trigger does not allow to have any script, ex:
test variables:
stage: test-variables
variables:
UPPERHASH_TEST1: $UPPERHASH
trigger:
project: "...\..."
I have tried multiple options but none of them is working.

It will not work this way because "test variables".variables is processed before before_script
You only can refer to this variable in a script:
test variables:
stage: test-variables
script:
UPPERHASH_TEST1=$UPPERHASH
... trigger other project from command line ...
Read here on how to trigger other project from command line
https://docs.gitlab.com/ee/ci/triggers/README.html

Related

whitelist some inherrited variables (but not all) in gitlab multi-project pipeline

I'm following the gitlab docs for multi-project pipelines. I'm running on gitlab.com (not enterprise/self-hosted).
I have successfully set up a multi-project pipeline. My question is - is there a way to pass some but not all variables between stages?
Here's a very simple build script for two projects:
Main project:
variables:
THIS_PROJECT_NAME: trigger-source
SHARED_ARGUMENT: "hello world!"
stages:
- build
- downstream
build-code-job:
stage: build
script:
- echo "${THIS_PROJECT_NAME}"
- echo "${SHARED_ARGUMENT}"
run-trigger-job:
stage: downstream
inherit:
variables: false
variables:
SHARED_ARGUMENT: $SHARED_ARGUMENT
trigger: my-org/triggers_dest
Triggered project:
variables:
THIS_PROJECT_NAME: trigger-dest
SHARED_ARGUMENT: "overwrite me"
stages:
- test
triggered-job:
stage: test
script:
- echo "${THIS_PROJECT_NAME}"
- echo "${SHARED_ARGUMENT}"
only:
- pipelines
when I run this with inherit: variables: false, the output in the triggered project's builds just show the default variables (no variables are passed):
$ echo "${THIS_PROJECT_NAME}"
trigger-dest
$ echo "${SHARED_ARGUMENT}"
overwrite me
However, when I use inherit: variables: true, all variables are passed, except the value of SHARED_ARGUMENT is actually written as the literal "$SHARED_ARGUMENT, which then gets expanded to "overwrite me":
$ echo "${THIS_PROJECT_NAME}"
trigger-source
$ echo "${SHARED_ARGUMENT}"
overwrite me
This is the opposite of what I want! Essentially I want to whitelist variables to pass through, rather than blacklisting them as above. Any way to do this?
Found the answer buried in the docs on the inherit: variables keyword. In addition to true/false, you can specify a list of variables to inherit.
Changing the source project's .gitlab-ci.yml to the following:
variables:
THIS_PROJECT_NAME: trigger-source
SHARED_ARGUMENT: "hello world!"
stages:
- build
- downstream
build-code-job:
stage: build
script:
- echo "${THIS_PROJECT_NAME}"
- echo "${SHARED_ARGUMENT}"
run-trigger-job:
stage: downstream
inherit:
variables:
- SHARED_ARGUMENT
trigger: my-org/triggers_dest
results in the desired output:
$ echo "${THIS_PROJECT_NAME}"
trigger-dest
$ echo "${SHARED_ARGUMENT}"
hello world!

Why is Gitlab CICD workflow:rules:if:variables failing to set variables?

stages:
- test
# Default vars
variables:
DEPLOY_VARIABLE: "dev-deploy"
workflow:
rules:
- if: '$CI_COMMIT_REF_NAME == "master"'
variables:
DEPLOY_VARIABLE: "master-deploy" # Override globally-defined DEPLOY_VARIABLE
my_project_test:
stage: test
script:
- env | grep CI
- echo $DEPLOY_VARIABLE // this always outputs dev-deploy.
Running with gitlab-runner 14.10.1.
No matter if i try that locally or on Gitlab that var is never set.
On local I run it with gitlab-runner exec shell my_project_test.
env | grep CI is:
CI_SERVER_VERSION=
CI_RUNNER_EXECUTABLE_ARCH=darwin/amd64
CI_COMMIT_REF_NAME=master
CI_JOB_TOKEN=
CI_PROJECT_ID=0
CI_RUNNER_REVISION=f761588f
... etc
As per their documentation:
If a rule matches, when: always is the default, and when: never is the default if nothing matches.
I even tried if: '1 == 1' and so on.
gitlab-runner exec has several limitations and does not implement/consider many features of YAML definitions, including workflow:rules:[]variables in this case.
However, when run through gitlab.com or a self-hosted instance of GitLab, workflow:rules: will evaluate properly.
Keep in mind, there are a few cases where variables set elsewhere will take precedence over variables defined in the YAML, such as when variables are set in project, group, or instance settings.
the assignment should works if you put the condition in your task.
my_project_test:
stage: test
rules:
- if: '$CI_COMMIT_REF_NAME == "master"'
variables:
DEPLOY_VARIABLE: "master-deploy" # Override globally-defined DEPLOY_VARIABLE
script:
- env | grep CI
- echo $DEPLOY_VARIABLE // this always outputs dev-deploy.
However the variable is only in the scope of your job, which is under your if condition, it won't overwrite the global value in the another job.
What you really need to pass variable between jobs :
set up global variables dynamically in gitlab-ci

In Azure Pipelines how to post a dynamic, multi-line comment generated in a previous step using GitHubComment task?

In an Azure Pipeline, the following will post a multi-line comment to a GitHub PR:
stages:
- stage: MyStage
jobs:
- job: CommentOnPR
steps:
- task: GitHubComment#0
displayName: Post comment to PR
inputs:
gitHubConnection: MyGitHubConnection
repositoryName: $(build.repository.name)
comment: |
Here is a comment
with multiple lines
The following will also post a multi-line comment:
variables:
myComment: "Here is a comment\nwithmultiplelines"
stages:
- stage: MyStage
jobs:
- job: CommentOnPR
steps:
- task: GitHubComment#0
displayName: Post comment to PR
inputs:
gitHubConnection: MyGitHubConnection
repositoryName: $(build.repository.name)
comment: $(myComment)
The multi-line comment I am wanting to post is dynamically generated in a script kind of like the following located at scripts/my-script:
#!/bin/bash
output="# Changes
The following packages were updated:
"
for package_name in $(git diff --name-only origin/main packages/ | cut -d'/' -f2 | sort -u)
do
output+="\n- $package_name"
done
export output
The output of this script then looks something like the following:
# Changes
The following packages were updated:
- some-package
- another-package
- etc.
(Note, my actual script is different, and is doing a few other things in addition to generating a multi-line string. I'm not asking about the contents of this script specifically.)
Given that setup, I would like my pipeline to run the scripts/my-script and use the output from it in the GitHub PR comment. However, everything I've tried either ends up with just the first line as the comment or a single line comment where all the \ns are shown literally.
I tried this:
stages:
- stage: MyStage
jobs:
- job: CommentOnPR
steps:
- bash: |
source scripts/my-script
echo "##vso[task.setvariable variable=myComment]$output"
- task: GitHubComment#0
displayName: Post comment to PR
inputs:
gitHubConnection: MyGitHubConnection
repositoryName: $(build.repository.name)
comment: $(myComment)
But the comment in GitHub was just the first line:
# Changes
I then tried changing scripts/my-script to only use \ns:
#!/bin/bash
output="# Changes\n\nThe following packages were updated:\n"
for package_name in $(git diff --name-only origin/main packages/ | cut -d'/' -f2 | sort -u)
do
output+="\n- $package_name"
done
export output
The comment in GitHub was all a single line:
# Changes\n\nThe following packages were updated:\n\n- some-package\n- another-package \n- etc.
I can't find the magic combination where I can dynamically generate a multi-line string in a step and then have the subsequent GitHubComment task display it properly. I'm fairly new to bash scripting and Pipelines. Any ideas? Thank you.
I had a colleague point me to https://developercommunity.visualstudio.com/t/multiple-lines-variable-in-build-and-release/365667 which linked out to VSTS Release multi-line variable. For one workaround, we can add a string substitution in the pipeline that replaces \ns with %0D%0As:
stages:
- stage: MyStage
jobs:
- job: CommentOnPR
steps:
- bash: |
source scripts/my-script
formatted_output=${output//\\n/%0D%0A}
echo "##vso[task.setvariable variable=myComment]$formatted_output"
- task: GitHubComment#0
displayName: Post comment to PR
inputs:
gitHubConnection: MyGitHubConnection
repositoryName: $(build.repository.name)
comment: $(myComment)
I'm not going to mark this answer as accepted, because I'm still hoping someone has a better way...

how to pass variables between gitlab-ci jobs?

I have a gitlab-ci like this:
stages:
- calculation
- execution
calculation-job:
stage: calculation
script: ./calculate_something_and_output_results.sh
tags:
- my-runner
execution-job:
stage: execution
script: ./execute_something_with_calculation_results.sh foo
tags:
- my-runner
The foo argument in execution-job is base on the results of calculation-job. I want to pass the results from one job to another job via variables. How can I do that?
If you're looking to get the results without storing a file anywhere you can use artifacts: reports: dotenv. This is taken entirely from DarwinJS shared-variables-across-jobs repo.
stages:
- calculation
- execution
calculation-job:
stage: calculation
script: - |
# stores new or updates existing env variables, ex. $OUTPUT_VAR1
./calculate_something_and_output_results.sh >> deploy.env
tags:
- my-runner
artifacts:
reports:
#propagates variables into the pipeline level, but never stores the actual file
dotenv: deploy.env
execution-job:
stage: execution
script: - |
echo "OUTPUT_VAR1: $OUTPUT_VAR1"
./execute_something_with_calculation_results.sh foo
tags:
- my-runner
AFAIK it is not possible to pass a variable directly from one job to another job. Instead you have to write them into a file and pass that as artifact to the receiving job. To make parsing of the file easy, I recommend to create it with bash export statements and source it in the receiving job's script:
calculation-job:
stage: calculation
script:
- ./calculate_something_and_output_results.sh
- echo "export RESULT1=$calculation_result1" > results
- echo "export RESULT2=$calculation_result2" >> results
tags:
- my-runner
artifacts:
name: "Calculation results"
path: results
execution-job:
stage: execution
script:
- source ./results
# You can access $RESULT1 and $RESULT2 now
- ./execute_something_with_calculation_results.sh $RESULT1 $RESULT2 foo
tags:
- my-runner
needs: calculation-job
Note the ./ when sourcing results might be necessary in case of a POSIX compliant shell that does not source files in the current directory directly like, for example, a bash started as sh.
As a simpler version of what #bjhend answered (no need for export or source statements), since GitLab 13.1 the docs. recommend using a dotenv artifact.
stages:
- calculation
- execution
calculation-job:
stage: calculation
script:
# Output format must feature one "VARIABLE=value" statement per line (see docs.)
- ./calculate_something_and_output_results.sh >> calculation.env
tags:
- my-runner
artifacts:
reports:
dotenv: calculation.env
execution-job:
stage: execution
script:
# Any variables created by above are now in the environment
- ./execute_something_with_calculation_results.sh
tags:
- my-runner
# The following is technically not needed, but serves as good documentation
needs:
job: calculation-job
artifacts: true
If you have a job after the calculation stage that you don't want to use the variables, you can add the following to it:
needs:
job: calculation-job
artifacts: false

Azure Pipeline File-based Trigger and Tags

Is it possible to make a build Pipeline with a file-based trigger?
Let's say I have the following Directory structure.
Microservices/
|_Service A
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
|_Service B
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
I want to have just one single YAML Build Pipeline File.
Based on the Variables $(Project) & $(Stage) different builds are created.
Is it possible to check what directory/file initiated the Trigger and set the variables accordingly?
Additionally it would be great if its possible to use those variables to set the tags to the artifact after the run.
Thanks
KR
Is it possible to check what directory/file initiated the Trigger and
set the variables accordingly?
Of course yes. But there's no direct way since we do not provide any pre-defined variables to store such message, so you need additional complex work around to get that.
#1:
Though there's no variable can direct stores the message like which folder and which file is modified, but you could get it by tracking the commit message Build.SourceVersion via api.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}/changes?api-version=5.1
From its response body, you can directly know its path and file:
Since the response body is JSON format, you could make use of some JSON function to parsing this path value. See this similar script as a reference.
Then use powershell script to set these value as pipeline variable which the next agent jobs/tasks could use them.
Also, in your scenario, all of these should be finished before all next job started. So, you could consider to create a simple extension with pipeline decorator. Define all above steps in decorator, so that it can be finished in the pre-job of every pipeline.
#2
Think you should feel above method is little complex. So I'd rather suggest you could make use of commit messge. For example, specify project name and file name in commit message, get them by using variable Build.SourceVersionMessage.
Then use the powershell script (I mentioned above) to set them as variable.
This is more convenient than using api to parse commits body.
Hope one of them could help.
Thanks for your reply.
I tried a different approach with a Bash Script.
Because I only use ubuntu Images.
I make "git log" with Filtering for the last commit of the Directory Microservices.
With some awk (not so a satisfying Solution) I get the Project & Stage and write them into Pipeline Variables.
The Pipeline just gets triggered when there is a change to the Microservices/* Path.
trigger:
batch: true
branches:
include:
- master
paths:
include:
- Microservices/*
The first job when the trigger activated, is the Dynamic_variables job.
This Job I only use to set the Variables $(Project) & $(Stage). Also the build tags are set with those Variables, so I'm able do differentiate the Artifacts in the Releases.
jobs:
- job: Dynamic_Variables
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- task: Bash#3
name: Dynamic_Var
inputs:
filePath: './scripts/multi-usage.sh'
arguments: '$(Build.SourcesDirectory)'
displayName: "Set Dynamic Variables Project"
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Project) ]; then
echo "target Project not specified";
exit 1;
fi
echo "Project is:" $(Dynamic_Var.Dynamic_Project)
displayName: 'Verify that the Project parameter has been supplied to pipeline'
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Stage) ]; then
echo "target Stage not specified";
exit 1;
fi
echo "Stage is:" $(Dynamic_Var.Dynamic_Stage)
displayName: 'Verify that the Stage parameter has been supplied to pipeline'
The Bash Script I run in this Job looks like this:
#!/usr/bin/env bash
set -euo pipefail
WORKING_DIRECTORY=${1}
cd ${WORKING_DIRECTORY}
CHANGEPATH="$(git log -1 --name-only --pretty='format:' -- Microservices/)"
Project=$(echo $CHANGEPATH | awk -F[/] '{print $2}')
CHANGEFILE=$(echo $CHANGEPATH | awk -F[/] '{print $4}')
Stage=$(echo $CHANGEFILE | awk -F[-] '{print $1}')
echo "##vso[task.setvariable variable=Dynamic_Project;isOutput=true]${Project}"
echo "##vso[task.setvariable variable=Dynamic_Stage;isOutput=true]${Stage}"
echo "##vso[build.addbuildtag]${Project}"
echo "##vso[build.addbuildtag]${Stage}"
If someone has a better solution then the awk commands please let me know.
Thanks a lot.
KR

Resources