Can I define a variable from gitlab-ci as a value in a variable in settings (or schedules)? - gitlab

Here's what I am trying to do.
in .gitlab-ci:
Check schedules pass:
stage: check
image: ${myimage}
script:
- MY_CI_VAR=aVeryLongVariable
- echo "$MY_SCHEDULE_VAR"
In schedules:
Which is not working.
The reason I want to do this is for picking different variable (out of many in the job) on each schedule.

Yes, it is possible to use variables within other variables. This feature was released in GitLab 14.3.
However, since you are using GitLab 13.x, this feature won't be available to you.
You may be able to get around this limitation by using a static value and altering your job script accordingly.
myjob:
before_script: |
if [[ "$SCHEDULE_VAR" == "abc" ]]; then
export FOO="$MY_CI_VAR"
fi
# ...
In versions of GitLab < 14.3 you can still make use of other variables within variables, but instead by using $$ to preserve variables from evaluation.
Example from the docs:
variables:
FLAGS: '-al'
LS_CMD: 'ls "$FLAGS" $$TMP_DIR'
script:
- 'eval "$LS_CMD"' # Executes 'ls -al $TMP_DIR'

Related

Read variable from file for usage in GitLab pipeline

Given the following very simple .gitlab-ci.yml pipeline:
---
variables:
KEYCLOAK_VERSION: 20.0.1 # this should be populated from reading a file from the repo...
stages:
- test
build:
stage: test
script:
- echo "$KEYCLOAK_VERSION"
As you might see, this simply outputs the value of KEYCLOAK_VERSION defined in the variables section.
Now, the Git repository contains a env.properties file with KEYCLOAK_VERSION=20.0.1 as content. How would I read the variable from that file and use it in the GitLab pipeline?
The documentation mentions import but this seems to be using YAML files.
To read variables from a file you can use the source or . command.
script:
- source env.properties
- echo $KEYCLOAK_VERSION
Attention:
One reason why you might not want to do it this way is because whatever is in env.properties will be run in your shell, such as rm -rf /, which could be very dangerous.
Maybe you can take a look here for some other solutions.

Why is Gitlab CICD workflow:rules:if:variables failing to set variables?

stages:
- test
# Default vars
variables:
DEPLOY_VARIABLE: "dev-deploy"
workflow:
rules:
- if: '$CI_COMMIT_REF_NAME == "master"'
variables:
DEPLOY_VARIABLE: "master-deploy" # Override globally-defined DEPLOY_VARIABLE
my_project_test:
stage: test
script:
- env | grep CI
- echo $DEPLOY_VARIABLE // this always outputs dev-deploy.
Running with gitlab-runner 14.10.1.
No matter if i try that locally or on Gitlab that var is never set.
On local I run it with gitlab-runner exec shell my_project_test.
env | grep CI is:
CI_SERVER_VERSION=
CI_RUNNER_EXECUTABLE_ARCH=darwin/amd64
CI_COMMIT_REF_NAME=master
CI_JOB_TOKEN=
CI_PROJECT_ID=0
CI_RUNNER_REVISION=f761588f
... etc
As per their documentation:
If a rule matches, when: always is the default, and when: never is the default if nothing matches.
I even tried if: '1 == 1' and so on.
gitlab-runner exec has several limitations and does not implement/consider many features of YAML definitions, including workflow:rules:[]variables in this case.
However, when run through gitlab.com or a self-hosted instance of GitLab, workflow:rules: will evaluate properly.
Keep in mind, there are a few cases where variables set elsewhere will take precedence over variables defined in the YAML, such as when variables are set in project, group, or instance settings.
the assignment should works if you put the condition in your task.
my_project_test:
stage: test
rules:
- if: '$CI_COMMIT_REF_NAME == "master"'
variables:
DEPLOY_VARIABLE: "master-deploy" # Override globally-defined DEPLOY_VARIABLE
script:
- env | grep CI
- echo $DEPLOY_VARIABLE // this always outputs dev-deploy.
However the variable is only in the scope of your job, which is under your if condition, it won't overwrite the global value in the another job.
What you really need to pass variable between jobs :
set up global variables dynamically in gitlab-ci

Evaluate subtraction inside GitLab CI VARIABLES keywords

I am trying to parallelize flutter build using GitLab using GitLab's parallel keyword and flutter's --total-shards and --shard-index.
Something like below
test_job:
stage: test
parallel: 3
script:
- flutter test --total-shards $CI_NODE_TOTAL --shard-index $CI_NODE_INDEX
However, this script fails in the last job because off-by-one error $CI_NODE_INDEX > $CI_NODE_TOTAL. Seems like it is undocumented that $CI_NODE_INDEX starts from 1 instead of 0.
I wanted to subtract the variables by using VARIABLES to $CI_NODE_INDEX_ZERO because the variable is being used multiple times throughout this long job (the script in the example above is shortened).
I tried this.
test_job:
stage: test
parallel: 3
variables:
CI_NODE_INDEX_ZERO: $( expr $CI_NODE_INDEX - 1 )
script:
- flutter test --total-shards $CI_NODE_TOTAL --shard-index $CI_NODE_INDEX_ZERO
The script still fails since the value of $CI_NODE_INDEX_ZERO is literal string expr $CI_NODE_INDEX - 1 instead of 0 (or whatever integer value needed).
This actually works in my local terminal.
petrabarus#Petras-Air % CI_NODE_INDEX_ZERO=5
petrabarus#Petras-Air % CI_NODE_INDEX=5
petrabarus#Petras-Air % echo $CI_NODE_INDEX
5
petrabarus#Petras-Air % CI_NODE_INDEX_ZERO=$( expr $CI_NODE_INDEX - 1 )
petrabarus#Petras-Air % echo $CI_NODE_INDEX_ZERO
4
How do I fix this?
Variables can only be literal values -- they are not evaluated in any way, like what happens in your bash shell.
If you want to use bash to evaluate and set variables for jobs, you can do that using a dotenv artifact.
make_variables:
stage: .pre # run before all jobs
script:
# evaluate the value of a variable
- DYNAMIC_VARIABLE=$(my-script)
# Add the value to a dotenv file.
- echo "DYNAMIC_VARIABLE=$DYNAMIC_VARIABLE" >> myvariables.env
artifacts:
reports: # set the variables for subsequent jobs
dotenv: myvariables.env
my_job:
script:
- echo "$DYNAMIC_VARIABLE"
Though the easier thing to do would be just to evaluate it directly in your script:
script:
- SHARD_INDEX=$( expr $CI_NODE_INDEX - 1 )
- flutter test --total-shards $CI_NODE_TOTAL --shard-index $SHARD_INDEX
i think because VARIABLES can't read from Gitlab CI
please check it https://docs.gitlab.com/ee/ci/variables/
example
test_variable:
stage: test
script:
- echo "$CI_JOB_STAGE"
please tried it
script:
- flutter test --total-shards "$CI_NODE_TOTAL" --shard-index "$CI_NODE_INDEX"

How to use one variable inside another in gitlab ci

I have a gitlab yaml file for running certain jobs. In the variables part, I have declared certain variables with values and when I try to use it in another variable formation, it is actually generating but not fetching in the later part of job execution.
Code tried is as below:
variables:
env: "prod"
user: "test"
region: "us-east"
var1: '$env-$user-$region'
As suggested in one forum to include var1 formation in before_script script part. I tried it, but it was also not returning the var1 value correctly.
Any help will be appreciated.
At the bottom of this section of the official documentation, they describe using variables within variables:
You can use variables to help define other variables. Use $$ to ignore a variable name inside another variable:
variables:
FLAGS: '-al'
LS_CMD: 'ls $FLAGS $$TMP_DIR'
script:
- 'eval $LS_CMD' # Executes 'ls -al $TMP_DIR'
I was able to follow this pattern, and additionally I was combining variables in the script: step with a command such as:
script:
- APP_NAME=$APP_NAME-$VERSION

Azure Pipeline File-based Trigger and Tags

Is it possible to make a build Pipeline with a file-based trigger?
Let's say I have the following Directory structure.
Microservices/
|_Service A
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
|_Service B
|_Test_Stage
|_Testing_Config
|_QA_Stage
|_QA_Config
|_Prod_stage
|_Prod_Config
I want to have just one single YAML Build Pipeline File.
Based on the Variables $(Project) & $(Stage) different builds are created.
Is it possible to check what directory/file initiated the Trigger and set the variables accordingly?
Additionally it would be great if its possible to use those variables to set the tags to the artifact after the run.
Thanks
KR
Is it possible to check what directory/file initiated the Trigger and
set the variables accordingly?
Of course yes. But there's no direct way since we do not provide any pre-defined variables to store such message, so you need additional complex work around to get that.
#1:
Though there's no variable can direct stores the message like which folder and which file is modified, but you could get it by tracking the commit message Build.SourceVersion via api.
GET https://dev.azure.com/{organization}/{project}/_apis/git/repositories/{repositoryId}/commits/{commitId}/changes?api-version=5.1
From its response body, you can directly know its path and file:
Since the response body is JSON format, you could make use of some JSON function to parsing this path value. See this similar script as a reference.
Then use powershell script to set these value as pipeline variable which the next agent jobs/tasks could use them.
Also, in your scenario, all of these should be finished before all next job started. So, you could consider to create a simple extension with pipeline decorator. Define all above steps in decorator, so that it can be finished in the pre-job of every pipeline.
#2
Think you should feel above method is little complex. So I'd rather suggest you could make use of commit messge. For example, specify project name and file name in commit message, get them by using variable Build.SourceVersionMessage.
Then use the powershell script (I mentioned above) to set them as variable.
This is more convenient than using api to parse commits body.
Hope one of them could help.
Thanks for your reply.
I tried a different approach with a Bash Script.
Because I only use ubuntu Images.
I make "git log" with Filtering for the last commit of the Directory Microservices.
With some awk (not so a satisfying Solution) I get the Project & Stage and write them into Pipeline Variables.
The Pipeline just gets triggered when there is a change to the Microservices/* Path.
trigger:
batch: true
branches:
include:
- master
paths:
include:
- Microservices/*
The first job when the trigger activated, is the Dynamic_variables job.
This Job I only use to set the Variables $(Project) & $(Stage). Also the build tags are set with those Variables, so I'm able do differentiate the Artifacts in the Releases.
jobs:
- job: Dynamic_Variables
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
- task: Bash#3
name: Dynamic_Var
inputs:
filePath: './scripts/multi-usage.sh'
arguments: '$(Build.SourcesDirectory)'
displayName: "Set Dynamic Variables Project"
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Project) ]; then
echo "target Project not specified";
exit 1;
fi
echo "Project is:" $(Dynamic_Var.Dynamic_Project)
displayName: 'Verify that the Project parameter has been supplied to pipeline'
- task: Bash#3
inputs:
targetType: 'inline'
script: |
set +e
if [ -z $(Dynamic_Var.Dynamic_Stage) ]; then
echo "target Stage not specified";
exit 1;
fi
echo "Stage is:" $(Dynamic_Var.Dynamic_Stage)
displayName: 'Verify that the Stage parameter has been supplied to pipeline'
The Bash Script I run in this Job looks like this:
#!/usr/bin/env bash
set -euo pipefail
WORKING_DIRECTORY=${1}
cd ${WORKING_DIRECTORY}
CHANGEPATH="$(git log -1 --name-only --pretty='format:' -- Microservices/)"
Project=$(echo $CHANGEPATH | awk -F[/] '{print $2}')
CHANGEFILE=$(echo $CHANGEPATH | awk -F[/] '{print $4}')
Stage=$(echo $CHANGEFILE | awk -F[-] '{print $1}')
echo "##vso[task.setvariable variable=Dynamic_Project;isOutput=true]${Project}"
echo "##vso[task.setvariable variable=Dynamic_Stage;isOutput=true]${Stage}"
echo "##vso[build.addbuildtag]${Project}"
echo "##vso[build.addbuildtag]${Stage}"
If someone has a better solution then the awk commands please let me know.
Thanks a lot.
KR

Resources