How to generate random number in GitLab YAML file - gitlab

I have YAML file in my Gitlab branch. As per the requirement i want to create one random mergeID in YAML file.

Another simple solution without using script is to use the gitlab env variable CI_JOB_ID.
It's not a random number, but it's unique.
variables:
MERGE_ID: $CI_JOB_ID

If you're referring to the gitlab-ci.yml file, this is possible. Each GitLab job defined in the YAML file provides a script section where you can define bash commands that will get executed by GitLab when the job is run. If you want more complex operations, you can call Python scripts from the script section.
Fortunately, in this case, bash has a built-in $RANDOM function, described in How to generate random number in Bash?.
gitlab-ci.yml:
my-job:
stage: my-stage
script:
# Generate random number from 1 to 10:
- let MR_ID=$((1 + RANDOM % 10))
More info on GitLab scripts: https://docs.gitlab.com/ee/ci/yaml/#script

Related

How to read labels in Gitlab CI script

I have a few use cases in my Gitlab setup I would like to be able to support:
If a certain label (let's call it “skip_build”) is set, the deployment steps should not be run when I merge an MR to a main branch. This would be useful when we have multiple MRs being merged right after another and only need the last one built.
If another label (we'll call it “skip_tests”) is set, I should be able to read it as an env var from within the script and alter the flow within the script accordingly (using normal bash syntax), e.g. to alter the package command parameters used a bit. This is useful for small changes where it might not make sense to run a lengthy test suite.
Is this possible with Gitlab, and if so, how?
I’ve tried experimenting with CI_MERGE_REQUEST_LABELS, but it doesn’t seem to be able to read that as an env var from within the script.
You have to use merge request pipelines for the CI_MERGE_REQUEST_LABELS variable (and other MR-related variables) to be present as documented in predefined variables.
You could use a rules: clause to skip jobs. Something like
build:
rules: # only run this job if the regex pattern does not match
- if: $CI_MERGE_REQUEST_LABELS !~ /skip_build/
You can also do this on any other kind of predefined (or user-defined) variable, like branch name, commit messages, MR titles, etc. Whatever works for you.
For example, a built in feature of GitLab is that if your commit message contains [ci skip] it will prevent the pipeline from running. You could implement similar functionality for your jobs and/or pipelines through rules: or workflow:rules:.

Add GitLab CI job to pipeline based on script command result

I have a GitLab CI pipeline with a 'migration' job which I want to be added only if certain files changed between current commit and master branch, but in my current project I'm forced to use GitLab CI pipelines for push event which complicates things.
Docs on rules:changes clearly states that it will glitch and will not work properly without MR (my case of push event), so that's out of question.
Docs on rules:if states that it only works with env variables. But docs on passing CI/CD variables to another job clearly states that
These variables cannot be used as CI/CD variables to configure a
pipeline, but they can be used in job scripts.
So, now I'm stuck. I can just skip running the job in question overriding the script and checking for file changes, but what I want is not adding the job in question to pipeline in first place.
While you can't add a job alone to a pipeline based on the output of a script, you can add a child pipeline dynamically based on the script output. The method of using rules: with dynamic variables won't work because rules: are evaluated at the time the pipeline is created, as you found in the docs.
However, you can achieve the same effect using dynamic child-pipelines feature. The idea is you dynamically create the YAML for the desired pipeline in a job. That YAML created by your job will be used to create a child pipeline, which your pipeline can depend on.
Sadly, to add/remove a Gitlab job based on variables created from a previous job is not possible for a given pipeline
A way to achieve this is if your break your current pipeline to an upstream and downstream
The upstream will have 2 jobs
The first one will use your script to define a variable
This job will trigger the downstream, passing this variable
Upstream
check_val:
...
script:
... Script imposes the logic with the needed checks
... If true
- echo "MY_CONDITIONAL_VAR=true" >> var.env
... If false
- echo "MY_CONDITIONAL_VAR=false" >> var.env
artifacts:
reports:
dotenv: var.env
trigger_your_original_pipeline:
...
variables:
MY_CONDITIONAL_VAR: "$MY_CONDITIONAL_VAR"
trigger:
project: "project_namespance/project"
The downstream would be your original pipeline
Downstream
...
migration:
...
rules:
- if: '$MY_CONDITIONAL_VAR == "true"'
Now the MY_CONDITIONAL_VAR will be available at the start of the pipeline, so you can impose rules to add or not the migration job

How to pass values to gitlab pipeline variable sourced from a file

For example the file that I have is test.env
test.env has the content
export SAMPLE="true"
I want the variable SAMPLE to be set as a pipeline variable before running the pipeline
I am trying the below mentioned solution but it is not really helping
before_script:
- git clone test.env
- source test.env
stages:
- publish
test:
stage: publish
trigger:
project: test_pipeline
branch: master
strategy: depend
only:
variables:
- $SAMPLE == 'True'
Is there any way to source the variables in the before hand and then set the pipeline variables so that execution can happen based on those pipeline variables
Currently with Gitlab CI there's no way to provide a file to use as environment variables, at least not in the way you stated. There are a couple of other options however.
First is take all the individual variables you would have in your test.env file and store them as separate Secret Variables. You can set these by going to your project's settings, -> CI/CD, -> Variables. Environment Variables defined here will automatically be available in every pipeline job for this project (although you can select the Protect Variable checkbox, which will only make the variable available for pipelines on Protected Branches).
The next option is to copy the entire test.env file contents, go back to your project’s Secret Variables (as described above), but this time change the Variable Type to "File", and paste the file contents as the value. When you use a "File" type variable, Gitlab will create a temporary file in each of your pipeline jobs (again, unless you check the Protect Variable option). Then the path to that file will be stored as the env variable with the key you selected. This would allow you to do things like cat $my_file_variable, which would evaluate as cat /path/to/temporary/file, then cat the contents.
A final option which is closest to your original request, is to add a job before all your other jobs that would require the test.env file that looks like this:
stage: env_setup # or whatever
script:
- : # this is the bash Null Command that does nothing and always succeeds
artifacts:
reports:
dotenv: test.env
For this job, the only purpose is to turn your test.env file into environment variables. We don't need to do anything else with it, so we use the Null Command for the script section (since a job without at least the script section will fail). The artifacts part is the important stuff here. Gitlab supports a special Report type called dotenv that takes a single argument: a path to a file. The file will get uploaded as an artifact like any other, but for subsequent jobs (or those that use the dependencies keyword with this job name) instead of pulling down the artifact as a file, each item in test.env will be turned into an environment variable, so you can use it like $SAMPLE, etc.
Personally I prefer the first two options over the third, and of the first 2, the 2nd is the easiest as you just have to copy and paste the file you have now into a variable. The reason the third option isn't ideal is that it still allows you to have sensitive variables (like passwords) in your git repository, which isn't ideal from a security standpoint. Either of the first two options eliminate that problem.

How do you reuse a before_script from a shared yml file in Gitlab CI?

I know that you can reuse blocks of code in a before script using yaml anchors:
.something_before: &something_before
- echo 'something before'
before_script:
- *something_before
- echo "Another script step"
but this doesn't seem to work when the .something_before is declared in a shared .yml file via the include:file. It also does not seem that extends works for before_script. Does anyone know a way of reusing some steps in a before_script from a shared .yml file?
EDIT: My use case is that I have 2 gitlab projects with almost identical before_script steps. I don't want to have to change both projects whenever there's a change, so I have a third, separate Gitlab project that has a .yml template that I am including via include:file in both projects. I want to put all the common code in that shared template, and just have like two lines before_script for the git project that has the two extra steps.
Yaml anchors don't work with included files. You need to use the extends keyword. But what you want to achieve won't work with before_script as code in your template will be overwritten in the job which uses the template if there is a before_script as well.
Do you really need a before_script in your specific job or can you achieve the same with a normal script? If yes you can do something like this:
Template File:
.something_before:
before_script:
- echo 'something before'
- echo 'something more before'
Project Pipeline:
include:
- project: 'my-group/my-project'
file: '/something_before.yml'
stages:
- something
something:
stage: something
extends: .something_before
script:
- echo 'additional stuff to do'
And your before_script section will be merged into the something job and executed before the script part.
See if GitLab 13.6 (November 2020) does make it easier:
Include multiple CI/CD configuration files as a list
Previously, when adding multiple files to your CI/CD configuration using the include:file syntax, you had to specify the project and ref for each file. In this release, you now have the ability to specify the project, ref, and provide a list of files all at once. This prevents you from having to repeat yourself and makes your pipeline configuration less verbose.
See Documentation) and Issue.
And even, with GitLab 14.9 (March 2022):
Include the same CI/CD template multiple times
Previously, trying to have standard CI/CD templates that you reuse in many places was complicated because each template could only be included in a pipeline once.
We dropped this limitation in this release, so you can include the same configuration file as many times as you like.
This makes your CI/CD configuration more flexible as you can define identical includes in multiple nested configurations, and rest assured that there will be no conflicts or duplication.
See Documentation and Issue.
You can use extends without any problem, but you will need to overwrite the entire before_script block.
If you want to change just a piece of your before_script, use a shell script to do it
Set the if condition inside of your template
before_script
- |
if [ condition ]
then
commands here
fi
AFTER EDIT: You can use variables to achieve it
Project 1: VAR = command 1
Project 2: VAR = command 2
You can set the content of env var on the gitlab-ci.yml file or on the CI/CD settings in each project!

Pass windows variable to gitlab-ci

Is it possible that I can create a variable via a batch file and pass it to a Gitlab CI variable?
The background is that I want to declare the link of an environment:
environment:
name: staging
url: https://staging.example.com
a part of the URL results dynamically from the current build date. How can I pass the variable declared in a batch file to the gitlab-ci "url" variable?
The Url would look like this in the batch file:
https://testme.com/Tool_%date:~-2%%date:~-7,2%%date:~-10,2%.zip
Outcome is:
https://testme.com/Tool_180410.zip
and that variable i want to write in the environment URL variable
I don't think creating a variable in a batch file and passing it to a GitLab CI variable is possible, but from what I've gathered from your scenario you could:
set the current build date before you call the batch script
pass in the variable to the batch script
use the variable for your URL generation
For example (running on a Windows runner):
$ set testDate=%date:~-2%%date:~-7,2%%date:~-10,2%
$ echo %testDate%
180410
# Use %testDate% wherever else you need it now for the rest of your build.
With regards to the Environment URL, I don't have any experience using it - but this open issue could be of interest.
I usually set any variable needed in Gitlab's .gitlab-ci.yml in another separate yml file. Let's assume that the name of this second file is "parameters.yml", just to give an example.
You can create the parameters.yml file from your batch and define all your variables there.
Example of what parameters.yml contains:
variables:
name: staging
url: https://staging.example.com
[other variables]
Then, all you have to do is to include this yml into the "main" one (.gitlab-ci.yml), for example, something like this:
stages:
- build
- test
- release
- deploy
include: parameters.yml
And that's it, now you will "see" all the variables defined in "parameters.yml"

Resources