Pass windows variable to gitlab-ci - gitlab

Is it possible that I can create a variable via a batch file and pass it to a Gitlab CI variable?
The background is that I want to declare the link of an environment:
environment:
name: staging
url: https://staging.example.com
a part of the URL results dynamically from the current build date. How can I pass the variable declared in a batch file to the gitlab-ci "url" variable?
The Url would look like this in the batch file:
https://testme.com/Tool_%date:~-2%%date:~-7,2%%date:~-10,2%.zip
Outcome is:
https://testme.com/Tool_180410.zip
and that variable i want to write in the environment URL variable

I don't think creating a variable in a batch file and passing it to a GitLab CI variable is possible, but from what I've gathered from your scenario you could:
set the current build date before you call the batch script
pass in the variable to the batch script
use the variable for your URL generation
For example (running on a Windows runner):
$ set testDate=%date:~-2%%date:~-7,2%%date:~-10,2%
$ echo %testDate%
180410
# Use %testDate% wherever else you need it now for the rest of your build.
With regards to the Environment URL, I don't have any experience using it - but this open issue could be of interest.

I usually set any variable needed in Gitlab's .gitlab-ci.yml in another separate yml file. Let's assume that the name of this second file is "parameters.yml", just to give an example.
You can create the parameters.yml file from your batch and define all your variables there.
Example of what parameters.yml contains:
variables:
name: staging
url: https://staging.example.com
[other variables]
Then, all you have to do is to include this yml into the "main" one (.gitlab-ci.yml), for example, something like this:
stages:
- build
- test
- release
- deploy
include: parameters.yml
And that's it, now you will "see" all the variables defined in "parameters.yml"

Related

How to inherit gitlab variables accross projects?

How can I let gitlab fill a global variable with from CI/CD secret, and then inherit this global variable in other projects?
templates/commons.yml:
variables:
TEST_VAR: $FILLED_FROM_SECRETS
project/.gitlab_ci.yml.
include:
- project: '/templates'
ref: master
file:
- 'commons.yml'
test:
stage: test
script:
- echo $TEST_VAR
Result: the variable is never set. Why?
(of course the FILLED_FORM_SECRETS variable is set in the commons project)
The problem you have is that include: only brings in the contents on the YAML file, not the project level settings or variables.
As possible alternatives, you can:
Set the variable in the template directly (not recommended for sensitive values)
Set variables set on your own self-hosted runners (note variables cannot be masked this way)
Set instance CI/CD variables
Set a required CI configuration to forcibly include a template to all projects (that template can include variables you need) (note variables cannot be masked this way)
Set group CI/CD variables (where all your projects live under the common group)
Retrieve your secrets using the vault integration or as part of your job script
With the include keyword the included files are merged with the .gitlab-ci.yml and then your .gitlab-ci.yml is executed in the repo where the pipeline is triggered. Therefore, only gobal variables in this repo or inherited variables from any parent groups are known. That's why TEST_VAR is not substituted with the value from the secret as the variable is defined in another repository.
According to the doc, the syntax you used require you provide the whole path for your project (all the part after gitlab.com/group/project).
Assuming your project path is gitlab.com/group/my_project, then you choose one of the following
include:
- project: 'group/my_project'
ref: master
file:
- 'templates/commons.yml'
# or simply, if 'templates' folder lives in the same project as your gitlab-ci.yml file
- '/templates/commons.yml'
test:
stage: test
script:
- echo $TEST_VAR
I personally used both ways in my work, but the doc shows other ways to implement this that you can have a look at.

How to pass values to gitlab pipeline variable sourced from a file

For example the file that I have is test.env
test.env has the content
export SAMPLE="true"
I want the variable SAMPLE to be set as a pipeline variable before running the pipeline
I am trying the below mentioned solution but it is not really helping
before_script:
- git clone test.env
- source test.env
stages:
- publish
test:
stage: publish
trigger:
project: test_pipeline
branch: master
strategy: depend
only:
variables:
- $SAMPLE == 'True'
Is there any way to source the variables in the before hand and then set the pipeline variables so that execution can happen based on those pipeline variables
Currently with Gitlab CI there's no way to provide a file to use as environment variables, at least not in the way you stated. There are a couple of other options however.
First is take all the individual variables you would have in your test.env file and store them as separate Secret Variables. You can set these by going to your project's settings, -> CI/CD, -> Variables. Environment Variables defined here will automatically be available in every pipeline job for this project (although you can select the Protect Variable checkbox, which will only make the variable available for pipelines on Protected Branches).
The next option is to copy the entire test.env file contents, go back to your project’s Secret Variables (as described above), but this time change the Variable Type to "File", and paste the file contents as the value. When you use a "File" type variable, Gitlab will create a temporary file in each of your pipeline jobs (again, unless you check the Protect Variable option). Then the path to that file will be stored as the env variable with the key you selected. This would allow you to do things like cat $my_file_variable, which would evaluate as cat /path/to/temporary/file, then cat the contents.
A final option which is closest to your original request, is to add a job before all your other jobs that would require the test.env file that looks like this:
stage: env_setup # or whatever
script:
- : # this is the bash Null Command that does nothing and always succeeds
artifacts:
reports:
dotenv: test.env
For this job, the only purpose is to turn your test.env file into environment variables. We don't need to do anything else with it, so we use the Null Command for the script section (since a job without at least the script section will fail). The artifacts part is the important stuff here. Gitlab supports a special Report type called dotenv that takes a single argument: a path to a file. The file will get uploaded as an artifact like any other, but for subsequent jobs (or those that use the dependencies keyword with this job name) instead of pulling down the artifact as a file, each item in test.env will be turned into an environment variable, so you can use it like $SAMPLE, etc.
Personally I prefer the first two options over the third, and of the first 2, the 2nd is the easiest as you just have to copy and paste the file you have now into a variable. The reason the third option isn't ideal is that it still allows you to have sensitive variables (like passwords) in your git repository, which isn't ideal from a security standpoint. Either of the first two options eliminate that problem.

How to generate random number in GitLab YAML file

I have YAML file in my Gitlab branch. As per the requirement i want to create one random mergeID in YAML file.
Another simple solution without using script is to use the gitlab env variable CI_JOB_ID.
It's not a random number, but it's unique.
variables:
MERGE_ID: $CI_JOB_ID
If you're referring to the gitlab-ci.yml file, this is possible. Each GitLab job defined in the YAML file provides a script section where you can define bash commands that will get executed by GitLab when the job is run. If you want more complex operations, you can call Python scripts from the script section.
Fortunately, in this case, bash has a built-in $RANDOM function, described in How to generate random number in Bash?.
gitlab-ci.yml:
my-job:
stage: my-stage
script:
# Generate random number from 1 to 10:
- let MR_ID=$((1 + RANDOM % 10))
More info on GitLab scripts: https://docs.gitlab.com/ee/ci/yaml/#script

How do you reuse a before_script from a shared yml file in Gitlab CI?

I know that you can reuse blocks of code in a before script using yaml anchors:
.something_before: &something_before
- echo 'something before'
before_script:
- *something_before
- echo "Another script step"
but this doesn't seem to work when the .something_before is declared in a shared .yml file via the include:file. It also does not seem that extends works for before_script. Does anyone know a way of reusing some steps in a before_script from a shared .yml file?
EDIT: My use case is that I have 2 gitlab projects with almost identical before_script steps. I don't want to have to change both projects whenever there's a change, so I have a third, separate Gitlab project that has a .yml template that I am including via include:file in both projects. I want to put all the common code in that shared template, and just have like two lines before_script for the git project that has the two extra steps.
Yaml anchors don't work with included files. You need to use the extends keyword. But what you want to achieve won't work with before_script as code in your template will be overwritten in the job which uses the template if there is a before_script as well.
Do you really need a before_script in your specific job or can you achieve the same with a normal script? If yes you can do something like this:
Template File:
.something_before:
before_script:
- echo 'something before'
- echo 'something more before'
Project Pipeline:
include:
- project: 'my-group/my-project'
file: '/something_before.yml'
stages:
- something
something:
stage: something
extends: .something_before
script:
- echo 'additional stuff to do'
And your before_script section will be merged into the something job and executed before the script part.
See if GitLab 13.6 (November 2020) does make it easier:
Include multiple CI/CD configuration files as a list
Previously, when adding multiple files to your CI/CD configuration using the include:file syntax, you had to specify the project and ref for each file. In this release, you now have the ability to specify the project, ref, and provide a list of files all at once. This prevents you from having to repeat yourself and makes your pipeline configuration less verbose.
See Documentation) and Issue.
And even, with GitLab 14.9 (March 2022):
Include the same CI/CD template multiple times
Previously, trying to have standard CI/CD templates that you reuse in many places was complicated because each template could only be included in a pipeline once.
We dropped this limitation in this release, so you can include the same configuration file as many times as you like.
This makes your CI/CD configuration more flexible as you can define identical includes in multiple nested configurations, and rest assured that there will be no conflicts or duplication.
See Documentation and Issue.
You can use extends without any problem, but you will need to overwrite the entire before_script block.
If you want to change just a piece of your before_script, use a shell script to do it
Set the if condition inside of your template
before_script
- |
if [ condition ]
then
commands here
fi
AFTER EDIT: You can use variables to achieve it
Project 1: VAR = command 1
Project 2: VAR = command 2
You can set the content of env var on the gitlab-ci.yml file or on the CI/CD settings in each project!

Declaration and usage of Output Variable in Azure Devops

I'm creating a Continuous Integration pipeline that uses Bash script tasks in order to create the initial variables for runtime.
I have a variable that I call: datebuild, which is formatted accordingly : $(date +%Y%m%d_%H%M%S).
Currently I'm using the pipeline variable that's how I'm declaring it
When using the datebuild variable under Bash#3 task, it successfully formatting it.
Afterwards I want to take the formatted output in order to use it on different tasks inside one agent job.
On the second task I need to copy file to the Artifact Staging Directory:
20200423_141808 is the file and Artifact Staging Directory is the Destination Directory, for example.
I've been reading that it can be used with feature called Output Variables.
Created the reference variable named: ref1, and on the task I want to take the output variable I'm using the ref1.datebuild in order to access the variable
Used the following documentation in order to use the output variable it doesn't seem to work.
here's the task inside the pipeline:
Trying to understand What I'm missing.
You can take the formatted date and set it as a variable for the next steps in the job.
For example, in YAML pipeline:
variables:
datebuild: '$(date +%Y%m%d_%H%M%S)'
steps:
- bash: |
formated="$(datebuild)"
echo "##vso[task.setvariable variable=formatedDate]$formated"
- bash: |
echo $(formatedDate)
In the editor:
The second bash task output is:

Resources