Gitlab CI: Passing dynamic variables - gitlab

I am looking to pass varibales value dynamically as shown below to terraform image as mentioned in the link
image:
name: hashicorp/terraform:light
entrypoint:
- /usr/bin/env
- 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
- 'ACCESS_KEY_ID=${ENV}_AWS_ACCESS_KEY_ID'
- 'SECRET_ACCESS_KEY=${ENV}_AWS_SECRET_ACCESS_KEY'
- 'DEFAULT_REGION=${ENV}_AWS_DEFAULT_REGION'
- 'export AWS_ACCESS_KEY_ID=${!ACCESS_KEY_ID}'
- 'export AWS_SECRET_ACCESS_KEY=${!SECRET_ACCESS_KEY}'
- 'export AWS_DEFAULT_REGION=${!DEFAULT_REGION}'
However, I am getting empty values. How can I pass dynamic values to the variables.

The confusion arises from the subtle fact, that the gitlab runner executes the commands passed into the script section using sh rather than bash
And the core issue is encountered that the following syntax
'export AWS_ACCESS_KEY_ID=${!ACCESS_KEY_ID}'
is understood correctly only by bash and not by sh
Therefore, we need to workaround it by using syntax that is understood by sh
For your case, something like the following should do it
image:
name: hashicorp/terraform:light
entrypoint:
- /usr/bin/env
- 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
job:
before_script:
- ACCESS_KEY_ID=${ENV}_AWS_ACCESS_KEY_ID
- export AWS_ACCESS_KEY_ID=$(eval echo \$$ACCESS_KEY_ID )
- SECRET_ACCESS_KEY=${ENV}_AWS_SECRET_ACCESS_KEY
- export AWS_SECRET_ACCESS_KEY=$( eval echo \$$SECRET_ACCESS_KEY )
- DEFAULT_REGION=${ENV}_AWS_DEFAULT_REGION
- export AWS_DEFAULT_REGION=$( eval echo \$$DEFAULT_REGION )
script:
- echo $AWS_ACCESS_KEY_ID
- echo $AWS_SECRET_ACCESS_KEY
- echo $AWS_DEFAULT_REGION

Related

gitlab ci/cd conditional 'when: manual'?

Is it possible to for a gitlab ci/cd job to be triggered manually only under certain conditions that are evaluated based on the output of jobs earlier in the pipeline? I would like my 'terraform apply' job to run automatically if my infrastructure hasn't changed, or ideally to be skipped entirely, but to be triggered manually if it has.
My .tf file is below. I'm using OPA to set an environment variable to true or false when my infrastructure changes but as far as I can tell, I can only include or exclude jobs when the pipeline is set up, based on e.g. git branch information, not at pipeline run time.
Thanks!
default:
image:
name: hashicorp/terraform:light
entrypoint:
- '/usr/bin/env'
before_script:
- echo ${AWS_PROFILE}
- echo ${TF_ROOT}
plan:
script:
- cd ${TF_ROOT}
- terraform init
- terraform plan -var "profile=${AWS_PROFILE}" -out tfplan.binary
- terraform show -json tfplan.binary > tfplan.json
artifacts:
paths:
- ${TF_ROOT}/.terraform
- ${TF_ROOT}/.terraform.lock.hcl
- ${TF_ROOT}/tfplan.binary
- ${TF_ROOT}/tfplan.json
validate:
image:
name: openpolicyagent/opa:latest-debug
entrypoint: [""]
script:
- cd ${TF_ROOT}
- /opa eval --format pretty --data ../../policy/terraform.rego --input tfplan.json "data.policy.denied"
- AUTHORISED=`/opa eval --format raw --data ../../policy/terraform.rego --input tfplan.json "data.policy.authorised"`
- echo INFRASTRUCTURE_CHANGED=`/opa eval --format raw --data ../../policy/terraform_infrastructure_changed.rego --input tfplan.json "data.policy.changed"` >> validate.env
- cat validate.env
- if [ $AUTHORISED == 'false' ]; then exit 1; else exit 0; fi
artifacts:
paths:
- ${TF_ROOT}/.terraform
- ${TF_ROOT}/.terraform.lock.hcl
- ${TF_ROOT}/tfplan.binary
reports:
dotenv: ${TF_ROOT}/validate.env
needs: ["plan"]
apply:
script:
- echo ${INFRASTRUCTURE_CHANGED}
- cd ${TF_ROOT}
- terraform apply tfplan.binary
artifacts:
paths:
- ${TF_ROOT}/.terraform
- ${TF_ROOT}/.terraform.lock.hcl
- ${TF_ROOT}/tfplan.binary
needs:
- job: validate
artifacts: true
when: manual
rules:
- allow_failure: false

Import external yaml file and use them as environment variable in GITLAB

I have a rest_config.yaml file which looks loke this:
host: abcd
apiKey: abcd
secretKey: abcd
I want to import these in my .gitlab-ci.yaml file and use them as my environment variable. How do I do so?
If your yaml file is part of the checked out repository on which the gitlab-ci.yaml pipeline operates, said pipeline can read the file in a script: section, as I illustrated here.
That script: section can set environment variables.
And you can pass variables explicitly between jobs
build:
stage: build
script:
- VAR1=foo
- VAR2=bar
- echo export VAR1="${VAR1}" > $CI_PROJECT_DIR/variables
- echo export VAR2="${VAR2}" >> $CI_PROJECT_DIR/variables
artifacts:
paths:
- variables
test:
stage: test
script:
- source $CI_PROJECT_DIR/variables
- echo VAR1 is $VAR1
- echo VAR2 is $VAR2
build:
stage: build
script: - VAR1=foo
echo export VAR2="${VAR2}" >> $CI_PROJECT_DIR/variables
artifacts:
paths:
variables
test:
stage: test
script:
source $CI_PROJECT_DIR/variables

How to be able to pass variable to rules in gitlab ci pipeline?

I want to use rules in my GitLab CI pipeline to be able to check if commit is commited from desired branch and if I have any fixable issues in image that I pushed on Harbor registry.
I push that image to registry and do scan of that image on Harbor registry, then get those results in previous stages and now I want to be able to check if I have any fixable issues in that image, if I have I would like to create that job to be manual but to leave the possibility to continue with execution of pipeline and other stages that come after this. If I don't find any of those issues ( I don't have it in my APIs output form Harbor ) I just set that variable to 0 and I wat to continue with execution of pipeline normaly. That varibale for fixable issues in pipeline is called FIXABLE, I tried many ways to assign value to this varibale so rules can be able to read value of that varibale but non of these worked. I will post mu latest work down below so that anyone, who has an idea or advice can look at this. Any help would mean a lot to me. I know that rules are created immediately after the pipeline itself is created so at this moment I am not really sure how can I deal with this.
Thanks in advance!
I have added value of 60 to varibale FINAL_FIXABLE to check if job would run manualy.
Issue is that only this job procession results (dev branch, case one) is running even though FINAL_FIXABLE is set to 60.
After I do build and push of image, those are the stages in pipeline related to this problem:
get results (dev branch):
stage: Results of scanning image
image: alpine
variables:
RESULTS: ""
STATUS: ""
SEVERITY: ""
FIXABLE: ""
before_script:
- apk update && apk upgrade
- apk --no-cache add curl
- apk add jq
- chmod +x ./scan-script.sh
script:
- 'RESULTS=$(curl -H "Authorization: Basic `echo -n ${HARBOR_USER}:${HARBOR_PASSWORD} | base64`" -X GET "https://myregistry/projects/myproject/artifacts/latest?page=1&page_size=10&with_tag=true&with_label=true&with_scan_overview=true&with_signature=true&with_immutable_status=true")'
- STATUS=$(./scan-script.sh "STATUS" "$RESULTS")
- SEVERITY=$(./scan-script.sh "SEVERITY" "$RESULTS")
- FIXABLE=$(./scan-script.sh "FIXABLE" "$RESULTS")
# - echo "$FIXABLE">fixableValue.txt
- echo "Printing the results of the image scanning process on Harbor registry:"
- echo "status of scan:$STATUS"
- echo "severity of scan:$SEVERITY"
- echo "number of fixable issues:$FIXABLE"
- echo "For more information of scan results please visit Harbor registry!"
- FINAL_FIXABLE=$FIXABLE
- echo $FINAL_FIXABLE
- FINAL_FIXABLE="60"
- echo $FINAL_FIXABLE
- echo "$FINAL_FIXABLE">fixableValue.txt
only:
refs:
- dev
- some-test-branch
artifacts:
paths:
- fixableValue.txt
get results (other branches):
stage: Results of scanning image
dependencies:
- prep for build (other branches)
image: alpine
variables:
RESULTS: ""
STATUS: ""
SEVERITY: ""
FIXABLE: ""
before_script:
- apk update && apk upgrade
- apk --no-cache add curl
- apk add jq
- chmod +x ./scan-script.sh
script:
- LATEST_TAG=$(cat tags.txt)
- echo "Latest tag is $LATEST_TAG"
- 'RESULTS=$(curl -H "Authorization: Basic `echo -n ${HARBOR_USER}:${HARBOR_PASSWORD} | base64`" -X GET "https://myregistry/myprojects/artifacts/"${LATEST_TAG}"?page=1&page_size=10&with_tag=true&with_label=true&with_scan_overview=true&with_signature=true&with_immutable_status=true")'
- STATUS=$(./scan-script.sh "STATUS" "$RESULTS")
- SEVERITY=$(./scan-script.sh "SEVERITY" "$RESULTS")
- FIXABLE=$(./scan-script.sh "FIXABLE" "$RESULTS")
# - echo "$FIXABLE">fixableValue.txt
- echo "Printing the results of the image scanning process on Harbor registry:"
- echo "status of scan:$STATUS"
- echo "severity of scan:$SEVERITY"
- echo "number of fixable issues:$FIXABLE"
- echo "For more information of scan results please visit Harbor registry!"
- FINAL_FIXABLE=$FIXABLE
- echo $FINAL_FIXABLE
- FINAL_FIXABLE="60"
- echo $FINAL_FIXABLE
- echo "$FINAL_FIXABLE">fixableValue.txt
only:
refs:
- master
- /^(([0-9]+)\.)?([0-9]+)\.x/
- rc
artifacts:
paths:
- fixableValue.txt
procession results (dev branch, case one):
stage: Scan results processing
dependencies:
- get results (dev branch)
image: alpine
script:
- FINAL_FIXABLE=$(cat fixableValue.txt)
- echo $CI_COMMIT_BRANCH
- echo $FINAL_FIXABLE
rules:
- if: ($CI_COMMIT_BRANCH == "dev" || $CI_COMMIT_BRANCH == "some-test-branch") && ($FINAL_FIXABLE=="0")
when: always
procession results (dev branch, case two):
stage: Scan results processing
dependencies:
- get results (dev branch)
image: alpine
script:
- FINAL_FIXABLE=$(cat fixableValue.txt)
- echo $CI_COMMIT_BRANCH
- echo $FINAL_FIXABLE
rules:
- if: ($CI_COMMIT_BRANCH == "dev" || $CI_COMMIT_BRANCH == "some-test-branch") && ($FINAL_FIXABLE!="0")
when: manual
allow_failure: true
procession results (other branch, case one):
stage: Scan results processing
dependencies:
- get results (other branches)
image: alpine
script:
- FINAL_FIXABLE=$(cat fixableValue.txt)
- echo $CI_COMMIT_BRANCH
- echo $FINAL_FIXABLE
rules:
- if: ($CI_COMMIT_BRANCH == "master" || $CI_COMMIT_BRANCH == "rc" || $CI_COMMIT_BRANCH =~ "/^(([0-9]+)\.)?([0-9]+)\.x/") && ($FINAL_FIXABLE=="0")
when: always
procession results (other branch, case two):
stage: Scan results processing
dependencies:
- get results (other branches)
image: alpine
script:
- FINAL_FIXABLE=$(cat fixableValue.txt)
- echo $CI_COMMIT_BRANCH
- echo $FINAL_FIXABLE
rules:
- if: ($CI_COMMIT_BRANCH == "master" || $CI_COMMIT_BRANCH == "rc" || $CI_COMMIT_BRANCH =~ "/^(([0-9]+)\.)?([0-9]+)\.x/") && ($FINAL_FIXABLE!="0")
when: manual
allow_failure: true
You cannot use these methods for controlling whether jobs run with rules: because rules are evaluated at pipeline creation time and cannot be changed once the pipeline is created.
Your best option to dynamically control pipeline configuration like this would probably be dynamic child pipelines.
As a side note, to set environment variables for subsequent jobs, you can use artifacts:reports:dotenv. When this special artifact is passed to subsequent stages/jobs, the variables in the dotenv file will be available in the job, as if it were set in environment:
stages:
- one
- two
first:
stage: one
script: # create dotenv file with variables to pass
- echo "VAR_NAME=foo" >> "myvariables.env"
artifacts:
reports: # create report to pass variables to subsequent jobs
dotenv: "myvariables.env"
second:
stage: two
script: # variables from dotenv artifact will be in environment automatically
- echo "${VAR_NAME}" # foo
You are doing basically the same thing with your .txt artifact, which works effectively the same way, but this works with less script steps. One key difference is that this can allow for somewhat more dynamic control and it will apply for some other job configuration keys that use environment variables. For example, you can set environment:url dynamically this way.

how to dump gitlab ci environment variables to file

the question
How to dump all Gitlab CI environment variables (with variables set in the project or group CI/CD settings) to a file, but only them, without environment variables of the host on which gitlab runner is executed?
Background
We are using gitlab CI/CD to deploy our projects to a docker server. Each project contains a docker-compose.yml file which uses various environment variables, eg db passwords. We are using .env file to store this variables, so one can start/restart the containers after deployment from command line, without accessing gitlab.
Our deployments script looks something like this:
deploy:
script:
#...
- cp docker-compose.development.yml {$DEPLOY_TO_PATH}/docker-compose.yml
- env > variables.env
- docker-compose up -d
#...
And the docker-compose.yml file looks like this:
version: "3"
services:
project:
image: some/image
env_file:
- variables.env
...
The problem is now the .env file contains both gitlab variables and hosts system environment variables and in the result the PATH variable is overwritten.
I have developed a workaround with grep:
env | grep -Pv "^PATH" > variables.env
It allowed us to keep this working for now, but I think that the problem might hit us again with another variables which would be set to different values inside a container and on the host system.
I know I can list all the variables in docker-compose and similar files, but we already have quite a few of them in a few projects so it is not a solution.
You need to add to script next command
script:
...
# Read certificate stored in $KUBE_CA_PEM variable and save it in a new file
- echo "$KUBE_CA_PEM" > variables.env
...
This might be late, but I did something like this:
script:
- env |grep -v "CI"|grep -v "FF"|grep -v "GITLAB"|grep -v "PWD"|grep -v "PATH"|grep -v "HOME"|grep -v "HOST"|grep -v "SH" > application.properties
- cat application.properties
It's not the best, but it works.
The one problem with this is you can have variables with a string containing one of the exclusions, ie. "CI","FF","GITLAB","PWD","PATH","HOME","HOME","SH"
My reusable solution /tools/gitlab/script-gitlab-variables.yml:
variables:
# Default values
GITLAB_EXPORT_ENV_FILENAME: '.env.gitlab.cicd'
.script-gitlab-variables:
debug:
# section_start
- echo -e "\e[0Ksection_start:`date +%s`:gitlab_variables_debug[collapsed=true]\r\e[0K[GITLAB VARIABLES DEBUG]"
# command
- env
# section_end
- echo -e "\e[0Ksection_end:`date +%s`:gitlab_variables_debug\r\e[0K"
export-to-env:
# section_start
- echo -e "\e[0Ksection_start:`date +%s`:gitlab_variables_export_to_env[collapsed=true]\r\e[0K[GITLAB VARIABLES EXPORT]"
# verify mandatory variables
- test ! -z "$GITLAB_EXPORT_VARS" && echo "$GITLAB_EXPORT_VARS" || exit $?
# display variables
- echo "$GITLAB_EXPORT_ENV_FILENAME"
# command
- env | grep -E "^($GITLAB_EXPORT_VARS)=" > $GITLAB_EXPORT_ENV_FILENAME
# section_end
- echo -e "\e[0Ksection_end:`date +%s`:gitlab_variables_export_to_env\r\e[0K"
cat-env:
# section_start
- echo -e "\e[0Ksection_start:`date +%s`:gitlab_variables_cat-env[collapsed=true]\r\e[0K[GITLAB VARIABLES CAT ENV]"
# command
- cat $GITLAB_EXPORT_ENV_FILENAME
# section_end
- echo -e "\e[0Ksection_end:`date +%s`:gitlab_variables_cat-env\r\e[0K"
How to use .gitlab-ci.yml:
include:
- local: '/tools/gitlab/script-gitlab-variables.yml'
Your Job:
variables:
GITLAB_EXPORT_VARS: 'CI_BUILD_NAME|GITLAB_USER_NAME'
script:
- !reference [.script-gitlab-variables, debug]
- !reference [.script-gitlab-variables, export-to-env]
- !reference [.script-gitlab-variables, cat-env]
Result cat .env.gitlab.cicd:
CI_BUILD_NAME=Demo
GITLAB_USER_NAME=Benjamin
What you need dump all:
# /tools/gitlab/script-gitlab-variables.yml
dump-all:
- env > $GITLAB_EXPORT_ENV_FILENAME
# .gitlab-ci.yml
script:
- !reference [.script-gitlab-variables, dump-all]
I hope I could help

Gitlab send current date in output

I have a gitlab-ci.yml file. After each step, I'd like to send an output via REST containing the current date. Just sending an output via REST works but I have difficulties passing in the currentdate. I'm currently solving it like below (by exporting a variable)
image:
name: hashicorp/terraform:light
entrypoint:
- '/usr/bin/env'
- 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
before_script:
- apk add curl
- export mydate = $(date -I)
stages:
- validate
- plan
- apply
validate:
stage: validate
script:
- terraform validate
- <curl request>
variables:
msg: "$mydate => Validation complete, moving on"
plan:
stage: plan
script:
- terraform plan -out "planfile"
variables:
msg: "$mydate => Planning complete, moving on"
dependencies:
- validate
$ export mydate = $(date -I) /bin/sh: export: line 97: : bad variable name
Whatever variable name I choose, I always get this error message
That's because you have a space in your variable name.
Instead of writing export mydate = $(date -I), write export mydate=$(date -I).

Resources