Trying to effectively pass multiple custom variables (about 10 of them) to other pipeline jobs through dotenv, by not creating 10x "- echo Var=Var". The custom variables have different values depending if the Environment is Dev,Test,Prod etc (about 8 different environments) as well.
From my understanding I can use child jobs to differentiate between environments, and use extends functionality as a config file to create all the variables for the different environments. Problem is that it's very cumbersome logic to use the bellow code when I’m going to add all the variables for all the environments.
Example code:
#Seperate file
.environment:Test:
variables:
CustomVar_ExtendServerFQDN: 'TestHost'
CustomVar_Datacenter: 'TestDatacenter'
.environment:Dev:
variables:
CustomVar_ExtendServerFQDN: 'DevHost'
CustomVar_Datacenter: 'DevDatacenter'
#Prep file
EnvironmentPrep:Dev:
stage: build
extends: .environment:Dev
script:
- echo "ServerFQDN=$CustomVar_ServerFQDN" >> build.env
- echo "Datacenter=$CustomVar_Datacenter" >> build.env
- echo "GitLab CI/CD | Print all environment variables"
- env
artifacts:
reports:
dotenv: build.env
rules:
- if: '$Environment=="Dev"'
EnvironmentPrep:Test:
stage: build
extends: .environment:Test
script:
- echo "ServerFQDN=$CustomVar_ServerFQDN" >> build.env
- echo "Datacenter=$CustomVar_Datacenter" >> build.env
- echo "GitLab CI/CD | Print all environment variables"
- env
artifacts:
reports:
dotenv: build.env
rules:
- if: '$Environment=="Test"'
#Main job file
MainJob:
stage: deploy
script:
- echo "Main Environment variable is [$Environment] [$Server]"
- echo "GitLab CI/CD | Print all environment variables"
- env
variables:
Environment: 'Test'
With this logic it means 10x "-echo var=$var", in each environment job, total of 80 "-echo var=$var" for the merged YAML.
Is there a more effective logic to make sure the MainJob gets the custom variables retrieved from a build stage job?
I'm looking for something like the bellow example, but I'm not sure how to create the logic or filter on only the custom variables "CustomVar_*" for variable $scriptBlock:
#Seperate file
.environment:Test:
variables:
CustomVar_ExtendServerFQDN: 'TestHost'
CustomVar_Datacenter: 'TestDatacenter'
.environment:Dev:
variables:
CustomVar_ExtendServerFQDN: 'DevHost'
CustomVar_Datacenter: 'DevDatacenter'
EnvironmentPrep:Dev:
stage: build
extends: .environment:Dev
script:
- $scriptBlock >> build.env
artifacts:
reports:
dotenv: build.env
rules:
- if: '$Environment=="Dev"'
EnvironmentPrep:Test:
stage: build
extends: .environment:Test
script:
- $scriptBlock >> build.env
artifacts:
reports:
dotenv: build.env
rules:
- if: '$Environment=="Test"'
MainJob:
stage: deploy
script:
- echo "Main Environment variable is [$Environment] [$Server]"
- echo "GitLab CI/CD | Print all environment variables"
- env
variables:
Environment: 'Test'
scriptBlock: <SOME FILTER LOGIC ON "CustomVar_*">
Related
Gitlab version: 14.3.3
I cannot get all jobs to be created and work in the pipeline. I get an infinite load on running pipelines.
I'm expecting
On a MR
build:merge-request-pipeline-dynamic_script
build:shared-pipeline-dump-variables
test:shared-pipeline-test-job-artifacting-always
test:shared-pipeline-test-job-artifacting-always-multi
On Main
build:main-pipeline-dynamic_script
build:shared-pipeline-dump-variables
test:shared-pipeline-test-job-artifacting-always
test:shared-pipeline-test-job-artifacting-always-multi
# .gitlab-ci.yml
image: ruby:2.3
# Global rules on all pipelines
workflow:
rules:
# Mask and prevent builds on noci commit SHA's
- if: $CI_COMMIT_TITLE =~ /^noci/
when: never
# Specific pipelines with specific rules
include:
- local: main-pipeline.yml
rules:
# Any commits to `main` i.e. we have merged something in
- if: $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == $CI_DEFAULT_BRANCH
- local: merge-request-pipeline.yml
rules:
# Merge requests going in will build
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
# Catch all pipeline that will just build everything that we want all the time
# NB: This will obviously only be included if we pass the global workflow check
- local: shared-pipeline.yml
# main-pipeline.yml
variables:
BUILD_TYPE: commit to main
main-pipeline-dynamic_script:
stage: build
script:
- echo "Were running directly in main!"
- echo "USER VARIABLE LIST (SHOULD BE PRESENT) - BUILD_TYPE - $BUILD_TYPE"
# merge-request-pipeline.yml
variables:
BUILD_TYPE: merge-request
merge-request-pipeline-dynamic_script:
stage: build
script:
- echo "Were running on a merge request!"
- echo "USER VARIABLE LIST (SHOULD BE PRESENT) - BUILD_TYPE - $BUILD_TYPE"
# shared-pipeline.yml
variables:
NO_DEFAULT:
description: This has no default. It should interpolate as blank.
WITH_DEFAULT:
description: This has a default. It should interpolate as default.
value: default
shared-pipeline-dump-variables:
stage: build
script:
# ... removed not important
shared-pipeline-test-job-artifacting-always:
stage: test
script:
- # redundant
shared-pipeline-test-job-artifacting-always-multi:
stage: test
script:
- echo "This job will artifact logs"
- echo "log1" > log1.log
- echo "log2" > log2.log
- # removed redundant
artifacts:
paths:
- log1.log
- log2.log
- log3.log
- log/log4.log
- log/log5.log
- log/log6doesnotexist.log
I have a variable stored into env file :
stages:
- build
- execute
build:
stage: build
script:
- Add-Content properties.env -Value PROD="TRUE"
artifacts:
reports:
dotenv: properties.env
tags:
- windows
Now i wanted to read this value in a stage and then decide whether to execute that stage or not.
I did the following, but it isn't working:
execute:
stage: execute
rules:
- if: $PROD == "TRUE"
when: always
script:
- echo "happy"
tags:
- windows
dependencies:
- build
Any help ?
Thanks.
You can try and use the approach described in "Pass an environment variable to another job":
build:
stage: build
script:
- echo "BUILD_VERSION=hello" >> build.env
artifacts:
reports:
dotenv: build.env
deploy_one:
stage: deploy
script:
- echo "$BUILD_VERSION" # Output is: 'hello'
dependencies:
- build
environment:
name: customer1
deployment_tier: production
I'm developing a pipeline on GitLab-ci, in the first job I use gittools/gitversion the obtain the semantic version of my software.
Here a small piece of code of /gitversion-ci-cd-plugin-extension.gitlab-ci.yml (Full documentation here https://gitversion.net/docs/reference/build-servers/gitlab)
.gitversion_function:
image:
name: gittools/gitversion
entrypoint: ['']
stage: .pre
.....
.....
artifacts:
reports:
#propagates variables into the pipeline level
dotenv: thisversion.env
Then a simplified version of my pipeline is as follows
stages:
- .pre
- install_dependencies
- build
- deploy
include:
- local: '/gitversion-ci-cd-plugin-extension.gitlab-ci.yml'
determineversion:
extends: .gitversion_function
install_dependencies:
image: node:16.14
stage: install_dependencies
script:
- echo ${PACKAGE_VERSION}
build:
image: node:16.14
stage: build
script:
- echo $PACKAGE_VERSION
deploy:
image: bitnami/kubectl
stage: deploy
needs: ['build']
script:
- echo $PACKAGE_VERSION
The problem is that the environment variable $PACKAGE_VERSION works in the first two jobs install_dependencies and build.
echo $PACKAGE_NAME; //0.0.1
But when the jobs deploy is executed the environment variable is not expanded by pipeline and I obtain literally this
echo $PACKAGE_NAME; //$PACKAGE_NAME
I found the problem.
In the last job of my pipeline, I use needs (https://docs.gitlab.com/ee/ci/yaml/#needs) to establish dependencies between jobs.
The problem is that artifact is not automatically passed because there is no a dependency between determineversion and deploy, to fix I do this:
...
deploy:
image: bitnami/kubectl
stage: deploy
needs: ['determineversion', 'build'] # <------
script:
- echo $PACKAGE_VERSION
...
I added determineversion as a dependency of deploy, in this way $PACKAGE_VERSION is printed correctly
I currently have two jobs in my CI file which are nearly identical.
The first is for manually compiling a release build from any git branch.
deploy_internal:
stage: deploy
script: ....<deploy code>
when: manual
The second is to be used by the scheduler to release a daily build from develop branch.
scheduled_deploy_internal:
stage: deploy
script: ....<deploy code from deploy_internal copy/pasted>
only:
variables:
- $MY_DEPLOY_INTERNAL != null
This feels wrong to have all that deploy code repeated in two places. It gets worse. There are also deploy_external, deploy_release, and scheduled variants.
My question:
Is there a way that I can combine deploy_internal and scheduled_deploy_internal such that the manual/scheduled behaviour is retained (DRY basically)?
Alternatively: Is there is a better way that I should structure my jobs?
Edit:
Original title: Deploy job. Execute manually except when scheduled
You can use YAML anchors and aliases to reuse the script.
deploy_internal:
stage: deploy
script:
- &deployment_scripts |
echo "Deployment Started"
bash command 1
bash command 2
when: manual
scheduled_deploy_internal:
stage: deploy
script:
- *deployment_scripts
only:
variables:
- $MY_DEPLOY_INTERNAL != null
Or you can use extends keyword.
.deployment_script:
script:
- echo "Deployment started"
- bash command 1
- bash command 2
deploy_internal:
extends: .deployment_script
stage: deploy
when: manual
scheduled_deploy_internal:
extends: .deployment_script
stage: deploy
only:
variables:
- $MY_DEPLOY_INTERNAL != null
Use GitLab's default section containing a before_script:
default:
before_script:
- ....<deploy code>
job1:
stage: deploy
script: ....<code after than deploy>
job2:
stage: deploy
script: ....<code after than deploy>
Note: the default section fails to function as such if you try to execute a job locally with the gitlab-runner exec command - use YAML anchors instead.
I need some direction here. I'm reading whatever documentation I can find online but it's not hitting the right synapses or I haven't found the right link yet. On a merge request to a deployable environment, I want to kick off a build on two separate machines. Both machines are IBM Is, running different versions of the OS. I'd like for these builds and subsequent deploys to happen independently of each other.
My .yml file has the entries for the build for the two machines (QQDEV & BNADEV), but the builds occur sequentially, not in parallel. The picture below is what Gitlab draws.
To me, from the above picture, it looks like both build_BNADEV and build_QQDEV are going to run the deploy jobs DEV_BNADEV and DEV_QQDEV. I want build_BNADEV to run DEV_BNADEV, et al, and that is a separate issue aside from the parallel builds.
What do I need here? Another runner? Another pipeline? Just looking for general pointers and direction here.
Here is my YAML.
stages:
- build
- deploy
build_QQDEV:
variables:
THING: "This is a THING for build for QQDEV"
script:
- "bash ./GitLabCI/GitLabCI.Build.sh qqdev"
stage: build
only:
- DEV
- QA
- UAT
- PROD
build_BNADEV:
variables:
THING: "This is a THING for build for BNADEV"
script:
- "bash ./GitLabCI/GitLabCI.Build.sh bnadev"
stage: build
only:
- DEV
- QA
DEV_QQDEV:
variables:
THING: "This is a THING for deploy_DEV_QQDEV"
ASPGRP: "*NONE"
script:
- "bash ./GitLabCI/GitLabCI.Deploy.sh QQDEV EPDEV1_5 /home/quikq/1.5/dev"
stage: deploy
environment:
name: DEV
only:
- DEV
DEV_BNADEV:
variables:
THING: "This is a THING for deploy_DEV_BNADEV"
REBUILD_DEPLOYMENT: "0"
ASPGRP: "DATADEV"
script:
- "bash ./GitLabCI/GitLabCI.Deploy.sh BNADEV EPDEV1_5 /home/quikq/1.5/dev"
stage: deploy
environment:
name: DEV
only:
- DEV
QA_QQDEV:
variables:
THING: "This is a THING for deploy_QA_QQDEV"
ASPGRP: "*NONE"
script:
- "bash ./GitLabCI/GitLabCI.Deploy.sh QQDEV EPQA1_5 /home/quikq/1.5/qa"
stage: deploy
environment:
name: QA
only:
- QA
QA_BNADEV:
variables:
THING: "This is a THING for deploy_QA_BNADEV"
REBUILD_DEPLOYMENT: "0"
ASPGRP: "DATADEV"
script:
- "bash ./GitLabCI/GitLabCI.Deploy.sh BNADEV EPQA1_5 /home/quikq/1.5/qa"
stage: deploy
environment:
name: QA
only:
- QA
UAT_QQ:
variables:
THING: "This is a THING for deploy_UAT_QQ"
ASPGRP: "*NONE"
script:
- "bash ./GitLabCI/GitLabCI.Deploy.sh QQ EPUAT1_5 /home/quikq/1.5/uat"
stage: deploy
environment:
name: UAT
only:
- UAT
UAT_QQBNA:
variables:
THING: "This is a THING for deploy_UAT_QQBNA"
ASPGRP: "*NONE"
script:
- "bash ./GitLabCI/GitLabCI.Deploy.sh QQBNA EPUAT1_5 /home/quikq/1.5/uat"
stage: deploy
environment:
name: UAT
only:
- UAT
PROD_QQ:
variables:
THING: "This is a THING for deploy_PROD_QQ"
ASPGRP: "*NONE"
script:
- "bash ./GitLabCI/GitLabCI.Deploy.sh QQ EPPROD1_5 /home/quikq/1.5/prod"
stage: deploy
environment:
name: PROD
only:
- PROD
PROD_QQBNA:
variables:
THING: "This is a THING for deploy_PROD_QQBNA"
ASPGRP: "*NONE"
script:
- "bash ./GitLabCI/GitLabCI.Deploy.sh QQBNA EPPROD1_5 /home/quikq/1.5/prod"
stage: deploy
environment:
name: PROD
only:
- PROD