I am new at gitlab CI/CD Settings. I don't want to put sensitive credentials (like API-Keys, passwords...) into my branch. For this, GitLab (and other CI/CD-Services) are able to set environment variables.
What I have done so far:
Via UI (Project ⇒ Settings ⇒ CI/CD ⇒ Variables)
First go to Project ⇒ Settings ⇒ CI/CD ⇒ Variables and add them like this:
enter image description here
Now here trying to get the File with all your config-values(e.g. with dotenv).
require("dotenv");
module.exports = process.env.NODE_ENV.trim() === "production" ? _config.production : _config.development;
Current .gitlab-ci.yaml file is:
image: node:8.9.0
cache:
paths:
- node_modules/
stages:
- ver
- init
- test
- build
- deploy
ver:
stage: ver
script:
- node -v
init:
stage: init-dev
script:
- npm install
tags:
- dev_server
only:
- dev
variables:
ENV_PRODUCTION: "/builds/AkdiD/8/abcde/projectName/ENV_PRODUCTION"
test:
stage: test
script:
- npm test
build:
stage: build
script:
- echo "BUILD_VERSION=production" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy-dev
script:
- npm run killcurrent
- echo $ENV_PRODUCTION
- echo $BUILD_VERSION
- npm run staging
tags:
- dev_server
only:
- dev
Question: where do I need to keep this ENV_PRODUCTION file name (yaml file or other place) so that server take that value ??
Edited variable like this- still server it not fetching these variables. Should I change/put something in .gitlab-ci.yml file?
Settings up a Custom environment variables of type File (GItLab 11.11+) does not seem the way to reference/set a list of variables, including ones with sensitive information.
A variable of type file is generally there to represent, for instance, a certificate.
You should define variables, possibly Group-level environment variables
You can define per-project or per-group variables that are set in the pipeline environment.
Group-level variables are stored out of the repository (not in .gitlab-ci.yml) and are securely passed to GitLab Runner, which makes them available during a pipeline run.
For Premium users who do not use an external key store or who use GitLab’s integration with HashiCorp Vault, we recommend using group environment variables to store secrets like passwords, SSH keys, and credentials.
Related
I new to gitlab CI/CD and I am looking for a way to pass environment variables to my NestJs application deployed to Heroku.
This is my .gitlab.yml file
...
image: node:latest
before_script:
- apt-get update -qy
- apt-get install -y ruby-dev
- gem install dpl
stages:
- testing
- staging
testing:
stage: testing
image: salesforce/salesforcedx:latest-slim
script:
- accessToken=accessToken
- echo TEST_ACCESS_TOKEN=${accessToken} > .env.test
- echo dummmy=test >> .env
- echo dummmyWithQuotes=test >> ".env"
only:
- staging
- main
staging:
stage: staging
image: ruby:latest
script:
- dpl --provider=heroku --app=$HEROKU_APP_STAGING --api-key=$HEROKU_API_KEY
only:
- staging
This is my app controller to test the deployed result
#Get()
getHello() {
const temoin = this.configService.get('PROD_LOGIN_URL');
const tested = this.configService.get('TEST_ACCESS_TOKEN');
const tested2 = this.configService.get('dummmy');
const tested3 = this.configService.get('dummmyWithQuotes');
return {
temoin,
tested,
tested2,
tested3,
};
}
Of course I have the following in the app module
ConfigModule.forRoot({
isGlobal: true,
envFilePath: ['.env', '.env.test'],
}),
What I get in the response from the deployed application is the folowing
{"temoin":"https://login.salesforce.com"}
I think this have to do with the docker image. What I can think of is that the files are getting created in the docker container and stay in the gitlab job context (Maybe I don't know)
Edit:
I added an ls -a in the staging job's scripts and there is no .env.test file
Edit 2:
I added the .env.test and .env files to the job's artifacts and it became available to the staging job. But when deploying the application with the dpl command the .env.test is not present in Heoku application files.
PS: I forgot to mention that the .env.test isn't present in the Git project, it's created in the pipeline.
It is not completely clear to me what you want to accomplish, but I think you would like the .env file to be available in the staging job.
Pass it by adding it as a job artifact
testing:
stage: testing
image: salesforce/salesforcedx:latest-slim
script:
- accessToken=accessToken
- echo TEST_ACCESS_TOKEN=${accessToken} > .env.test
- echo dummmy=test >> .env
- echo dummmyWithQuotes=test >> ".env"
artifacts:
paths:
- .env
only:
- staging
- main
Another option, if you would like dynamically generated variables to be available to subsequent jobs, would be to use the built-in dotenv report.
artifacts:
reports:
dotenv: .env
This will make any vars in the .env file available in the environment of subsequent jobs as the dotenv report gets loaded into the environment automatically.
After adding the artifact to make the .env and .env.test files available to the testing job, as Benjamin said, I found that those files are not getting deployed in the application.
And the reason is that dpl performs a cleanup (stash all) before deploying.
I found this issue that adds the --skip_cleanup flag to dpl to avoid the cleanup.
Works like a charm.
I need to pass a file path to a trigger job where the file path is found within a specified json file in a separate job. Something along the lines of this...
stages:
- run_downstream_pipeline
variables:
- FILE_NAME: default_file.json
.get_path:
stage: run_downstream_pipeline
needs: []
only:
- schedules
- triggers
- web
script:
- apt-get install jq
- FILE_PATH=$(jq '.file_path' $FILE_NAME)
run_pipeline:
extends: .get_path
variables:
PATH: $FILE_PATH
trigger:
project: my/project
branch: staging
strategy: depend
I can't seem to find any workaround to do this, as using extends won't work since Gitlab wont allow for a script section in a trigger job.
I thought about trying to use the Gitlab API trigger method, but I want the status of the downstream pipeline to actually show up in the pipeline UI and I want the upstream pipeline to depend on the status of the downstream pipeline, which from my understanding is not possible when triggering via the API.
Any advice would be appreciated. Thanks!
You can use artifacts:reports:dotenv for setting variables dynamically for subsequent jobs.
stages:
- one
- two
my_job:
stage: "one"
script:
- FILE_PATH=$(jq '.file_path' $FILE_NAME) # In script, get the environment URL.
- echo "FILE_PATH=${FILE_PATH}" >> variables.env # Add the value to a dotenv file.
artifacts:
reports:
dotenv: "variables.env"
example:
stage: two
script: "echo $FILE_PATH"
another_job:
stage: two
trigger:
project: my/project
branch: staging
strategy: depend
Variables in the dotenv file will automatically be present for jobs in subsequent stages (or that declare needs: for the job).
You can also pull artifacts into child pipelines, in general.
But be warned you probably don't want to override the PATH variable, since that's a special variable used to help you find builtin binaries.
We have multiple environments (staging, production...) and I don't want to put sentitive informations like database passwords inside the codebase. For this, I want to use environment variables provided by GitLab CI/CD.
However I don't know how to tell GitLab to run a different set of variables depending on my environment.
What I've done so far:
1- Create environments : Via UI (Project => Operations => Environments : Here I created 2 environments, STAGING and PRODUCTION
2- Create variables Via UI (Project => Settings => CI/CD => Variables : Here I created the variable DB_PASSWORD twice (with of course different values assigned) , one with environment scope set to STAGING, the other one to PRODUCTION.
Now what I want to do is run my project's pipeline. So I go to Project => CI/CD => Pipelines => Run Pipeline and here I expect GitLab CI to ask me if I would like to run my pipeline with the set of variables set for STAGING or PRODUCTION but it doesn't.
How I am supposed to tell GitLab that I want to run my pipeline using DB_PASSWORD variable with the value corresponding to the environment I want to target?
You need to specify the environment in your gitlab-ci.yml file, see here
Example from official gitlab docs:
stages:
- test
- build
- deploy
test:
stage: test
script: echo "Running tests"
build:
stage: build
script: echo "Building the app"
deploy_staging:
stage: deploy
script:
- echo "Deploy to staging server"
environment:
name: staging
url: https://staging.example.com
only:
- master
In this example when running deploy_staging the environment is set to staging and thus you can access the defined Variables for the environment, like so:
deploy_staging:
stage: deploy
script:
- echo "Deploy to staging server"
environment:
name: staging
url: https://staging.example.com
variables:
DB_PASS: ${DB_PASSWORD} # which is your defined variable within Gitlab CI
only:
- master
I am importing some secrets from Azure Key Vault to Variable Group to CI / CD pipeline.
I am able to map the required secrets in VariableGroup from KeyVault using Azure Devops UI.
In my pipeline YAML i am able to read and print those VariableGroup variables which are AzureKeyVault secrets.
trigger:
- dev
# define the VM image
pool:
vmImage: "Ubuntu 16.04"
# define variables to use during the build
variables:
- group: SecretVarGroup # it has keyvault variable 'KV_API_KEY'
- group: PublicVarGroup # it has a variable 'API_CLIENTID'
# define the step to export key to env varaiable
steps:
- script: echo $MYSECRETAPIKEY
env:
MYSECRETAPIKEY: $(KV_API_KEY)
## Run the npm build
- script: |
npm run build
displayName: "npm build"
I am able to see value for 'KV_API_KEY' secret printed as *** value in the build output log which i assume its able to consume. I also see value for API_CLIENTID printed in build log as well as node js process.env object.
I was assuming the variable "MYSECRETAPIKEY" will be available in my node js process.env object. But it's not avaialble.
The way i tested it is in my node js project build config i have a print statement which prints process.env object. It printed all the environment variables of pipeline build agent including my PUBLICVARGROUP variable 'API_CLIENTID'. But i don't see my secret variable 'MYSECRETAPIKEY' in the process.env object.
env:
MYSECRETAPIKEY: $(KV_API_KEY)
I thought above line would export variable to specific language process environment. But it is not. How can i fix this?
# define the step to export key to env varaiable
steps:
## Run the npm build
- script: |
npm run build
displayName: "npm build"
env:
MYSECRETAPIKEY: $(KV_API_KEY)
Looks like secrets are scoped on the agent for individual tasks and scripts to use. The issue was i had env: declaraion in a separate adhoc task.Moving it to the same place of my script declaration in the above code has fixed the issue.
I run an end2end test in gitlab-CI , see https://docs.cypress.io/guides/guides/continuous-integration.html.
I run it after I deploy my app.
It works well but I want to change the base url in order to run it against my prod or my staging env. It is possible via an environment var passed to the test.
I don’t want to write a test job per environment, then I would like to get the environment URL via env var, but the $CI_ENVIRONMENT_URL is only available on the deploy job, not in the next one.
deploy-prod:
stage: deploy
script:
- some commands
environment:
name: prod
url: http://myprod.com
only:
- master
deploy-staging:
stage: deploy
script:
- some other commands
environment:
name: staging
url: http://mystaging.com
only:
- staging
test:
stage: after-deploy
script:
- CYPRESS_baseUrl=$CI_ENVIRONMENT_URL cypress run
I expect $CI_ENVIRONMENT_URL equal http://mystaging.com or http://myprod.com depending the previous deploy job has been run. But it is empty, seems $CI_ENVIRONMENT_URL is only available in deploy job.
Is it possible to pass a variable from on job to a next job?
You can use artifacts feature: write the $CI_ENVIRONMENT_URL in a file:
echo $CI_ENVIRONMENT_URL > environmentUrl.txt
save it as artifact, and then read it in the next job:
$CI_ENVIRONMENT_URL=`cat environmentUrl.txt`