How to pass environment variables to NestJS application from gitlab pipeline (gitlab.yml file)? - node.js

I new to gitlab CI/CD and I am looking for a way to pass environment variables to my NestJs application deployed to Heroku.
This is my .gitlab.yml file
...
image: node:latest
before_script:
- apt-get update -qy
- apt-get install -y ruby-dev
- gem install dpl
stages:
- testing
- staging
testing:
stage: testing
image: salesforce/salesforcedx:latest-slim
script:
- accessToken=accessToken
- echo TEST_ACCESS_TOKEN=${accessToken} > .env.test
- echo dummmy=test >> .env
- echo dummmyWithQuotes=test >> ".env"
only:
- staging
- main
staging:
stage: staging
image: ruby:latest
script:
- dpl --provider=heroku --app=$HEROKU_APP_STAGING --api-key=$HEROKU_API_KEY
only:
- staging
This is my app controller to test the deployed result
#Get()
getHello() {
const temoin = this.configService.get('PROD_LOGIN_URL');
const tested = this.configService.get('TEST_ACCESS_TOKEN');
const tested2 = this.configService.get('dummmy');
const tested3 = this.configService.get('dummmyWithQuotes');
return {
temoin,
tested,
tested2,
tested3,
};
}
Of course I have the following in the app module
ConfigModule.forRoot({
isGlobal: true,
envFilePath: ['.env', '.env.test'],
}),
What I get in the response from the deployed application is the folowing
{"temoin":"https://login.salesforce.com"}
I think this have to do with the docker image. What I can think of is that the files are getting created in the docker container and stay in the gitlab job context (Maybe I don't know)
Edit:
I added an ls -a in the staging job's scripts and there is no .env.test file
Edit 2:
I added the .env.test and .env files to the job's artifacts and it became available to the staging job. But when deploying the application with the dpl command the .env.test is not present in Heoku application files.
PS: I forgot to mention that the .env.test isn't present in the Git project, it's created in the pipeline.

It is not completely clear to me what you want to accomplish, but I think you would like the .env file to be available in the staging job.
Pass it by adding it as a job artifact
testing:
stage: testing
image: salesforce/salesforcedx:latest-slim
script:
- accessToken=accessToken
- echo TEST_ACCESS_TOKEN=${accessToken} > .env.test
- echo dummmy=test >> .env
- echo dummmyWithQuotes=test >> ".env"
artifacts:
paths:
- .env
only:
- staging
- main
Another option, if you would like dynamically generated variables to be available to subsequent jobs, would be to use the built-in dotenv report.
artifacts:
reports:
dotenv: .env
This will make any vars in the .env file available in the environment of subsequent jobs as the dotenv report gets loaded into the environment automatically.

After adding the artifact to make the .env and .env.test files available to the testing job, as Benjamin said, I found that those files are not getting deployed in the application.
And the reason is that dpl performs a cleanup (stash all) before deploying.
I found this issue that adds the --skip_cleanup flag to dpl to avoid the cleanup.
Works like a charm.

Related

How to pass global variable value to next stage in GitLab CI/CD

Based on GitLab documentation You can use the variables keyword to pass CI/CD variables to a downstream pipeline.
I have a global variable DATABASE_URL
The init stage retrieves connection string from the AWS Secret manager and sets it
to DATABASE_URL
Then in the deploy stage I want to use that variable to deploy database. However in the deploy stage variable's value is empty.
variables:
DATABASE_URL: ""
default:
tags:
- myrunner
stages:
- init
- deploy
init-job:
image: docker.xxxx/awscli
stage: init
script:
- SECRET_VALUE="$(aws secretsmanager get-secret-value --secret-id my_secret --region us-west-2 --output text --query SecretString)"
- DATABASE_URL="$(jq -r .DATABASE_URL <<< $SECRET_VALUE)"
- echo "$DATABASE_URL"
deploy-dev-database:
image: node:14
stage: deploy
environment:
name: development
script:
- echo "$DATABASE_URL"
- npm install
- npx sequelize-cli db:migrate
rules:
- if: $CI_COMMIT_REF_NAME == "dev"
Init Job. echos the DATABASE_URL
However DATABASE_URL is empty in deploy stage
Questions
1> How do I pass the global variable across the stages.
2> NodeJS database deployment process will be using this variable as process.env.DATABASE_URL will it be available to nodejs environment?
Variables are set by precedence, and when you print a variable inside of a job, it will look for the variable inside itself (the same job), and then start moving up to what's defined in the CI YAML file (variables: section), then the project, group, and instance. The job will never look at other jobs.
If you want to pass a variable from one job to another, you would want to make sure you don't set the variable at all and instead pass the variable from one job to another following the documentation on passing environment variables to another job.
Basically,
Make sure to remove DATABASE_URL: "" from the variables section.
Make the last line of your init-job script - echo "$DATABASE_URL" >> init.env. You can call your .env file whatever you want of course.
Add an artifacts: section to your init-job.
Add a dependencies: or needs: section to your deploy-dev-database job to pull the variable.
You should end up with something like this:
stages:
- init
- deploy
init-job:
image: docker.xxxx/awscli
stage: init
script:
- SECRET_VALUE="$(aws secretsmanager get-secret-value --secret-id my_secret --region us-west-2 --output text --query SecretString)"
- DATABASE_URL="$(jq -r .DATABASE_URL <<< $SECRET_VALUE)"
- echo "$DATABASE_URL" >> init.env
artifacts:
reports:
dotenv: init.env
deploy-dev-database:
image: node:14
stage: deploy
dependencies:
- init-job
environment:
name: development
script:
- echo "$DATABASE_URL"
- npm install
- npx sequelize-cli db:migrate
rules:
- if: $CI_COMMIT_REF_NAME == "dev"

How to - Gitlab custom environment variables to be exported to a .env file before build

I have a gitlab project with custom environment variables defined in the UI. Now, before the build step in my pipeline - I want to place a .env at the root of the project with "ALL" the custom variables exported to the file. I understand I can do single exports using echo $var_name >> .env or something but Really need to do all of them at once (only custom defined)
stage: Creating Artifacts
cache:
key: $CI_COMMIT_REF_SLUG-$CI_PROJECT_DIR
paths:
- node_modules/
script:
- env >> ".env" #HERE IS THE PROBLEM - IT EXPORTS ALL DEFAULTS INSTEAD OF CUSTOM VARS
- npm run build
artifacts:
name: 'staging_api'
untracked: false
expire_in: 30 days
paths:
- ./dist
- ./node_modules
- .env
I'm not sure how to work .env with npm.
Try to use env >> ".env" instead env > ".env". It will override full .env file by current environment variables.
if you have in .env specific default values, I suggest define default variables in .gitlab-ci.yml. The default variables was overridden by UI variables.

Access env variables in node js app when using gitlab ci cd pipeline

I am using gitlab ci cd pipe line to deploy my application to ubuntu server. I have different .env file for local and for dev env and its not a part of git repo (included in gitignore) how to get env variables in my app when deployed to ubuntu server
my gitlab-ci.yml
stages:
- deploy
cache:
paths:
- node_modules/
deploy:
stage: deploy
script:
- npm install
- sudo pm2 delete lknodeapi || true
- sudo pm2 start server.js --name lknodeapi
I guess you are looking for this -Create Variables Gitlab.You can create your environment variables in the ui and then change your gitlab-ci.yml like below
stages:
- deploy
cache:
paths:
- node_modules/
deploy:
stage: deploy
script:
- echo "NGINX_REPO_KEY"=$NGINX_REPO_KEY >> ".env"
- npm install
- sudo pm2 delete lknodeapi || true
- sudo pm2 start server.js --name lknodeapi
This will create a .env file in root folder and put your variables in it.

Setup Gitlab CI/CD environment variables via UI

I am new at gitlab CI/CD Settings. I don't want to put sensitive credentials (like API-Keys, passwords...) into my branch. For this, GitLab (and other CI/CD-Services) are able to set environment variables.
What I have done so far:
Via UI (Project ⇒ Settings ⇒ CI/CD ⇒ Variables)
First go to Project ⇒ Settings ⇒ CI/CD ⇒ Variables and add them like this:
enter image description here
Now here trying to get the File with all your config-values(e.g. with dotenv).
require("dotenv");
module.exports = process.env.NODE_ENV.trim() === "production" ? _config.production : _config.development;
Current .gitlab-ci.yaml file is:
image: node:8.9.0
cache:
paths:
- node_modules/
stages:
- ver
- init
- test
- build
- deploy
ver:
stage: ver
script:
- node -v
init:
stage: init-dev
script:
- npm install
tags:
- dev_server
only:
- dev
variables:
ENV_PRODUCTION: "/builds/AkdiD/8/abcde/projectName/ENV_PRODUCTION"
test:
stage: test
script:
- npm test
build:
stage: build
script:
- echo "BUILD_VERSION=production" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy-dev
script:
- npm run killcurrent
- echo $ENV_PRODUCTION
- echo $BUILD_VERSION
- npm run staging
tags:
- dev_server
only:
- dev
Question: where do I need to keep this ENV_PRODUCTION file name (yaml file or other place) so that server take that value ??
Edited variable like this- still server it not fetching these variables. Should I change/put something in .gitlab-ci.yml file?
Settings up a Custom environment variables of type File (GItLab 11.11+) does not seem the way to reference/set a list of variables, including ones with sensitive information.
A variable of type file is generally there to represent, for instance, a certificate.
You should define variables, possibly Group-level environment variables
You can define per-project or per-group variables that are set in the pipeline environment.
Group-level variables are stored out of the repository (not in .gitlab-ci.yml) and are securely passed to GitLab Runner, which makes them available during a pipeline run.
For Premium users who do not use an external key store or who use GitLab’s integration with HashiCorp Vault, we recommend using group environment variables to store secrets like passwords, SSH keys, and credentials.

How do I deploy a sapper/svelte site to Gitlab Pages?

I am trying to use gitlab pages to host my static site generated by Sapper and Svelte.
I used the sapper starter app from the getting started docs:
npx degit "sveltejs/sapper-template#rollup" my-app
I added the .gitlab-ci.yml file as gitlab docs instrcuted:
# This file is a template, and might need editing before it works on your project.
image: node:latest
# This folder is cached between builds
# http://docs.gitlab.com/ce/ci/yaml/README.html#cache
cache:
paths:
- node_modules/
pages:
stage: deploy
script:
- npm run export
- mkdir public
- mv __sapper__/export public
artifacts:
paths:
- public
only:
- master
When the pipeline runs, it says it passes, but I still get a 404 error even after a day of waiting.
Has anyone successfully done this with sapper??
You're moving the export folder, rather than its contents. Change your move command to
mv __sapper__/export/* public/
so that your config would be
# This file is a template, and might need editing before it works on your project.
image: node:latest
# This folder is cached between builds
# http://docs.gitlab.com/ce/ci/yaml/README.html#cache
cache:
paths:
- node_modules/
pages:
stage: deploy
script:
- npm run export
- mkdir public
- mv __sapper__/export/* public/
artifacts:
paths:
- public
only:
- master

Resources