How to backup and restore Gitlab CI environment variables? - gitlab

I would like to manage Gitlab variables from different projects via local files, so I would like to export project's CI variables locally to a YAML or JSON file, to change values and to import it back with updated values.
I tried glab-cli and Gitlab API, but it's to basic, You must process a variable by variable manually, I would like to find a better solution, capable to process all variables at once.

Related

Including Terraform Configuration from Another Gitlab Project

I have a couple apps that use the same GCP project. There are dev, stage, and prod projects, but they're basically the same, apart from project IDs and project numbers. I would like to have a repo in gitlab like config where I keep these IDs, in a dev.tfvars, stage.tfvars, prod.tfvars. Currently each app's repo has a directory of config/{env}.tfvars, which is really repetitive.
Googling for importing or including terraform resources is just getting me results about terraform state, so hasn't been fruitful.
I've considered:
Using a group-level Gitlab variable just as key=val env file and have my gitlab-ci yaml source the correct environment's file, then just include what I need using -var="key=value" in my plan and apply commands.
Creating a terraform module that either uses TF_WORKSPACE or an input prop to return the correct variables. I think this may be possible, but I'm new to TF, so I'm not sure how to return data back from a module, or if this type of "side-effects only" solution is an abusive workaround to something there's a better way to achieve.
Is there any way to include terraform variables from another Gitlab project?

How to get variables from my .env file to the Azure Pipeline

When I build my react app locally, all the envoriment variables get read from my local .env file, which is inside the projects root folder. We use gulp for the building process.
Now i want the same variables available in my azure pipeline, which also builds the app via gulp and deploys it to my Azure Static Web App.
I already tried pushing my .env file to the repo and also tried to set these variables in the pipelines YAML file via
env:
HOSTNAME: 'google.com'
And I also tried putting the values in my pipelines variables and accessing it in YAML like
env:
HOSTNAME: $(HOSTNAME)
Lastly I tried uploading my .env to Devops Pipelines Secure files, then adding Tasks to my pipeline to access this file and copiying it to my repos root folder.
All ways ended up with just random String like "8a8878aa1317" inside this variables, once the app is deployed and running. This random strings changes each time I run the pipeline. Does anyone know how to get the right values to the variables?
HOSTNAME is a tricky name for a variable, because it might be used by underlying OS or DevOps Agent (a random string value suggests that).
Try changing it to something like MY_HOSTNAME (in .env file, in the pipeline and in your app).

managing api keys in gitlab project

I have a gitlab project that is mirroring (pull) a github private repo. Because of its origins, the repo has a "config/private.js" file with all the api keys and server config that it needs. Or rather, that file isnt in the repo, its in .gitignore.
How do I populate my gitlab environment with this file? It would be ideal if I could reserve a special file that is not in the repo and does not update with commits, and is used to populate the dist environment with a build command like:
- cat secrets.file > src/config/private.js
But -- I'm having no luck finding that in the documentation. I do se project and group secrets -- but 1. that would be tedious just to add them and 2. I would need to rewrite the code, or else create another just as tedious script to echo each to the file.
this was a tad complicated.
Gitlab does not install the repo it installs the build results, thus you can inject api key files in gitlab's CI CD - but you would have to change it/rebuild for each env. (You couldnt test results and then redeploy known working results to prod.) In my case, I was building once, and committed to only applying relevant keys to stage and prod.
What I do is I keep the secrets as variables on the destination. I inject a key file that refers to the env during CI CD. For example, it might set a key to __MY_API_KEY__. I use a postinstall script in deployment to apply these env keys to the built scripts that are installed (this is just a tr command over a set of env variables and /build files).
This way, I can use a hard coded, gitignored private file locally, and still inject private keys specific to each env separately.

How to get my Bitbucket Pipeline Repo Variables to my local Docker Build?

My goal is to be able to develop/add features locally then create a local docker build and create a container using the Bitbucket Pipeline Repo Variables. I don't want to hard code any secrets on the host machine or inside the code. I'm trying to access some api keys hosted in the Bitbucket pipeline repo variables.
Anyone know how to do this? I am thinking some script inside the Dockerfile that will create environment variables inside the container.
You can pass these variables to your container as environment variables when you run the container with the -e flag (see: this question), you could use the bitbucket variables at this point. When you do this the variables are available in your docker container, but of course you will then still have to be able to use them in your python script I suppose?
You can easily do that like this:
variable = os.environ['ENV_VARIABLE_NAME']
If you do not want to pass the variables in plain text to the commands like this you could also set up a MySQL container linked to your python container which provides your application with the variables. This way everything is secured, dynamic and not visible from anywhere except to users with acces to your database and can still be modified easily. It takes a bit more time to set up, but is less of a hassle than an .env file.
I hope this helps you

During a VSTS build, using an npm script, how can I determine the branch that triggered the build?

I am trying to create a JS utility to version stamp a VSTS build with details about the branch and commit id.
I have been using git-rev-sync which works fine locally. However, when the code is checked out using a VSTS build definition, the repo is detached, and I am no longer able to determine from the git repo itself which branch the current code belongs to.
git-rev-sync reports something along the lines of:
Detatched: 705a3e89206882aceed8c9ea9b2f412cf26b5e3f
Instead of "develop" or "master"
I may look at the vsts node SDK which might be able to pick up VSTS environmental variables like you can with Powershell scripts.
Has anyone done this or solved this problem in a neater way?
The build variables will be added to current process’s environment variables, so you can access Build.SourceBranchName build-in variable from environment variable:
PowerShell:
$env:BUILD_SOURCEBRANCHNAME
NodeJS:
process.env.BUILD_SOURCEBRANCHNAME
Shell script:
$BUILD_SOURCEBRANCHNAME
Batch script:
%BUILD_SOURCEBRANCHNAME%
You also can pass it through argument of task ($(Build.SourceBranchName)), for example, using Replace Tokens task to replace variable value to a file, then you can read the value from the file (replace %{BUILD_SOURCEBRANCHNAME}%).

Resources