How do you develop a custom plugin for Gitlab CICD? - gitlab

I need to integrate a Gitlab CICD pipeline with a custom Delivery Manager Tool. My pipeline will need to invoke the delivery manager API with some info.
In Jenkins we developed a plugin that provided a pipeline step - void deployEnv (String app, String environment ) - that can be used in the different stages, e.g.:
deployEnv("applicationx", "production")
Is there a way to develop a similar plugin in Gitlab CICD?
Is it possible to invoke a remote URL from a Gitlab CICD pipeline passing some credentials?

The closest analog for this kind of "plugin" in GitLab CI is probably a CI templated job definition. There's maybe a few ways to formulate this. There are a number of methods for abstracting and including job definitions provided by others. Some of the basic tools are: include:, extends:, !reference, "hidden job" keys, and YAML anchors.
Providing reusable templates
If you just need to provide an abstraction for a series of steps, a "hidden key" definition would be the closest to what you want.
Consider the following template YAML. This might be embedded directly in your .gitlab-ci.yml file(s) or you might choose to include it any number of configurations from a remote location using the include: keyword.
In this fictional example, we provide a script step that expects two environment variables to be present: APPLICATION_NAME and DEPLOYMENT_ENV. These variables are used (in this fictional example) to call a remote API passing those values as path parameters. Here, the definitions are provided in a "hidden job" key
.deploy_to_env:
image: curlimages/curl # or otherwise have `curl` in your environment
script:
- |
if [[ -z "$APPLICATION_NAME" || -z "$DEPLOYMENT_ENV" ]]; then
echo "FATAL: you must set APPLICATION_NAME and DEPLOYMENT_ENV variables"
exit 1
fi
- curl -XPOST "https://my-deployment-api.example.com/${APPLICAITON_NAME}/${DEPLOYMENT_ENV}"
Let's assume this yaml file exists in a file named deploy.yml in a project whose path is my-org/templates.
Using templates
Now let's say a pipeline configuration wants to leverage the above definition to deploy an application named applicationx to production.
First, in any case, the project should include: the remote definition (unless you choose to embed it directly -- e.g., copy/paste).
include:
- project: my-org/templates
file: deploy.yml
ref: main # or any git ref, or omit to use default branch
Then you can use the extends: keyword to form a concrete job from the hidden key.
deploy_production:
stage: deploy
extends: .deploy_to_env
variables:
APPLICATION_NAME: "applicationx"
DEPLOYMENT_ENV: "production"
Or, if you want to embed the deployment steps in the middle of other script steps using !reference is useful here.
deploy_production:
stage: deploy
script:
- export APPLICATION_NAME="applicationx"
- export DEPLOY_ENV="production"
# these could also be set in `variables:`
- echo "calling deployment API to deploy ${APPLICATION_NAME} to ${DEPLOY_ENV}"
- !reference [.deploy_to_env, script]
- echo "done"
There's a lot of ways to handle this, these are just two examples.

Related

Can I pass a variable from .env file into .gitlab-ci.yml

I'm quite new to CI/CD and basically I'm trying to add this job to Gitlab CI/CD that will run through the repo looking for secret leaks. It requires some API key to be passed there. I was able to directly insert this key into .gitlab-ci.yml and it worked as it was supposed to - failing the job and showing that this happened due to this key being in that file.
But I would like to have this API key to be stored in .env file that won't be pushed to a remote repo and to pull it somehow into .gitlab-ci.yml file from there.
Here's mine
stages:
- scanning
gitguardian scan:
variables:
GITGUARDIAN_API_KEY: ${process.env.GITGUARDIAN_API_KEY}
image: gitguardian/ggshield:latest
stage: scanning
script: ggshield scan ci
The pipeline fails with this message: Error: Invalid API key. so I assume that the way I'm passing it into variables is wrong.
CI variables should be available in gitlab-runner(machine or container) as environment variables, they are either predefined and populated by Gitlab like the list of predefined variables here, or added by you in the settings of the repository or the gitlab group Settings > CI/CD > Add Variable.
After adding variables you can use the following syntax, you can test if the variable has the correct value by echoing it.
variables:
GITGUARDIAN_API_KEY: "$GITGUARDIAN_API_KEY"
script:
- echo "$GITGUARDIAN_API_KEY"
- ggshield scan ci

How to refer variable name in .common tag in gitlab cicd

I have a gitlab yaml file. I have variables part and .common tag part in it as below:
variables:
name: app
env: prod
.common:
tags:
- &env_tag prod
My question is can we pass the env variable to .common tag part. While trying to refer the variable name, it was failing
I have tried as below:
.common:
tags:
- &env_tag $env
Unfortunately this is not possible right now as gitlab does not support variable expansion in tags.
There are currently multiple open issues regarding this feature:
https://gitlab.com/gitlab-org/gitlab/-/issues/35742
https://gitlab.com/gitlab-org/gitlab-foss/-/issues/24207

gitlab heroku api key securing

I am doing CI/CD with heroku and gitlab but i found it's not secure placing api in gitlab_ci.yml file
My gitlab looks like:
- dpl --provider=heroku --app=myproject-development --api-key=myapigoesthere
I found another way to do it, like this:
- dpl --provider=heroku --app=myproject-development --api-key=$HEROKU_API_KEY
I found we can give variable this way, but where can i set value of $HEROKU_API_KEY?
Anyone knows it?
There are multiple ways to set CI/CD variables, but you'll specifically want to set it within the project settings as a "masked" variable so it doesn't get printed in job logs.
So basically, go to your project's Settings > CI/CD and expand the Variables section and set up a variable with:
Key: HEROKU_API_KEY
Value: (insert your API key)
Type: Variable
Mask variable: on
Save.

Gitlab CI: Trigger different child pipelines from parent pipeline based on directory of changes

I would like to use Parent\Child pipilens https://docs.gitlab.com/ee/ci/parent_child_pipelines.html in this way.
I have this source structure:
- backend
--- .gitlab-ci.yml
--- src
- frontend
--- .gitlab-ci.yml
--- src
-.gitlab-ci.yml
I want to trigger backend or frontend .gitlab-ci.yml based on the path where new commit happens: if it happend on frontend, only frontend.gitlab-ci.yml should be used for build\publish.
Is it possible?
You can specify to execute different pipelines based on where the changes in the code occurred using the only: changes configuration documented here.
You can therefor specify to execute a pipeline frontend only if changes happen within the frontend folder (analog for `backend).
You can use the include: local feature (documented here) to include the frontend/.gitlab-ci.yml-file within the pipeline for the frontend that is defined in the root .gitlab-ci.yml.
For examples on how to exactly configure the pipeline so that it triggers a configuration provided in a local file, please see here.
Parent-child pipelines also support the only: changes configuration as documented here.

How to use a pipeline template for multiple pipelines (in multiple projects) in Azure devops

I am new to working with Azure DevOps and I am trying to set up build pipelines for multiple projects and share a yml template between them. I will demonstrate more clearly what I want to achieve but first let me show you our projects' structure:
proj0-common/
|----src/
|----azure-pipelines.yml
|----pipeline-templates/
|----build-project.yml
|----install-net-core
proj1/
|----src/
|----azure-pipelines.yml
proj2/
|----src/
|----azure-pipelines.yml
proj3/
|----src/
|----azure-pipelines.yml
The first folder is our Common project in which we want to put our common scripts and packages and use them in the projects. The rest of the folders (proj1-proj3) are .net core projects and act as microservice projects. As you can see, each project has its own azure-pipelines.yml pipeline file and each project resides in its own repository in Github. Then there are the template pipeline files (build-project.yml and install-net-core) which reside in the common project.
All the projects have the same build steps, therefore I would like to use the build-project.yml template for all the three projects (instead of hardcoding every step in every file).
My problem is that since they reside in distinct projects, I cannot access the template files simply, let's say from project3, by just addressing it like this:
.
.
.
- template: ../proj0-common/pipeline-templates/build-project.yml
.
.
.
And [I believe] the reason is that each project will have its own isolated build pool(please do correct me on this if I am wrong).
I was thinking if Azure DevOps had similar functionality to the variable groups but for pipeline templates, that could solve my problem, however, I cannot find such a feature. Could someone suggest a solution to this problem?
Could you copy this use case? I experimented a bit after checking out some of the docs. It had some gaps though, like most of Microsoft's other docs around Azure DevOps.
Say you have azdevops-settings.yml that specifies the pipeline in one of your service branches. In the example below it has two task steps that runs an external template in another repository, but in one of them I supply a parameter that is otherwise set to some default in the template.
Notice I had to use the endpoint tag, otherwise it will complain. Something that could be further specified in the docs.
# In ThisProject
# ./azdevops-settings.yml
resources:
repositories:
- repository: templates
type: bitbucket
name: mygitdomain/otherRepo
endpoint: MyNameOfTheGitServiceConnection
steps:
- template: sometemplate.yml#templates
parameters:
Param1: 'Changed Param1'
- template: sometemplate.yml#templates
In the template I first have the available parameters that I want to pipe to the template. I tried out referencing parameters without pipeing them, like build id and other predefined variables and they worked fine.
I also tried using an inline script as well as a script path reference. The 'test.ps1' just prints a string, like the output below.
# otherRepo/sometemplate.yml
parameters:
Param1: 'hello there'
steps:
- powershell: |
Write-Host "Your parameter is now: $env:Param"
Write-Host "When outputting standard variable build id: $(Build.BuildId)"
Write-Host "When outputting standard variable build id via env: $env:BuildNumber"
Write-Host "The repo name is: $(Build.Repository.Name)"
Write-Host "The build definition name is: $(Build.DefinitionName)"
env:
Param: ${{parameters.Param1}}
BuildNumber: $(Build.BuildId)
- powershell: './test.ps1'
And the separate powershell script:
# otherRepo/test.ps1
Write-Host "Running script from powershell specification"
Output:
========================== Starting Command Output ===========================
Your parameter is now: Changed Param1
When outputting standard variable build id: 23
When outputting standard variable build id via env: 23
The repo name is: mygitdomain/thisRepo
The build definition name is: ThisProject
Finishing: PowerShell
========================== Starting Command Output ===========================
Running script from powershell specification
Finishing: PowerShell
..and so on..
I found only one solution to actually do that. You can reference the parent directory by using an absolute path. The key was to populate the root path using a system variable. The solution for your example:
- template: ${{variables['System.DefaultWorkingDirectory']}}/proj0-common/pipeline-templates/build-project.yml

Resources