Gitlab CI: Trigger different child pipelines from parent pipeline based on directory of changes - gitlab

I would like to use Parent\Child pipilens https://docs.gitlab.com/ee/ci/parent_child_pipelines.html in this way.
I have this source structure:
- backend
--- .gitlab-ci.yml
--- src
- frontend
--- .gitlab-ci.yml
--- src
-.gitlab-ci.yml
I want to trigger backend or frontend .gitlab-ci.yml based on the path where new commit happens: if it happend on frontend, only frontend.gitlab-ci.yml should be used for build\publish.
Is it possible?

You can specify to execute different pipelines based on where the changes in the code occurred using the only: changes configuration documented here.
You can therefor specify to execute a pipeline frontend only if changes happen within the frontend folder (analog for `backend).
You can use the include: local feature (documented here) to include the frontend/.gitlab-ci.yml-file within the pipeline for the frontend that is defined in the root .gitlab-ci.yml.
For examples on how to exactly configure the pipeline so that it triggers a configuration provided in a local file, please see here.
Parent-child pipelines also support the only: changes configuration as documented here.

Related

How do you develop a custom plugin for Gitlab CICD?

I need to integrate a Gitlab CICD pipeline with a custom Delivery Manager Tool. My pipeline will need to invoke the delivery manager API with some info.
In Jenkins we developed a plugin that provided a pipeline step - void deployEnv (String app, String environment ) - that can be used in the different stages, e.g.:
deployEnv("applicationx", "production")
Is there a way to develop a similar plugin in Gitlab CICD?
Is it possible to invoke a remote URL from a Gitlab CICD pipeline passing some credentials?
The closest analog for this kind of "plugin" in GitLab CI is probably a CI templated job definition. There's maybe a few ways to formulate this. There are a number of methods for abstracting and including job definitions provided by others. Some of the basic tools are: include:, extends:, !reference, "hidden job" keys, and YAML anchors.
Providing reusable templates
If you just need to provide an abstraction for a series of steps, a "hidden key" definition would be the closest to what you want.
Consider the following template YAML. This might be embedded directly in your .gitlab-ci.yml file(s) or you might choose to include it any number of configurations from a remote location using the include: keyword.
In this fictional example, we provide a script step that expects two environment variables to be present: APPLICATION_NAME and DEPLOYMENT_ENV. These variables are used (in this fictional example) to call a remote API passing those values as path parameters. Here, the definitions are provided in a "hidden job" key
.deploy_to_env:
image: curlimages/curl # or otherwise have `curl` in your environment
script:
- |
if [[ -z "$APPLICATION_NAME" || -z "$DEPLOYMENT_ENV" ]]; then
echo "FATAL: you must set APPLICATION_NAME and DEPLOYMENT_ENV variables"
exit 1
fi
- curl -XPOST "https://my-deployment-api.example.com/${APPLICAITON_NAME}/${DEPLOYMENT_ENV}"
Let's assume this yaml file exists in a file named deploy.yml in a project whose path is my-org/templates.
Using templates
Now let's say a pipeline configuration wants to leverage the above definition to deploy an application named applicationx to production.
First, in any case, the project should include: the remote definition (unless you choose to embed it directly -- e.g., copy/paste).
include:
- project: my-org/templates
file: deploy.yml
ref: main # or any git ref, or omit to use default branch
Then you can use the extends: keyword to form a concrete job from the hidden key.
deploy_production:
stage: deploy
extends: .deploy_to_env
variables:
APPLICATION_NAME: "applicationx"
DEPLOYMENT_ENV: "production"
Or, if you want to embed the deployment steps in the middle of other script steps using !reference is useful here.
deploy_production:
stage: deploy
script:
- export APPLICATION_NAME="applicationx"
- export DEPLOY_ENV="production"
# these could also be set in `variables:`
- echo "calling deployment API to deploy ${APPLICATION_NAME} to ${DEPLOY_ENV}"
- !reference [.deploy_to_env, script]
- echo "done"
There's a lot of ways to handle this, these are just two examples.

Can I pass a variable from .env file into .gitlab-ci.yml

I'm quite new to CI/CD and basically I'm trying to add this job to Gitlab CI/CD that will run through the repo looking for secret leaks. It requires some API key to be passed there. I was able to directly insert this key into .gitlab-ci.yml and it worked as it was supposed to - failing the job and showing that this happened due to this key being in that file.
But I would like to have this API key to be stored in .env file that won't be pushed to a remote repo and to pull it somehow into .gitlab-ci.yml file from there.
Here's mine
stages:
- scanning
gitguardian scan:
variables:
GITGUARDIAN_API_KEY: ${process.env.GITGUARDIAN_API_KEY}
image: gitguardian/ggshield:latest
stage: scanning
script: ggshield scan ci
The pipeline fails with this message: Error: Invalid API key. so I assume that the way I'm passing it into variables is wrong.
CI variables should be available in gitlab-runner(machine or container) as environment variables, they are either predefined and populated by Gitlab like the list of predefined variables here, or added by you in the settings of the repository or the gitlab group Settings > CI/CD > Add Variable.
After adding variables you can use the following syntax, you can test if the variable has the correct value by echoing it.
variables:
GITGUARDIAN_API_KEY: "$GITGUARDIAN_API_KEY"
script:
- echo "$GITGUARDIAN_API_KEY"
- ggshield scan ci

Can't determine pipeline which triggered a build

I'm using Azure DevOps's multiple repository functionality, documented here:
​https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops
I have my YAML file in one repo, and the pipeline points to that YAML. The YAML has a trigger set up for another repository resource, so that when that repo gets updated, the pipeline will be triggered:
resources:
repositories:
- repository: MyRepo
endpoint: 'MyRepos'
type: git
name: RepoName
trigger:
- '*'
The documentation claims that the 'Build.SourceBranch' variable will allow me to determine which branch in MyRepo triggered the pipeline build: "When an update to one of the repositories triggers a pipeline, then the following variables are set based on triggering repository"
However, this doesn't seem to be the case. No matter what branch triggers the build, 'Build.SourceBranch' is always 'refs/heads/master', presumably because the repo that holds the YAML has 'master' as its default branch.
I can't find any environment variable that gets set to the name of the branch that triggered the build, either. So how can I get the name of the branch that triggered the build? If there's no possible way, I think this needs to be added!
The issue is:
According to the document, Build.SourceBranch is set based on triggering repository. However, its value is determined by repo in which the YAML file resides in practice.
I have done following tests. There are two repos, 'RepoA' and 'RepoB'. Both repos have two branches, 'master' and 'bran'. And the YAML file is in 'master' of 'RepoA'
Commit a change in 'bran' of 'RepoB'. The value of Build.SourceBranch is refs/heads/master. It is not consistent with the documentation.
Commit a change in 'bran' of 'RepoA'. The value of Build.SourceBranch is refs/heads/bran. It is consistent with the documentation.
Commit a change in 'master' of 'RepoB'. The value of Build.SourceBranch is refs/heads/master. It is consistent with the documentation.
Commit a change in 'master' of 'RepoA'. The value of Build.SourceBranch is refs/heads/master. It is consistent with the documentation.
Thus, if the build is triggered by 'RepoA', Build.SourceBranch can successfully represent the true branch. However, if the build is triggered by 'RepoB', the value of Build.SourceBranch are always refs/heads/master.
We have reported this issue to the product group.

Gitlab CI : Multiple project

I have two projects on gitlab : a frontend (angular) and a module backend (spring). So I would like to use a pipeline to run tests on the frontend after backend was tested and builded . For example, I'd like run tests and build backend modules when it succeeds I'd like run the frontend tests which call the api back before I deploy it as below :
Frontend pipeline .gitlab-ci.yml : stage back : tests => build the backend then stage front : run the tests on api back => build the frontend
How I can do this, please ?
You could use Gitlabs Multi-Project Pipelines Feature: https://docs.gitlab.com/ee/ci/multi_project_pipelines.html#multi-project-pipelines
For example you can add a build-backend job to your frontend gitlab-ci.yml. This job starts the pipeline in the Start/backend Repository and waits for it to end (configured with strategy: depend). In the gitlab-ci.yml of the backend project, you can build and test the backend modules and after this pipeline finishes, the next jobs in the frontend pipeline are executed.
build-backend:
stage: build-backend
trigger:
project: Start/backend
strategy: depend
You can use the GitLab Pipelines API to create a new pipeline in the frontend project.
This means you would have two .gitlab-ci.yml files -- one in the backend project, and one in the frontend project.
See also: https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html (you'll need an access token to auth with the GitLab API. You can so via Oauth2 or by using a personal access token, which you might find easier to start with).

How to use a pipeline template for multiple pipelines (in multiple projects) in Azure devops

I am new to working with Azure DevOps and I am trying to set up build pipelines for multiple projects and share a yml template between them. I will demonstrate more clearly what I want to achieve but first let me show you our projects' structure:
proj0-common/
|----src/
|----azure-pipelines.yml
|----pipeline-templates/
|----build-project.yml
|----install-net-core
proj1/
|----src/
|----azure-pipelines.yml
proj2/
|----src/
|----azure-pipelines.yml
proj3/
|----src/
|----azure-pipelines.yml
The first folder is our Common project in which we want to put our common scripts and packages and use them in the projects. The rest of the folders (proj1-proj3) are .net core projects and act as microservice projects. As you can see, each project has its own azure-pipelines.yml pipeline file and each project resides in its own repository in Github. Then there are the template pipeline files (build-project.yml and install-net-core) which reside in the common project.
All the projects have the same build steps, therefore I would like to use the build-project.yml template for all the three projects (instead of hardcoding every step in every file).
My problem is that since they reside in distinct projects, I cannot access the template files simply, let's say from project3, by just addressing it like this:
.
.
.
- template: ../proj0-common/pipeline-templates/build-project.yml
.
.
.
And [I believe] the reason is that each project will have its own isolated build pool(please do correct me on this if I am wrong).
I was thinking if Azure DevOps had similar functionality to the variable groups but for pipeline templates, that could solve my problem, however, I cannot find such a feature. Could someone suggest a solution to this problem?
Could you copy this use case? I experimented a bit after checking out some of the docs. It had some gaps though, like most of Microsoft's other docs around Azure DevOps.
Say you have azdevops-settings.yml that specifies the pipeline in one of your service branches. In the example below it has two task steps that runs an external template in another repository, but in one of them I supply a parameter that is otherwise set to some default in the template.
Notice I had to use the endpoint tag, otherwise it will complain. Something that could be further specified in the docs.
# In ThisProject
# ./azdevops-settings.yml
resources:
repositories:
- repository: templates
type: bitbucket
name: mygitdomain/otherRepo
endpoint: MyNameOfTheGitServiceConnection
steps:
- template: sometemplate.yml#templates
parameters:
Param1: 'Changed Param1'
- template: sometemplate.yml#templates
In the template I first have the available parameters that I want to pipe to the template. I tried out referencing parameters without pipeing them, like build id and other predefined variables and they worked fine.
I also tried using an inline script as well as a script path reference. The 'test.ps1' just prints a string, like the output below.
# otherRepo/sometemplate.yml
parameters:
Param1: 'hello there'
steps:
- powershell: |
Write-Host "Your parameter is now: $env:Param"
Write-Host "When outputting standard variable build id: $(Build.BuildId)"
Write-Host "When outputting standard variable build id via env: $env:BuildNumber"
Write-Host "The repo name is: $(Build.Repository.Name)"
Write-Host "The build definition name is: $(Build.DefinitionName)"
env:
Param: ${{parameters.Param1}}
BuildNumber: $(Build.BuildId)
- powershell: './test.ps1'
And the separate powershell script:
# otherRepo/test.ps1
Write-Host "Running script from powershell specification"
Output:
========================== Starting Command Output ===========================
Your parameter is now: Changed Param1
When outputting standard variable build id: 23
When outputting standard variable build id via env: 23
The repo name is: mygitdomain/thisRepo
The build definition name is: ThisProject
Finishing: PowerShell
========================== Starting Command Output ===========================
Running script from powershell specification
Finishing: PowerShell
..and so on..
I found only one solution to actually do that. You can reference the parent directory by using an absolute path. The key was to populate the root path using a system variable. The solution for your example:
- template: ${{variables['System.DefaultWorkingDirectory']}}/proj0-common/pipeline-templates/build-project.yml

Resources