How to refer variable name in .common tag in gitlab cicd - gitlab

I have a gitlab yaml file. I have variables part and .common tag part in it as below:
variables:
name: app
env: prod
.common:
tags:
- &env_tag prod
My question is can we pass the env variable to .common tag part. While trying to refer the variable name, it was failing
I have tried as below:
.common:
tags:
- &env_tag $env

Unfortunately this is not possible right now as gitlab does not support variable expansion in tags.
There are currently multiple open issues regarding this feature:
https://gitlab.com/gitlab-org/gitlab/-/issues/35742
https://gitlab.com/gitlab-org/gitlab-foss/-/issues/24207

Related

How do you develop a custom plugin for Gitlab CICD?

I need to integrate a Gitlab CICD pipeline with a custom Delivery Manager Tool. My pipeline will need to invoke the delivery manager API with some info.
In Jenkins we developed a plugin that provided a pipeline step - void deployEnv (String app, String environment ) - that can be used in the different stages, e.g.:
deployEnv("applicationx", "production")
Is there a way to develop a similar plugin in Gitlab CICD?
Is it possible to invoke a remote URL from a Gitlab CICD pipeline passing some credentials?
The closest analog for this kind of "plugin" in GitLab CI is probably a CI templated job definition. There's maybe a few ways to formulate this. There are a number of methods for abstracting and including job definitions provided by others. Some of the basic tools are: include:, extends:, !reference, "hidden job" keys, and YAML anchors.
Providing reusable templates
If you just need to provide an abstraction for a series of steps, a "hidden key" definition would be the closest to what you want.
Consider the following template YAML. This might be embedded directly in your .gitlab-ci.yml file(s) or you might choose to include it any number of configurations from a remote location using the include: keyword.
In this fictional example, we provide a script step that expects two environment variables to be present: APPLICATION_NAME and DEPLOYMENT_ENV. These variables are used (in this fictional example) to call a remote API passing those values as path parameters. Here, the definitions are provided in a "hidden job" key
.deploy_to_env:
image: curlimages/curl # or otherwise have `curl` in your environment
script:
- |
if [[ -z "$APPLICATION_NAME" || -z "$DEPLOYMENT_ENV" ]]; then
echo "FATAL: you must set APPLICATION_NAME and DEPLOYMENT_ENV variables"
exit 1
fi
- curl -XPOST "https://my-deployment-api.example.com/${APPLICAITON_NAME}/${DEPLOYMENT_ENV}"
Let's assume this yaml file exists in a file named deploy.yml in a project whose path is my-org/templates.
Using templates
Now let's say a pipeline configuration wants to leverage the above definition to deploy an application named applicationx to production.
First, in any case, the project should include: the remote definition (unless you choose to embed it directly -- e.g., copy/paste).
include:
- project: my-org/templates
file: deploy.yml
ref: main # or any git ref, or omit to use default branch
Then you can use the extends: keyword to form a concrete job from the hidden key.
deploy_production:
stage: deploy
extends: .deploy_to_env
variables:
APPLICATION_NAME: "applicationx"
DEPLOYMENT_ENV: "production"
Or, if you want to embed the deployment steps in the middle of other script steps using !reference is useful here.
deploy_production:
stage: deploy
script:
- export APPLICATION_NAME="applicationx"
- export DEPLOY_ENV="production"
# these could also be set in `variables:`
- echo "calling deployment API to deploy ${APPLICATION_NAME} to ${DEPLOY_ENV}"
- !reference [.deploy_to_env, script]
- echo "done"
There's a lot of ways to handle this, these are just two examples.

Can I pass a variable from .env file into .gitlab-ci.yml

I'm quite new to CI/CD and basically I'm trying to add this job to Gitlab CI/CD that will run through the repo looking for secret leaks. It requires some API key to be passed there. I was able to directly insert this key into .gitlab-ci.yml and it worked as it was supposed to - failing the job and showing that this happened due to this key being in that file.
But I would like to have this API key to be stored in .env file that won't be pushed to a remote repo and to pull it somehow into .gitlab-ci.yml file from there.
Here's mine
stages:
- scanning
gitguardian scan:
variables:
GITGUARDIAN_API_KEY: ${process.env.GITGUARDIAN_API_KEY}
image: gitguardian/ggshield:latest
stage: scanning
script: ggshield scan ci
The pipeline fails with this message: Error: Invalid API key. so I assume that the way I'm passing it into variables is wrong.
CI variables should be available in gitlab-runner(machine or container) as environment variables, they are either predefined and populated by Gitlab like the list of predefined variables here, or added by you in the settings of the repository or the gitlab group Settings > CI/CD > Add Variable.
After adding variables you can use the following syntax, you can test if the variable has the correct value by echoing it.
variables:
GITGUARDIAN_API_KEY: "$GITGUARDIAN_API_KEY"
script:
- echo "$GITGUARDIAN_API_KEY"
- ggshield scan ci

How to find the deployment environment of an azure function app in code

Scenerio is that:
We have Azure DevOps and we can run a pipeline into one of x number of named environments
We make use of Azure App Configuration, and labels for the values for each environment. So for each setting, it might have a different value depending on the label
It occurs to me that if i match up the label to the same as the names of the environments, then in code, when i get the config value, if I can somehow determine what environment I've been deployed to (speaking from the code's point of view) then i can just pass this variable when getting the app config and i will have the correct config settings for my environment.
var environment = // HERE find my deployed to environment as in pipeline (1.)
var credentials = new DefaultAzureCredential();
configurationBuild.AddAzureAppConfiguration(options =>
{
options.Connect(settings.GetValue<string>("ConnectionStrings:AppConfig"))
.Select(KeyFilter.Any, LabelFilter.Null)
.Select(KeyFilter.Any, labelFilter: environment);
});
I was thinking that the solution would be something of the form of setting the environment in the azure-pipelines.yaml where the pipeline somehow knows the choice of environment and then reading it in code back out of the environment variable. but i dont know how to do that, or if there is a better way to do it? Thanks in advance for any help offered.
You can use the pipeline variables to pass the environment value to your code. The pipeline variables you defined in azure-pipelines.yaml will get injected as environment variables for your platform, which allows you to get their values in your code using Environment.GetEnvironmentVariable().
So you can define a pipeline variable in the azure-pipelines.yaml like below example(ie.DeployEnv):
parameters:
- name: Environment
displayName: Deploy to environment
type: string
values:
- none
- test
- dev
variables:
DeployEnv: ${{parameters.Environment}}
trigger: none
pool:
vmImage: 'windows-latest'
Then you can get the pipeline variable (ie.DeployEnv) in you code like below:
using System;
var environment = Environment.GetEnvironmentVariable("DeployEnv");
var credentials = new DefaultAzureCredential();
....
Another workaround is to define an environment property in the config(eg.web.config) file. And you can read the environment property in your code. In the pipeline you need to add tasks to replace the value of the environment property in the config file. See this thread for more information.

Azure Pipelines "Require Template" check not working

I am trying to get the "Require template" check working on a protected resource (Agent Pool, Service Connection, etc) in my Azure Pipelines.
I've got a shared template setup in a common repository (named "goldenimage-azure-pipelines-templates") that is defined as follows:
# /templates/pipelines/master.yml
parameters:
- name: templates
type: object
default: []
stages:
- ${{ each template in parameters.templates }}:
- ${{ each pair in template }}:
${{ if eq(pair.key, 'template') }}:
${{ template }}
Then I have a set of shared templates in the same repository that are referenced by the consuming azure-pipelines.yml file.
# /templates/stages/main.yml
stages:
- stage: mainBuild
jobs:
- template: /templates/jobs/set-version.yml
- template: /templates/jobs/build-image.yml
- template: /templates/jobs/cleanup-build.yml
- template: /templates/jobs/test-image.yml
- template: /templates/jobs/cleanup-test.yml
- template: /templates/jobs/update-configmap.yml
- template: /templates/jobs/destroy-template.yml
- template: /templates/jobs/cleanup.yml
Now, in my consuming repository, I have the azure-pipelines.yml file defined as follows:
# azure-pipelines.yml
name: $(GitVersion.NuGetVersionV2).$(Build.BuildId)
trigger:
branches:
include:
- master
paths:
exclude:
- 'README.md'
resources:
repositories:
- repository: templates
type: git
name: goldenimage-azure-pipelines-templates
ref: feature/WI443-baseTest
variables:
- template: /templates/vars/main.yml#templates
- template: /azure-pipelines/vars.yml
extends:
template: templates/pipelines/master.yml#templates
parameters:
templates:
- template: /templates/stages/main.yml
And then in my protected resource (Agent Pool or Service Connection), I've defined the check as follows:
But whenever the build runs, it ALWAYS reports that it has failed this check.
I've tried changing the syntax for the Ref to several different options such as:
feature/WI443-baseTest
refs/heads/feature/WI443-baseTest
refs/tags/extend (made this tag just for this test)
I've also tried adding and removing the leading slash on the path to the template, and well as adding #templates on the end of it.
In addition, I have added and removed the template on both the Service Connection, and the Agent pool (in case it would work with one, but not the other).
No matter what I do, it reports that the run is not extending the template. However, I can see in the pipeline the jobs from the template, so it's obviously pulling it.
What am I doing wrong?
No matter what I do, it reports that the run is not extending the
template. However, I can see in the pipeline the jobs from the
template, so it's obviously pulling it.
The direct cause of the issue is that your pipeline doesn't pass the Require Template check. I think the jobs are canceled because of that.
I found it could work well if all my resources were in branch whose format was feature. and same issue occurred if I used a branch like feature/xxx. So I think the second format branch is not supported well in Require Template check .
Check the pic above, according to my tests the check works well for DevBranch, but not Feature/Test. I suggest you can post a feature request here to report this issue. Thanks for your help to make our product better :)
resources:
repositories:
- repository: templates
type: git
name: goldenimage-azure-pipelines-templates
ref: feature/WI443-baseTest
the ref in the pipeline should add the ref/tags/* or at your case ref/heads/feature/WI443-baseTest
in the security approval is the same too, you may refer this article for more information here

How to use a pipeline template for multiple pipelines (in multiple projects) in Azure devops

I am new to working with Azure DevOps and I am trying to set up build pipelines for multiple projects and share a yml template between them. I will demonstrate more clearly what I want to achieve but first let me show you our projects' structure:
proj0-common/
|----src/
|----azure-pipelines.yml
|----pipeline-templates/
|----build-project.yml
|----install-net-core
proj1/
|----src/
|----azure-pipelines.yml
proj2/
|----src/
|----azure-pipelines.yml
proj3/
|----src/
|----azure-pipelines.yml
The first folder is our Common project in which we want to put our common scripts and packages and use them in the projects. The rest of the folders (proj1-proj3) are .net core projects and act as microservice projects. As you can see, each project has its own azure-pipelines.yml pipeline file and each project resides in its own repository in Github. Then there are the template pipeline files (build-project.yml and install-net-core) which reside in the common project.
All the projects have the same build steps, therefore I would like to use the build-project.yml template for all the three projects (instead of hardcoding every step in every file).
My problem is that since they reside in distinct projects, I cannot access the template files simply, let's say from project3, by just addressing it like this:
.
.
.
- template: ../proj0-common/pipeline-templates/build-project.yml
.
.
.
And [I believe] the reason is that each project will have its own isolated build pool(please do correct me on this if I am wrong).
I was thinking if Azure DevOps had similar functionality to the variable groups but for pipeline templates, that could solve my problem, however, I cannot find such a feature. Could someone suggest a solution to this problem?
Could you copy this use case? I experimented a bit after checking out some of the docs. It had some gaps though, like most of Microsoft's other docs around Azure DevOps.
Say you have azdevops-settings.yml that specifies the pipeline in one of your service branches. In the example below it has two task steps that runs an external template in another repository, but in one of them I supply a parameter that is otherwise set to some default in the template.
Notice I had to use the endpoint tag, otherwise it will complain. Something that could be further specified in the docs.
# In ThisProject
# ./azdevops-settings.yml
resources:
repositories:
- repository: templates
type: bitbucket
name: mygitdomain/otherRepo
endpoint: MyNameOfTheGitServiceConnection
steps:
- template: sometemplate.yml#templates
parameters:
Param1: 'Changed Param1'
- template: sometemplate.yml#templates
In the template I first have the available parameters that I want to pipe to the template. I tried out referencing parameters without pipeing them, like build id and other predefined variables and they worked fine.
I also tried using an inline script as well as a script path reference. The 'test.ps1' just prints a string, like the output below.
# otherRepo/sometemplate.yml
parameters:
Param1: 'hello there'
steps:
- powershell: |
Write-Host "Your parameter is now: $env:Param"
Write-Host "When outputting standard variable build id: $(Build.BuildId)"
Write-Host "When outputting standard variable build id via env: $env:BuildNumber"
Write-Host "The repo name is: $(Build.Repository.Name)"
Write-Host "The build definition name is: $(Build.DefinitionName)"
env:
Param: ${{parameters.Param1}}
BuildNumber: $(Build.BuildId)
- powershell: './test.ps1'
And the separate powershell script:
# otherRepo/test.ps1
Write-Host "Running script from powershell specification"
Output:
========================== Starting Command Output ===========================
Your parameter is now: Changed Param1
When outputting standard variable build id: 23
When outputting standard variable build id via env: 23
The repo name is: mygitdomain/thisRepo
The build definition name is: ThisProject
Finishing: PowerShell
========================== Starting Command Output ===========================
Running script from powershell specification
Finishing: PowerShell
..and so on..
I found only one solution to actually do that. You can reference the parent directory by using an absolute path. The key was to populate the root path using a system variable. The solution for your example:
- template: ${{variables['System.DefaultWorkingDirectory']}}/proj0-common/pipeline-templates/build-project.yml

Resources