Can GitLab pages be used for review apps on a mkdocs project? - gitlab

This answer by #joki to a previous question suggests that it is possible to deploy each active branch in a GitLab repo to a dynamic environment, by giving browsable artifacts a public URL.
Trying this out with a mkdocs material project, I've found two issues.
Firstly, if the GitLab repo is within a group or a subgroup the URLs in the .gitlab-ci.yml file needs to be something more like this:
environment:
name: review/$CI_COMMIT_REF_NAME
url: "$CI_PAGES_URL/-/jobs/$CI_JOB_ID/artifacts/public/index.html"
auto_stop_in: 1 week
variables:
PUBLIC_URL: "$CI_PAGES_URL/-/jobs/$CI_JOB_ID/artifacts/public/"
Secondly, relative links within the site don't work well, leading to a lot of 404 errors, and the loss of things like style files. Possibly the URLs above are not right, or maybe the site_url in mkdocs.yml needs changing to something like:
site_url: !!python/object/apply:os.getenv ["CI_ENVIRONMENT_URL"]
however, neither of these quite worked for me.
A minimal MR with a very small deployment and review app can be found here.
Does anyone have a working recipe for mkdocs review apps?

You can see the URL you need in the »Browse« button of the build step in your pipeline.
Does this work?
develop:
artifacts:
paths:
- public
environment:
name: Develop
url: "https://$CI_PROJECT_NAMESPACE.gitlab.io/-/snim2-test-subgroup/$CI_PROJECT_NAME/-/jobs/$CI_JOB_ID/artifacts/public/index.html"
script: |
# whatever
stage: deploy
variables:
PUBLIC_URL: "/-/snim2-test-subgroup/$CI_PROJECT_NAME/-/jobs/$CI_JOB_ID/artifacts/public"
You'll also need your change to mkdocs.yml to actually use the PUBLIC_URL, and make sure it's used everywhere that absolute internal links are generated:
site_url: !!python/object/apply:os.getenv ["PUBLIC_URL"]
use_directory_urls: false
…

Related

How to configure a custom code quality check in Gitlab?

I'm trying to configure .NET project code quality check in GitLab Enterprise Edition 15.8.1-ee (Premium tier), but Gitlab UI doesn't show any code issue.
Since I'm going to use a custom code inspection tool (JetBrains Inspect Code command line tool), I've written a special converter that reformat JetBrains report format to Gitlab JSON format (https://docs.gitlab.com/ee/ci/testing/code_quality.html#implement-a-custom-tool). For testing purpose, I've prepared a GitLab code quality report, I added the report to the repository and added an additional Gitlab job to provide the file to CI pipeline.
Prepared GitLab code quality report (gl-code-quality-report.json) part:
[
{
"description": "Using directive is not required by the code and can be safely removed",
"fingerprint": "a3d5c2a9-1761-4a18-8e17-35df9e2bc3a6",
"severity": "critical",
"location": {
"path": "src/folder/Class.cs",
"lines": {
"begin": 8
}
}
}
...
]
.gitlab-ci.yml part (since the report is already pregenerated, powershell script do nothing):
check-code-quality:
stage: check-code-quality
only: ['branches']
dependencies:
- build
script: ['powershell.exe .\build\check-code-quality.ps1']
artifacts:
when: always
expire_in: 4 days
reports:
codequality: gl-code-quality-report.json
Current result: CI pipeline doesn't fail. The pipeline has a new job 'check-code-quality' and there is a new tab in the pipeline page - Code quality. Unfortunately, the tab has the text: "No code quality issues found.". In a merge request page there is a new section with the text "Code Quality hasn't changed.".
check-code-quality log has a text:
gl-code-quality-report.json: found 1 matching files and directories
Uploading artifacts as "codequality" to coordinator... ok id=1684071 responseStatus=201 Created token=64_yasyB
Why I can't see any issue in Gitlab UI? Please tell me what I'm doing wrong.
I have multiple things in mind.
First of all your JSON structure could be invalid. Make sure that the JSON file conforms to the GitLab JSON format as described in the docs.
Another problem could be that location field may be incorrect. It specifies the path to the file that contains the code quality issue. Make sure that the path is correct and accessible in your repository.
I would also check for the artifact path. Please verify that the path to the JSON file is specified in the artifacts field of your .gitlab-ci.yml.
In some cases it might also be related to a cache issue, try clearing the cache.
I've found that my pre-generated file encoding is UTF-8 with BOM and it seems Gitlab doesn't recognize data with this encoding. When I change encoding to UTF-8 Gitlab shows the code quality widget and all issues described in provided JSON file.

How do you develop a custom plugin for Gitlab CICD?

I need to integrate a Gitlab CICD pipeline with a custom Delivery Manager Tool. My pipeline will need to invoke the delivery manager API with some info.
In Jenkins we developed a plugin that provided a pipeline step - void deployEnv (String app, String environment ) - that can be used in the different stages, e.g.:
deployEnv("applicationx", "production")
Is there a way to develop a similar plugin in Gitlab CICD?
Is it possible to invoke a remote URL from a Gitlab CICD pipeline passing some credentials?
The closest analog for this kind of "plugin" in GitLab CI is probably a CI templated job definition. There's maybe a few ways to formulate this. There are a number of methods for abstracting and including job definitions provided by others. Some of the basic tools are: include:, extends:, !reference, "hidden job" keys, and YAML anchors.
Providing reusable templates
If you just need to provide an abstraction for a series of steps, a "hidden key" definition would be the closest to what you want.
Consider the following template YAML. This might be embedded directly in your .gitlab-ci.yml file(s) or you might choose to include it any number of configurations from a remote location using the include: keyword.
In this fictional example, we provide a script step that expects two environment variables to be present: APPLICATION_NAME and DEPLOYMENT_ENV. These variables are used (in this fictional example) to call a remote API passing those values as path parameters. Here, the definitions are provided in a "hidden job" key
.deploy_to_env:
image: curlimages/curl # or otherwise have `curl` in your environment
script:
- |
if [[ -z "$APPLICATION_NAME" || -z "$DEPLOYMENT_ENV" ]]; then
echo "FATAL: you must set APPLICATION_NAME and DEPLOYMENT_ENV variables"
exit 1
fi
- curl -XPOST "https://my-deployment-api.example.com/${APPLICAITON_NAME}/${DEPLOYMENT_ENV}"
Let's assume this yaml file exists in a file named deploy.yml in a project whose path is my-org/templates.
Using templates
Now let's say a pipeline configuration wants to leverage the above definition to deploy an application named applicationx to production.
First, in any case, the project should include: the remote definition (unless you choose to embed it directly -- e.g., copy/paste).
include:
- project: my-org/templates
file: deploy.yml
ref: main # or any git ref, or omit to use default branch
Then you can use the extends: keyword to form a concrete job from the hidden key.
deploy_production:
stage: deploy
extends: .deploy_to_env
variables:
APPLICATION_NAME: "applicationx"
DEPLOYMENT_ENV: "production"
Or, if you want to embed the deployment steps in the middle of other script steps using !reference is useful here.
deploy_production:
stage: deploy
script:
- export APPLICATION_NAME="applicationx"
- export DEPLOY_ENV="production"
# these could also be set in `variables:`
- echo "calling deployment API to deploy ${APPLICATION_NAME} to ${DEPLOY_ENV}"
- !reference [.deploy_to_env, script]
- echo "done"
There's a lot of ways to handle this, these are just two examples.

Can I pass a variable from .env file into .gitlab-ci.yml

I'm quite new to CI/CD and basically I'm trying to add this job to Gitlab CI/CD that will run through the repo looking for secret leaks. It requires some API key to be passed there. I was able to directly insert this key into .gitlab-ci.yml and it worked as it was supposed to - failing the job and showing that this happened due to this key being in that file.
But I would like to have this API key to be stored in .env file that won't be pushed to a remote repo and to pull it somehow into .gitlab-ci.yml file from there.
Here's mine
stages:
- scanning
gitguardian scan:
variables:
GITGUARDIAN_API_KEY: ${process.env.GITGUARDIAN_API_KEY}
image: gitguardian/ggshield:latest
stage: scanning
script: ggshield scan ci
The pipeline fails with this message: Error: Invalid API key. so I assume that the way I'm passing it into variables is wrong.
CI variables should be available in gitlab-runner(machine or container) as environment variables, they are either predefined and populated by Gitlab like the list of predefined variables here, or added by you in the settings of the repository or the gitlab group Settings > CI/CD > Add Variable.
After adding variables you can use the following syntax, you can test if the variable has the correct value by echoing it.
variables:
GITGUARDIAN_API_KEY: "$GITGUARDIAN_API_KEY"
script:
- echo "$GITGUARDIAN_API_KEY"
- ggshield scan ci

Gitlab CI: Trigger different child pipelines from parent pipeline based on directory of changes

I would like to use Parent\Child pipilens https://docs.gitlab.com/ee/ci/parent_child_pipelines.html in this way.
I have this source structure:
- backend
--- .gitlab-ci.yml
--- src
- frontend
--- .gitlab-ci.yml
--- src
-.gitlab-ci.yml
I want to trigger backend or frontend .gitlab-ci.yml based on the path where new commit happens: if it happend on frontend, only frontend.gitlab-ci.yml should be used for build\publish.
Is it possible?
You can specify to execute different pipelines based on where the changes in the code occurred using the only: changes configuration documented here.
You can therefor specify to execute a pipeline frontend only if changes happen within the frontend folder (analog for `backend).
You can use the include: local feature (documented here) to include the frontend/.gitlab-ci.yml-file within the pipeline for the frontend that is defined in the root .gitlab-ci.yml.
For examples on how to exactly configure the pipeline so that it triggers a configuration provided in a local file, please see here.
Parent-child pipelines also support the only: changes configuration as documented here.

How to use a pipeline template for multiple pipelines (in multiple projects) in Azure devops

I am new to working with Azure DevOps and I am trying to set up build pipelines for multiple projects and share a yml template between them. I will demonstrate more clearly what I want to achieve but first let me show you our projects' structure:
proj0-common/
|----src/
|----azure-pipelines.yml
|----pipeline-templates/
|----build-project.yml
|----install-net-core
proj1/
|----src/
|----azure-pipelines.yml
proj2/
|----src/
|----azure-pipelines.yml
proj3/
|----src/
|----azure-pipelines.yml
The first folder is our Common project in which we want to put our common scripts and packages and use them in the projects. The rest of the folders (proj1-proj3) are .net core projects and act as microservice projects. As you can see, each project has its own azure-pipelines.yml pipeline file and each project resides in its own repository in Github. Then there are the template pipeline files (build-project.yml and install-net-core) which reside in the common project.
All the projects have the same build steps, therefore I would like to use the build-project.yml template for all the three projects (instead of hardcoding every step in every file).
My problem is that since they reside in distinct projects, I cannot access the template files simply, let's say from project3, by just addressing it like this:
.
.
.
- template: ../proj0-common/pipeline-templates/build-project.yml
.
.
.
And [I believe] the reason is that each project will have its own isolated build pool(please do correct me on this if I am wrong).
I was thinking if Azure DevOps had similar functionality to the variable groups but for pipeline templates, that could solve my problem, however, I cannot find such a feature. Could someone suggest a solution to this problem?
Could you copy this use case? I experimented a bit after checking out some of the docs. It had some gaps though, like most of Microsoft's other docs around Azure DevOps.
Say you have azdevops-settings.yml that specifies the pipeline in one of your service branches. In the example below it has two task steps that runs an external template in another repository, but in one of them I supply a parameter that is otherwise set to some default in the template.
Notice I had to use the endpoint tag, otherwise it will complain. Something that could be further specified in the docs.
# In ThisProject
# ./azdevops-settings.yml
resources:
repositories:
- repository: templates
type: bitbucket
name: mygitdomain/otherRepo
endpoint: MyNameOfTheGitServiceConnection
steps:
- template: sometemplate.yml#templates
parameters:
Param1: 'Changed Param1'
- template: sometemplate.yml#templates
In the template I first have the available parameters that I want to pipe to the template. I tried out referencing parameters without pipeing them, like build id and other predefined variables and they worked fine.
I also tried using an inline script as well as a script path reference. The 'test.ps1' just prints a string, like the output below.
# otherRepo/sometemplate.yml
parameters:
Param1: 'hello there'
steps:
- powershell: |
Write-Host "Your parameter is now: $env:Param"
Write-Host "When outputting standard variable build id: $(Build.BuildId)"
Write-Host "When outputting standard variable build id via env: $env:BuildNumber"
Write-Host "The repo name is: $(Build.Repository.Name)"
Write-Host "The build definition name is: $(Build.DefinitionName)"
env:
Param: ${{parameters.Param1}}
BuildNumber: $(Build.BuildId)
- powershell: './test.ps1'
And the separate powershell script:
# otherRepo/test.ps1
Write-Host "Running script from powershell specification"
Output:
========================== Starting Command Output ===========================
Your parameter is now: Changed Param1
When outputting standard variable build id: 23
When outputting standard variable build id via env: 23
The repo name is: mygitdomain/thisRepo
The build definition name is: ThisProject
Finishing: PowerShell
========================== Starting Command Output ===========================
Running script from powershell specification
Finishing: PowerShell
..and so on..
I found only one solution to actually do that. You can reference the parent directory by using an absolute path. The key was to populate the root path using a system variable. The solution for your example:
- template: ${{variables['System.DefaultWorkingDirectory']}}/proj0-common/pipeline-templates/build-project.yml

Resources