How do I include a binary program in gitlab-ci pipeline? - gitlab

I have a GitLab repository containing Protocol Buffers code (i.e., .proto files).
The specifics of what I wish to accomplish shouldn't matter to my main question, but the context may help, so here goes. I wish to add a Protocol Buffers source code formatter to the repository's GitLab CI pipeline that checks whether the proto files in any merge request are correctly formatted (say, by running the formatter on the source files and checking that no changes are produced). The formatter I've settled on for this task is clang-format.
Clang-format is distributed as a binary. What is the right approach to integrate the binary into the GitLab CI pipeline for this purpose? By right approach, I'm mainly looking for:
solutions that are idiomatic and follow best practices; and
solutions that are efficient (e.g., not re-download the binary for each run, if possible)

There are a number of docker images for clang-format at Docker Hub. Take a look at them and choose one to use. maranov/clang-git says it was specifically created for use in Gitlab CI pipelines, so that might be worth a try.
Update your .gitlab-ci.yml file to use the image:
Job Name:
stage: my-stage
image: maranov/clang-git
script:
- #run what you need to
Instead of using a pre-made image, you could create your own.

Related

How to write automated test for Gitlab CI/CD configuration?

I have notified the complexity of my configuration for Gitlab CI/CD grows fairly fast. Normally, in a either a programming language or infrastructure code I could write automated test for the code.
Is it possible to write automated test for the gitlab-ci.yml file itself?
Does it exist any library or testing framework for it?
Ideally I would like to setup the different environments variables, execute the gitlab-ci.yml file, and do assertions on the output.
I am not aware of any tool currently to really test behaviour. But i ended up with two approaches:
1. testing in smaller chunks aka unit testing my building blocks.
I extracted my builds into smaller chunks which i can include, and test seperately. This works fine for script blocks etc. but is not really sufficient for rule blocks - but offers great value, eg i use the templates for some steps as verification for the generated docker image, which will be used by those steps.
2. verification via gitlab-ci-local
https://github.com/firecow/gitlab-ci-local is a tool which allows you to test it locally, and which allows you to provide environment variables. There are some minor issues with branch name resolution in pr pipelines but besides that it works great. I use it to test gitlab ci files in GitHub Actions.
I hope that this helps somehow
Gitlab have a linter for gitlab-ci.yaml
To access the CI Lint tool, navigate to CI/CD > Pipelines or CI/CD > Jobs in your project and click CI lint.
Docs

How to create a common pipeline in GitLab for many similar projects

We have hundreds of similar projects in GitLab which have the same structure inside.
To build these projects we use a one common TeamCity build. We trigger and pass project GitLab URL along with other parameters to the build via API, so TeamCity build knows which exact project needs to be fetched/cloned. TeamCity VCS root accepts target URL via parameter.
The question is how to replace existing TeamCity build with a GitLab pipeline.
I see the general approach is to have CI/CD configuration file(.gitlab-ci.yml) directly in project. Since the structure of the projects the same this is not the option to duplicate the same CI/CD config file across all projects.
I'm wondering is it possible to create a common pipeline for several projects which can accept the target project URL via parameter ?
You can store the full CICD config in a repository and put in all your projects a simple .gitlab-ci.yml which includes the shared file.
With thus approach there is no redundant definition of the jobs.
Still, you can add specific other jobs to specific projects (in the regarding .gitlab-ci.yml files or define variables in a problem and use some jobs conditionally) - you can also include multiple other definition files, e.g. if you have multiple similar projects.
cf. https://docs.gitlab.com/ee/ci/yaml/#include
With latest GitLab (13.9) there are even more referencing methods possible: https://docs.gitlab.com/ee/ci/yaml/README.html#reference-tags
As #MrTux already pointed out, you can use includes.
You can either use it to include a whole CI file, or to include just certain steps. in Having Gitlab Projects calling the same gitlab-ci.yml stored in a central location - you can find detailed explanation with examples of both usages

How to change pipeline badge name

As the standard pipeline badge from GitLab looks like this
you can tell pretty well that those are not really distinguishable.
Is there a way to change the pipeline text manually or programmatically to something else for each badge?
Btw, the badges were added with those links
https://gitlab.com/my-group/my-repository/badges/master/pipeline.svg
https://gitlab.com/my-group/my-repository/badges/dev/pipeline.svg
Additional facts:
The pipeline runs locally on my computer
My repo is private
I know it is a bit of an old post, but I was looking for the same and found that it is available now since GitLab 13.1.
The text for a badge can be customized to differentiate between multiple coverage jobs that run in the same pipeline. Customize the badge text and width by adding the key_text=custom_text and key_width=custom_key_width parameters to the URL:
https://gitlab.com/gitlab-org/gitlab/badges/main/coverage.svg?job=karma&key_text=Frontend+Coverage&key_width=130
The example is for the Coverage badge but this also works for Pipelines, so in your case:
https://gitlab.com/my-group/my-repository/badges/master/pipeline.svg?key_text=master&key_width=50
https://gitlab.com/my-group/my-repository/badges/dev/pipeline.svg?key_text=dev&key_width=50
(Found this via https://microfluidics.utoronto.ca/gitlab/help/ci/pipelines/settings.md#custom-badge-text)
There are multiple ways how you can achieve custom pipeline badges in GitLab.
One way could be to use Shields.io which provide a way to generate dynamic badges for your Gitlab repository via a jsonfile.But if your repository is private (only accessible from internal network) then you will get an inaccessible message in your badges.
Otherwise, if your build uses python Docker images or any other python installation with dependencies, you can simply install the anybadge package and generate svg badges to be used in the project from the artifacts directly.
It would be good in future that GitLab offers us more cleaner way to customize the badges, but for now I think those are the workaround solutions.

CodeQuality using `include` statement for GitLab v11.11 and later versions?

Today, while trying to setup CodeQuality in my .gitlab-ci.yml (on gitlab-ee 11.10, gitlab-runner 11.10) I encountered the following issue:
First thing the GitLab documentation tells you is it is possible to set this with help of a DockerInDocker gitlab-runner and a single configuration line:
include:
- template: Code-Quality.gitlab-ci.yml
There is also disclaimer that this is supported in gitlab 11.11 or later, which is strange because as of April 23rd 2019, the latest version is 11.10. Not sure if this a typo or if they publish documentation before the releases are actually available.
I tried following this instructions but many things are unclear:
I realized that the include statement should be added after the stages definition for the syntax check to pass.
A code_quality job showed up and passed:
(source: cozyo.io)
However, later I learned from this answer that I need to create a .codeclimate.yml file and somehow add it .gitlab-ci.yml. That answer shares two links that can be used to understand how to work with CodeClimate but I'm haven't figured a way to add to gitlab-ci.yml. I found some example in this gitlab related page BUT this is not using the include statement described in the docs.
I'm unable to find the report of the code_quaity job that passed. In this answer someone pointed out that the report is only available for download on the merge request for gitlab-ee. However this is not practical since then devs would have to start spamming mock merge request just to see if they code suffered down grades.
The gitlab-ci.yml I use looks something like this:
image: docker:stable
variables:
ARTEFACT: my_app
VERSION: 0.1
DOCKER_HOST: tcp://docker:2375/
DOCKER_DRIVER: overlay2
services:
- docker:dind
before_script:
- docker info
stages:
- build
- test
- deploy
build:
stage: build
script:
- docker build -t $ARTEFACT:$VERSION-DEV .
test:
stage: test
script:
- docker run --rm --env MODE=DEV $ARTEFACT:$VERSION-DEV ./my_test.sh
include:
- template: Code-Quality.gitlab-ci.yml
Ideally, this should be a simple as figuring out how to use CodeClimate for python applications and then adding its configuration to the repo and referencing in gitlab-ci.yml correct? But how to make the reference. Is there any clear documentation available somewhere?
EDIT: I now know that jobs are independent and that I should modify the build stage in the .gitlab-ci.yml above to push the build image somewhere other jobs can pull from. Still this doesn't help to solve the CodeQuality problem I think.
First, yes, GitLab often has docs for features of the coming release before the official release. The docs often apply to pre-release versions which are often running on GitLab.com.
The docs could probably be clearer, so I encourage you to file an issue with specifics on how the docs could be improved or even submit a MR if you feel comfortable with that.
Do note that code quality is a paid feature so requires a Starter license for a self-managed install (or Bronze on GitLab.com).
The code quality process will store information based on the first run in a codeclimate.json file. This is merely reference for future runs and is not the code quality report itself. Upon subsequent runs, the code climate report will be viewable in the Merge Request which kicked off the pipeline.
In order for the code climate report to appear in the Merge Request, the following conditions must be met:
You've defined a job in your .gitlab-ci.yml which produces the Code Quality report artifact
The job must not be the first run. If the Code Quality report doesn’t have anything to compare to, no information will be displayed in the merge request area. That is the case when you add the Code Quality job in your .gitlab-ci.yml for the very first time. Consecutive merge requests will have something to compare to and the Code Quality report will be shown properly.
A code error must be present. There has to be a degradation/error in your code as this is what will be displayed in the merge request page. If the code is correct there will be nothing displayed.
The .json file must not be too large. There is an existing issue which prevents the code report from displaying when the related .json file is very large.

How to update repository with built project?

I’m trying to set up GitLab CI/CD for an old client-side project that makes use of Grunt (https://github.com/yeoman/generator-angular).
Up to now the deployment worked like this:
run ’$ grunt build’ locally which built the project and created files in a ‘dist’ folder in the root of the project
commit changes
changes pulled onto production server
After creating the .gitlab-ci.yml and making a commit, the GitLab CI/CD job passes but the files in the ‘dist’ folder in the repository are not updated. If I define an artifact, I will get the changed files in the download. However I would prefer the files in ‘dist’ folder in the to be updated so we can carry on with the same workflow which suits us. Is this achievable?
I don't think commiting into your repo inside a pipeline is a good idea. Version control wouldn't be as clear, some people have automatic pipeline trigger when their repo is pushed, that'd trigger a loop of pipelines.
Instead, you might reorganize your environment to use Docker, there are numerous reasons for using Docker in a professional and development environments. To name just a few: that'd enable you to save the freshly built project into a registry and reuse it whenever needed right with the version you require and with the desired /dist inside. So that you can easily run it in multiple places, scale it, manage it etc.
If you changed to Docker you wouldn't actually have to do a thing in order to have the dist persistent, just push the image to the registry after the build is done.
But to actually answer your question:
There is a feature request hanging for a very long time for the same problem you asked about: here. Currently there is no safe and professional way to do it as GitLab members state. Although you can push back changes as one of the GitLab members suggested (Kamil Trzciński):
git push http://gitlab.com/group/project.git HEAD:my-branch
Just put it in your script section inside gitlab-ci file.
There are more hack'y methods presented there, but be sure to acknowledge risks that come with them (pipelines are more error prone and if configured in a wrong way, they might for example publish some confidential information and trigger an infinite pipelines loop to name a few).
I hope you found this useful.

Resources