Create two versions of my software in Gitlab CI - gitlab

I am setting up Gitlab CI for the first time, and I want it to create two releases for each commit. My .gitlab-ci.yml file looks like this:
stages:
- compile
- test
- build release
compile apps:
stage: compile
script:
- scons
artifacts:
paths:
- deliverables/
check version:
stage: test
script:
- check_version.sh
build releasefile:
stage: build release
script:
- build_release.sh
artifacts:
paths:
- release/
For my second version, I want to run scons in compile apps with a flag (scons --special) and then run all next jobs as well on those deliverables. My deliverables are named the same for both versions, and if I just create jobs for both the normal and special version, my "check version" job will check the normal version twice. My options:
Create a really long pipeline that runs everything of the normal version and then everything of the special version. I don't like this solution, it looks hideous and can make errors less visible when the pipeline is expanded later.
Change my scons and shell scripts.
Create two pipelines on each commit, one with a Gitlab CI flag and one without (I don't know how to do this).
Create a "split" pipeline that only uses stuff from the job that it is based on (I don't know how to do this).
For the last case, my pipeline would look something like this:
-----+------ Compile normal ----- Check version ----- Build releasefile
|
+----- Compile special ----- Check version ----- Build releasefile
I would prefer option 3 or 4 and I've been looking at Directed Acyclic Graph Pipelines, but I can't get those to work in the way I want. Is there a way to do either of these?

You can do this by creating a DAG pipeline with needs. If you don't use needs (or the older dependencies), all artifacts from previous stages will be downloaded, which in this case is problematic due to the overlap in the artifact folders / names.
You can also use extends to avoid duplication in your job declarations. The full .gitlab-ci.yml config could be something like this:
stages:
- compile
- test
- build release
compile apps:
stage: compile
script:
- scons
artifacts:
paths:
- deliverables/
compile apps special:
extends:
- compile apps
script:
- scons --special
check version:
stage: test
script:
- check_version.sh
needs:
- compile apps
check version special:
extends:
- check version
needs:
- compile apps special
build releasefile:
stage: build release
script:
- build_release.sh
artifacts:
paths:
- release/
needs:
- compile apps
- check version
build releasefile special:
extends:
- build releasefile
needs:
- compile apps special
- check version special
extends works well in this context because it doesn't combine YAML list items, but instead overwrites them (keys with different names would get merged, on the other hand). So in this case, the whole script and needs declarations get overwritten by the inheriting job.

Related

run gitlab jobs sequentially

I have two simple stages. (build and test). And I want jobs in the pipeline to run sequentially.
Actually, I want when I run the test job, it doesn't run until the build job was passed completely.
My gitlab file:
stages:
- build
- test
build:
stage: build
script:
- mvn clean package
only:
- merge_requests
test:
stage: test
services:
script:
- mvn verify
- mvn jacoco:report
artifacts:
reports:
junit:
- access/target/surefire-reports/TEST-*.xml
paths:
- access/target/site/jacoco
expire_in: 1 week
only:
- merge_requests
Can I add
needs:
- build
in the test stage?
Based on the simplicity of your build file, i do not think that you actively need the needs. Based on the documentation, all stages are executed sequentially.
The pitfall you are in right now, is the only reference. The build stage will run for any branch, and for that ignore merge requests. if you add a only directive to your build job, you might get the result you are looking for like:
build:
stage: build
script:
- mvn clean package
only:
- merge_requests
- master # might be main, develop, etc. what ever your longliving branches are
This way it will not be triggered for each branch but only for merge requests and the long living branches. see the only documentation. Now the execution is not assigned to the branch but to the merge request, and you will have your expected outcome (at least what i assume)

How to merge or add a new stage in gitlab-ci.yml which includes a common template yml with default list of stages and job definitions

Currently in our project the gitlab-ci.yml includes a common template yml with default stages and job definitions. I want to include a stage and respective job specific to this project, which needs to run in between the stages defined in template yml. Below are the sample yml files which represent the scenario I am working on.
.mvn-template.yml
stages:
- build
- static-analysis
- unit-test
- package
- integration-test
compile:
stage: build
tags:
- docker
artifacts:
paths:
- target/
expire_in: 14 days
script:
- mvn $MAVEN_CLI_OPTS compile
interruptible: true
//and job definitions for - static-analysis
// - unit-test
// - package
// - integration-test
gitlab-ci.yml
include:
- project: 'xxx/common-pipeline'
ref: x.x.x
file: '/.mvn-template.yml'
...
Now I want include a new stage specific this project, say contract-testing, in between
package
integration-test
So the contract-testing should run after package and before integration-test.
As this is specific to this project I cannot include in common template. Thus needs to be included in gitlab-ci.yml.
Could not figure out how to do this. I earlier worked with GitHub and just recently started on project with GitLab.
Can I include the stage in common template but don't define respective job there. And include the Job definition in gitlab-ci.yml? Not sure if this is the correct way. Probably there could be better options to handle such scenario.
Please help.
In my opinion, you can do it in 2 different ways :
Add the stage contract-testing in your template file and define the specific job in the .gitlab-ci.yml of your project. The configuration will be valid : gitlab will concatenate and expand includes/anchors to build one single gitlab-ci at the end. Note : you can define a stage even if there is no job using this stage (a sort of an empty stage), it will not generate an error.
If you don't want to have this "empty stage" included in all your projects referring the include, you can move the stages outside of you template and put them in the .gitlab-ci.yml of your project. Like for solution 1), you need to define the job in .gitlab-ci.yml of your project (not in the template).

GitLab CI - Run pipeline when the contents of a file changes

I have a mono-repo with several projects (not my design choice).
Each project has a .gitlab-ci.yml setup to run a pipeline when a "version" file is changed. This is nice because a user can check-in to stage or master (for a hot-fix) and a build is created and deployed to a test environment.
The problem is when a user does a merge from master to stage and commits back to stage (to pull in any hot-fixes). This causes ALL the pipelines to run; even for projects that do not have actual content changes.
How do I allow the pipeline to run from master and/or stage but ONLY when the contents of the "version" file change? Like when a user changes the version number.
Here is an example of the .gitlab-ci.yml (I have 5 of these, 1 for each project in the mono-repo)
#
# BUILD-AND-TEST - initial build
#
my-project-build-and-test:
stage: build-and-test
script:
- cd $MY_PROJECT_DIR
- dotnet restore
- dotnet build
only:
changes:
- "MyProject/.gitlab-ci.VERSION.yml"
# no needs: here because this is the first step
#
# PUBLISH
#
my-project-publish:
stage: publish
script:
- cd $MY_PROJECT_DIR
- dotnet publish --output $MY_PROJECT_OUTPUT_PATH --configuration Release
only:
changes:
- "MyProject/.gitlab-ci.VERSION.yml"
needs:
- my-project-build-and-test
... and so on ...
I am still new to git, GitLab, and CI/pipelines. Any help would be appreciated! (I have little say in changing the mono-repo)
The following .gitlab-ci.yml will run the test_job only if the file version changes.
test_job:
script: echo hello world
rules:
- changes:
- version
See https://docs.gitlab.com/ee/ci/yaml/#ruleschanges
See also
Run jobs only/except for modifications on a path or file

Gitlab CI script: exclude branches

I'm trying to improve the project building script, described in YML-file, the improvement itself seems quite trivial, but the idea of accidentally ruining the auto-builds scares me a bit.
Right now there are several branches, version tags and other stuff in the project.
A development branch, not built by the runners would be of use, because copying a huge project somehow between virtual machines to test the build on different platforms is not convenient at all. So, I want to exclude from builds some "prj-dev" branch.
And there we have:
stages:
- build
- linuxbuild
job:
tags:
- win2008build
stage: build
script:
# something complex
job1:
tags:
- linux
stage: linuxbuild
script:
# something else complex
I googled and found a solution like:
stages:
- build
- linuxbuild
job:
tags:
- win2008build
branches:
except:
- *dev-only
But it seems that our pipelines are quite different, the tags are not git tags, they are pipeline tags. So, I'm considering rather to use a config like:
stages:
- build
- linuxbuild
job:
tags:
- win2008build
except:
branches:
- *dev-only
...which would mean "build as usual, but not my branch". There are complications in trying it both ways, I'm pretty sure someone should know the recipe for sure.
So, if you please, -- how do I exclude my dev branch without changing the pipelines, config only? Is it possible at all?
All you need to do is use except in the gitlab-ci.yml file and add your branches directly below like this:
mybuild:
stage: test
image: somedockerimage
script:
- some script running
except:
- branch-name
This is working on my project without problems.

How to control a stage play based on previous stage result without using artifacts?

We have a project hosted on an internal Gitlab installation.
The Pipeline of the project has 3 stages:
Build
Tests
Deploy
The objective is to hide or disable the Deploy stage when Tests fails
The problem is that we can't use artifacts because they are lost each time our machines reboot.
My question: Is there an alternative solution to artifacts to achieve this task?
The used .gitlab-ci.yml looks like this:
stages:
- build
- tests
- deploy
build_job:
stage: build
tags:
# - ....
before_script:
# - ....
script:
# - ....
when: manual
only:
- develop
- master
all_tests:
stage: tests
tags:
# - ....
before_script:
# - ....
script:
# - ....
when: manual
only:
- develop
- master
prod:
stage: deploy
tags:
# - ....
script:
# - ....
when: manual
environment: prod
I think you might have misunderstood the purpose of the built-in CI. The goal is to have building and testing all automated on each commit or at least every push. Having all tasks set to manual execution gives you almost no advantage over external CI tools like Jenkins or Bamboo. Your only advantage to local execution of the targets right now is having visibility in a central place.
That said there is no way to conditionally show or hide CI tasks, because it's against the basic idea. If you insist on your idea, you could look up the artifacts of the previous stages and abort the manual execution in case something is wrong.
The problem is that we can't use artifacts because they are lost each time our machines reboot
AFAIK artifacts are uploaded to the master and not saved on the runners. You should be fine having your artifacts passed from stage to stage.
By the way, the default for when is on_success which means to execute build only when all builds from prior stages succeed.

Resources