Create 2 Pipelines for a Node Project in GitLab - gitlab

I'm trying to run 2 pipelines for a project in GitLab, but I can't find any way to do it.

In gitlab CI you can't create multiple pipelines for one project explicitly. There are cases where multiple pipelines will run simultaneously, such as when you have jobs that run only for merge requests and other jobs that do not run on merge requests.
That said, there are ways to obtain the effect of running multiple series of jobs independently from one another.
The hacky way, before gitlab-ce 12.2
If you want to start 2 pipelines for the same project you can use pipeline triggers. This method is limited to 2 pipelines and gitlab CI is not meant to be used this way. Usually, triggers are used to start pipelines on another project.
All in your .gitlab-ci.yml:
stages:
- start
- build
###################################################
# First pipeline #
###################################################
start_other_pipeline:
stage: start
script:
# Trigger a pipeline for current project on current branch
- curl --request POST --form "token=$CI_JOB_TOKEN" --form ref=$CI_COMMIT_REF_NAME $CI_API_V4_URL/projects/$CI_PROJECT_ID/trigger/pipeline
except:
- pipelines
build_first_pipeline:
stage: build
script:
- echo "Building first pipeline"
except:
- pipelines
###################################################
# Second pipeline #
###################################################
# Will run independently of the first pipeline.
build_second_pipeline:
stage: build
script:
- echo "Building second pipeline"
only:
- pipelines
To clean up this mess of a .gitlab-ci.yml, you can use the include keyword:
# .gitlab-ci.yml
include:
- '/first-pipeline.yml'
- '/second-pipeline.yml'
stages:
- start
- build
# This starts the second pipeline. The first pipeline is already running.
start_other_pipeline:
stage: start
script:
# Trigger a pipeline for current project on current branch
- curl --request POST --form "token=$CI_JOB_TOKEN" --form ref=$CI_COMMIT_REF_NAME $CI_API_V4_URL/projects/$CI_PROJECT_ID/trigger/pipeline
except:
- pipelines
# first-pipeline.yml
build_first_pipeline:
stage: build
script:
- echo "Building first pipeline"
except:
- pipelines
# second-pipeline.yml
build_second_pipeline:
stage: build
script:
- echo "Building second pipeline"
only:
- pipelines
The reason this works is the use only and except in the jobs. The jobs marked with
except:
- pipelines
do not run when the pipeline has started because of a trigger coming from another pipeline, so they don't run in the second pipeline. On the other hand,
only:
- pipelines
does the exact opposite, therefore those jobs run only when the pipeline is triggered by another pipeline, so they only run in the second pipeline.
The probably right way, depending on your needs ;)
In gitlab CE 12.2, it is possible to define Directed Acyclic Graphs to specify the order that your jobs run. This way, a job can start as soon as the job it depends on (using needs) finishes.

As of GitLab 12.7 it is also possible to use parent-child pipelines for this:
# .gitlab-ci.yml
trigger_child:
trigger:
include: child.yml
do_some_stuff:
script: echo "doing some stuff"
# child.yml
do_some_more_stuff:
script: echo "doing even more stuff"
The trigger_child job completes successfully once the child pipeline has been created.

Related

How to make a stage depend on another stage?

I have a YAML file as below. Let’s say the *.md file is committed, the build does not work, but the test works. Here how can I make the test depend on the build? Like if the build doesn’t work, the test shouldn’t work.
Thanks in advance.
build:
stage: build
script:
- echo "Build is running"
only:
changes:
- Dockerfile
- requirements.txt
- ./configs/*
test:
stage: test
script:
- echo "Test is running"
- echo "$CI_JOB_STAGE"
dependencies:
- build
That should be what stages defines
Use stages to define stages that contain groups of jobs.
stages is defined globally for the pipeline.
Use stage in a job to define which stage the job is part of.
The order of the stages items defines the execution order for jobs:
Jobs in the same stage run in parallel.
Jobs in the next stage run after the jobs from the previous stage complete successfully.
For example:
stages:
- build
- test
- deploy
All jobs in build execute in parallel.
If all jobs in build succeed, the test jobs execute in parallel.
If all jobs in test succeed, the deploy jobs execute in parallel.
If all jobs in deploy succeed, the pipeline is marked as passed.
If any job fails, the pipeline is marked as failed and jobs in later stages do not start.
Jobs in the current stage are not stopped and continue to run.
So, in your case:
stages:
- build
- test
test won't run if build fails.

fail gitlab multijob pipeline if test job fails in another repository

I have a gitlab ci pipeline in my application repo, A, which calls an end to end testing Repo T to run its tests. The repo A pipeline succesfully triggers the tests from Repo T but if the test job fails in T, the job calling the test job in T from A still passes. How do I get repo A to track the result of Repo T's test job, and pass/fail its pipeline based off of the test jobs in T?
.gitlab-ci.yml for testing Repo T:
stages:
- test
test:
stage: test
image: markhobson/maven-chrome:jdk-11
artifacts:
paths:
- target/surefire-reports
script:
- mvn test
only:
- triggers
.gitlab-ci.yml from application repo A:
job1:
stage: unit-tests ...
job2:
stage: build ...
...
trigger-e2e-repo:
stage: e2e-testing
image: markhobson/maven-chrome
script:
- "curl -X POST -F token=repo-T-token -F ref=repo-T-branch https://repo-A/api/v4/projects/repo-T-id/trigger/pipeline"
only:
- repo-A-branch
Since GitLab 11.8 you can trigger a pipeline via bridge job.
In GitLab 11.8, GitLab provides a new CI/CD configuration syntax to make this task easier, and avoid needing GitLab Runner for triggering cross-project pipelines.
With bridge jobs it is possible to mirror the status of the trigger pipeline to the calling pipeline.
You can mirror the pipeline status from the triggered pipeline to the source bridge job by using strategy: depend.
Example in your case:
trigger-e2e-repo:
stage: e2e-testing
trigger:
project: repo-T
strategy: depend
If the triggered pipeline with the test jobs fails, the calling pipeline also fails.
If you only want to execute a particular job in your repository "Repo T" when executed by a bridge job, then you should use only: pipeline (only) or rules: -if '$CI_PIPELINE_SOURCE == "pipeline"' (rules:if) instead of only: triggers.
I wasn't able to use the bridge job property of mirroring a downstream job result as the version of my gitlab is before 11.8. I did manage to get it to work by creating a trigger for repo A, and making a call from repo T to repo A with this new second trigger. The remaining jobs in repo A will only be activated by triggers and setting of variables (JOB in this case ) as laid out below:
.gitlab-ci.yml for repo T:
test:
stage: test
script:
- mvn test
- "curl -X POST -F token=repo-A-token -F ref=$BRANCH -F variables[JOB]=build https://project.com/api/v4/projects/project_id/trigger/pipeline"
.gitlab-ci.yml in A
job1:
...
except:
- triggers
job2:
...
except:
- triggers
...
trigger-e2e-repo:
stage: e2e-testing
script:
- "curl -X POST -F token=repo-B-token -F ref=repo-B-branch -F variables[BRANCH]=$CI_COMMIT_REF_NAME -F https://project-B/api/v4/projects/project-B-id/trigger/pipeline"
except:
- triggers
build_application_for_prod:
stage: build_prod
script:
- "curl -X POST -F token=repo-A-token -F ref=$CI_COMMIT_REF_NAME -F variables[JOB]=deploy -F variables[SEND]=true https://foo/api/v4/projects/proj_A_id/trigger/pipeline"
only:
variables:
- $JOB == "build"
deploy_production_environment:
stage: deploy_prod
script:
- ...
only:
variables:
- $JOB == "deploy"
Note I also had to add the except statements for the jobs before the end to end tests in repo A so that they won't rerun and loop when repo A's API trigger is called later on.

How to trigger only specific stage of pipeline with gitlab API?

I have gitlab project with ci file:
stages:
- build
- run
build:
stage: build
script:
- (some stuff)
tags:
- my_runner_tag
except:
- triggers
when: manual
run:
stage: run
script:
- (some stuff)
tags:
- my_runner_tag
except:
- triggers
when: manual
Jobs are created on every source code change, and they can be run only manually, using gitlab interface.
Now, i want to have possibility to trigger stage run with Gitlab API. Trying:
curl -X POST \
> -F token=xxxxxxxxxxxxxxx \
> -F ref=master \
> https://gitlab.xxxxx.com/api/v4/projects/5/trigger/pipeline
Returns:
{"message":{"base":["No stages / jobs for this pipeline."]}}
Seems, like i have to define stage to trigger, but i can't find a way to pass it via api call.
you are using the wrong endpoint, to do it, you need to follow the path below
list all of your pipelines and get the newest one
GET /projects/:id/pipelines
list the jobs from this pipeline
GET /projects/:id/pipelines/:pipeline_id/jobs
After that you can trigger your job
POST /projects/:id/jobs/:job_id/play
you are telling your build to run at all times except for the time they are being triggered (api call is also considered as a trigger).
change your job definition to the following:
run:
stage: run
script:
- (some stuff)
tags:
- my_runner_tag
when: manual

Gitlab-ci - Pipeline failing for no job

Here is my .gitlab-ci.yml file:
script1:
only:
refs:
- merge_requests
- master
changes:
- script1/**/*
script: echo 'script1 done'
script2:
only:
refs:
- merge_requests
- master
changes:
- script2/**/*
script: echo 'script2 done'
I want script1 to run whenever there is a change in script1 directory; likewise script2.
I tested these with a change in script1, a change in script2, change in both the directories, and no change in either of these directories.
Former 3 cases are passing as expected but 4th case, the one with no change in either directory, is failing.
In the overview, Gitlab gives the message
Could not retrieve the pipeline status. For troubleshooting steps, read thedocumentation.
In the Pipelines tab, I have an option to Run pipeline. Clicking on that gives the error
An error occurred while trying to run a new pipeline for this Merge Request.
If there is no job, I want the pipeline to succeed.
Gitlab pipelines do not have any independent validity outside of jobs. A pipeline, by definition, consists of one or more jobs. In your example 4 above no jobs are created. The simplest hack you can add to your pipeline is a job which always runs:
dummyjob:
script: exit 0

GitLab: Job artifacts in multi project pipelines

I have been trying to learn multi project pipelines for a while now, and apart from GitLab documentation, I have not found any study material. If I could see an example, it would really help. I have been using the following ci config for a multi project pipeline in project A, but it's not working:
trigger_job:
stage: trigger_release
trigger:
project: https://<gitlab-site>/api/v4/projects/<project-B-id>/trigger/pipeline
branch: master
strategy: depend
This leaves the pipeline in project A in a pending state forever. I used curl in the following way to finally get the config working:
trigger_job:
stage: trigger_release
script:
- curl --request POST --form "token=$CI_JOB_TOKEN" --form ref=master https://<gitlab-site>/api/v4/projects/<project-B-id>/trigger/pipeline
However, what I really need is to collect and use the artifacts of project B pipeline in project A pipeline after the triggered job finishes. How do I do that?
Since Gitlab 11.8. you don't need to use API to trigger a pipeline, see official documentation
Example
Let's have group mygroup with 2 repos: myrepository1 and myrepository2.
Config in repository1
trigger-job:
trigger:
project: mygroup/myrepository2
branch: master
variables:
VARIABLE_TO_PASS: $CI_COMMIT_REF_NAME
Config in repository2
job-waiting-for-trigger:
stage: deploy
variables:
script:
- echo "${VARIABLE_TO_PASS} from another project pipeline"
only:
- pipelines
I haven't tried this code, but it should be correct.

Resources