Gitlab ci - pipeline which triggers pipeline in another project, does not finish - gitlab

I have a gitlab ci which triggers a pipeline in another repository.
That pipeline does finish, but it does not cause the original pipeline to finish as well.
This worked up until a few days ago, we did not change anything in our gitlab ci's for both main repository and tests repository.
This is how the tests pipeline is triggered:
trigger_integration_test:
extends:
- .merge_request_rules
stage: test_on_dev
variables:
TEST_TYPE: ci
trigger:
project: <path to>/backend-inregration-tests-repository
strategy: depend
Gitlab version is GitLab Enterprise Edition 14.1.0-pre e230eeca384
Please advise on how I can resolve this?

The manual step at the end of the pipeline was preventing the main pipeline from finishing.
Once disabled it, everything returned back to normal.

Related

Gitlab auto pull

I have repository in GitLab. and I have Test and Dev branches in this repository.
In Gitlab pipeline, I schedule a job to auto run Test branch pipeline every 24 hours.
in Test .gitlab-ci.yml I have
deploy:
stage: deploy
script:
- git pull --ff-only origin Dev
only:
- Test
After merging Dev in Test, this part of code was removed. And next time Test branch pipeline could not pull from Dev branch, when pipeline run.
How can I pull code from Dev branch to Test Branch without losing the code - git pull --ff-only origin Dev ?
Or maybe it is possible to have tow .gitlab-ci.yml on branch? (if yes, how GitLab should know which one of them be diploid first? )
Or maybe it is possible to have two .gitlab-ci.yml on branch
You would generally create your .gitlab-ci.yml in the main/master branch.
Which means that .gitlab-ci.yml would not be impacted by merges between Test and Dev. Which is what you need.

GitLab CI Pipeline not triggered for events on default branch

I got two branches in my GitLab repo (uat and production). Two deploy jobs are meant to deploy a branch to a specific environment. There are two gitlab-ci.yml files, one in each branch (with the config for that branch) and production is my default branch.
The jobs should run only if files in dir/ changed and not for scheduled pipelines.
Problem: The deploy job for UAT is just working as expected: it runs if I push directly to the branch or if I accept a merge request. However, although there is no difference except the branch, the deploy job for production is not triggered on any event.
Question: Do you know if I misunderstood something and what would fix this?
Thanks!
gitlab-ci.yml in production
deploy_to_production:
only:
refs:
- production
changes:
- dir/*
except:
- schedules
script:
# upload to prod
gitlab-ci.yml in uat
deploy_to_uat:
only:
refs:
- uat
changes:
- dir/*
except:
- schedules
script:
# upload to uat
Do you have those empty lines before script: in your file?
This will define script under default, because it is not tied to a job.
default:
script:
# upload to uat
The reason that it is only running uat is that on the second reference the first one gets overwritten. You can check this on gitlab on your project page under CI/CD/Editor.
And here you can view the final yaml after merging:

Gitlab ci issue with passing artifacts to Downstream pipeline with trigger and needs keywords

I am working on a multi-pipeline project, and using trigger keyword to trigger a downstream pipeline, but I'm not able to pass artifacts created in the upstream project. I am using needs to get the artifact like so:
Downstream Pipeline block to get artifacts:
needs:
- project: workspace/build
job: build
ref: master
artifacts: true
Upstream Pipeline block to trigger:
build:
stage: build
artifacts:
paths:
- ./policies
expire_in: 2h
only:
- master
script:
- echo 'Test'
allow_failure: false
triggerUpstream:
stage: deploy
only:
- master
trigger:
project: workspace/deploy
But I am getting the following error:
This job depends on other jobs with expired/erased artifacts:
I'm not sure what's wrong.
Looks like there is a problem sharing artifacts between pipelines as well as between projects. It is known bug and has been reported here:
https://gitlab.com/gitlab-org/gitlab/-/issues/228586
You can find a workaround there but since it needs to add access token to project it is not the best solution.
Your upstream pipeline job "Build" is set to only store its artifacts for 2 hours (from the expire_in: 2h line. Your downstream pipeline must have run at least 2 hours later than the artifacts were created, so the artifact expired and was erased, generating that error.
To solve it you can either update the expire_in field to however long you need them to be active (so for example if you know the downstream pipeline will run up to 5 days later, set it to 5d for 5 days), or rerun the Build job to recreate the artifacts.
You can read more about the expire_in keyword and artifacts in general from the docs
It isn't a problem with expired artifacts, the error is incorrect. In my case I am able to download the artifacts as a zip directly from the UI on the executed job. My expire_in is set to 1 week yet I am still getting this message.

Azure Devops build pipeline: CI triggers not working on PR merge to master when PR set to none

Need to trigger CI build job when a PR is merged to master (only when change is inside ./rel/* path) but not have CI build triggered when the Pull Request (PR) is created. So I have the trigger config as below.
trigger:
branches:
include:
- master
paths:
include:
- ./rel/*
pr: none # will disable PR builds (but not CI builds)
But it fails to trigger CI build when pr: none is added. If pr: none is removed, The Job is getting triggered for both PR and a merge to master. I would only need the job/CI build to run on a merge to master.
The paths filter in the YAML is looking at paths in your repository file structure, not the branch path. To have it only trigger on rel branch, replace the master under the include branches with ./rel/* (or the correct value).
We have a more defined pipeline that runs unit tests on PR and then only packages for release on merge into the master branch. We do this by having our trigger set to the master branch and using conditions for each stage in the multi-stage pipeline. Check those out as well.
Solved! This works now.
trigger:
branches:
include:
- master
paths:
include:
- ./rel/*
pr: none # will disable PR builds (but not CI builds)
Also you can configure / override using the classic azure devops UI from Triggers.
Ref : https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?view=azure-devops&tabs=classic#ci-triggers

Troubles with .gitlab-ci.yml triggers configuration

How should I set up my .gitlab-ci.yml manifest to run builds ONLY on:
Merge Request;
Push to branch with opened merge request from it (I mean when the merge request from branch Y to branch X is already opened and some new changes are pushed to branch Y);
Push to master;
I've tried to solve it with a setting like this:
job:
only:
- triggers
- /merge-requests/
- master
except:
- branches
Regarding on the documentation here: https://docs.gitlab.com/ce/ci/yaml/README.html#only-and-except-simplified
Suddenly the error occurred on my MR page:
Could not connect to the CI server. Please check your settings and try again.
When I removed only/except restrictions from my manifest, the error was gone.
What am I doing wrong here?
My Gitlab version is: GitLab Community Edition 10.8.1
You want to run a job only on:
merge request: I don't understand what you want here
Push to branch with opened merge request from it: you have to set a special job that call the Gitlab API to control that the current branch has a MR
A job executed only on new pushed branch:
image: alpine:latest
script:
- # <-- add here the script that call Gitlab API
only:
- branches
Push to master:
A job executed only on master:
image: alpine:latest
script:
- echo "Hello world!"
only:
- master

Resources