I have two projects in Gitlab and I want to start the pipeline of three other projects from the one project. It is important to me that the pipelines run in a certain order.
ProjectRoot -> ProjectA -> ProjectB -> ...
Each project has its own gitlab-ci.yml with corresponding content.
ProjectRoot
...
projecta:
stage: projecta
trigger:
include: 'app/projecta'
ref: 'test
file: '/.gitlab-ci.yml'
strategy: depend
needs:
- root
when: on_success
...
ProjectA
test:
stage: test
script:
- cat requirements.txt
allow_failure: true
The pipeline is triggered and I also see the job in Gitlab, but it exits with an error.
cat: can't open 'requirements.txt': No such file of directory
Why doesn't the root pipeline access the content of the project or how can I build this so that the content is loaded from the project and I can keep a certain order.
The projects should go through one after the other and only if the previous job was successful.
With curl I get the pipeline to run, but there the next job does not wait for the previous one.
Related
I'm using GitLab and the CI config has got following stages:-
stages:
- test
- child_pipeline1
- child_pipeline2
- promote-test-reports
At any time, only one of the child pipeline will run i.e. either child_pipeline1 or child_pipeline2 after the test stage and not both at a time.
Now, I have added another stage called promote-test-reports which I would like to run in the end i.e. after the successful execution of any of child pipeline.
But I'm completely blocked here. This promote-test-reports is coming from a template which I have included in this main CI config file like:-
# Include the template file
include:
- project: arcesium/internal/vteams/commons/commons-helper
ref: promote-child-artifacts
file: 'templates/promote-child-artifacts.yml'
I'm overriding the GitLab project token in this same main file like below:-
test:
stage: promote-test-reports
trigger:
include: adapter/child_pipelin1.yml
strategy: depend
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
If you see the above stage definition in main CI config file, I'm trying to use strategy: depend to wait for successful execution of child_pipeline1 and then run this stage but it throwing error (jobs:test config contains unknown keys: trigger)and this approach is not working reason is I'm using scripts in the main definition of this stage (promote-test-reports) in the template and as per the documentation both scripts and strategy cannot go together.
Following is the definition of this stage in the template:-
test:
stage: promote-test-reports
image: "495283289134.dkr.ecr.us-east-1.amazonaws.com/core/my-linux:centos7"
before_script:
- "yum install unzip -y"
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
allow_failure: true
script:
- 'cd $CI_PROJECT_DIR'
- 'echo running'
when: always
rules:
- if: $CI_PIPELINE_SOURCE == 'web'
artifacts:
reports:
junit: "**/testresult.xml"
coverage_report:
coverage_format: cobertura
path: "**/**/coverage.xml"
coverage: '/TOTAL\s+\d+\s+\d+\s+(\d+%)/'
The idea of using strategy attribute failed. I cannot remove the logic of scripts in the template as well. May I know what is the alternate way of running my job(promote-test-reports) in the end and remember it is going to be an OR condition that run either after child_pipeline1 or child_pipeline2
I would really appreciate your help
Finally, I was able to do it by putting the strategy: depend on child pipelines. I was doing it incorrectly earlier by doing on stage, promote-test-reports
I have two jobs in GitLab CI/CD: build-job (which takes ~10 minutes) and deploy-job (which takes ~2 minutes). In a pipeline, build-job succeeded while deploy-job failed because of a typo in the script. How can I fix this typo and run only deploy-job, instead of rerunning build-job?
My current .gitlab-ci.yml is as follows:
stages:
- build
- deploy
build-job:
image: ...
stage: build
only:
refs:
- main
changes:
- .gitlab-ci.yml
- pubspec.yaml
- test/**/*
- lib/**/*
script:
- ...
artifacts:
paths:
- ...
expire_in: 1 hour
deploy-job:
image: ...
stage: deploy
dependencies:
- build-job
only:
refs:
- main
changes:
- .gitlab-ci.yml
- pubspec.yaml
- test/**/*
- lib/**/*
script:
- ...
I imagine something like:
the triggerer (me) fixes the typo in deploy-job's script and pushes the changes
the pipeline is triggered
the runner looks at the files described by build-job/only/changes and detects a single change in the .gitlab-ci.yml file
the runner looks at the .gitlab-ci.yml file and detects there IS a change, but since this change does not belong to the build-job section AND the job previously succeeded, this job is skipped
the runner looks at the files described by deploy-job/only/changes and detects a single change in the .gitlab-ci.yml file
the runner looks at the .gitlab-ci.yml file and detects there is a change, and since this change belongs to the deploy-job section, this job is executed
This way, only deploy-job is executed. Can this be done, either using rules or only?
There is a straight foward way, you just have to have access to gitlab CI interface.
Push and let the pipeline start normally.
Cancel the fist job.
Run the desired one (deploy).
Another way is to create a new job, intended to be runned only when needed, in which you can set a non-dependency deploy. Just copy the current deploy code without the dependencies. Also add the manual mode with
job:
script: echo "Hello, Rules!"
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
when: manual
allow_failure: true
As docs.
I have a "parent" job defined in a shared (between projects) yaml "default-ci.yml":
.build_solution:
stage: exe_builds
# Declare variables to be overridden
variables:
LVSLN: null
OUTPUT_DIR: null
# optionally use CONFIGURATION to specify a configuration
script:
- . $SCRIPTS_DIR\build_solution.ps1
-SolutionPath "$LVSLN"
-OutputDir "$OUTPUT_DIR"
artifacts:
when: always
paths:
- $ESCAPED_ARTIFACTS_DIR\
expire_in: 1 week
In the specific project yaml I define values for the variables
include: "default-ci.yml"
Build My Project:
extends: .build_solution
variables:
LVSLN: build.lvsln
OUTPUT_DIR: build
With it set up as above, the artifacts are only stored for successful jobs despite the parent stating when: always. At the end of the job summary of a failed job, it goes straight from the after script to Cleaning up project directory and file based variables: no Uploading artifacts for failed job (in other words it's not even trying to find/upload artifacts).
But, when I move the artifacts section to the child yml, so it becomes
include: "default-ci.yml"
Build My Project:
extends: .build_solution
variables:
LVSLN: build.lvsln
OUTPUT_DIR: build
artifacts:
when: always
paths:
- $ESCAPED_ARTIFACTS_DIR\
expire_in: 1 week
I do get artifacts from failed jobs with Uploading artifacts for failed job appearing in the summary.
The artifacts section should be the same for all projects extending .build_solution so I do not want to have to define it in each of the children, it should be defined in the parent.
It looks like what's happening is the parent "artifacts" section is not applying or is being overridden, but I can't find why or where that would be happening when the child has no artifacts section. The rest of the .build_solution job is working as defined in the parent as I see output from my "build_solution.ps1" script and it was passed the correct parameters.
I have a long gitlab CI in a mono repo kind of structure. It looks as follows:
Project A
--- .gitlab-ci.yaml (contains thousands of jobs across 3 stages(plan, test and apply) and also includes gitlab-ci folder)
--- gitlab-ci/my-ci.yaml (folder than contains specific ci files, this has 2 stages, plan and apply for my-new-code)
--- my-new-code (folder which I recently pushed)
--- other folders
I have added my-ci.yaml as a separate file to the gitlab-ci folder. But what happens is that those thousands of jobs from the gitlab-ci.yaml run along with the relevant job when I push changes only to my-new-code folder.
I understand that those jobs will run in the stages because they don't have any rules/only/except statements and it is impractical for me to add it to each one of them.
Is there a way I can exclude these jobs from running in the 2 stages when changes are made to the my-new-code folder?
You can have changes condition with folder or file names to it.
If you can see below the job will run only if specific changes made into src folder likewise you use for your jobs
test_pylint:
stage: test
image: python:3.7
allow_failure: true
before_script:
- pip install pylint pylint-junit
- pip install -e .
- python -V
script:
- pylint --output-format=pylint_junit.JUnitReporter src/ | tee rspec.xml
artifacts:
paths:
- rspec.xml
reports:
junit: rspec.xml
when: always
expire_in: 1 week
rules:
- if: '($CI_PIPELINE_SOURCE == "merge_request_event")'
when: always
changes:
- src/**/*
If you want to exclude jobs based on folder consider using except changes as it would exclude that file/folder like below example which will not allow the job to run if any changes made into md files
build:
script: npm run build
except:
changes:
- "*.md"
PS: This is merge request pipelines you can frame the if condition for commit based pipelines
My pipeline has 4 stages
build - Should only happen on merge requests
test - Should only happen on merge requests
report - Should only happen on merge into master
release - Should only happen on merge into master
BUILD: During the build phase I build my test container and upload it to the container registry.
TEST: During the test phase I run the tests within the container, copy out the coverage report from the container and artifact the entire report directory.
REPORT: During the reporting stage I want to copy the artifact from my reporting stage into a Gitlab page directory so we can view the report.
RELEASE: Does terraform plan apply and building the production container.
Since my report and release stages are detached, I'm unable to upload the artifact that was created in a different stage. My work around is to upload the current cov report to /public/<commit-sha> and then move it to /public when it successfully merges with master. Might not be the best solution but I have limited knowledge on gitlab's pipelines.
The issue I'm having is pretty weird.
pages:
stage: report
dependencies:
- unittest
script:
- if [ "$CI_COMMIT_REF_NAME" == "master" ]; then mv public/$CI_COMMIT_SHA public/; else mv coverage/ public/$CI_COMMIT_SHA; fi
artifacts:
paths:
- public
expire_in: 30 days
This complains that mv: can't rename 'coverage/': No such file or directory
However this works perfectly fine
pages:
stage: report
dependencies:
- unittest
script:
- mv coverage/ public
artifacts:
paths:
- public
expire_in: 30 days
If there's an easier solution to pass artifacts between jobs that would be great, but I'm not sure if I'm missing something really obvious in my script.