Is there any way to extend and run only a specific job from another pipeline in my current pipeline without copy-pasting it?
For example I have two pipelines:
1. build -> code_check -> auto_test -> deploy
2. auto_test* -> report
I want to execute pipeline 2 where auto_test* executes on another runner while keeping the job's keys exactly as they are in pipeline 1 (except for "tags" which I add in the job to be able to use another runner).
I have a process restriction that I can't change anything in pipeline 1 config so I need a way to execute only a specific job.
I have tried to do that through include .gitlab-ci.yaml+extends:. It somewhat works but pipeline 2 will have all jobs from both pipelines and it is not what I would like to see.
The most straightforward way would be just to copy on each update auto_test job specification from pipeline 1 into my gitlab-ci YAML of pipeline 2 and adding tags: ["MyRunner"] but I hoped there is a built-in way to do that.
Related
I need to exclude a job from pipeline in case my project version is pre-release.
How I know it's a pre-release?
I set the following in the version info file, that all project files and tools use:
version = "1.2.3-pre"
From CI script, I parse the file, extract the version value, and know whether it's a pre-release or not, and can set the result in an environment variable.
The only way I know to exclude a job from pipeline is to use rules, while, I know also from gitlab docs that:
rules are evaluated before any jobs run
before_script also is claimed to be called with the script, i.e. after applying the rules.
I can stop the job, only after it starts from the script itself, based on the version value, but what I need is to extract the job from the pipeline in the first place, so it's not displayed in the pipeline history. Any idea?
Thanks
How do you run (start) your pipeline, and is the information whether "it's a pre-release" already known at this point?
If yes, then you could add a flag like IS_PRERELEASE as a variable to the pipeline, and use that in the rules: section of your job. The drawback is that this will not work with automatic pipelines (triggered by a commit or MR); but you can use this approach with manually triggered pipelines (https://docs.gitlab.com/ee/ci/variables/#override-a-variable-when-running-a-pipeline-manually) or via the API (https://docs.gitlab.com/ee/ci/triggers/#pass-cicd-variables-in-the-api-call).
I have a GitLab CI pipeline with a 'migration' job which I want to be added only if certain files changed between current commit and master branch, but in my current project I'm forced to use GitLab CI pipelines for push event which complicates things.
Docs on rules:changes clearly states that it will glitch and will not work properly without MR (my case of push event), so that's out of question.
Docs on rules:if states that it only works with env variables. But docs on passing CI/CD variables to another job clearly states that
These variables cannot be used as CI/CD variables to configure a
pipeline, but they can be used in job scripts.
So, now I'm stuck. I can just skip running the job in question overriding the script and checking for file changes, but what I want is not adding the job in question to pipeline in first place.
While you can't add a job alone to a pipeline based on the output of a script, you can add a child pipeline dynamically based on the script output. The method of using rules: with dynamic variables won't work because rules: are evaluated at the time the pipeline is created, as you found in the docs.
However, you can achieve the same effect using dynamic child-pipelines feature. The idea is you dynamically create the YAML for the desired pipeline in a job. That YAML created by your job will be used to create a child pipeline, which your pipeline can depend on.
Sadly, to add/remove a Gitlab job based on variables created from a previous job is not possible for a given pipeline
A way to achieve this is if your break your current pipeline to an upstream and downstream
The upstream will have 2 jobs
The first one will use your script to define a variable
This job will trigger the downstream, passing this variable
Upstream
check_val:
...
script:
... Script imposes the logic with the needed checks
... If true
- echo "MY_CONDITIONAL_VAR=true" >> var.env
... If false
- echo "MY_CONDITIONAL_VAR=false" >> var.env
artifacts:
reports:
dotenv: var.env
trigger_your_original_pipeline:
...
variables:
MY_CONDITIONAL_VAR: "$MY_CONDITIONAL_VAR"
trigger:
project: "project_namespance/project"
The downstream would be your original pipeline
Downstream
...
migration:
...
rules:
- if: '$MY_CONDITIONAL_VAR == "true"'
Now the MY_CONDITIONAL_VAR will be available at the start of the pipeline, so you can impose rules to add or not the migration job
Using the following CI pipeline running on GitLab:
stages:
- build
- website
default:
retry: 1
timeout: 15 minutes
build:website:
stage: build
...
...
...
...
website:dev:
stage: website
...
...
...
What does the first colon in job name in build:website: and in website:dev: exactly mean?
Is it like we pass the second part after the stage name as a variable to the stage?
Naming of jobs does not really change the behavior of the pipeline in this case. It's just the job name.
However, if you use the same prefix before the : for multiple jobs, it will cause jobs to be grouped in the UI. It still doesn't affect the material function of the pipeline, but it will change how they show up in the UI:
It's a purely cosmetic feature.
Jobs can also be grouped using / as the separator or a space.
i am currently facing an issue using Gitlab-ci. i am using gitlab 13.12 community edition.
I want to be able to override the value of a CICD variable from within a gitlab-ci.yml job.
you might say that i could just pass out the value from one job to another but the goal is to change that value so that , on the next pipeline , all my jobs will use the updated one.
To be precise , i want to be able to change MY_CICD_VARIABLE_TO_CHANGE that i've defined in project > cicd > variables. and update this value from a gitlab-ci job
something like this but not only change it for the current pipeline but update it :
change_variable_value:
stage: "change_variable"
image: myimage
script:
- $MY_CICD_VARIABLE_TO_CHANGE="value_changed"
, i tried to do every single solutions specify here :
https://docs.gitlab.com/13.12/ee/ci/variables/index.html#override-a-defined-cicd-variable
nothing seems to work.
I also tried to use artefact but it seems that i am not able to pass artefact between 2 distinct pipelines, they have to be part of the same pipeline , even by curling the api (at least in the community edition)
Any idea is more than welcome :)
I am trying to run a CI stage in Azure DevOps in a self-hosted Linux Agent. The stages look like below:
CI - Build Job:
Task 1: Python script to check a TRUE OR FALSE condition
Task 2: Bash script to execute certain commands
Now, Task 2 Should run only when the Task 1 py script execution contains only "TRUE".
I have referred a few docs which suggested to go with custom conditions from the following link:
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=classic
But not sure how to write custom condition as I am new to this.
NOTE: I want to try only in a custom mode, not in YAML
We can define a new variable or update a variable in the python script when the Task 1 py script execution contains only "TRUE", then use the variable in the condition. Sample condition eq(variables['{variable name}'], '{variable value}'), the task 2 will only run if the condition is determined to be successful, if the result is fail, the task 2 will be skipped.