I have 1 question regarding Gitlab pipeline triggering. We have multiple gitlab projects which trigger 1 common project. They are doing It separately. The idea is to trigger this project only when subprojects are finished. Is there are any way to do It better than create script which checks pipeline status via API? Because didn't find any out-of-the box solution for this
You can use the trigger:strategy. As per the docs:
Use trigger:strategy to force the trigger job to wait for the downstream pipeline to complete before it is marked as success.
So say you have build and test stages, and you want the trigger job in the build stage to succeed before moving on to the test stage, you could do something like this: =
downstream-build:
stage: build
trigger:
include: path/to/child-pipeline.yml
strategy: depend
Related
I have one gitlab CI file like this
stages:
- build
- deploy
build-job:
stage: build
script:
- echo "Compiling the code..."
- echo "Compile complete."
when: manual
deploy-bridge:
stage: deploy
trigger:
project: tests/ci-downstream
What I understand is that the deploy-bridge stage should not be run unless the manual build-job is run successfully. But it is not the case here. Is this normal?
Jobs in the same stage run in parallel. Jobs in the next stage run
after the jobs from the previous stage complete successfully.
You're not defining your deploy-bridge job as a dependent job, or that it needs another job to finish first, so it can run right away as soon as it reaches the stage. Since the previous stage is all manual jobs, GitLab CI/CD sort of interprets it as 'done', at least enough so that other stages can start.
Since it doesn't look like you're uploading the compiled code from build-job as an artifact, we can't use the dependencies keyword here. All that keyword does is control which jobs' dependencies this job needs, but if it needs the artifacts of a prior job, that job will need to run and finish successfully for this job to start. Also, by default all available artifacts from all prior jobs will be downloaded and available for all jobs in the pipeline. The dependencies keyword can also be used to limit which artifacts this job actually needs. However, if there are no artifacts available in the job we "depend" on, it will throw an error. Luckily there's another keyword we can use.
The needs keyword controls the "flow" of the pipeline, so much so that if a job anywhere in the pipeline (even in the last of say 1,000 stages) had needs: [] it will run as soon as the pipeline starts (and as soon there is an available runner). We can use needs here to make the pipeline flow the way you need.
...
deploy-bridge:
stage: deploy
needs: ['build-job']
trigger:
project: tests/ci-downstream
Now, the deploy-bridge job won't run until the build-job has finished successfully. If build-job fails, deploy-bridge will be skipped.
One other use for needs is that it has the same functionality as dependencies, in that it can control what artifacts are downloaded in which jobs, but it won't fail if the "needed" jobs don't have artifacts at all.
Both dependencies and needs accept an empty array which equates to 'don't download any artifacts' and for needs, run as soon as a runner is available.
My Azure DevOps pipeline has 3 jobs. One builds the project for production and the other builds it for testing and the last publish artifacts. When I push to release branch, it triggers all of 3 jobs, but they can take 10-15 minutes to finish. What I'm trying to achieve is exclude testing job if a tag is present on the commit or something like that.
Ex. Don't trigger test job if branch tag has "hotfix". Tryed "Run this job with conditions" in job's settings with this value "not(startsWith(variables['Build.SourceBranch'], 'refs/tags/hotfix'))" but if I push something to release with hotfix tag it still runs.
Thanks
We recommend you can use the conditions:
condition: eq(variables['Build.SourceBranch'], 'refs/tags/test')
And this means if you want the test job to run, then you need to push something to release with test tag. We cannot use the value "not(startsWith(variables['Build.SourceBranch'], 'refs/tags/hotfix'))"
On my test, if I push the commit release with hotfix tag, then the test job will be skipped.
Update:
We can use the Custom conditions in Additional options, and set it as :
eq(variables['Build.SourceBranch'], 'refs/tags/test')
More details, you can refer this doc: Expressions
I am using Azure DevOps and I have a single build pipeline with a number of steps including PublishBuildArtifacts defined in the azure-pipelines.yml file.
I have pointed the same pipeline for the Build Validation (Validate code by pre-merging and building pull request changes.) from the master branch's build policies option. However, for this PR build run, I don't to run certain tasks like PublishBuildArtifacts.
How can I achieve this? I can think of one way which is to create a separate pipeline for PR and also a separate azure-pipelines-pr.yml file and not adding those tasks in that file. But this feels like an approach with redundancy to me. Is there any better way to achieve this?
You can add a custom condition for the publish artifacts step:
and(succeeded(), ne(variables['Build.Reason'], 'PullRequest'))
Now the step will run only when the build reason is not pull request.
I'm working on a solution that has a yaml build pipeline that does the following
>restore
>build
>test
>publish test
>publish test coverage
>publish source code
And I want to implement a policy in a branch that does the following: whenever a developer creates a pull request to develop branch, that action triggers a build to ensure that the code the developer is trying to merge to develop builds and passes all tests
My question is: as a best practice, should I reuse the build pipeline that I already have, or should I create a new pipeline for that specific job?
Hi Rodrigo you can use the same pipeline as long as you don't have some specific requirements. You can make use of conditions or stages in pipeline yaml to make it robust reuse.
for more information
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema%2Cparameter-schema
I'm trying to setup Azure Pipelines for a CI setup and I'm using the YAML syntax to get started. However, I was wondering if it is possible to script the flow at "runtime"? Like you can do in Jenkins script: spawn builds etc.
Depending on the commit I want to have a vastly different flow.
This is because I currently have a mono-repo setup with Conan libraries and I want to rebuild the libraries that are necessary depending on the commit, thus the build-flow is not the same for each commit. I want to spawn jobs so I can take advantage of parallel building on several agents.
For your issue ,do you refer to trigger builds based on specified commits? If so, you can trigger builds by adding tag trigger in yaml. You can create tags on the commits. If the tag created meets the trigger condition of the tag trigger in yaml , then the build will be triggered.
trigger:
tags:
include:
- v2.*