I have a job in my gitlab-ci.yml that triggers an external Pipeline that generates artifacts (in this case, badges).
I want to be able to get those artifacts and add them as artifacts to the bridge job (or some other job) on my project so that I can reference them.
My triggered job looks like this:
myjob:
stage: test
trigger:
project: other-group/other-repo
strategy: wait
I'd like something like this:
myjob:
stage: test
trigger:
project: other-group/other-repo
strategy: wait
artifacts:
# how do I get artifacts from the job(s) on other-repo?
badge.svg
Gitlab has an endpoint that can be used for the badge url for downloading the artifact from the latest Pipeline/Job for a project
https://gitlabserver/namespace/project/-/jobs/artifacts/master/raw/badge.svg?job=myjob
Is there a way to get the artifacts from the triggered job and add them to my project?
The artifacts block is for handling archiving artifacts from the current job. In order to get artifacts from a different job, you would handle that in the script section. Once you have that artifact, you can archive it normally within the artifacts block as usual.
You can use wget to download artifacts from a different project as described here
I know it a bit late but maybe this could help.
Add this to your job. It tells the job it needs the artifacts from a specific project.
(You need to be owner of the project)
needs:
- project: <FULL_PATH_TO_PROJECT> (without hosting website)
job: <JOB_NAME>
ref: <BRANCH>
artifacts: true
Related
I have 1 question regarding Gitlab pipeline triggering. We have multiple gitlab projects which trigger 1 common project. They are doing It separately. The idea is to trigger this project only when subprojects are finished. Is there are any way to do It better than create script which checks pipeline status via API? Because didn't find any out-of-the box solution for this
You can use the trigger:strategy. As per the docs:
Use trigger:strategy to force the trigger job to wait for the downstream pipeline to complete before it is marked as success.
So say you have build and test stages, and you want the trigger job in the build stage to succeed before moving on to the test stage, you could do something like this: =
downstream-build:
stage: build
trigger:
include: path/to/child-pipeline.yml
strategy: depend
I am trying to dynamically generate a section of .yml file based on a variable set in .gitlab-ci.yml file, and then include the generated file as an artifact to be evaluated for future stages such as build.
I am running into issues trying to load in the build stage the generated yml in the .pre stage through an include. It does make sense that this would not be possible on the same run, but maybe I could leverage some possible feature which might make the generated file loaded on subsequent runs, maybe caching it somehow.
Is it possible to include generated templates (artifacts) on subsequent stages, maybe through caching the artifact?
Can you set an artifact to be included as a .yml template for later stages?
I just discovered parent-child pipelines might be a possible solution for this and I will be trying this.
https://docs.gitlab.com/ee/ci/pipelines/parent_child_pipelines.html#dynamic-child-pipelines
From docs:
You can trigger a child pipeline from a dynamically generated configuration file:
generate-config:
stage: build
script: generate-ci-config > generated-config.yml
artifacts:
paths:
- generated-config.yml
child-pipeline:
stage: test
trigger:
include:
- artifact: generated-config.yml
job: generate-config
Is it possible to specify no-sources files that should not trigger the Gitlab CI?
When I make changes in README.md, the pipeline triggers, thought that file is only the documentation inside the gitlab and is not packaged in anz output artifact.
You can control when each of your jobs is added to a running pipeline using the only, except, or rules keywords. The easiest way to prevent jobs from running when only the README is changed is with except:
build_job:
stage: build
except:
changes:
- README.md
With this job syntax, if the only file that changes in a push is README.md, this job will not run. Unfortunately you can only set these rules at a job level, not the pipeline level so you'd have to put this in each of your jobs to prevent them all from running.
I have 2 pipelines in 2 different repositories:
lib.yml in repo.git produces a re-usable library artifact A (needs multiple stages on different nodes for that) and also has a stage that runs autotests after A has been produced.
app.yml in app.git builds and tests an application that needs A to build.
In app.yml I want to integrate A without duplicating lib.yml. I was told that templates are a solution, but I am not so sure. Multiple stages of lib.yml must run before I can consume the desired artifact. Using job templates would only complicate both pipelines and create a dependency on pipeline internals. app.yml should not know how lib.yml builds A.
After consulting the docs I think that a pipeline resource is closer to what I need. But I do not fully understand how it works. Let us assume we want artifact A from lib.yml on branch B.
Will app.yml use the latest available artifact A of branch B or will it kick lib.yml on B?
Is there a way to tell app.yml: use the latest A of lib.yml for branch B if available, otherwise run lib.yml on B and wait for A to become ready?
You, you are right - resource pipelines should do a job for you.
And to run app.yml you should have sth like this:
resources:
pipelines:
- pipeline: hadar
source: kmadof.hadar
trigger:
branches:
- B
and use just:
- download: hadar
to get latest artifact from pipeline of branch B.
You can also select pipeline running your app.yaml manual and select artifact:
And - download: hadar also gets correct artifact.
So
app.yml will use latest available artifact A of branch B.
If you want to app.yaml trigger lib.yaml and then get artifact from lib.yaml it would be difficult and not possible using out of the box functionality. So there is no out fo the box way to have app.yaml trigger un lib.yaml and waiting for artifacts from lib.yaml. And if your app yaml will be triggered by an other trigger than resoure pipeline trigger it will use latest available artifact A of branch B.
It used to be the case that Azure DevOps would run a new CI build when you pushed a new branch to Origin. My company relied on this, as we would issue releases by creating a release/* branch off of develop which would trigger a build and a subsequent release into our testing environment. If everything passed UAT/QA, we would deploy the that same build into production.
Our Build Pipelines are both Classic UI and have two Branch filters, develop and release/*. The one product in question has two build pipelines - one for the Webjob and one for the API, and as such each pipeline has a Path filter (to, or not to, include the Webjob folder). Creating a release/* branch does not trigger either CI pipeline.
Our Release Pipeline looks like the following, where DEV is triggerd on artifacts produced from develop and TEST->PROD is triggered on artifacts built from release/* -
This allows us to have a release environment where we iteratively make changes to get it ready, while we simultaneously make changes to develop for the next release. This is a common Gitflow model.
The Problem
Now that Azure no longer triggers a build on branch creation, I'm forced to manually run a build on Release in order to trigger the build and subsequent TEST deployment depicted in the image above.
What can I do to have automated builds and deployments in Azure while maintaining Gitflow?
I was also stuck not knowing why my TEST build wouldn't trigger. This post helped explain why. More searching found the way to resolve it. Simple as removing any path checks from your TEST build.
From the linked documentation:
If your pipeline has path filters, it will be triggered only if the new branch has changes to files that match that path filter.
If your pipeline does not have path filters, it will be triggered even if there are no changes in the new branch.
See:
Behavior of triggers when new branches are created
Not sure if you are using YAML or not but using the GUI you can add a 'path' filter for Continuous Integration and set the build to run anytime a path contains 'release'.
YAML pipelines are configured by default with a CI trigger on all branches.
YAML pipelines are configured by default with a CI trigger on all branches.
When defining pipelines using YAML and defining a repository resource (repo for shorthand). A new pipeline run gets triggered when a commit has occurred. You can further configure the trigger to ONLY act on certain branches.
See:
CI triggers - MSDN
Repo triggers
Example (All branches by default):
resources:
- repo: self
clean: true
pool:
vmImage: 'ubuntu-20.04'
steps:
- task: CMake#1
displayName: 'Generate Makefile'
inputs:
workingDirectory: $(Build.SourcesDirectory)
cmakeArgs: '-B$(Agent.BuildDirectory) -H.'
- task: Bash#3
displayName: 'Build project'
inputs:
workingDirectory: $(Agent.BuildDirectory)
targetType: 'inline'
script: 'make -j4'
Limit to release branches (add trigger to repository resource):
trigger: # Optional; Triggers are enabled by default
branches: # branch conditions to filter the events, optional; Defaults to all branches.
include:
- release/*