Gitlab - passing job id of parent pipeline to child pipeline - gitlab

Is there any way we can pass job id of parent pipeline to child pipeline as variable
package:
stage: package
script:
- zip -r ./service.zip
deploy:
stage: deploy
variables:
trigger:
include:
- project: '<namespace>/<project>'
ref: '<branch>'
file: '<path to yml file>'
strategy: depend

Yes, it's supported by Gitlab. You need to pass the CI_PIPELINE_ID built-in variable.
.trigger_deploy:
stage: deploy
strategy: depend
trigger:
include:
- project: '<namespace>/<project>'
ref: '<branch>'
file: '<path to yml file>'
variables:
PARENT_PIPELINE_ID: $CI_PIPELINE_ID
PARENT_JOB_ID: $CI_JOB_ID
You can file these and more variables in Gitlab predefined variables documentation

Related

ADO YAML failing: No repository found by name templates

I'm trying to edit an ADO YAML file down to the bare minimum in order to isolate another issue.
When I run Validate, it comes back with the following error:
No repository found by name templates
Here's the general gist of my YAML:
#resources:
# repositories:
# - repository: templates
# type: git
# name: TemplateProject/TemplateRepo
name: $(VersionName)
trigger:
branches:
include:
- main
batch: true
paths:
exclude: $(ListOfExclusions)
stages:
- template: core/setVersion.yml#templates
- stage: Build
pool: linux
jobs:
- job: BuildDocker
displayName: Build and Push Docker Image
pool: linux
steps:
- task: Docker#2
displayName: Build and push an image to container registry
inputs:
command: buildAndPush
repository: $(RepoName)
dockerfile: $(Build.SourcesDirectory)/Dockerfile
containerRegistry: $(dockerRegistryServiceConnection)
tags: |
$(Tag)
What could be going wrong? The error message makes me think the YAML isn't clean.
It turns out I caused a simple typo when commenting out the resources section of the YAML. I had a template part of the stage that also needed to be commented out, and I neglected to do this.
Once I updated the code to read:
stages:
# - template: core/setVersion.yml#templates
- stage: Build
pool: linux
jobs:
- job: BuildDocker
# etc...
Now my YAML validates with OK.

Migrating azure-pipelines.yaml to separate repo, but running on code of other repo

Ultimately, I'm trying to do this:
Move azure-pipelines.yaml and associated templates out of the code repository (code-repo).
Move them into a separate dedicated repository (pipeline-repo).
Have the pipeline look at the config for the pipeline in pipeline-repo, but run the pipeline on the code in the code-repo.
I'm referring the following documentation:
Use other repositories: this one refers to "templates in other repositories," but I'm trying to remove any pipeline configs so the code-repo is just purely application code... and the Dockerfile.
Define a repositories resource
For testing, I have this simple test.yaml:
# Triggers when PR is created due to branch policies
trigger: none
resources:
repositories:
- repository: code-repo
type: git
name: code-repo
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Testing
displayName: Test stage
jobs:
- job: ParallelA
steps:
- bash: echo Hello from parallel job A
displayName: 'Run a one-line script'
When I create a PR on code-repo, it is triggering the pipeline, which is to say branch policies are configured to refer to that pipeline. I do get the print out the Hello from parallel job A print out.
But I don't see in the run logs it pulling code-repo.
I do see the following, however:
My actual PR pipeline would look something like this:
trigger: none
resources:
repositories:
- repository: code-repo
type: git
name: code-repo
variables:
- template: templates/variables.yaml
pool:
vmIMage: $(vmImageName)
stages:
- template: templates/build/buildStage.yaml
...
Testing that, it confirms that it isn't running on the code-repo PR, but the pipeline-repo so everything fails.
So it is unclear to me what I need to do from here to get the pipeline to run on the PR code from code-repo.
Suggestions?
Ok, I think I have it sorted out, at least some of my stages are now succeeding.
I came across this documentation which informed me of checkout.
So in addition to doing something like:
resources:
repositories:
- repository: code-repo
type: git
name: code-repo
Then you need to add a step called checkout like the following:
# Triggers when PR is created due to branch policies
trigger: none
resources:
repositories:
- repository: code-repo
type: git
name: code-repo
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Testing
displayName: Test stage
jobs:
- job: ParallelA
steps:
- checkout: code-repo
- task: task1
- task: task2
The checkout should set the context for the subsequent steps.

Trigger a pipeline when specific file is changed

I am creating a new CI pipeline that will be triggered anytime a .bicep file is changed and then zip up all of the files.
# Pipeline is triggered anytime there is a change to .bicep files
trigger:
branches:
include:
- "feature/*"
pool:
vmImage: ubuntu-latest
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
This pipeline works and is triggered anytime a change is made in the feature branch.
To target any .bicep files I am trying:
trigger:
branches:
include:
- "feature/*"
paths:
include:
- '**/*.bicep'
I also tried to specify the entire route that holds the files:
trigger:
branches:
include:
- "feature/*"
paths:
include:
- "src/Deployment/IaC/Bicep/*"
When I make a change to a .bicep file in the feature branch, the pipeline is never triggered so I know my syntax is wrong.
Wildcards are not supported anymore for azure pipelines.
Instead just set the relative path to your Bicep folder like so :
paths:
include:
- src/Deployment/IaC/Bicep
see : https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/azure-repos-git?tabs=yaml&view=azure-devops#paths

Passing variable in multi project git pipeline

YML of my trigger project:
trigger:
variables:
CODE_SHA: ${SHA}
stage: test
trigger:
project: dummy
branch: dump
strategy: depend
YML of my triggered project:
build:docker3.7:
stage: build
retry: 2
script:
- echo $code_sha
Ideal behaviour: It should print ${SHA} from the trigger project in triggered project.
Present behaviour- no value is getting passed.

Azure DevOps - two dependent YAML pipelines in one repository

I have two .yml files in my repo. One for build, one for deployment. The main reason why I would like to keep build separate from the deployment is that I also would like to store variables for environments in my repo, e.i. in variables-dev.yml and variables-prod.yml files. So there is no need to create a new build every time (which includes running tests, docker image build etc.).
The file build.yml:
trigger:
paths:
exclude:
- build.yml
- deploy.yml
stages:
- stage: build
jobs:
...
And the deploy.yml, which I want to be triggered only on the completion of the build pipeline. That's why I add the first exclusion of all paths, but add one on pipeline resource.
trigger:
paths:
exclude:
- '*'
resources:
pipelines:
- pipeline: build
source: build
trigger:
branches:
include:
- '*'
stages:
- stage: dev
variables:
- template: variables-dev.yml
jobs:
- deployment: deploy_dev
environment: 'dev'
strategy:
runOnce:
deploy:
steps:
...
- stage: prod
dependsOn: dev
variables:
- template: variables-prod.yml
jobs:
- deployment: deploy_prod
environment: 'prod'
strategy:
runOnce:
deploy:
steps:
...
Unfortunately it does not seem to work. The top trigger blocks lower trigger. And if I remove the top trigger than the deploy pipeline is triggered at the same time with the build one.
you have to start your deploy.yml with trigger: none
trigger: none
resources:
pipelines:
- pipeline: ci-pipeline
source: my-build-pipeline
trigger:
enabled: true
branches:
include:
- master
Set your triggers for the second yml to none, then add this setting in the "Triggers" section of the UI. It will stage your builds as you describe

Resources