Azure DevOps Pipeline - list all Build SourceBranches - azure

This is my simple pipeline,
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
name: LinuxJavaCIBuildAgents #CheckmarxAgents #LinuxJavaCIBuildAgents
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
- release
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: echo ....??? what to echo to list all repositories and their source branch from resources.repositories
How do I list all Build SourceBranches that are involved in the above build pipeline? we have 2 repos: repo_a and repo_b, I want to list them using bash and list their source branches.
Thanks

How do I list all Build SourceBranches that are involved in the above
build pipeline? we have 2 repos: repo_a and repo_b, I want to list
them using bash and list their source branches.
Azure DevOps supports multiple repositories check out as a built-in function, Refer to the YAML code below:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
ref: main
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
ref: main
trigger:
- main
- release
steps:
- checkout: repo_a
- checkout: repo_b
- script: dir $(Build.SourcesDirectory)
While running the pipeline it will ask for authorization to allow both the repositories to run like below:-
After granting the permission:-
To list the build source branches of the above repos you can use the below echo command as a Bash Inline script in your YAML code:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
- release
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
#!/bin/bash
echo
for repo in $(echo $(echo "${resources}" | jq -r '.repositories[].name')); do
echo "Repository: $repo"
echo "Source branches: $(echo "${resources}" | jq -r '.repositories[] | select(.name == "$repo") | .trigger[]')"
echo "Source branches"
done
Pipeline Run:-
If you want to make sure all repos run on the same branch and require a warning if another repo runs on a different branch, Use below YAML code:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
steps:
- task: Bash#3
inputs:
targetType: 'inline'
script: |
#!/bin/bash
# Get the source branch of the first repository
source_branch=$(echo "${resources}" | jq -r '.repositories[0].trigger[]')
# Loop through the other repositories and compare the source branch with the first repository
for repo in $(echo $(echo "${resources}" | jq -r '.repositories[].name' | tail -n +2)); do
if [[ $(echo "${resources}" | jq -r '.repositories[] | select(.name == "$repo") | .trigger[]') != "$source_branch" ]]; then
echo "Error: Repository $repo is running on a different branch ($(echo "${resources}" | jq -r '.repositories[] | select(.name == "$repo") | .trigger[]')) than the first repository ($source_branch)"
exit 1
fi
done
echo "All repositories are set to run on branch $source_branch"
Output:-
Reference :-
Check out multiple repositories in your pipeline - Azure Pipelines | Microsoft Learn

Related

Azure Devops: how configure my pipeline script to trigger another pipeline with different repo?

I have my own pipeline p-A with repository r-A, and pipeline p-B with another repo r-B.
I want to update the pipeline script for p-A only, to trigger the p-B actively, without any modification in p-B.
Below is the yaml pipeline script for p-B, which is already set up to run with schedule
pool:
name: 'workflow_test_pool'
schedules:
- cron: "0 19 * * *"
displayName: run test every day at 8PM CET
branches:
include:
- main
always: true
trigger: none
jobs:
- job:
timeoutInMinutes: 30
steps:
- script: |
python -m pytest tests/ -s
displayName: 'Run the test'
below is the pipeline script main.yaml for p-A
pool:
name: 'workflow_test_pool'
stages:
#########################
- template: pipeline2/p1.yaml
############################
- template: pipeline2/p2.yaml
parameters:
dependsOn:
- FirstPipeline
so the question is, how to trigger the pipeline p-B in pipeline2/p2.yaml(from p-A)?
You can create PowerShell script task as the last step of the pipeline to trigger pipeline p-B through REST API. You will have to maintain Personal Access Token, ideally as secret variable.
REST API call you will use:
https://learn.microsoft.com/en-us/rest/api/azure/devops/pipelines/runs/run-pipeline?view=azure-devops-rest-7.1
Detailed step-by-step guide:
https://blog.geralexgr.com/cloud/trigger-azure-devops-build-pipelines-using-rest-api
Azure DevOps supports multiple repositories checkout, you can just reference resources function in your YAML script and call another repository to trigger from the pipeline.
YAML code :-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
steps:
- checkout: repo_a
- checkout: repo_b
- script: dir $(Build.SourcesDirectory)
I am running this pipeline from repo_a and the repo_a and repo_b both ran successfully like below:-
Output :-
You can directly run any task from pipeline with multiple repositories like below:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
steps:
- checkout: repo_a
- checkout: repo_b
- task: AzureCLI#2
inputs:
azureSubscription: 'Subscription-name(sub-id)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az resource list --location uksouth'
Output:-
References :-
Check out multiple repositories in your pipeline - Azure Pipelines | Microsoft Learn
Trigger azure Devops pipeline from another repository - GeralexGR
Multiple Repositories in a Single Azure Pipeline - DEV Community 👩‍💻👨‍💻

Owasp-Zap Securiy Testing Using azure pipeline?

This is the reference doc I have followed to set up the Azure pipeline
https://medium.com/adessoturkey/owasp-zap-security-tests-in-azure-devops-fe891f5402a4
below i am sharing screenshort of the pipeline failed:
Could you please help here to resolve the issue I have exactly followed the medium article to implement the task....
Those who aware on this could you please share your taughts.
This is the pipeline script i am using.
trigger: none
stages:
stage: 'buildstage'
jobs:
job: buildjob
pool:
vmImage: ubuntu-latest
steps:
- checkout: self
- checkout: owasap-zap
bash: "docker run -d -p 80:80 nginx:1.14.2"
displayName: "App Container"
bash: |
chmod -R 777 ./
docker run --rm -v $(pwd):/zap/wrk/:rw -t owasp/zap2docker-stable zap-full-scan.py -t http://$(ip -f inet -o addr show docker0 | awk '{print $4}' | cut -d '/' -f 1):80 -x xml_report.xml
true
displayName: "Owasp Container Scan"
- displayName: "PowerShell Script"
powershell: |
$XslPath = "owasp-zap/xml_to_nunit.xslt"
$XmlInputPath = "xml_report.xml"
$XmlOutputPath = "converted_report.xml"
$XslTransform = New-Object System.Xml.Xsl.XslCompiledTransform
$XslTransform.Load($XslPath)
$XslTransform.Transform($XmlInputPath, $XmlOutputPath)
displayName: "PowerShell Script"
task: PublishTestResults#2
displayName: "Publish Test Results"
inputs:
testResultsFiles: converted_report.xml
testResultsFormat: NUnit
# task: PublishTestResults#2
stage: buildstage
According to the YAML file, you want to checkout multiple repositories in your pipeline, but it seems you haven't define a repository resource like mentioned in the document you shared.
resources:
repositories:
- repository: <repo_name>
type: git
name: <project_name>/<repo_name>
ref: refs/heads/master
And according to the screenshot you shared, you only checkout out one repo. Which cause the location of file xml_to_nunit.xslt is different from owasp-zap/xml_to_nunit.xslt. If you only checkout one repo, the location of xml_to_nunit.xslt should be current directory, thus, just define $XslPath in the PowerShell script as "xml_to_nunit.xslt".
Edit
If the repository that contain "xml_to_nunit.xslt" file is in the same organization as the repository run for your pipeline, you need to checkout the repository by using Inline syntax checkout like below or define repository resource.
- checkout: git://MyProject/MyRepo # Azure Repos Git repository in the same organization
You could also add one more command ls before the PowerShell script to list the files in current directory. Aim to figure out where is "xml_to_nunit.xslt".

How do I build the right project when checking multiple projects in Azure devops

I am reading this doc: https://learn.microsoft.com/en-us/azure/devops/pipelines/repos/multi-repo-checkout?view=azure-devops
According to the link above, we can have multiple projects in one pipeline. So I have something like this:
trigger:
- none
resources:
repositories:
- repository: repo1
type: git
name: org/repo1
ref: test_multi_repo_pipeline
trigger:
- test_multi_repo_pipeline
- repository: repo2
type: git
name: org/repo2
ref: test_multi_repo_pipeline
trigger:
- test_multi_repo_pipeline
stages:
- stage: Build_and_Deploy
pool:
name: 'myAgent'
jobs:
- job: Build
condition: always()
steps:
- checkout: repo1
displayName: 'repo1'
- checkout: repo2
displayName: 'repo2'
- script: dir $(Build.SourcesDirectory)
displayName: 'Build.SourcesDirectory'
- script: |
npm run build:project_win_dev
displayName: Building the project...
- script: |
npm run test:coverage
displayName: Testing the skill...
- script: dir $(Build.SourcesDirectory)
displayName: 'Build.SourcesDirectory'
name: Test
So when I execute this yaml I get the next output on this task "Build.SourcesDirectory":
Directory of E:\Builds_VNext\Agent2_Builds\3046\s
03/24/2022 11:24 AM <DIR> .
03/24/2022 11:24 AM <DIR> ..
03/24/2022 11:24 AM <DIR> pipelines
03/24/2022 11:24 AM <DIR> repo1
03/24/2022 11:24 AM <DIR> repo2
0 File(s) 0 bytes
5 Dir(s) 878,801,965,056 bytes free
So once the "Build" task gets executed it fails because it gets executed in the root rather than in the specific project which I made the commit. So I was wondering if there is a way to know that if I made the commit in the project repo1, then the build gets done under the folder repo1 and if I made the change on repo2 then the build gets done inside repo2 folder
Thanks in advance for your help.
Greetings
So, you could write a conditional to determine which repo you're editing. But, I'd advise against it. You'd have to query the git history and detect changes.
The path of lease resistance is to build both every time. You'd just need to update your code to cd to the correct folder before building:
trigger:
- none
resources:
repositories:
- repository: repo1
type: git
name: org/repo1
ref: test_multi_repo_pipeline
trigger:
- test_multi_repo_pipeline
- repository: repo2
type: git
name: org/repo2
ref: test_multi_repo_pipeline
trigger:
- test_multi_repo_pipeline
stages:
- stage: Build_and_Deploy
pool:
name: 'myAgent'
jobs:
- job: Build
condition: always()
steps:
- checkout: repo1
displayName: 'repo1'
- checkout: repo2
displayName: 'repo2'
- script: dir $(Build.SourcesDirectory)
displayName: 'Build.SourcesDirectory'
- script: |
cd repo1
ls
echo '----'
npm run build:project_win_dev
echo 'Running Tests'
npm run test:coverage
displayName: 'Build and Test: Repo 1'
- script: |
cd repo2
ls
echo '----'
npm run build:project_win_dev
echo 'Running Tests'
npm run test:coverage
displayName: 'Build and Test: Repo 2'
- script: dir $(Build.SourcesDirectory)
displayName: 'Build.SourcesDirectory'
name: Test

Migrating azure-pipelines.yaml to separate repo, but running on code of other repo

Ultimately, I'm trying to do this:
Move azure-pipelines.yaml and associated templates out of the code repository (code-repo).
Move them into a separate dedicated repository (pipeline-repo).
Have the pipeline look at the config for the pipeline in pipeline-repo, but run the pipeline on the code in the code-repo.
I'm referring the following documentation:
Use other repositories: this one refers to "templates in other repositories," but I'm trying to remove any pipeline configs so the code-repo is just purely application code... and the Dockerfile.
Define a repositories resource
For testing, I have this simple test.yaml:
# Triggers when PR is created due to branch policies
trigger: none
resources:
repositories:
- repository: code-repo
type: git
name: code-repo
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Testing
displayName: Test stage
jobs:
- job: ParallelA
steps:
- bash: echo Hello from parallel job A
displayName: 'Run a one-line script'
When I create a PR on code-repo, it is triggering the pipeline, which is to say branch policies are configured to refer to that pipeline. I do get the print out the Hello from parallel job A print out.
But I don't see in the run logs it pulling code-repo.
I do see the following, however:
My actual PR pipeline would look something like this:
trigger: none
resources:
repositories:
- repository: code-repo
type: git
name: code-repo
variables:
- template: templates/variables.yaml
pool:
vmIMage: $(vmImageName)
stages:
- template: templates/build/buildStage.yaml
...
Testing that, it confirms that it isn't running on the code-repo PR, but the pipeline-repo so everything fails.
So it is unclear to me what I need to do from here to get the pipeline to run on the PR code from code-repo.
Suggestions?
Ok, I think I have it sorted out, at least some of my stages are now succeeding.
I came across this documentation which informed me of checkout.
So in addition to doing something like:
resources:
repositories:
- repository: code-repo
type: git
name: code-repo
Then you need to add a step called checkout like the following:
# Triggers when PR is created due to branch policies
trigger: none
resources:
repositories:
- repository: code-repo
type: git
name: code-repo
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Testing
displayName: Test stage
jobs:
- job: ParallelA
steps:
- checkout: code-repo
- task: task1
- task: task2
The checkout should set the context for the subsequent steps.

how to read copied file from github from yaml and jq

I am using devops pipeline to read contents of a json file hosted in a private repo in git hub. I can see the file in the pipeline output but jq is not reading the file its giving this error: "jq: error: Could not open file /home/vsts/work/1/s/config.json: No such file or directory"
this is my yaml code:
---
resources:
repositories:
- repository: testrepo
type: github
endpoint: testendpoint
name: test/test01
trigger:
- none
pool:
name: Hosted Ubuntu 1604
steps:
- script: |
displayName: 'Update the build number in readme.txt'
name: JQ
sudo apt-get install jq
echo 'installing jq'
- checkout: testrepo
path: mytest # will checkout at $(Pipeline.Workspace)/PutMyCodeHere
- script: dir ../mytest/
data=$(jq 'to_entries | map(select(.value.datavalue=="true")) | from_entries' $(Agent.BuildDirectory)/s/data.json )
echo "$data"
how can i jq to my json file?
I can get the same error message and I added task Copy files and Publish build artifacts to check the S folder, then we can see the data.json file.
It seems that jq issue instead of azure devops pipeline, you can raise the issue to jq support.
As a workaround, we can read the json file via cmd jq . {file name}.json
Result:
Update1
My test code:
resources:
repositories:
- repository: vitol
type: github
endpoint: GitHubTest
name: vitoliutest/vitol
trigger:
- none
pool:
name: Hosted Ubuntu 1604
steps:
- script: |
displayName: 'Update the build number in readme.txt'
name: JQ
sudo apt-get install jq
echo 'installing jq'
- checkout: vitol
path: mytest # will checkout at $(Pipeline.Workspace)/PutMyCodeHere
- task: CopyFiles#2
inputs:
SourceFolder: '$(Agent.BuildDirectory)'
Contents: '**'
TargetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
- script: dir ../mytest/
jq . $(Agent.BuildDirectory)/*/data.json

Resources