I am trying to improve the CI/CD process of an old funky project whose code is not open to refactoring at the moment. I just cant get this to work following the Azure documentation or even if it is possible.
I have been able to improve the current state with an azure pipeline file that runs unit tests before merging into releases/dev branch. But i want to further.
Tasks every PR into releases/dev will:
script: npm run test:unit
script: npm run build:dev
copy/publish the contents of the .div/ folder to a azure blob store config for static site
Any PR or merge into releases/staging will:
script: npm run build:staging
copy/publish the contents of the .div/ folder to a azure blob store config for static site
Any PR or merge into master will:
script: npm run test:unit
script: npm run build:production
copy/publish the contents of the .div/ folder to a azure blob store config for static site
I have 3 questions
Is this possible within a single yaml file?
How do i run different task for different branches, I've defined jobs/stages but cant get them to be conditional?
Is there some magic anyone can direct me to that lets me copy the content of a directory to a blob store? Or must it be zipped->copied->un zipped?
Thanks in advance from a new sleep deprived dad
Is this possible within a single yaml file? How do i run different
task for different branches, I've defined jobs/stages but cant get
them to be conditional?
Of course. You could add these stages in a single yaml file. Then you need to define the condiftion field for each stage or each job.
Here is an example for stages:
trigger:
- '*'
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Test1
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/master') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage1!
- stage: Test2
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/dev') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/dev'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage2!
- stage: Test3
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/staging') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/staging'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage3!
You could set required branches as trigger. Then you could use the Build.SourceBranch and System.PullRequest.TargetBranch to determine whether to run the current stage.
Build.SourceBranch -> for merge branch.
System.PullRequest.TargetBranch -> for Pull Request.
Here are the docs about conditions and variables.
Is there some magic anyone can direct me to that lets me copy the content of a directory to a blob store? Or must it be zipped->copied->un zipped?
Since you need to publish file to Azure Blob, you could directly try to use the Azure File Copy task.
Here is an example:
- task: AzureFileCopy#4
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: xxx
azureSubscription: xxx
Destination: AzureBlob
storage: xxx
ContainerName: '$web'
Hope this helps.
Related
I'm using the below azure-pipeline.yml file to build docker image, push to Azure docker registry & restart the Azure docker app servicer.
This yaml file uses variable set in azure pipeline, screenshot attached.
My issue is, I need to create 2-3 pipelines every-week for different projects I need to add every variable manually for each project and copy paste from my config. Is there a way I can import a .env file or add multiple variables all at once while creating the pipeline.
Objectively I need to cut down the single variable copy paste time & avoid errors that might occurr
1.You could use variable group to reuse variables.
trigger:
- none
pool:
vmImage: ubuntu-latest
variables:
- group: forTest
steps:
- script: |
echo $(test1)
echo $(test2)
displayName: 'Run a multi-line script'
2.You could use variable template.
trigger:
- none
pool:
vmImage: ubuntu-latest
variables:
- template: vars.yml
steps:
- script: |
echo $(test1)
echo $(test2)
displayName: 'Run a multi-line script'
I need to pass a file path to a trigger job where the file path is found within a specified json file in a separate job. Something along the lines of this...
stages:
- run_downstream_pipeline
variables:
- FILE_NAME: default_file.json
.get_path:
stage: run_downstream_pipeline
needs: []
only:
- schedules
- triggers
- web
script:
- apt-get install jq
- FILE_PATH=$(jq '.file_path' $FILE_NAME)
run_pipeline:
extends: .get_path
variables:
PATH: $FILE_PATH
trigger:
project: my/project
branch: staging
strategy: depend
I can't seem to find any workaround to do this, as using extends won't work since Gitlab wont allow for a script section in a trigger job.
I thought about trying to use the Gitlab API trigger method, but I want the status of the downstream pipeline to actually show up in the pipeline UI and I want the upstream pipeline to depend on the status of the downstream pipeline, which from my understanding is not possible when triggering via the API.
Any advice would be appreciated. Thanks!
You can use artifacts:reports:dotenv for setting variables dynamically for subsequent jobs.
stages:
- one
- two
my_job:
stage: "one"
script:
- FILE_PATH=$(jq '.file_path' $FILE_NAME) # In script, get the environment URL.
- echo "FILE_PATH=${FILE_PATH}" >> variables.env # Add the value to a dotenv file.
artifacts:
reports:
dotenv: "variables.env"
example:
stage: two
script: "echo $FILE_PATH"
another_job:
stage: two
trigger:
project: my/project
branch: staging
strategy: depend
Variables in the dotenv file will automatically be present for jobs in subsequent stages (or that declare needs: for the job).
You can also pull artifacts into child pipelines, in general.
But be warned you probably don't want to override the PATH variable, since that's a special variable used to help you find builtin binaries.
The goal
I'm pretty new to Azure and pipelines, and I'm trying to trigger a pipeline from a pr in Azure. The repo lives in Github.
Here is the pipeline yaml: pipeline.yml
trigger: none # I turned this off for to stop push triggers (but they work fine)
pr:
branches:
include:
- "*" # This does not trigger the pipeline
stages:
- stage: EchoTriggerStage
displayName: Echoing trigger reason
jobs:
- job: A
steps:
- script: echo 'Build reason::::' $(Build.Reason)
displayName: The build reason
# ... some more stages here triggered by PullRequests....
# ... some more stages here triggered by push (CI)....
The pr on Github looks like this:
The problem
However, the pipeline is not triggered, when the push triggers work just fine.
I have read in the docs but I can't see why this does not work.
The pipeline is working perfectly fine when I am triggering it through git push. However, when I try to trigger it with PR's from Github, nothing happens. In the code above, I tried turning off push triggers, and allow for all pr's to trigger the pipeline. Still nothing.
I do not want to delete the pipeline yet and create a new one.
Update
I updated the yaml file to work as suggested underneath. Since the pipeline actually runs through a push command now, the rest of the details of the yaml file are not relevant and are left out.
Other things I have tried
Opening new PR on Github
Closing/Reopening PR on Github
Making change to existing PR on Github
-> Still no triggering of pipeline.
You have a mistake in your pipeline. It should be like this:
trigger: none # turned off for push
pr:
- feature/automated-testing
steps:
- script: echo "PIPELINE IS TRIGGERED FROM PR"
Please change this
- stage:
- script: echo "PIPELINE IS TRIGGERED FROM PR"
to
- stage:
jobs:
- job:
steps:
- script: echo "PIPELINE IS TRIGGERED FROM PR"
EDIT
I used your pipeline
trigger: none # I turned this off for to stop push triggers (but they work fine)
pr:
branches:
include:
- "*" # This does not trigger the pipeline
stages:
- stage: EchoTriggerStage
displayName: Echoing trigger reason
jobs:
- job: A
steps:
- script: echo 'Build reason::::' $(Build.Reason)
displayName: The build reason
# ... some more stages here triggered by PullRequests....
# ... some more stages here triggered by push (CI)....
and all seems to be working.
Here is PR and here build for it
I didn't do that but you can try to enforce this via branch policy. TO do that please go to repo settings and then as follow:
The solution was to go to Azure Pipelines -> Edit pipeline -> Triggers -> Enable PR validation.
You can follow below steps to troubleshooting your pipeline.
1, First you need to make sure a pipeline was created from the yaml file on azure devops Portal. See example in this tutorial.
2, Below part of your yaml file is incorrect. - script task should be under steps section.
Change:
stages:
- stage:
- script: echo "PIPELINE IS TRIGGERED FROM PR"
To:
stages:
- stage:
jobs:
- job:
step:
- script: echo "PIPELINE IS TRIGGERED FROM PR"
3, I saw you used template in your yaml file. Please make sure the template yaml files are in correct format. For example:
the dockerbuild-dashboard-client.yml template of yours is a step template. You need to make sure its contents is like below:
parameters:
...
steps:
- script: echo "task 1"
- script: echo "task 2"
And webapprelease-dashboard-dev-client.yml of yours is a job template. Its contents should be like below:
parameters:
name: ''
pool: ''
sign: false
jobs:
- job: ${{ parameters.name }}
pool: ${{ parameters.pool }}
steps:
- script: npm install
- script: npm test
- ${{ if eq(parameters.sign, 'true') }}:
- script: sign
4, After the pipeline was created on azure devops Portal. You can manually run this pipeline to make sure there is no error in the yaml file and the pipeline can be successfully executed.
5, If All above are checked, but the PR trigger still is not working. You can try deleting the pipeline(created on the first step) created on Azure devops portal and recreated a new pipeline from the yaml file.
i have wired behavior with the copy files from hosted agent and then downloading them back to the same agent
looks like it copies the files from agent A but the same pipeline downloading them back to Agent B
with is in another machine doing another build job that is not related
Upload from ios_docker_142_linux_slave_1
Download back to different agent ios_docker_141_linux_slave_3 , why ?
- task: CopyFiles#2
inputs:
CleanTargetFolder: 'true'
SourceFolder: '$(Agent.HomeDirectory)/../${{parameters.Folderpath}}'
Contents: '**'
TargetFolder: '$(build.artifactstagingdirectory)'
This is an expected behavior if you are using parallel jobs. According to your screenshot, there are multiple jobs self-hosted connect , mac_agent, copy_back_files_to_self..
One agent one job at a time. If the agent is running a job, it will in busy status, and other jobs will look for idle agents to run . The parallel jobs is for running multiple jobs in multiple agents at a time.
To achieve what you want, you need to specify detail agent in your YAML file. The pool name needs to add to the name field, then you could add demands. You may try the following YAML Code:
stages:
- stage: Deploy
pool:
name: AgentPoolName(e.g. alm-aws-pool)
demands:
- agent.name -equals Agentname (e.g. deploy-05-agent1)
jobs:
- job: BuildJob
steps:
- script: echo Building!
I'm setting up several pipelines in Azure DevOps. To make my teams life easier, I'm using job templates.
These job templates are in a a proper repository, just for them.
For every pipeline I define the repository to get the templates from.
Some tasks in these templates run powershell code, and I want this code to be in a script file, to be reusable and stored in the same repo as the template.
When the pipelines runs, the template is embeded, it tries to locate the powershell script inside project repo actually being built/deployed.
How can i achieve this?
The workaround is to have inline code which I really don't want to have.
Any constructive answer will be very appreciated.
Thanks
After some digging I couldn't find any way to specify a script file as source to powershell task in a template.
Inside pipeline definition:
resources:
repositories:
- repository: templates
type: git
name: deploy-templates
variables:
artifactName: 'Trade Data ETL - $(Build.SourceBranchName)'
stages:
- stage: Build
displayName: Build
variables:
- group: DEV-Credential-Group
- group: COMMON-Settings-Group
jobs:
- template: ssis/pipelines/stage-build.yml#templates # Template reference
parameters:
artifactName: '$(artifactName)'
Inside template file:
- task: PowerShell#2
inputs:
filePath: ssis/pipelines/scripts/build-ssis-project.ps1
arguments: '-ProjectToBuild "tradedata-ldz-ssis/tradedata-ldz-ssis.dtproj'
pwsh: true
Update 2021
According to learn.microsoft.com, you can now also check out multiple repositories without custom scripting.
If you check out more than one repository, a separate folder containing the repository is created below $(Build.SourcesDirectory).
You can define multiple repositories like this:
resources:
repositories:
- repository: devops
type: git
name: DevOps
ref: main
- repository: infrastructure
type: git
name: Infrastructure
ref: main
And in the steps simply check them out as follows:
steps:
- checkout: self
- checkout: devops
- checkout: infrastructure
# List all available repositories
- script: ls
Original Answer
Currently the resources command only supports yml files in other repositories. However, you could simply checkout the repository in a task and then run the desired powershell script.
steps:
- task: PowerShell#2
inputs:
targetType: inline
script: |
git clone -b <your-desired-branch> https://azuredevops:$($env:token)#dev.azure.com/<your-organization>/<your-project>/_git/<your-repository> <target-folder-name>
./<target-folder-name>/foo.ps1
env:
token: $(System.AccessToken)
This script would checkout an arbitrary branch and execute a script foo.ps1 in the root of the target repository.
Call - checkout: templates inside the template file. This might only work when you insert a template but it successfully sees the repository resource and pulls it down.
You can copy the script files from source directory. Currently, you have not mentioned the root folder -
ssis/pipelines/scripts/build-ssis-project.ps1
Assuming, you are building on a repo where the powershell script resides -
Try -
- task: PowerShell#1
inputs:
scriptName: '$(ScriptsDir)/ssis/pipelines/scripts/build-ssis-project.ps1'
Pass the value of ScriptsDir where it could be the build source directory or build working directory