I'm attempting to create a Scheduled Azure Pipeline where I clone a self hosted BitBucket git repository using a Service Connection and mirror it to an existing Azure git repository.
A client keeps a a repository of code on their own BitBucket server. I'd like to set up a pipeline where I pull any changes from that repo on a scheduled interval into my own Azure repository so I can set up automated deployments.
I keep getting hung up on the Service Connection part of things. The Service Connection is setup as "Other Git" and contains all of the credentials I need to access the remote BitBucket server.
trigger: none
schedules:
- cron: "*/30 * * * *" # RUN EVERY 30 MINUTES
displayName: Scheduled Build
branches:
include:
- my-branch
always: true # RUNS ALWAYS REGARDLESS OF CHANGES MADE
pool:
name: Azure Pipelines
steps:
- task: AzureCLI#2
name: setVariables
displayName: Set Output Variables
continueOnError: false
inputs:
azureSubscription: "Service Connection Name"
scriptType: ps
scriptLocation: inlineScript
addSpnToEnvironment: true
inlineScript: |
Write-Host "##vso[task.setvariable variable=username;isOutput=true]$($env:username)"
Write-Host "##vso[task.setvariable variable=password;isOutput=true]$($env:password)"
- powershell: |
# Use the variables from above to pull latest from
# BitBucket then change the remote origin and push
# everything to my Azure repo
displayName: 'PowerShell Script'
When I run this I end up getting an error stating:
The pipeline is not valid. Job: setVariables input connectedServiceNameARM
expects a service connection of type AzureRM but the proviced service connection is of type git.
How can I access variables from a git service connection in my YAML pipeline?
The AzureCLI task only accepts service connections of the Azure Resource Manager type. So the git connection you are using doesn't work.
According to your needs, you can check out the repo first. There is a Bitbucket Cloud Service connection for Bitbucket repositories. You can use it to check out multiple repositories in your pipeline if you keep the yaml files in the azure repo.
Here is the sample yaml and screenshot:
resources:
repositories:
- repository: MyBitbucketRepo
type: bitbucket
endpoint: MyBitbucketServiceConnection
name: MyBitbucketOrgOrUser/MyBitbucketRepo
trigger: none
schedules:
- cron: "*/30 * * * *" # RUN EVERY 30 MINUTES
displayName: Scheduled Build
branches:
include:
- my-branch
always: true # RUNS ALWAYS REGARDLESS OF CHANGES MADE
pool:
name: Azure Pipelines
steps:
- checkout: MyBitbucketRepo
- powershell: |
# Use the variables from above to pull latest from
# BitBucket then change the remote origin and push
# everything to my Azure repo
displayName: 'PowerShell Script'
Related
I have my own pipeline p-A with repository r-A, and pipeline p-B with another repo r-B.
I want to update the pipeline script for p-A only, to trigger the p-B actively, without any modification in p-B.
Below is the yaml pipeline script for p-B, which is already set up to run with schedule
pool:
name: 'workflow_test_pool'
schedules:
- cron: "0 19 * * *"
displayName: run test every day at 8PM CET
branches:
include:
- main
always: true
trigger: none
jobs:
- job:
timeoutInMinutes: 30
steps:
- script: |
python -m pytest tests/ -s
displayName: 'Run the test'
below is the pipeline script main.yaml for p-A
pool:
name: 'workflow_test_pool'
stages:
#########################
- template: pipeline2/p1.yaml
############################
- template: pipeline2/p2.yaml
parameters:
dependsOn:
- FirstPipeline
so the question is, how to trigger the pipeline p-B in pipeline2/p2.yaml(from p-A)?
You can create PowerShell script task as the last step of the pipeline to trigger pipeline p-B through REST API. You will have to maintain Personal Access Token, ideally as secret variable.
REST API call you will use:
https://learn.microsoft.com/en-us/rest/api/azure/devops/pipelines/runs/run-pipeline?view=azure-devops-rest-7.1
Detailed step-by-step guide:
https://blog.geralexgr.com/cloud/trigger-azure-devops-build-pipelines-using-rest-api
Azure DevOps supports multiple repositories checkout, you can just reference resources function in your YAML script and call another repository to trigger from the pipeline.
YAML code :-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
steps:
- checkout: repo_a
- checkout: repo_b
- script: dir $(Build.SourcesDirectory)
I am running this pipeline from repo_a and the repo_a and repo_b both ran successfully like below:-
Output :-
You can directly run any task from pipeline with multiple repositories like below:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
steps:
- checkout: repo_a
- checkout: repo_b
- task: AzureCLI#2
inputs:
azureSubscription: 'Subscription-name(sub-id)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az resource list --location uksouth'
Output:-
References :-
Check out multiple repositories in your pipeline - Azure Pipelines | Microsoft Learn
Trigger azure Devops pipeline from another repository - GeralexGR
Multiple Repositories in a Single Azure Pipeline - DEV Community 👩💻👨💻
How to authorize variable in a yaml template in another repo to be used in a different repo. IOW, how to declare variables in a template once and use in multiple repos in azure devops
I am trying to migrate from classic pipelines to yaml in azure devops. So i am trying to set up a repo to host all yaml templates so it can be referenced and reused by multiple repos for builds, etc.
I wrote this yaml pipeline to sample prototyping it:
`name: FirstPL
trigger:
- my_test_branch
pool: my-agent
resources:
repositories:
- repository: blah
type: git
name: foo/bar
ref: refs/heads/poc
variables:
- template: pipeline_vars.yml#blah
steps:
- script: echo $(variable_from_pipeline_vars)
`
However when i run this i get the follwoing error:
An error occurred while loading the YAML build pipeline. Variable group was not found or is not authorized for use. For authorization details, refer to https://aka.ms/yamlauthz.
How can i declare my variables and variables groups once in a template in a repo that is dedicated to host those templates and then use them over and again in multiple repos using the resourcs syntax above? Also, I tried to find a way authorize the variables template but couldn't find anything to enable this.
How to authorize variable in a yaml template in another repo to be
used in a different repo. IOW, how to declare variables in a template
once and use in multiple repos in azure devops. However when I run
this i get the follwoing error:
An error occurred while loading the YAML build pipeline. Variable group was not found or is not authorized for use. For authorization
details, refer to
https://aka.ms/yamlauthz.
You can directly add the variable group in your azure DevOps project in the Library tab and save all your variables from pipeline_vars.yml in the variable group like below:-
Now, You can access this variable group in your YAML pipeline of multiple repos like the below:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
variables:
- group: SharedVariables
steps:
- checkout: repo_a
- checkout: repo_b
- script: |
echo $(databaseServerName)
- task: AzureCLI#2
inputs:
azureSubscription: 'xxx subscription(xxxxxxxxx-f598-44d6-b4fd-xxxxxxxxxxxx)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az resource list --location uksouth'
Output:-
It asks for approving permission for the Variable group to run in the pipeline like below:-
Console:-
Tried the same with another repo repo_b in the project and it asks to approve access for repositories and variable groups like the below:-
Output:-
If you want this variable to be accessed in multiple stages/repos/pipelines within the project without authorization prompt. You can click on Security on top and allow it:-
I created one variables template and referenced it in the YAML pipeline to use across multiple repos by checking out another repo like below:-
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
pool:
vmImage: ubuntu-latest
workspace:
clean: all
resources:
repositories:
- repository: repo_a
type: git
name: InternalProjects/repo_a
trigger:
- main
- release
- repository: repo_b
type: git
name: InternalProjects/repo_b
trigger:
- main
variables:
- template: pipeline_vars.yml
steps:
- checkout: repo_a
- checkout: repo_b
- script: |
echo $(environmentName)
- task: AzureCLI#2
inputs:
azureSubscription: 'xxx subscription(xxxxxxxx-f598-44d6-b4fd-e2b6e97xxxxxx)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: 'az resource list --location uksouth'
Output:-
I tried to reference the same template in another repo where it does not exist it could not read the pipeline_vars.yml file as it does not exist in the repo.
You can make use of variable groups like the above to reference the variables in this pipeline.
One of the possible reasons for this is that the project that hosts the repository with the variables does not allow access to it's repositories from yaml pipelines.
To verify, go to your project's settings -> Pipelines -> Settings -> Verify "Protect access to repositories in YAML pipelines" . This setting is enabled by default. You could set it to off or add a checkout step to your pipeline yaml. See here for more information.
I have a repository in Github and want to integrate it into Azure-DevOps. I connected both the repositories in Github as well as Azure-devops.
When I commit some code into Github the changes are not getting updated automatically in Azure. Is there anyway that we can automatically pull the changes if there are any new changes in Github?
Any references/suggestions are much appreciated.
Update:
Azure DevOps doesn't have such a built-in feature to sync the Github repo to the DevOps repo now.
If you need the feature, you can upvote the feature request in this place:
https://developercommunity.visualstudio.com/t/automatically-sync-azure-devops-repository-with-gi/516057
When enough people request a new feature, Microsoft will include it in future product plans.
1, You need to use code/script to sync the repo and use the CI Trigger of the YAML pipeline to capture the changes in the Github repo.
trigger:
- <branch name>
pool:
vmImage: ubuntu-latest
variables:
- name: system.debug
value: true
steps:
- script: |
echo put the logic here
displayName: 'Push changes to DevOps repo'
The code you can refer to this page:
https://dileepveldi.medium.com/sync-azure-devops-repo-with-github-repo-35a958d7784e
2, Then after the above pipeline pushes the changes, you need to captrue the changes via the CI trigger of the YAML pipeline on DevOps side.
trigger:
- <branch name>
pool:
vmImage: ubuntu-latest
variables:
- name: system.debug
value: true
steps:
- script: |
echo xxx
displayName: 'Run a multi-line script'
Original Answer:
If you mean integrating Github repo and Azure DevOps pipeline, for example, you need continuous integration on main branch of your repo.
Then, follow the below steps.
1, For classic pipeline:
2, For YAML pipeline:
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
If what you mean is not integrating Github repo and Azure DevOps pipeline, please clarify your requirements.
Hello guys I'm currently working with Azure Synapse studio. My situation could be described in this way:
I have 3 env: Dev, Test and Prod, each of them has a Azure synapse workspace but I can access only to the Dev one. I need to make some changes from Dev also for the other 2 env (sql script, pipelines etc) and then publish them to other env without touching them.
So I think Azure DevOps can be the solution.
From Dev Syanapse studio Workspace I created 3 branches 1 per env, all of them linked to an Azure DevOps repo. Also Test and Prod are linked to the same repo.
The problem is that the code on Test and Prod workspace could be different from the code on Dev. So I can't use the same ARM template (generated by publishing on the publish branch of the workspace) for all the 3 environment. A good way could be find a way to hit the publish button also on the other envs without using the portal, for example by a REST API ? It is possible ?
Now I only set up the 3 branch solution so I can magae the 3 env directly from Dev env but I think that this will not be the right solution, are changes applied on other envs ? Can I run SQL scripts or pipelines manually from other envs ?
This is my current situation on the other envs I asked to set collaboration and publish branch with the same value as the env branch name (test-test-test and prod-prod-prod)
with the new version (V2) of the Synapse workspace deployment (in Preview 2022-06), it is now possible to deploy from any branch using Azure Devops, so no need for a workspace_publish branch or the Publish button.
Just make the object json files available as artifacts to the release pipeline, and select "Validate and deploy" as the Operation Type.
I am working with Microsoft directly, building a Synapse warehouse myself for a large corporation. We have the same issue, in that the Publish button must be pressed manually for the ARM templates to be generated. Microsoft have confirmed that there is no automatic method for this available right now; we had hoped to receive a Preview AzDevOps deployment task this month, but it turns out that it simply allows us to validate the JSON assets - it still deploys using the ARM template.
We have also looked at using Azure Data Factory tools to deploy from the JSON component files, but we run into issues with the dedicated pool stored procedure tasks being unsupported. :(
The only standard option to achieve this is by creating GitHub repository and then creating Continuous Integration and creating a self-hosted Azure DevOps VM agent or use an Azure DevOps hosted agent.
Then you can setup release pipelines in Azure DevOps to work with different environments. But still you need to commit the changes in the GitHub repository for each environment, there is no Publish button kind of this available.
Refer Continuous integration and delivery for an Azure Synapse Analytics workspace for more details.
This was bothering me as well, so put together the following to be run once any PR is approved to merge into the Synapse collaboration branch, in our case, "main".
For your case, you can modify to target the relevant workspaces.
See below Azure DevOps pipeline code.
What it does is:
runs the Synapse workspace validation task, which also generates the workspace template jsons as an artifact that need to be published to the workspace_publish branch.
It will then check out your publish branch and commit and push the templates that were generated from the previous task
Such that the workspace UI does not think there are any unpublished changes when you click the "Publish" button, we need to update the workspace configuration to reflect the latest commit ID from the workspace COLLABORATION branch (main in this example) that was used to generate what we pushed to the PUBLISH branch in the previous step.
Any suggestions/improvements welcome. Hope this helps.
name: $(TeamProject)_$(Build.DefinitionName)_$(SourceBranchName)_$(Date:yyyyMMdd)$(Rev:.r) # sets Build.BuildNumber
trigger:
branches:
include:
- main
paths:
include:
- synapse/*
resources:
repositories:
- repository: 'Synapse-Publish'
type: git
name: Synapse # update to the name of your repo
ref: workspace_publish # update to the name of your synapse PUBLISH branch
variables:
repoName: $(Build.Repository.Name)
azureSubscription: your_subscription
azureTenantId: your_tenant_guid
adoOrg: your_azure_devops_org_name
adoProject: your_azure_devops_project_name
SourceWorkspaceName: your_synapse_workspace_name
workspacePublishBranch: workspace_publish # should be the same for you but update if not
stages:
- stage: build_stage
displayName: Build, Run Validations, Publish NonProd if merged to main
jobs:
# other jobs excluded from this snippet
- job: publish_workspace_artifacts_job
displayName: Publish for $(SourceWorkspaceName) $(workspacePublishBranch)
# only kick off workspace publish job for non-PR builds
condition: and(not(or(failed(), canceled())), ne(variables['Build.Reason'], 'PullRequest'))
pool:
name: 'linux-vmss' # update this for whatever you need
steps:
- checkout: self # main
clean: true
persistCredentials: true
- task: Synapse workspace deployment#2
displayName: Generate workspace artifact templates
condition: true
continueOnError: false
inputs:
operation: 'validate' # despite this name, it also generates the templates
ArtifactsFolder: '$(Build.SourcesDirectory)/$(repoName)/synapse'
TargetWorkspaceName: $(SourceWorkspaceName)
- checkout: 'Synapse-Publish' # workspace_publish
clean: true
persistCredentials: true
- task: CmdLine#2
displayName: 'Set git user'
inputs:
workingDirectory: '$(System.DefaultWorkingDirectory)'
failOnStderr: true
script: |
git config --global user.email "whatever.you.want#your_org.com"
git config --global user.name "Whatever You Want"
- task: AzurePowerShell#5
displayName: Publish to $(SourceWorkspaceName) $(workspace_publish)
condition: true
inputs:
azureSubscription: '$(azureSubscription)'
ScriptType: InlineScript
Inline: |
# the output from the workspace validate step above are saved here, also published as artifact with name = the synapse workspace name
# Get-ChildItem $(Build.SourcesDirectory)/ExportedArtifacts -Name
cd $(Build.SourcesDirectory)/$(repoName)
git pull origin $(workspacePublishBranch)
git switch $(workspacePublishBranch)
Move-Item -Path $(Build.SourcesDirectory)/ExportedArtifacts/*.json -Destination $(Build.SourcesDirectory)/$(repoName)/$(SourceWorkspaceName) -Force -Verbose
git add $(Build.SourcesDirectory)/$(repoName)/$(SourceWorkspaceName)/*.json
$diff = git diff --cached
$status = git status
if (!($status.ToLower() -like "*nothing to commit*"))
{
echo "##[section]git push changes to repo";
git commit -m "Update $(workspacePublishBranch) for source workspace $(SourceWorkspaceName) [skip ci]";
git pull --rebase;
git push origin $(workspacePublishBranch);
}
else
{
echo "##[warning]No new changes to push for source workspace $(SourceWorkspaceName) templates";
git reset –-hard origin/$(workspacePublishBranch)
git clean -fxd
}
azurePowerShellVersion: 'LatestVersion'
- task: AzurePowerShell#5
displayName: Update $(SourceWorkspaceName) Git Config # this is required so when you click "Publish" within the workspace it doesn't think there are any changes vs. what's already published
inputs:
azureSubscription: '$(azureSubscription)'
ScriptType: InlineScript
Inline: |
# get latest version of this module which now has the LastCommitId parameter that we need
Install-Module -Name Az.Synapse -Confirm:$false -RequiredVersion 1.5.0 -Force
Import-Module -Name Az.Synapse -MinimumVersion 1.5.0
cd $(Build.SourcesDirectory)/$(repoName)
[String] $latestCommitHash = git log -n 1 origin/main --pretty=format:"%H" # format to get only the hash value of the latest commit
$config = New-AzSynapseGitRepositoryConfig `
-RepositoryType AzureDevOpsGit `
-TenantId $(azureTenantId) `
-AccountName $(adoOrg) `
-ProjectName $(adoProject) `
-RepositoryName $(repoName) `
-CollaborationBranch main `
-RootFolder "/synapse" `
-LastCommitId $latestCommitHash
echo "##[section] Updating $(SourceWorkspaceName) git configuration to point to the latest main branch commit ID"
# see https://learn.microsoft.com/en-us/powershell/module/az.synapse/update-azsynapseworkspace?view=azps-8.0.0
Update-AzSynapseWorkspace -Name $(SourceWorkspaceName) -GitRepository $config
azurePowerShellVersion: 'LatestVersion'
We have a multi-stage YAML pipeline that does CI/CD to an existing set of Azure Resources
The stages are
Build
Deploy to Development and Run Tests
If Previous succeeded - Deploy to Production and Run Tests
We use the AzureRmWebAppDeployment task during the deployment stages and we use the AppSettings argument to that task to specify environment-specific settings. For example
- task: AzureRmWebAppDeployment#4
displayName: 'Deploy Azure App Service'
inputs:
azureSubscription: '$(azureSubscriptionEndpoint)'
appType: '$(webAppKind)'
WebAppName: 'EXISTING__AZURE_RESOURCENAME-DEV'
Package: '$(Pipeline.Workspace)/**/*.zip'
AppSettings: >
-AzureAd:CallbackPath /signin-oidc
-AzureAd:ClientId [GUID was here]
-AzureAd:Domain [domain was here]
-AzureAd:Instance https://login.microsoftonline.com/
-AzureAd:TenantId [Id was here]
-EmailServer:SMTPPassword SECRETPASSWORD
-EmailServer:SMTPUsername SECRETUSERNAME
There are two settings in that set, EmailServer: SMTPUsername and EmailServer: SMTPPassword that I want to pull from an Azure KeyVault. I know how to reference the KV secret from Azure Portal using the syntax
#Microsoft.KeyVault(SecretUri=https://our.vault.azure.net/secrets/SendGridPassword/ReferenceGuidHere)
but how do I reference the value from the YAML pipeline so it is set in Azure?
As pointed out by Thomas in this comment, Referencing Azure Key Vault secrets from CI/CD YAML
I can explicitly set the value in the YAML file like this:
-EmailServer:SMTPPassword #Microsoft.KeyVault(SecretUri=https://our.vault.azure.net/secrets/SendGridPassword/ReferenceGuidHere)
You need to set an AzureKeyVault#1 task with RunAsPreJob to true, this will make your key vault values available as CI/CD jobs environment variables so you can use it as $(KEY-OF-SECRET-VALUE) on the rest of your stages in the job.
The following piece of yaml file is a working example.
We set for python unittest a set of env variable provided from Azure key-vault
trigger:
batch: true # disable concurrent build for pipeline
branches:
include:
- '*' # CI start for all branches
pool:
vmImage: ubuntu-16.04
stages:
- stage: Test
jobs:
- job: sample_test_stage
steps:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'YOUR SUBSCRIPTION HERE'
KeyVaultName: 'THE-KEY-VAULT-NAME'
SecretsFilter: '*'
RunAsPreJob: true
- task: UsePythonVersion#0
inputs:
versionSpec: '3.7'
- script : python -m unittest discover -v -s tests
displayName: 'Execute python unittest'
env: { MY-ENV-VAL-1: $(SECRET-VALUE-1), MY-ENV-VAL-2: $(SECRET-VALUE-2)}
Note that sometimes you need to approve connection beetween AzureDevops and another Azure service like KeyVault