I need to set git checkout on HTTP/2. However, I am unable to find any such option and Azure is exclusively setting the http.Version to HTTP/1.1 and this is blocking my checkout due to firewalls. Any help or workaround on how I can set http.Version to HTTP/2 is appreciated. Thank you.
Below is the snapshot of pipeline logs.
This is my pipeline yaml
pool:
name: "some pool"
trigger:
- some branch
stages:
- stage: main
jobs:
- job: synchronize
steps:
- checkout: self
clean: true
displayName: Git checkout
continueOnError: true
- task: Bash#3
inputs:
targetType: filePath
filePath: $(System.DefaultWorkingDirectory)/scripts/sync_git.sh
workingDirectory: $(System.DefaultWorkingDirectory)
I am running azp agents as container in Kubernetes.
You can't configure the http.version in the Checkout step. Microsoft force to use HTTP/1.1, you can see it in the Microsoft agent source code:
// Force Git to HTTP/1.1. Otherwise IIS will reject large pushes to Azure Repos due to the large content-length header
// This is caused by these header limits - https://learn.microsoft.com/en-us/iis/configuration/system.webserver/security/requestfiltering/requestlimits/headerlimits/
int exitCode_configHttp = await gitCommandManager.GitConfig(executionContext, targetPath, "http.version", "HTTP/1.1");
if (exitCode_configHttp != 0)
{
executionContext.Warning($"Forcing Git to HTTP/1.1 failed with exit code: {exitCode_configHttp}");
}
As workaround, you can disable the native checkout step with checkout: none and implement a custom checkout. just add a command line task and perform git commands:
git config http.version HTTP/2
git clone YOUR-REPO-URL
Related
how to checkout from different repo, below is the code I have in resources / repository section. currently getting the below error
The repository qp-EAR-AA-8643 in project ea-quality-process-improvement could not be retrieved. Verify the name and credentials being used.
this is how I am trying to checkout the code
- checkout: QPExpressDestinationRepo
persistCredentials: true
clean: true
I have applied the answers given below but I am getting this error.I gave the link of repo which I am trying to connect (QPExpressDestinationRepo)
I can reproduce your issue:
This issue needs to make sure two things to make the service connection be able to be used in the 'resources.repositories.repository'.
1, Please make sure the service connection type is 'Azure Repos/Team Foundation Server'.
2, Please make sure the 'Authentication method' is 'Token Based Authentication'.
This is my pipeline YAML definition, and it works fine:
trigger:
- none
resources:
repositories:
- repository: QPExpressDestinationRepo
type: git
name: xxx/xxx
endpoint: TestOrgBowman #Also need to check common git connection type.
ref: refs/heads/main
pool:
vmImage: ubuntu-latest
steps:
- checkout: QPExpressDestinationRepo
persistCredentials: true
clean: true
- script: ls
displayName: 'Run a one-line script'
This is the structure of the repository in another organization:
This is the pipeline result from the original organization:
See Official document of Repository resource definition
I have a repository in Github and want to integrate it into Azure-DevOps. I connected both the repositories in Github as well as Azure-devops.
When I commit some code into Github the changes are not getting updated automatically in Azure. Is there anyway that we can automatically pull the changes if there are any new changes in Github?
Any references/suggestions are much appreciated.
Update:
Azure DevOps doesn't have such a built-in feature to sync the Github repo to the DevOps repo now.
If you need the feature, you can upvote the feature request in this place:
https://developercommunity.visualstudio.com/t/automatically-sync-azure-devops-repository-with-gi/516057
When enough people request a new feature, Microsoft will include it in future product plans.
1, You need to use code/script to sync the repo and use the CI Trigger of the YAML pipeline to capture the changes in the Github repo.
trigger:
- <branch name>
pool:
vmImage: ubuntu-latest
variables:
- name: system.debug
value: true
steps:
- script: |
echo put the logic here
displayName: 'Push changes to DevOps repo'
The code you can refer to this page:
https://dileepveldi.medium.com/sync-azure-devops-repo-with-github-repo-35a958d7784e
2, Then after the above pipeline pushes the changes, you need to captrue the changes via the CI trigger of the YAML pipeline on DevOps side.
trigger:
- <branch name>
pool:
vmImage: ubuntu-latest
variables:
- name: system.debug
value: true
steps:
- script: |
echo xxx
displayName: 'Run a multi-line script'
Original Answer:
If you mean integrating Github repo and Azure DevOps pipeline, for example, you need continuous integration on main branch of your repo.
Then, follow the below steps.
1, For classic pipeline:
2, For YAML pipeline:
trigger:
- main
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
If what you mean is not integrating Github repo and Azure DevOps pipeline, please clarify your requirements.
I'm attempting to create a Scheduled Azure Pipeline where I clone a self hosted BitBucket git repository using a Service Connection and mirror it to an existing Azure git repository.
A client keeps a a repository of code on their own BitBucket server. I'd like to set up a pipeline where I pull any changes from that repo on a scheduled interval into my own Azure repository so I can set up automated deployments.
I keep getting hung up on the Service Connection part of things. The Service Connection is setup as "Other Git" and contains all of the credentials I need to access the remote BitBucket server.
trigger: none
schedules:
- cron: "*/30 * * * *" # RUN EVERY 30 MINUTES
displayName: Scheduled Build
branches:
include:
- my-branch
always: true # RUNS ALWAYS REGARDLESS OF CHANGES MADE
pool:
name: Azure Pipelines
steps:
- task: AzureCLI#2
name: setVariables
displayName: Set Output Variables
continueOnError: false
inputs:
azureSubscription: "Service Connection Name"
scriptType: ps
scriptLocation: inlineScript
addSpnToEnvironment: true
inlineScript: |
Write-Host "##vso[task.setvariable variable=username;isOutput=true]$($env:username)"
Write-Host "##vso[task.setvariable variable=password;isOutput=true]$($env:password)"
- powershell: |
# Use the variables from above to pull latest from
# BitBucket then change the remote origin and push
# everything to my Azure repo
displayName: 'PowerShell Script'
When I run this I end up getting an error stating:
The pipeline is not valid. Job: setVariables input connectedServiceNameARM
expects a service connection of type AzureRM but the proviced service connection is of type git.
How can I access variables from a git service connection in my YAML pipeline?
The AzureCLI task only accepts service connections of the Azure Resource Manager type. So the git connection you are using doesn't work.
According to your needs, you can check out the repo first. There is a Bitbucket Cloud Service connection for Bitbucket repositories. You can use it to check out multiple repositories in your pipeline if you keep the yaml files in the azure repo.
Here is the sample yaml and screenshot:
resources:
repositories:
- repository: MyBitbucketRepo
type: bitbucket
endpoint: MyBitbucketServiceConnection
name: MyBitbucketOrgOrUser/MyBitbucketRepo
trigger: none
schedules:
- cron: "*/30 * * * *" # RUN EVERY 30 MINUTES
displayName: Scheduled Build
branches:
include:
- my-branch
always: true # RUNS ALWAYS REGARDLESS OF CHANGES MADE
pool:
name: Azure Pipelines
steps:
- checkout: MyBitbucketRepo
- powershell: |
# Use the variables from above to pull latest from
# BitBucket then change the remote origin and push
# everything to my Azure repo
displayName: 'PowerShell Script'
How could I rewrite this python script so that it can run within azure devops pipeline and export the dataframe as a csv to the devops repository. I'm able to achieve this locally but would like to achieve this remotely.
Put different, how can I export a pandas dataframe to devops repos folder as a csv file using an azure devops pipeline task. Below is the python script that needs to run as a pipeline task.
local_path in this case should be azure devops path.
from azureml.core import Workspace, Dataset
local_path = 'data/prepared.csv'
dataframe.to_csv(local_path)
⚠️You really should not do this. Azure pipelines are for building code, not for processing data. Assuming that you meant Azure DevOps Pipelines, opposed to Azure ML Pipelines.
Also you should not commit data to your repository.
If you still want to proceed, here is an example for what you try to achieve. Note that for the last line, i.e. git push, you need to give the agent permission to write the repository. See Run Git commands in a script for an approximate☹️ documentation on how to do that on your account.
trigger: none
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: self
persistCredentials: true
- task: UsePythonVersion#0
inputs:
versionSpec: '3.8'
addToPath: true
architecture: 'x64'
- script: |
python your_data_generating_script.py
git config --global user.email "you#example.com"
git config --global user.name "Your Name"
git add data/prepared.csv
git commit -m'test commit'
git push origin HEAD:master
displayName: 'push data to master'
So I'm trying to learn deployement with Azure Devops. I have this Angular app sitting in Gitlab which already has a CI/CD pipeline with jenkins to kubernetes cluster. So i was thinking to do the same with Azure Devops via YAML. Which is not possible according to Azure docs directly from gitlab.
So what i'm trying to do is create CI pipeline from github which takes checkout from gitlab UI repo and build it for deployement.
I have created a Repository Resource in my below pipeline YAMl file. Azure give me error saying:
Repository JpiPipeline references endpoint https://gitlab.com/myusername/myUiRepo.git which does not exist or is not authorized for use
trigger:
- master
resources:
repositories:
- repository: UiPipeline. #alias
type: git
name: repository_name
# ref: refs/heads/master # ref name to use; defaults to 'refs/heads/master'
endpoint: https://#gitlab.com/myusername/myUiRepo.git # <-- Is this possible
stages:
- stage: Checkout
jobs:
- job: Build
pool:
vmImage: 'Ubuntu-16.04'
continueOnError: true
steps:
- checkout: JpiPipeline
- script: echo "hello to my first Build"
Repository type gitlab is not support in YAML pipeline yet. The currently supported types are git, github, and bitbucket, see supported types.
The workaround to get the gitlab repo sources is to run git command inside the script tasks.
For below example Yaml pipeline:
- checkout: none to avoid checkout the github source.
Use git clone https://username:password#gitlab.com/useraccount/reponame.git to clone the gitlab repo inside a script task.
stages:
- stage: Checkout
jobs:
- job: Build
pool:
vmImage: 'Ubuntu-latest'
steps:
- checkout: none
- script: |
git clean -ffdx
git clone https://username:password#gitlab.com/useraccount/reponame.git
#if your password or username contain # replace it with %40
Your gitlab repo will be clone to folder $(system.defaultworkingdirectory)/reponame
Another workaround is to classic UI pipeline. Gitlab repo type is supported in Classic UI pipeline.
You can choose Use the classic editor to create a classic ui pipeline.
When you come to select source page. Choose other git and click Add connection to add your gitlab repo url. Then the pipeline will get the sources for your gitlab repo.