Error when deploying DACPAC via Azure DevOps Pipelines - azure

I am flushing out a CI/CD process with Azure SQL DB deployed via Azure DevOps Pipelines. I am using the Adventure works database and set up a visual studio project importing the schema.
I have a pipeline configured to publish the dacpac and run a subsequent deployment using the SqlAzureDacpacDeployment#1 and am getting the below error:
2020-10-10T02:36:34.1421137Z ##[error]Unable to connect to target server 'server.database.windows.net'. Please verify the connection information such as the server name, login credentials, and firewall rules for the target server.
2020-10-10T02:36:34.1605855Z ##[error]Windows logins are not supported in this version of SQL Server.
2020-10-10T02:36:34.2143924Z ##[error]The Azure SQL DACPAC task failed. SqlPackage.exe exited with code 1.Check out how to troubleshoot failures at https://aka.ms/sqlazuredeployreadme#troubleshooting-
2020-10-10T02:36:34.2522414Z ##[section]Finishing: SqlAzureDacpacDeployment
I am using windows latest and here is my YAML pipeline:
trigger:
- master
pool:
vmImage: 'windows-latest'
jobs:
- job: BuildDeploySQL
variables:
- group: SQLServerLogin
steps:
- task: VSBuild#1
inputs:
solution: '**\*.sln'
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Pipeline.Workspace)'
publishLocation: 'pipeline'
- task: SqlAzureDacpacDeployment#1
inputs:
azureSubscription: 'Subscription Name here'
AuthenticationType: 'server'
ServerName: 'server.database.windows.net'
DatabaseName: 'AdventureWorks'
SqlUsername: 'sqladmin'
SqlPassword: ${{ variables.Password }}
deployType: 'DacpacTask'
DeploymentAction: 'Publish'
DacpacFile: '$(Pipeline.Workspace)\s\AdventureWorks\bin\Debug\*.dacpac'
IpDetectionMethod: 'AutoDetect'
I have tried to deploy from my local machine and it is successful using the same sql credentials. Additionally I have confirmed that the SQL Database has allow Azure Services enabled. I have also tried to deploy the dacpac to a new empty database and get this same error.
I do believe this could be just a generic error message as my deployment logs do show a successful connection to the server:
2020-10-10T02:36:18.7912964Z Invoke-Sqlcmd -ServerInstance "server.database.windows.net" -Database "AdventureWorks" -Username "sqladmin" -Password ****** -Inputfile
....
2020-10-10T02:36:33.0554895Z Initializing deployment (Start)
** Update
Just to rule out I did create a new SQL Login with DBO_owner permissions and ran the deployment using that and got the same error message.

Above error is probably because the build agent ip is not allow-listed in the firewall rules of your Azure SQL Database. See the this link about IP ranges for Microsoft-hosted agents.
You can check the firewall rules setting of your azure database, and try allowing all IP ranges.
You can aslo add Azure CLi task to get the agent ip and set a firewall rule for your azure database to allow the agent ip dynamically in your pipeline. See this thread.
steps:
- task: AzureCLI#2
displayName: 'Azure CLI '
inputs:
azureSubscription: 'azureSubscription'
scriptType: ps
scriptLocation: inlineScript
inlineScript: |
$agentIp = (New-Object net.webclient).downloadstring("http://checkip.dyndns.com") -replace "[^\d\.]"
az sql server firewall-rule create -g $(rg) -s $(server) -n test --start-ip-address $agentIp --end-ip-address $agentIp
You can also create a self-hosted agent on your local machine/Azure VM. and run your pipeline on this self-hosted agent. Note to allow-list your local machine ip for the azure database.

The root issue was the password secret contained characters which escaped Powershell. Wrapping the secret in "" resolved it.

Related

KubernetesManifest createSecret fails with illegal base64 data at input byte 4329 in Azure DevOps pipeline execution

I am trying to deploy an application into Private AKS cluster using Azure Devops pipelines in VMSS agent.
I have create service connection with kubeConfig option ( saved without verification ) for connectivity.
Task:
- task: KubernetesManifest#0
displayName: create secret
inputs:
action: 'createSecret'
kubernetesServiceConnection: PRIVATE_AKS_SC
namespace: 'default'
secretType: 'dockerRegistry'
secretName: 'mysecret'
dockerRegistryEndpoint: 'PRIVATE_ACR_SC'
Error:
Same task works for Public AKS cluster.

Deploy packages to multiple webapp service by azure pipeline single stage

I have more than 100 webapp service in azure. I want to deploy packages in 100 webapps by azure pipeline with one pipeline yml file. But I couldn't find any documentation like this. I got one microsoft documentation and they prefer to increase pipeline steps. If I have 100 webapps service, then have to add 100 steps for each deployment. This is not an efficient way and its time consuming. I want just like this step.
- task: AzureWebApp#1
displayName: 'Azure Web App Deploy'
inputs:
azureSubscription: '$(Parameters.connectedServiceName)'
appType: webApp
ResourceGroupName: $(group)
appName: 'JustGoTestAgain, justgotesttwo, webapp123, webapp555, webapp777 and so on ........'
package: '$(build.artifactstagingdirectory)/**/*.zip'
This yaml file is showing error. I couldn't find any essential extensions to fix it. I also couldn't find any azure powershell deployment command regarding to this issue. How can I get the solution?
You will not be able to do this like this. However you can use Azure Cli task:
- task: AzureCLI#2
displayName: Azure CLI
inputs:
azureSubscription: '$(Parameters.connectedServiceName)'
scriptType: ps
scriptLocation: inlineScript
inlineScript: |
$apps= #('JustGoTestAgain, justgotesttwo, webapp123, webapp555, webapp777 and so on ........')
foreach ($app in $apps) {
az webapp deployment source config-zip -g $(group) -n $app --src '$(build.artifactstagingdirectory)/SOME_FOLDER/Artifact.zip'
}
And here you have more details about deployment itself
Annother approach with multiple task bu continuation if one fail is:
parameters:
- name: apps
type: object
default:
- JustGoTestAgain
- justgotesttwo
- and so on
steps:
- ${{ each app in parameters.apps}}:
- task: AzureWebApp#1
displayName: 'Azure Web App Deploy ${{ app }}'
continueOnError: true
inputs:
azureSubscription: '$(Parameters.connectedServiceName)'
appType: webApp
ResourceGroupName: $(group)
appName: ${{ app }}
package: '$(build.artifactstagingdirectory)/**/*.zip'
Thete was issue with space. Now is fine. Apart from that there is only one issue with connectedServiceName
Job Job: Step input azureSubscription references service connection $(Parameters.connectedServiceName) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz. Job Job: Step input azureSubscription references service connection $(Parameters.connectedServiceName) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz. Job Job: Step input azureSubscription references service connection $(Parameters.connectedServiceName) which could not be found. The service connection does not exist or has not been authorized for use. For authorization details, refer to https://aka.ms/yamlauthz.
Which I skipped here as you already have it on your solution.

Is it possible to use Azure CLI on DevOps (hosted/self-hosted) while connecting with managed identity

I have a small test project with a crude start of a pipeline at https://github.com/veikkoeeva/dockerservice/blob/main/azure-pipelines.yml. It's currently just to check it's possible to connect to the Azure, so
trigger:
- master
pool:
vmImage: windows-latest
steps:
- task: AzureCLI#2
displayName: Az --version
inputs:
azureSubscription: 'TestManagedIdentityConnection'
scriptType: pscore
scriptLocation: inlineScript
inlineScript: |
az --version
- task: AzureCLI#2
inputs:
azureSubscription: 'TestManagedIdentityConnection'
scriptType: 'pscore'
scriptLocation: 'scriptPath'
scriptPath: '$(System.DefaultWorkingDirectory)\devops.ps1'
But this fails on login step like so
The service connection scope is on subscription level. It appears the hosted image tries to connect to an Internal Azure token endpoint. Is there a way to use managed identity that can sign in the CLI using hosted images? What could it look like using self-hosted and managed identity?
This seem to work with "the usual" service principal. But it appears developers are often forbidden to create SPNs to company AD so creating a service connections fails. It appears often it's possible to create a service connection using managed identity, but here we are with this problem. :)
<edit: Reading from https://learn.microsoft.com/en-us/cli/azure/authenticate-azure-cli?view=azure-cli-latest the options could be either az login --identity or a certificate. With az login --identity it appears there is still the same problem of calling the same endpoint as earlier and it errors with the same reason.
<edit 2: Duh! In the image it's called with --identity switch already!

Azure pipeline : SqlAzureDacpacDeployment fails to execute

My pipeline is stuck on deployment job with message The agent request is not running because all potential agents are running other requests. Current position in queue: 1
This message is wrong. If I remove the SqlAzureDacpacDeployment task below and if I only keep the simple echo command, it will execute just fine. So the problem has something to do with the SQL task. Only there is no log and I can't figure out what's wrong.
I have triple checked all parameters and they are correct. I run the build on a dedicated VM and I have added the VM IP Address to the Firewall rules of SQL Server.
What else can I do to troubleshoot this problem ?
- script:
echo "hello"
- task: SqlAzureDacpacDeployment#1
displayName: 'Azure SQL InlineSqlTask'
inputs:
azureSubscription: 'mysubscription id'
ServerName: my-server.database.windows.net
DatabaseName: mydb
SqlUsername: admin
SqlPassword: my password
deployType: sqlTask
SqlFile: $(workFolder)/Scripts.sql

DevOps Pipeline Automation - Infrastructure and Application deployment

I am facing an implementation issues with the below scenario using Azure DevOps pipelines
Provision two resources in Azure Subscription using ARM templates
Azure Container Registry
Azure Kubernetes Services
Deploy the containerized application code to the Kubernetes clusters
I am able to perform both of the steps in an individual pipelines. Need some help with combining/integrating two pipelines into one.
How can I automate both of the above steps without having any manual intervention in the process. The process must be robust enough to handle the Application deployment in the same ACR and AKS that was created in the previous step?
Both the Infrastructure and the Application code resides in the same GIT/ Azure Repository.
First of all. I would like to recommend that you do not tie the two pipeline together.
While infrastructure as code is important and should be used. It is important to decouple the infrastructure provisioning from the application provisioning for many good reasons.
Regarding you ask, you should have a task that create the ACR/AKS, one task that gets the credentials to be used in another task which would deploy to the ore created AKS ans ACR
The flow could be.
ACR —AKS (if you want to configure your AKS with ACR integration, you need to have them sequential) — docker build/push to ACR — deploy container using shell script with KubeCTL leveraging the credentials from second step.
You can use multiple stages in one yaml pipeline. See below simple example:
Below yaml pipeline has two stages. The first stage do the tasks to provision the Infrastructure. The second stage is dependsOn the first stage, and defined the ACR and AKS in the variables, then do the tasks to deploy to the Kubernetes clusters.
trigger:
- master
stages:
- stage: Infrastructure Deployment
pool:
vmImage: windows-latest
jobs:
- job: Infrastructure
steps:
- task: AzureResourceManagerTemplateDeployment#3
inputs:
.....
- stage: Application Deployment
dependsOn: Infrastructure Deployment
pool:
vmImage: windows-latest
variables:
ACRName: "ACRName"
AKSName: "ACRName"
azureSubscriptionEndpoint: ..
azureContainerRegistry: ..
azureResourceGroup: ..
jobs:
- job: Application
steps:
# - task: Docker#2
# inputs:
# containerRegistry: $(ACRName)
# ...
- powershell: |
# Log in to Docker with service principal credentials
docker login $(ACRName).azurecr.io --username $SP_APP_ID --password $SP_PASSWD
docker build
docker push
- task: Kubernetes#1
displayName: kubectl apply
inputs:
connectionType: Azure Resource Manager
azureSubscriptionEndpoint: $(azureSubscriptionEndpoint)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(AKSName)
....
Update: dynamically capture the ACR and AKS name along with ACR login credentials
You can use azure powershell task to get above data. In order to use azure powershell task, you need to create a azure resource manager service connection.
Then you can write custom inline scripts in the task. See below example:
- task: AzurePowerShell#5
name: AzurePowershell
inputs:
azureSubscription: 'Microsoft Azure Internal Consumption (daaeef3e-d7fe-49e8-baaa-b3df9d072825)'
ScriptType: InlineScript
Inline: |
$json = (Get-Content "$(system.defaultworkingdirectory)\template.json" -Raw) | ConvertFrom-Json
$AKS = $json.resources | where {$_.type -eq "Microsoft.ContainerService/managedClusters"} | select name
$ACR = $json.resources | where {$_.type -eq "Microsoft.ContainerRegistry/registries"} | select name
echo "##vso[task.setvariable variable=ACRName;isOutput=true]$($ACR.name)"
echo "##vso[task.setvariable variable=AKSName;isOutput=true]$($AKS.name)"
$ACRCred = Get-AzContainerRegistryCredential -ResourceGroupName "MyResourceGroup" -Name $($ACR.name)
echo "##vso[task.setvariable variable=UserName;isOutput=true]$($ACRCred.Username)"
echo "##vso[task.setvariable variable=Password;isOutput=true]$($ACRCred.Password)"
azurePowerShellVersion: LatestVersion
You can then get these variables in the following stage by referring to stageDependencies.stageName.jobName.outputs['stepName.variableName']
See here for more azure powershell cli.

Resources