Azure DevOps Pipeline Azure Blob Storage upload file 403 Forbidden Exception - azure

Summary
I'm creating a CI/CD provisioning pipeline for a new Azure Storage Account within an Azure DevOps Pipeline and attempting to upload some files into the Blob Storage using AzCopy running from an Azure Powershell task in the pipeline.
The Error
The script runs successfully from my local machine but when running in the Azure DevOps pipeline I get the following error (ErrorDateTime is just an obfuscated ISO 8601 formatted datetime):
System.Management.Automation.RemoteException: [ErrorDateTime][ERROR] Error parsing destination location
"https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate
destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
[error][ErrorDateTime][ERROR] Error parsing destination location "https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
[debug]Processed: ##vso[task.logissue type=error][ErrorDateTime][ERROR] Error parsing destination location "https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
Error record:
This request is not authorized to perform this operation.
Assumptions
The storage account has been setup to only allow specific VNet and IP Addresses access.
It looks like the firewall or credentials are somehow configured wrongly but the ServicePrincipal running the script has been used successfully in other sibling pipeline tasks and to understand these problems i've temporarily given the ServicePrincipal Subscription Owner permissions and the Storage account Firewall Rules tab has "Allow trusted Microsoft Services to access this storage account"
What I've tried...
I've successfully run the script from my local machine with my IP Address being in the allowed list.
If I enable "Allow access from All networks" on the Storage Account Firewall rules then the script runs and the file is uploaded successfully.
It appears as if the Azure Pipeline Agents running in their own VNet don't have access to my Storage Account but I would have thought that requirement would be satisfied by setting "Allow trusted Microsoft Services to access this storage account" in the Firewall settings
I'm using the following line within the Azure Powershell Task. I'm happy with the values because everything works when "All networks" or my IP Address is enabled and I run locally.
.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
Any thoughts or guidance would be appreciated.
Thanks,
SJB

People seem to be getting mixed results in this github issue, but the AzureFileCopy#4 task works (at least for us) after adding the "Storage Blob Data Contributor" role to the ARM connection's service principal (to the storage account itself). The below is the only necessary step in a pipeline that deploys a repo as a static website in a blob container:
- task: AzureFileCopy#4
displayName: 'Copy files to blob storage: $(storageName)'
inputs:
SourcePath: '$(build.sourcesDirectory)'
Destination: AzureBlob
storage: $(storageName)
ContainerName: $web
azureSubscription: 'ARM Connection goes here' # needs a role assignment before it'll work
(Of course, if you're using Azure CDN like we are, the next step is to clear the CDN endpoint's cache, but that has nothing to do with the blob storage error)

After doing further research I noticed the following raised issue - that Azure DevOps isn't considered a trusted Microsoft Service from a Storage Account perspective.
https://github.com/MicrosoftDocs/azure-docs/issues/19456
My temporary workaround is to:
Setting the DefaultAction to Allow, thereby allowing "All networks access".
Setting the DefaultAction to Deny after the copy action ensured my VNet rules were being enforced again.
Try
{
Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Allow
.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
}
Catch
{
#Handle errors...
}
Finally
{
Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Deny
}
Thanks,
SJB

Have you considered using the Azure DevOps Task "Azure File Copy" instead of a powershell script?
see: https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-file-copy?view=azure-devops

Related

Azure blob storage - SAS - Data Factory

I was able to blob test connection and it's successful, but when I attempt to look for the storage path it shows this error. screenshot
Full error:
Failed to load
Blob operation failed for: Blob Storage on container '' and path '/' get failed with 'The remote server returned an error: (403) Forbidden.'. Possible root causes: (1). Grant service principal or managed identity appropriate permissions to do copy. For source, at least the “Storage Blob Data Reader” role. For sink, at least the “Storage Blob Data Contributor” role. For more information, see https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage?tabs=data-factory#service-principal-authentication. (2). It's possible because some IP address ranges of Azure Data Factory are not allowed by your Azure Storage firewall settings. Azure Data Factory IP ranges please refer https://docs.microsoft.com/en-us/azure/data-factory/azure-integration-runtime-ip-addresses. If you allow trusted Microsoft services to access this storage account option in firewall, you must use https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage?tabs=data-factory#managed-identity. For more information on Azure Storage firewalls settings, see https://docs.microsoft.com/en-us/azure/storage/common/storage-network-security?tabs=azure-portal.. The remote server returned an error: (403) Forbidden.StorageExtendedMessage=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Context: I'm trying to copy data from SQL db to Snowflake and I am using Azure Data Factory for that. Since this doesn't publish, I enable the staged copy and connect blob storage.
I already tried to check network and it's set for all network. I'm not sure what I'm missing here because I found a youtube video that has it working but they didn't show an issue related/similar to this one. https://www.youtube.com/watch?v=5rLbBpu1f6E.
I also tried to retain empty storage path but trigger for copy data pipeline isn't successfully to.
Full error from trigger:
Operation on target Copy Contacts failed: Failure happened on 'Sink' side. ErrorCode=FileForbidden,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error occurred when trying to upload a blob, detailed message: dbo.vw_Contacts.txt,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.WindowsAzure.Storage.StorageException,Message=The remote server returned an error: (403) Forbidden.,Source=Microsoft.WindowsAzure.Storage,StorageExtendedMessage=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
I created Blob storage and generated SAS token for that. I created a blob storage linked service using SAS URI It created successfully.
Image for reference:
When I try to retrieve the path I got below error
I changed the networking settings of storage account by enabling enabled from all networks of storage account
Image for reference:
I try to retrieve the path again in data factory. It worked successfully. I was able to retrieve the path.
Image for reference:
Another way is by whitelisting the IP addresses we can resolve this issue.
From the error message:
'The remote server returned an error: (403) Forbidden.'
It's likely the authentication method you're using doesn't have enough permissions on the blob storage to list the paths. I would recommend using the Managed Identity of the Data Factory to do this data transfer.
Take the name of the Data Factory
Assign the Blob Data Contributor role in the context of the container or the blob storage to the ADF Managed Identity (step 1).
On your blob linked service inside of Data Factory, choose the managed identity authentication method.
Also, if you stage your data transfer on the blob storage, you have to make sure the user can write to the blob storage, and also bulk permissions on SQL Server.

The gateway did not receive a response from 'Microsoft.Sql' within the specified time period

I am running terraform via Azure devOps pipeline, in order to create azure MSSQL along with Blob Auditing Policies. However, when I run the pipeline, I am getting the following error after the pipeline runs for a while. Can some please help me identifying the root cause of this issue?
Error: failure in issuing create/update request for SQL Database "Identity" Blob Auditing Policies(SQL Server ""/ Resource Group ""): sql.ExtendedDatabaseBlobAuditingPoliciesClient#CreateOrUpdate: Failure responding to request: StatusCode=504 -- Original Error: autorest/azure: Service returned an error. Status=504 Code="GatewayTimeout" Message="The gateway did not receive a response from 'Microsoft.Sql' within the specified time period."
on azure-sql-server.tf line 92, in resource "azurerm_mssql_database" "sqlserver":
92: resource "azurerm_mssql_database" "sqlserver" {
failure in issuing create/update request for SQL Database "Identity" Blob Auditing Policies(SQL Server ""/ Resource Group ""): sql.ExtendedDatabaseBlobAuditingPoliciesClient#CreateOrUpdate: Failure responding to request: StatusCode=504 -- Original Error:
autorest/azure: Service returned an error. Status=504
Code="GatewayTimeout" Message="The gateway did not receive a response from 'Microsoft.Sql' within the specified time period.
To resolve the above error, please try the following:
Try removing the azurerm_mssql_database_extended_auditing_policy try replacing with the old extended_auditing_policy block within azurerm_mssql_database .
Using storage requires to enable 'Allow trusted Microsoft services to access this storage account' on the storage account.
Make sure you have Storage Blob Data Contributor for the storage created from terraform.
Enable System Managed Identity on the existing SQL Server.
For the workaround, try editing the state file to remove the "status": "tainted", line from the "azurerm_mssql_server" resource.
For more in detail, please refer below links:
azure - Creating SQL Server vulnerability assessment resource using a private Storage Account fails - Stack Overflow.
mssql_server: breaking change in the azure api · Issue #8915 · hashicorp/terraform-provider-azurerm · GitHub.
Export database fails with "The gateway did not receive a response from 'Microsoft.Sql'" - Microsoft Q&A.

Azure ADF using Azure Batch throws Shared Access Signature generation error

I am working on a simple Azure Data Factory pipeline where I have simply added a Batch Service and in that specified the Batch Service account (which I have created thru linked service and tested the connection is working). In the command I am just running a simple "ls" command and when I do a debug run I get this error: "Cannot create Shared Access Signature unless Account Key credentials are used." I have following linked services "Azure Batch", "Azure Blob Storage" and Key Vault (where we store the access key). All linked services connections are working properly.
Any help on how to fix this error: "Cannot create Shared Access Signature unless Account Key credentials are used."
Azure Batch Linked service:
Azure Storage Linked service:
Azure Data factory pipeline:
The issue happens because you use "Managed Identity" to connect ADF to the Storage. It will say "successful" when doing a connection test on the linked services but when this storage is used for a Batch, it needs to have "Account Key" authentication type (see here).

Azure DevOps: Service connection is not being recognized

I can't seem to authorize access to my Azure subscription in Azure DevOps to run a build whenever a commit is pushed to master. I keep getting the below error:
Also, when I click Authorize resources, it says the authorization was successful, but the next time I run the pipeline, I get the same exact error. I verified in Project settings -> Service connections that I have an active connection to the subscription.
How can I get around this issue? When I go to Deployment Center in Azure Functions and wire up the connection there, it creates a task-based pipeline, but I want to use yaml.
The above indicates the azureSubscription you specified in your azure function deployment task doesnot exist, or you didnot have the permission.
If the service connection is already correctly setup, but you still encounter above error. You can follow below to troubleshoot the issue.
1, Check your yaml pipeline.
The azure subscription is validated at compile time. If you use variables to reference the azure subscription yaml pipeline. You need to make sure the variable can be retrieved at compile time.
You can check out this thread.
2, Check the service connection security setting.
Go to project settings-->Service Connections under Pipelines--> Select your azure service connection --> More settings(3 dots)-->Security-->Try adding your pipeline to the Pipeline permissions list.
If the azure subscription service connection is not set up. You need to create an service connection of azure Resource Manager type to connect to your azure subscription. See below steps:
1, Go to project settings-->Service Connections under Pipelines--> New Service connection-->Select Azure Resource Manager--> Next
2, Then select the Authentication method. If your azure devops is connected to AAD. You can select Service principal (automatic) as Authentication method. This will automatically create a service principal in your Azure AD.
3, If you want to create new service principal. You can select Service principal (manual). See below document to create service principal in Azure
Use the portal to create an Azure Active Directory application and a service principal that can access resources
Use Azure PowerShell to create an Azure service principal with a certificate
Then enter the related information in the service connection configuration page.
After the your azure subscription service connection is created. You can use it in your yaml pipeline task by specify the service connection name. See below example:
- task: AzureFunctionApp#1
displayName: Azure Function App Deploy
inputs:
azureSubscription: myAzureSubscription
Note: You need to add the correct role assignment for above service principal to enable the service principal to deploy to your azure resources.
You must create a new connection from the task itself (you may need to use the advanced options to add an existing service principal).
under "Azure subscription" click the name of the subscription you wish to use
Click the drop down next to "Authorize" and open advanced options
Click " use the full version of the service connection dialog."
Enter all your credentials and hit save
I spent a while trying to figure out why I got the same problem. Compared my yaml to another yaml I had worked on previously and couldn't spot any problems, also verified the service connections.
But as #Levi Lu-MSFT mentions, verifying the yaml lead me to finding what caused my issue so I thought I'd share it here even though it's not 100% related:
My variables weren't indented correctly. I was a bit tired and thought DevOps was just goofing with me. So verify that your yaml is properly setup. Sometimes it can be really small things that causes these issues.

Task to Deploy Artifact to a container Storage Outside of my account

I am currently creating a CI for the FrontEnd of one of our client.
We need to copy the file coming from our repo the container account of the compagny that manage the operational part (we are only providing the code).
So , the company that will manage the infrastructure has Given us the storage account name (testdeploy) , the container name (artifact-deply) and the key (securekey).
I have managed to connect to the storage via Azure Storage Explorer , but now I need to deploy the artifact on this container via the CI.
The problem is , I don't know how , and I can't find documentation on how to proceed , every doc talk about deploying to a container in the same subscription.
But I do not have acces to this container , I only have it's name and key.
Here is the Yaml to what I have already setup , I do not know if i can help:
steps:
- task: AzureFileCopy#2
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: '$(System.DefaultWorkingDirectory)/_listes-Azure/buildtest'
azureSubscription: 'Paiement à l''utilisation(my_subscription)'
Destination: AzureBlob
storage: testdeploy
ContainerName: 'artifact-deploy/front'
AdditionalArgumentsForBlobCopy: 'securekey'
outputStorageUri: 'https://testdeply.blob.core.windows.net/'
outputStorageContainerSasToken: 'securekey'
Of course when i do this I have this error message :
2019-10-25T10:45:51.1809999Z ##[error]Storage account: fprplistesdeploy not found. The selected service connection 'Service Principal' supports storage accounts of Azure Resource Manager type only.
Since It's not in my subscription scope , it can't acces it.
What I am doing wrong ?
I am using the AzurFileCopy task , is it good?
How can I setup the AzurFileCopy task to a container account that is not on my subscription scope , knowing that the only thing i have is a account name , and a key?
Thanks in advance !
What you basically have to do is to create and use a Shared Access Signature (SAS) to deploy resources into this blob container. Since you have the storage account key you can create a SAS token with Azure Storage Explorer.
Then use Azure Cloud Shell or Azure CLI on local machine for testing purposes. Try to copy a file into the blob container using a SAS token for authorization. If you have problems with authorization using a SAS token you can also test access using Azure Storage Explorer. Such basic tasks are widely known and well documented.
Finally find a way to run the file copy command used while testing in an Azure Pipeline Task. If Azure File Copy task does not fit to your use case, use a more generic task like an Azure CLI task. From reading over the docs it might be that it does not support your use case although the task name indicates that. I see your point. Find out how to access the artifact provided by the build pipeline and copy the file resources into the storage account. If that basically works find out how to improve it. Voila.
So I managed to do it.
Turns out , you can't do it via the AzureFile Copy , this task can't upload to as Container outside your subscription.
You must use an Azur CLI task , here is the script I used:
#!/bin/bash
az storage blob upload --container-name artifact --file $(System.DefaultWorkingDirectory)/artifact_deply/buildtest/front.zip --name front --account-key securekey
I changed all the variable but the idea is here ( I declared the account name in the variable panel of azur devops).
I used the account key , because I had error with the SAS URL , but I think you can easily use the Azur devops variable to pass the SAS Token URL.
And I created a task before this one to zip all the folder , so it's easier to manage.

Resources