How to secure Azure Function's backend Storage account? - azure

I have created an Azure storage account with private endpoints for blob, file, table and queue. I have set the 'Public network access' to 'Allow Azure services on the trusted services list to access this storage account' for security.
However, Azure Function creation (via portal) fails unless 'Public network access' is turned on.
Is there a better way to do this other than enabling public network access on storage?
I have checked the below link and there is nothing that suggests that storage needs to be publicly accessible.
https://learn.microsoft.com/en-us/azure/azure-functions/storage-considerations?tabs=azure-cli
Error:
"properties": {
"statusCode": "BadRequest",
"serviceRequestId": null,
"statusMessage": "{\"Code\":\"BadRequest\",\"Message\":\"Creation of storage file share failed with: 'The remote server returned an error: (403) Forbidden.'. Please check if the storage account is accessible.\",\"Target\":null,\"Details\":[{\"Message\":\"Creation of storage file share failed with: 'The remote server returned an error: (403) Forbidden.'. Please check if the storage account is accessible.\"},{\"Code\":\"BadRequest\"},{\"ErrorEntity\":{\"ExtendedCode\":\"99022\",\"MessageTemplate\":\"Creation of storage file share failed with: '{0}'. Please check if the storage account is accessible.\",\"Parameters\":[\"The remote server returned an error: (403) Forbidden.\"],\"Code\":\"BadRequest\",\"Message\":\"Creation of storage file share failed with: 'The remote server returned an error: (403) Forbidden.'. Please check if the storage account is accessible.\"}}],\"Innererror\":null}",
"eventCategory": "Administrative",
"entity": "/subscriptions/XXXXXXXXXXXX/resourcegroups/rg-xxxxxxx/providers/Microsoft.Web/sites/func-xxxxxxx",
"message": "Microsoft.Web/sites/write",
"hierarchy": "xxxxx/MG/MG/xxxxxx"
}

Find function app's outbound IP addresses and grant access from an internet IP range to those IP addresses. If your function app is VNET integrated, grant access from a virtual network.

Related

Azure blob storage - SAS - Data Factory

I was able to blob test connection and it's successful, but when I attempt to look for the storage path it shows this error. screenshot
Full error:
Failed to load
Blob operation failed for: Blob Storage on container '' and path '/' get failed with 'The remote server returned an error: (403) Forbidden.'. Possible root causes: (1). Grant service principal or managed identity appropriate permissions to do copy. For source, at least the “Storage Blob Data Reader” role. For sink, at least the “Storage Blob Data Contributor” role. For more information, see https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage?tabs=data-factory#service-principal-authentication. (2). It's possible because some IP address ranges of Azure Data Factory are not allowed by your Azure Storage firewall settings. Azure Data Factory IP ranges please refer https://docs.microsoft.com/en-us/azure/data-factory/azure-integration-runtime-ip-addresses. If you allow trusted Microsoft services to access this storage account option in firewall, you must use https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage?tabs=data-factory#managed-identity. For more information on Azure Storage firewalls settings, see https://docs.microsoft.com/en-us/azure/storage/common/storage-network-security?tabs=azure-portal.. The remote server returned an error: (403) Forbidden.StorageExtendedMessage=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Context: I'm trying to copy data from SQL db to Snowflake and I am using Azure Data Factory for that. Since this doesn't publish, I enable the staged copy and connect blob storage.
I already tried to check network and it's set for all network. I'm not sure what I'm missing here because I found a youtube video that has it working but they didn't show an issue related/similar to this one. https://www.youtube.com/watch?v=5rLbBpu1f6E.
I also tried to retain empty storage path but trigger for copy data pipeline isn't successfully to.
Full error from trigger:
Operation on target Copy Contacts failed: Failure happened on 'Sink' side. ErrorCode=FileForbidden,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error occurred when trying to upload a blob, detailed message: dbo.vw_Contacts.txt,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.WindowsAzure.Storage.StorageException,Message=The remote server returned an error: (403) Forbidden.,Source=Microsoft.WindowsAzure.Storage,StorageExtendedMessage=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
I created Blob storage and generated SAS token for that. I created a blob storage linked service using SAS URI It created successfully.
Image for reference:
When I try to retrieve the path I got below error
I changed the networking settings of storage account by enabling enabled from all networks of storage account
Image for reference:
I try to retrieve the path again in data factory. It worked successfully. I was able to retrieve the path.
Image for reference:
Another way is by whitelisting the IP addresses we can resolve this issue.
From the error message:
'The remote server returned an error: (403) Forbidden.'
It's likely the authentication method you're using doesn't have enough permissions on the blob storage to list the paths. I would recommend using the Managed Identity of the Data Factory to do this data transfer.
Take the name of the Data Factory
Assign the Blob Data Contributor role in the context of the container or the blob storage to the ADF Managed Identity (step 1).
On your blob linked service inside of Data Factory, choose the managed identity authentication method.
Also, if you stage your data transfer on the blob storage, you have to make sure the user can write to the blob storage, and also bulk permissions on SQL Server.

Trying to set a link services through registerd app to azure data lake storage and keep getting 24200 error

I am new to azure. We have azure data lake storage set. I am trying to set the link services from the data factory to the azure data lake storage gen2. It keeps failing when I test the link service to the data lake storage. As far as I can see, I have granted the "Storage blob contributor" role to the user in the azure data lake storage. I still keep getting permission denied error when I test the link services
ADLS Gen2 operation failed for: Storage operation '' on container 'testconnection' get failed with 'Operation returned an invalid status code 'Forbidden''. Possible root causes: (1). It's possible because the service principal or managed identity don't have enough permission to access the data. (2). It's possible because some IP address ranges of Azure Data Factory are not allowed by your Azure Storage firewall settings. Azure Data Factory IP ranges please refer https://learn.microsoft.com/en-us/azure/data-factory/azure-integration-runtime-ip-addresses.. Account: 'dlsisrdatapoc001'. ErrorCode: 'AuthorizationFailure'. Message: 'This request is not authorized to perform this operation.'.
What I could observe is that when I open the network to all (public) in the data lake storage, it works, when I set the firewall with CIDR it fails. Couldn't narrow the cause of the problem. I do have the "Allow azure services on the trusted services list to access this account" checked.
Completely lost
As mentioned in the error description, the error usually occurs if you don't have sufficient permissions to perform the action or if you don't add the required IPs in the firewall settings of your storage account.
To resolve the error, please check if you added the Storage Blob Data Contributor role to your managed identity along with the user like below:
Go to Azure Portal -> Storage Accounts -> Your Storage Account -> Access Control (IAM) ->Add role assignment
Make sure to select the managed identity, based on the authentication method you selected while creating linked service.
As mentioned in this MsDoc, make sure to add all the required IPs based on your resource location and service tag.
Download the JSON file to know the IP range for service tag in your resource location and add them in the firewall settings like below:
Make sure to select the Resource type as
Microsoft.DataFactory/factories while choosing CIDR.
For more in detail, please refer below links:
Error when I am trying to connect between Azure Data factory and Azure Data lake Gen2 by Anushree Garg
Storage Accoung V2 access with firewall, VNET to data factory V2 by Cindy Pau

The gateway did not receive a response from 'Microsoft.Sql' within the specified time period

I am running terraform via Azure devOps pipeline, in order to create azure MSSQL along with Blob Auditing Policies. However, when I run the pipeline, I am getting the following error after the pipeline runs for a while. Can some please help me identifying the root cause of this issue?
Error: failure in issuing create/update request for SQL Database "Identity" Blob Auditing Policies(SQL Server ""/ Resource Group ""): sql.ExtendedDatabaseBlobAuditingPoliciesClient#CreateOrUpdate: Failure responding to request: StatusCode=504 -- Original Error: autorest/azure: Service returned an error. Status=504 Code="GatewayTimeout" Message="The gateway did not receive a response from 'Microsoft.Sql' within the specified time period."
on azure-sql-server.tf line 92, in resource "azurerm_mssql_database" "sqlserver":
92: resource "azurerm_mssql_database" "sqlserver" {
failure in issuing create/update request for SQL Database "Identity" Blob Auditing Policies(SQL Server ""/ Resource Group ""): sql.ExtendedDatabaseBlobAuditingPoliciesClient#CreateOrUpdate: Failure responding to request: StatusCode=504 -- Original Error:
autorest/azure: Service returned an error. Status=504
Code="GatewayTimeout" Message="The gateway did not receive a response from 'Microsoft.Sql' within the specified time period.
To resolve the above error, please try the following:
Try removing the azurerm_mssql_database_extended_auditing_policy try replacing with the old extended_auditing_policy block within azurerm_mssql_database .
Using storage requires to enable 'Allow trusted Microsoft services to access this storage account' on the storage account.
Make sure you have Storage Blob Data Contributor for the storage created from terraform.
Enable System Managed Identity on the existing SQL Server.
For the workaround, try editing the state file to remove the "status": "tainted", line from the "azurerm_mssql_server" resource.
For more in detail, please refer below links:
azure - Creating SQL Server vulnerability assessment resource using a private Storage Account fails - Stack Overflow.
mssql_server: breaking change in the azure api · Issue #8915 · hashicorp/terraform-provider-azurerm · GitHub.
Export database fails with "The gateway did not receive a response from 'Microsoft.Sql'" - Microsoft Q&A.

Azure DevOps Pipeline agent fails while running Terraform Plan with UnAuthorized error while connecting to a Storage Account

I have a storage account which has
a) Microsoft network routing selected.
b) Publish route-specific endpoint as only Microsoft network routing enabled.
I have an Azure DevOps pipeline agent running terraform plan - before running a plan I get the public ip of the VM (using curl) and run bash script to add thise public ip of the VM to the Network ACL of the storage account.
However the plan fails with not authorized error.
As soon as I also select the "Publish Internet routing" the plan starts working.
Can anyone shed light/explain why this is happening ?
PS: attaching the error details from pipeline..
Error: Error retrieving Container "bootdiag" (Account "xxxxxxxxx" / Resource Group "xx-dev-xx-xxx-001"): containers.Client#GetProperties: Failure responding to request: StatusCode=403 -- Original Error: autorest/azure: Service returned an error. Status=403 Code="AuthorizationFailure" Message="This request is not authorized to perform this operation.\nRequestId:f01c457e-d01e-0036-38b5-f25ba0000000\nTime:2021-01-25T00:57:41.2404471Z"

Azure DevOps Pipeline Azure Blob Storage upload file 403 Forbidden Exception

Summary
I'm creating a CI/CD provisioning pipeline for a new Azure Storage Account within an Azure DevOps Pipeline and attempting to upload some files into the Blob Storage using AzCopy running from an Azure Powershell task in the pipeline.
The Error
The script runs successfully from my local machine but when running in the Azure DevOps pipeline I get the following error (ErrorDateTime is just an obfuscated ISO 8601 formatted datetime):
System.Management.Automation.RemoteException: [ErrorDateTime][ERROR] Error parsing destination location
"https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate
destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
[error][ErrorDateTime][ERROR] Error parsing destination location "https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
[debug]Processed: ##vso[task.logissue type=error][ErrorDateTime][ERROR] Error parsing destination location "https://newStorageAccount.blob.core.windows.net/config/import": Failed to validate destination: One or more errors occurred. The remote server returned an error: (403) Forbidden.
Error record:
This request is not authorized to perform this operation.
Assumptions
The storage account has been setup to only allow specific VNet and IP Addresses access.
It looks like the firewall or credentials are somehow configured wrongly but the ServicePrincipal running the script has been used successfully in other sibling pipeline tasks and to understand these problems i've temporarily given the ServicePrincipal Subscription Owner permissions and the Storage account Firewall Rules tab has "Allow trusted Microsoft Services to access this storage account"
What I've tried...
I've successfully run the script from my local machine with my IP Address being in the allowed list.
If I enable "Allow access from All networks" on the Storage Account Firewall rules then the script runs and the file is uploaded successfully.
It appears as if the Azure Pipeline Agents running in their own VNet don't have access to my Storage Account but I would have thought that requirement would be satisfied by setting "Allow trusted Microsoft Services to access this storage account" in the Firewall settings
I'm using the following line within the Azure Powershell Task. I'm happy with the values because everything works when "All networks" or my IP Address is enabled and I run locally.
.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
Any thoughts or guidance would be appreciated.
Thanks,
SJB
People seem to be getting mixed results in this github issue, but the AzureFileCopy#4 task works (at least for us) after adding the "Storage Blob Data Contributor" role to the ARM connection's service principal (to the storage account itself). The below is the only necessary step in a pipeline that deploys a repo as a static website in a blob container:
- task: AzureFileCopy#4
displayName: 'Copy files to blob storage: $(storageName)'
inputs:
SourcePath: '$(build.sourcesDirectory)'
Destination: AzureBlob
storage: $(storageName)
ContainerName: $web
azureSubscription: 'ARM Connection goes here' # needs a role assignment before it'll work
(Of course, if you're using Azure CDN like we are, the next step is to clear the CDN endpoint's cache, but that has nothing to do with the blob storage error)
After doing further research I noticed the following raised issue - that Azure DevOps isn't considered a trusted Microsoft Service from a Storage Account perspective.
https://github.com/MicrosoftDocs/azure-docs/issues/19456
My temporary workaround is to:
Setting the DefaultAction to Allow, thereby allowing "All networks access".
Setting the DefaultAction to Deny after the copy action ensured my VNet rules were being enforced again.
Try
{
Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Allow
.\AzCopy.exe /Source:$SourcePath /Dest:$blobUrlDir /DestKey:$key /Pattern:$FilenamePattern /Y
}
Catch
{
#Handle errors...
}
Finally
{
Update-AzureRmStorageAccountNetworkRuleSet -ResourceGroupName "$ResourceGroupName" -Name "$StorageAccountName" -DefaultAction Deny
}
Thanks,
SJB
Have you considered using the Azure DevOps Task "Azure File Copy" instead of a powershell script?
see: https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/deploy/azure-file-copy?view=azure-devops

Resources