Getting AuthorizationFailure (403) response on Azure Blob Storage via REST API with SAS Token - azure

After a few days fighting with this issue I have decided to post it here in case someone can help out by giving me some pointers.
I have an Azure Storage Account with a Blob container with encrypted images. The images have been uploaded via PUT request (from a browser) to the Azure Blob Storage REST API, using SAS signature (generated by an Azure VM) and customer provided keys (x-ms-encryption headers). However, when trying to download the images from the Azure VM with a GET request (using curl), I get the following 403 "AuthorizationFailure" error:
<?xml version="1.0" encoding="utf-8”?>
<Error>
<Code>AuthorizationFailure</Code>
<Message>
This request is not authorized to perform this operation.
RequestId:1b203db6-c01e-0013-1553-6adb9b000000
Time:2020-08-04T11:33:42.9494992Z
</Message>
</Error>
Funny thing though, when I perform exactly the same GET request (exactly the same headers) with curl, Postman or the browser from my own local machine and even from other Azure VMs located in different resource groups and different virtual networks, it works as expected and I can successfully download the encrypted image. This confirms that the SAS signature and the headers passed are correctly constructed.
I have reviewed all the configuration of the VM, Network Security Group, Virtual Network and Storage Account in the Azure Portal and compared it with the other VMs that successfully download the images, and all the settings are exactly the same except for:
In the VM that fails to download the image, I'm authorized as a contributor for that VM's resource group only, while in the VM that works I'm the owner of the account.
In the account where I am the owner, Azure has created a new resource group called NetworkWatcherRG, which seems to have no resources attached to it. In the VM that has the problem I do not have access to that resource group, so I ignore whether it has been created by Azure or not, and whether it has any impact on the problem I'm facing.
I would really appreciate any ideas or suggestions on what the issue might be.

Related

Azure blob storage - SAS - Data Factory

I was able to blob test connection and it's successful, but when I attempt to look for the storage path it shows this error. screenshot
Full error:
Failed to load
Blob operation failed for: Blob Storage on container '' and path '/' get failed with 'The remote server returned an error: (403) Forbidden.'. Possible root causes: (1). Grant service principal or managed identity appropriate permissions to do copy. For source, at least the “Storage Blob Data Reader” role. For sink, at least the “Storage Blob Data Contributor” role. For more information, see https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage?tabs=data-factory#service-principal-authentication. (2). It's possible because some IP address ranges of Azure Data Factory are not allowed by your Azure Storage firewall settings. Azure Data Factory IP ranges please refer https://docs.microsoft.com/en-us/azure/data-factory/azure-integration-runtime-ip-addresses. If you allow trusted Microsoft services to access this storage account option in firewall, you must use https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage?tabs=data-factory#managed-identity. For more information on Azure Storage firewalls settings, see https://docs.microsoft.com/en-us/azure/storage/common/storage-network-security?tabs=azure-portal.. The remote server returned an error: (403) Forbidden.StorageExtendedMessage=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
Context: I'm trying to copy data from SQL db to Snowflake and I am using Azure Data Factory for that. Since this doesn't publish, I enable the staged copy and connect blob storage.
I already tried to check network and it's set for all network. I'm not sure what I'm missing here because I found a youtube video that has it working but they didn't show an issue related/similar to this one. https://www.youtube.com/watch?v=5rLbBpu1f6E.
I also tried to retain empty storage path but trigger for copy data pipeline isn't successfully to.
Full error from trigger:
Operation on target Copy Contacts failed: Failure happened on 'Sink' side. ErrorCode=FileForbidden,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error occurred when trying to upload a blob, detailed message: dbo.vw_Contacts.txt,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.WindowsAzure.Storage.StorageException,Message=The remote server returned an error: (403) Forbidden.,Source=Microsoft.WindowsAzure.Storage,StorageExtendedMessage=Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
I created Blob storage and generated SAS token for that. I created a blob storage linked service using SAS URI It created successfully.
Image for reference:
When I try to retrieve the path I got below error
I changed the networking settings of storage account by enabling enabled from all networks of storage account
Image for reference:
I try to retrieve the path again in data factory. It worked successfully. I was able to retrieve the path.
Image for reference:
Another way is by whitelisting the IP addresses we can resolve this issue.
From the error message:
'The remote server returned an error: (403) Forbidden.'
It's likely the authentication method you're using doesn't have enough permissions on the blob storage to list the paths. I would recommend using the Managed Identity of the Data Factory to do this data transfer.
Take the name of the Data Factory
Assign the Blob Data Contributor role in the context of the container or the blob storage to the ADF Managed Identity (step 1).
On your blob linked service inside of Data Factory, choose the managed identity authentication method.
Also, if you stage your data transfer on the blob storage, you have to make sure the user can write to the blob storage, and also bulk permissions on SQL Server.

ADF Pipeline Errors - RequestContentTooLarge and InvalidContentLink

The ADF Pipeline release to the test Data Factory instance is failing with the following error as shown in the image below.
So, to overcome the above issue, I modified the pipeline by adding an additional step of Azure Blob File Copy to store the linked templates in a storage account and reference it in the pipeline to use it for the deployment. However when I made the above change I am getting another error which states InvalidContentLink: Unable to download deployment content from 'https://xxx.blob.core.windows.net/adf-arm-templates/ArmTemplate_0.json?***Sanitized Azure Storage Account Shared Access Signature***'. The tracking Id is 'xxxxx-xxxx-x-xxxx-xx'. Please see https://aka.ms/arm-deploy for usage details.
I have tried using the SAS token for both at the Container level and at the Storage Account level. I also have ensured that the agent and the storage account are under same VNets. I have also tried to remove the firewall restrictions but still it gives me the same InvalidContentLink error.
The modified pipeline with the Azure Storage Account step :
How do I resolve this issue?
InvalidContentLink: Unable to download deployment content from 'https://xxx.blob.core.windows.net/adf-arm-templates/ArmTemplate_0.json?Sanitized Azure Storage Account Shared Access Signature'. The tracking Id is 'xxxxx-xxxx-x-xxxx-xx'. Please see https://aka.ms/arm-deploy for usage details.
This error can cause because of you are trying to link which might not present in storage account.
Make sure you provide correct URl for the nested template that is accessible.
Also, if your storage account has firewall rule you can't link nested template from it.
Make sure your Storage Account, Container and Blob are publicly available. To achieve this:
Provide a Blob level Shared Access Signature URL. select the file click on"..." and then Click on Generate SAS.
refer for more understanding about nested template.

Azure sql database export to storage blob failed

I tried to export Sql Database in azure to storage blob but the operation failed. I do this task daily last month and this issue is new. It shows as follows :
Error encountered during the service operation.Blob https://blob link/dbname-2019-1-16-14-24.bacpac is not writeable. The remote server returned an error: (403) Forbidden.The remote server returned an error: (403) Forbidden.
I had the same problem. I have contacted Azure support and this is their response.
We recently identified a regression in the import/export service that is generating incorrect SAS tokens to the storage accounts.
The engineering team has rolled out the fix, but it might take some time for the fix to get applied worldwide.
Please try the following link to access Azure portal and then perform the export operations:
https://portal.azure.com/?feature.canmodifystamps=true&microsoft_azure_storage=stage1
The portal shows an orange title bar if you open it via the above link, this is expected.
So, it should be already fixed by now.
If you're using sql scripts for back up following this link, then I suspect that the expired date of SHARED ACCESS SIGNATURE is reached.
Please re-generate SHARED ACCESS SIGNATURE, and then use the new key for backup.
Please let me know if any more issues.
Make sure a firewall rule does not exist. Please go to the Azure portal "Storage Accounts → "YourStorageAccountName" → Firewalls and Virtual Networks (left vertical panel). Put it to "Allow access" from "All networks". You can also configure the storage account with exceptions for trusted Microsoft services as explained here.

Unable to deploy the index and grammar file in KES

I'm using Knowledge Exploration Service by Azure. I've prepared a grammar and an index file. Since, the size of it was small I was able to run it on my local machine and on a Azure VM.
But now, I want to deploy this service. Issue is when I run the command kes deploy_service it is unable to download the blob from Azure Storage. Even when I try to provide the file from my local machine.
Followed the same steps on a Azure VM and I receive the same errors.
>kes deploy_service Some.grammar Some.index kes-example
00:00:00 Index: Some.index
00:00:00 ERROR: Invalid value for index parameter: 'Some.index' is not a blob URI.
>kes deploy_service Some.grammar https://storagename.blob.core.windows.net/containername/Some.index kes-example
00:00:00 Index: https://storagename.blob.core.windows.net/containername/Bell.index
00:00:02 ERROR: ResourceNotFound: The storage account 'storagename' was not found.
The container has public access. I can download the file via the browser and even via Azure CLI.
What am I missing here?
EDIT: Adding a sample index file which I've uploaded on Azure Storage with public access. This index file was generated using the Academic example in the documentation.
>kes describe_index https://kesstorage.blob.core.windows.net/kess/Academic.index
ERROR: ResourceNotFound: The storage account 'kesstorage' was not found.
kes.exe is using the old Service Management API. It is querying the API for Storage Accounts in your subscription, but this API predates Azure Resource Manager (ARM), and therefore has no knowledge of ARM Storage Accounts. You will need to use a Classic Storage Account instead.
For how to create a Classic storage account tutorial, refer to this link: https://learn.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#create-a-storage-account

ACL access abilities for Azure Containers and Blobs

I am looking at using azure Containers and Blobs to store images and videos for my website. I found http://msdn.microsoft.com/en-us/library/windowsazure/dd179354.aspx which talks about the different ALC settings but it did not answer one of my questions. If a Container/Blob are set to "No public read access" the site says that only the account owner can read the data. Would this mean that people could not access it by the URL but my MVC Web App hosted on an Azure VM would be able to access it via URL?
Please bear with me if the answer sounds a bit preachy & unnecessary lengthy :)
Essentially each resource (Blob Container, Blob) in Windows Azure has a unique URL and is accessible via REST API (thus accessible over http/https protocol). Wit ACL, you basically tell storage service whether or not to honor the request sent to serve the resource. To read more about authentication mechanism, you may find this link useful: http://msdn.microsoft.com/en-us/library/windowsazure/dd179428.aspx.
When you set the ACL as No public read access, you're instructing storage service not to honor any anonymous requests. Only authenticated requests will be honored. To create an authenticated request, you would require your account name and key and create an authorization header which gets passed along with the request to access the request. If this authorization header is not present in your request, the request will be rejected.
So long story short, to answer your question even your MVC application won't be able to access the blob via URL unless that authorization header is included in the request. One possibility would be to explore Shared Access Signature (SAS) functionality in blob storage. This would give time-bound restricted permissions to blobs in your storage. So what you would do is create a SAS URL for your blob in your MVC app using your account name and key and use that SAS URL in the application.
To further explain the concept of ACL, let's say you have a blob container called mycontainer and it has a blob called myblob.txt in a storage account named myaccount. For listing blobs in the container, the container URL would be http://myaccount.blob.core.windows.net/mycontainer?restype=container&comp=list and the blob URL would be http://myaccount.blob.core.windows.net/mycontainer/myblob.txt. Following will be the behavior when you try to access these URLs directly through the browser with different ACL:
No public read access
Container URL - Error
Blob URL - Error
Public read access for blobs only
Container URL - Error
Blob URL - Success (will download the blob)
Full public read access
Container URL - Success (will show an XML document containing information about all blobs in the container)
Blob URL - Success (will download the blob)

Resources