Storage destination needs to have a Service SAS, not an Account SAS. What Does This Mean? - azure

Hello recently I have been in the process of trying to use this azure graph request noted here
https://learn.microsoft.com/en-us/graph/api/user-exportpersonaldata?view=graph-rest-1.0&tabs=http
Now when you do that request as stated in it you provide a storage location which is, "This is a shared access signature (SAS) URL to an Azure Storage account, to where data should be exported."
Every time I provide by SAS url I get this error, "Storage destination needs to have a Service SAS, not an Account SAS"
Can someone please help me understand what this means? The documentation it links is not clear.

Storage destination needs to have a Service SAS, not an Account SAS
Difference between Account SAS and Service SAS is described here: https://learn.microsoft.com/en-us/rest/api/storageservices/delegate-access-with-shared-access-signature#types-of-shared-access-signatures.
You're providing an SAS URL for the entire account (e.g. https://account.blob.core.windows.net/?sas-parameters) whereas it is expected that you provide a SAS URL for a specific blob container (e.g. https://account.blob.core.windows.net/blob-container/?sas-parameters).
There are two possible solutions:
Create a SAS URL for a specific blob container. Or in other words create a Service SAS as the error message is telling you to do. You can do so using a tool like Microsoft Storage Explorer.
Insert the blob container name in your account SAS URL so that it looks like something like this https://account.blob.core.windows.net/blob-container/?sas-parameters.
Please note that if you're using an Account SAS, it should at least have Write permission on Object for Blob service.

Related

ADF Pipeline Errors - RequestContentTooLarge and InvalidContentLink

The ADF Pipeline release to the test Data Factory instance is failing with the following error as shown in the image below.
So, to overcome the above issue, I modified the pipeline by adding an additional step of Azure Blob File Copy to store the linked templates in a storage account and reference it in the pipeline to use it for the deployment. However when I made the above change I am getting another error which states InvalidContentLink: Unable to download deployment content from 'https://xxx.blob.core.windows.net/adf-arm-templates/ArmTemplate_0.json?***Sanitized Azure Storage Account Shared Access Signature***'. The tracking Id is 'xxxxx-xxxx-x-xxxx-xx'. Please see https://aka.ms/arm-deploy for usage details.
I have tried using the SAS token for both at the Container level and at the Storage Account level. I also have ensured that the agent and the storage account are under same VNets. I have also tried to remove the firewall restrictions but still it gives me the same InvalidContentLink error.
The modified pipeline with the Azure Storage Account step :
How do I resolve this issue?
InvalidContentLink: Unable to download deployment content from 'https://xxx.blob.core.windows.net/adf-arm-templates/ArmTemplate_0.json?Sanitized Azure Storage Account Shared Access Signature'. The tracking Id is 'xxxxx-xxxx-x-xxxx-xx'. Please see https://aka.ms/arm-deploy for usage details.
This error can cause because of you are trying to link which might not present in storage account.
Make sure you provide correct URl for the nested template that is accessible.
Also, if your storage account has firewall rule you can't link nested template from it.
Make sure your Storage Account, Container and Blob are publicly available. To achieve this:
Provide a Blob level Shared Access Signature URL. select the file click on"..." and then Click on Generate SAS.
refer for more understanding about nested template.

Azure Data Explorer oneclick Ingest from blob container (UI)

I'm trying to configure and use the Azure Data Explorer OneClick Ingest from blob container (continous ingest).
Whatever I try the URL is never accepted, I always end up with this error:
Invalid URL. Either the URL leads to a blob instead of a container, or the permissions are incorrect. If you just grant permission, please wait couple of minutes and try again.
The URL I'm using follow that pattern:
https://mystorageaccount.blob.core.windows.net/mycontainer?sp=rl&st=2022-04-26T22:01:42Z&se=2032-04-27T06:01:42Z&spr=https&sv=2020-08-04&sr=c&sig=Z4Mlh7s5%2Fm1890kdfzlkYLSIHHDdGJmTSyYXVYsHdn01o%3D
I'm probably missing something, either in the URL syntax ou SAS generation.
Has anyone successfully used it? Any idea what could be wrong?
Thanks
I finally found out what was the issue.
Probably due to the security in place on my Storage account I had to create in Azure Data Explorer Networking panel, a Managed private enpoint, pointing to my storage resource (and then approve that endpoint in the storage account Networking)
https://learn.microsoft.com/en-us/azure/data-explorer/security-network-managed-private-endpoint-create

Azcopy error "This request is not authorized to perform this operation."

I copied a container to another storage account based on the document linked below.
(DataLake Storage Gen2).
When trying, I got the following error:
this request not authorized to perform this operations using this permission
https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10
If you are using AAD Token, this error is telling you that you need to add a role assignment to the user. Please go to Storage account -> Access Control -> Add -> Add role assignment, then add Storage Blob Data Owner to your login account.
If this problem persists, please provide more details.
I also faced the same problem. For me to work I just log out and log in again on the azcopy cli after doing the #BowmanZhu solution
azcopy logout
azcopy login --tenant-id xxxx-xxxx-xxxx
If you don't want to login that way there is always the option to add a SAS token at the end of the URL. If you don't want to attach the token always at the end you can try for permanent access by going through any one of these steps you find in the official documentation page.
After granting myself with role Storage Blob Data Owner on the container, then AzCopy will now behave itself and succeed in copying a file to the blob storage container.
go to storageaccount -> container -> Access control rules -> add role assignement -> Storage Blob Data Owner
In my case, my azure storage account vnet address was blocking the azcopy from copying the data over the storage account.
I added my client IP to allow a firewall address.
The SAS token has probably expired.
When I had this, I discovered it was because I'd used Azure Storage Explorer to generate a SAS that didn't have read permission, and I think it was trying to read the size/existence of a blob before writing it.
I got a clue from https://github.com/Azure/azure-storage-azcopy/issues/790 but ultimately I just regenerated a new SAS with read permission and it worked out..
I probably could ahve looked to modify the C# code using Azure Data Movement lib, to not perform a length check, but the spec was later changed to "don't overwrite" so the read permissions are probably needed anyway
Give appropriate permissions(read, write, create) while generating SAS tokens
as here
Had a similar issue. That's how was resolved
Command used was .\azcopy.exe copy "C:\Users\kriof\Pictures" "https://test645676535storageaccount.blob.core.windows.net/images?sp=rw&st=2022-02-23T11:03:50Z&se=2022-02-23T19:03:50Z&spr=https&sv=2020-08-04&sr=c&sig=QRN%2SMFtU3zaUdd4adRddNFjM2K4ik7tNPSi2WRL0%3D"
SAS token had default(Read) permission only. Adding Write permission
in Azure Portal, resolved the issue.

Azure ML studio export data Azure Storage V2

I already post my problem here and they suggested me to post it here.
I am trying to export data from Azure ML to Azure Storage but I have this error:
Error writing to cloud storage: The remote server returned an error: (400) Bad Request.. Please check the url. . ( Error 0151 )
My blob storage configuration is Storage v2 / Standard and Require secure transfer set as enabled.
If I set the Require secure transfer set as disabled, the export works fine.
How can I export data to my blob storage with the require secure transfer set as enabled ?
According to the offical tutorial Export to Azure Blob Storage, there are two authentication types for exporting data to Azure Blob Storage: SAS and Account. The description for them as below.
For Authentication type, choose Public (SAS URL) if you know that the storage supports access via a SAS URL.
A SAS URL is a special type of URL that can be generated by using an Azure storage utility, and is available for only a limited time. It contains all the information that is needed for authentication and download.
For URI, type or paste the full URI that defines the account and the public blob.
For private accounts, choose Account, and provide the account name and the account key, so that the experiment can write to the storage account.
Account name: Type or paste the name of the account where you want to save the data. For example, if the full URL of the storage account is http://myshared.blob.core.windows.net, you would type myshared.
Account key: Paste the storage access key that is associated with the account.
I try to use a simple module combination as the figure and Python code below to test the issue you got.
import pandas as pd
def azureml_main(dataframe1 = None, dataframe2 = None):
dataframe1 = pd.DataFrame(data={'col1': [1, 2], 'col2': [3, 4]})
return dataframe1,
When I tried to use the authentication type Account of my Blob Storage V2 account, I got the same issue as yours which the error code is Error 0151 as below via click the View error log Button under the link of View output log.
Error 0151
There was an error writing to cloud storage. Please check the URL.
This error in Azure Machine Learning occurs when the module tries to write data to cloud storage but the URL is unavailable or invalid.
Resolution
Check the URL and verify that it is writable.
Exception Messages
Error writing to cloud storage (possibly a bad url).
Error writing to cloud storage: {0}. Please check the url.
Based on the error description above, the error should be caused by the blob url with SAS incorrectly generated by the Export Data module code with account information. May I think the code is old and not compatible with the new V2 storage API or API version information. You can report it to feedback.azure.com.
However, I switched to use SAS authentication type to type a blob url with a SAS query string of my container which I generated via Azure Storage Explorer tool as below, it works fine.
Fig 1: Right click on the container of your Blob Storage account, and click the Get Shared Access Signature
Fig 2: Enable the permission Write (recommended to use UTC timezone) and click Create button
Fig 3: Copy the Query string value, and build a blob url with a container SAS query string like https://<account name>.blob.core.windows.net/<container name>/<blob name><query string>
Note: The blob must be not exist in the container, otherwise an Error 0057 will be caused.

Copying blob to another storage account using REST API gives 404 error

In the Logic App I'm using Blob Service Rest API to copy blob between different accounts.
I have SAS signatures on both source and destination URLs. Not sure what I am doing wrong.
Update
The destination URL (with SAS) is obtained from Dynamics 365 endpoint. It comes back with sv value of 2014-02-14. Could this be the problem (the sv is too old as suggested in the comments)?
I managed to copy the blob in a different way, by reading the contents of the source blob and creating the blob at the destination URL with that content (Put Blob).
Some information for you to refer.
I generate the SAS token in the portal, and copy the blob in storage account A to B, I test it in the logicapp, it works fine.
Generate SAS:
Request URL:
Put https://storageB.blob.core.windows.net/containername/testcopy1?sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-08-27T10:43:40Z&st=2018-08-27T02:43:40Z&spr=https&sig=xxxxxxx
Request Headers:
x-ms-copy-source:https://storageA.blob.core.windows.net/containername/2.5.txt?sv=2017-11-09&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-08-27T10:59:19Z&st=2018-08-27T02:59:19Z&spr=https&sig=xxxxxx
In the LogicApp:
Check in the portal:
Update:
I think this is obviously the problem.
Refer to: version mentioned in the article

Resources