I generated a sas token with the list permission for a folder in a datalake storage account gen2.
Howerver when I try to access it with an httpGet request, I get AuthorizationPermissionMismatch error This request is not authorized to perform this operation using this permission
I know that to list a container with httpRequest we have to add &comp=list&restype=container in the queryString. Is there any specific param to add for listing a folder
Thanks
Once you have generated the SAS token on the container level using the required permissions. You can add a directory filter in the rest API like below :
https://<StorageURL>/<Container>?directory=<DirectoryName>&restype=container&comp=list&<SASToken>
I tested the same in my environment . I created ADLS Gen 2 Storage account , a test container and then a directory named as Folder and added few files in it as shown below :
Then using SAS generated on container level , I called the below Rest API :
https://adlsgen2ansuman.blob.core.windows.net/test?directory=folder&restype=container&comp=list&sp=racwdlmeop&st=2022-02-03T06:55:43Z&se=2022-02-03T14:55:43Z&spr=https&sv=2020-08-04&sr=c&sig=xxxxxxxxxxxxxxxxxx
Output:
Related
The Error I am getting
I tried to follow this link:
List folder in Azure Gen2 storage account with sas
following above link from postman and I am getting authorization error. what else should be added in header?
I got same error in my environment; I just add below details. Now it’s working fine in postman
Go to azure storage account gen2 -> click on access control -> click on add ->add role assignment and select contributor as role, then select your service principle and save.
SAS token Syntax
https://<StorageURL>/<Container>?directory=<DirectoryName>&restype=container&comp=list&<SASToken>
For more information follow this SO thread by Ivan Yang and this reference link
I am trying to create a read-only Blob Container with Azure AD Authentication. My application will upload files to this Blob container and an email will be sent to users inside my organization with a link to a file that they will download with a browser.
I have created a simple storage account and created a blob container inside it.
Access level is currently set to: Private (no anonymous access)
I have given my test user Reader permission on the Storage Account and Storage Data Blob Reader on the blob container
Permissions given to my demouser account that will only have Reader permissions to blob files:
I've uploaded a test file with my admin account:
With my demouser logged into my Azure Organization via Azure Storage Explorer, I can download the file just fine:
However when I try to download the file with a direct link https://sareadonly01.blob.core.windows.net/myreadonlycontainer01/TestFile01.txt from a browser, which will be the way the users will be downloading these files with a link in an email, I get this error: "The specified resource does not exist."
If I change the Access Level of the blob container to: Blob (anonymous read only access for blobs only), then my demouser can download the file in a browser, but so can everyone else outside my organization and this doesn't use AAD Authentication.
So how do I create a Read-Only Blob container with AAD Authentication and the ability to download files with a direct URL from a browser?
What CLI command can be used to grant access to an Azure image's source blob uri to a specific App Registration's clientId?
Background
An image is created using the CLI. That image includes a source blob uri whose address is given in the portal as:
https://myblobid.blob.core.windows.net/vhds/letters-and-numbers-guid.vhd
Other tools such as ARM templates need to be able to access that same source blob URI, but the problem is that the source blob uri is not visible either to calling ARM templates or when pasted in raw form in the Azure portal.
There seems to be a permission issue.
The ARM templates will be run using a specific clientId associated with an App Registration that can be assigned any Role that you tell us it needs to have.
So what CLI command must be typed in order to give the clientId we specify the ability to run ARM template commands that can successfully access and use the given source blob uri?
I created an Azure Storage Account with an Azure Data Lake Storage Gen2. I want to upload a file using the REST API. While using the authorization with Shared Keys works fine, I get problems using an account SAS.
For the path creation I use the Path - Create operation.
# provide Azure Data Lake Storage Gen2 URL as environment variable
$ ADLS_URL="https://xxxxx.blob.core.windows.net/files"
# provide account SAS as environment variable
$ SAS="sv=2017-07-29&ss=bf&..."
# Create a new path in ADLS Gen2
$ curl -vX PUT -H "Content-Length: 112" "$ADLS_URL/example.txt?resource=file&$SAS"
The request returns with 400 An HTTP header that's mandatory for this request is not specified. and the following error message.
<Error>
<Code>MissingRequiredHeader</Code>
<Message>An HTTP header that's mandatory for this request is not specified. RequestId:870e754b-... Time:2020-07-07T...</Message>
<HeaderName>x-ms-blob-type</HeaderName>
</Error>
It turned out, that the missing header is required for the Creation of a blob in the Blob storage. Since the ADLS Gen2 supports both APIs and both provide a similar operation, it delegates the request to the wrong one.
Is there a way to create a path using PUT operation with a SAS on the ADLS Gen2 API?
Yes, you can create a path(a file in this example) using PUT operation with a SAS on the ADLS Gen2 API. But you need take 3 steps: create an empty file / append data to the empty file / flush data.
Step 1: after generating a sas token, you need to call the Path - Create to create a file in ADLS Gen2. Note: the file should be empty here, it means that in the request header, Content-Length should be 0.
The request url looks like this:
https://xxx.dfs.core.windows.net/aaa/myfile999.txt?resource=file&sv=2019-10-10&ss=bfqt&srt=sco&sp=rwdlacupx&se=2020-07-08T10:31:37Z&st=2020-07-08T02:31:37Z&spr=https&sig=xxxx
Here, I tested it with the tool postman, it works without issue. The empty file can be created on ADLS Gen2 in azure portal:
Step 2 and Step 3:
then you should call Path - Update for appending data.
at last, call the Path - Update again for flushing data.
If you don't know how to use Path - Update for these operation, please use fiddler to see the detailed request information, or just let me know:). Here is the screenshot of the request captured by Fiddler:
BTW, I suggest you can directly use the Put Blob api with sas token(but you need to specify x-ms-blob-type in the request header), which is just a one step for both create the file as well as upload content.
my goal is to restrict access to a Azure Data Lake Gen 2 storage on a directory level (which should be possible according to Microsoft's promises).
I have two directories data, and sensitive in a data lake gen 2 container. For a specific user, I want to grant read access to the directory data and prevent any access to directory sensitive.
Along the documentation I removed all RBAC assignements for that user (on storage account as well as data lake container) so that I have no inherited read access on the directories. Then I added a Read-ACL statement to the data directory for that user.
My expectation:
The user can directly download files from the data directory.
The user can not access files of the sensitive directoy
Reality:
When I try to download files from the data directory I get a 403 ServiceCode=AuthorizationPermissionMismatch
az storage blob directory download -c containername -s data --account-name XXX --auth-mode login -d "./download" --recursive
RESPONSE Status: 403 This request is not authorized to perform this operation using this permission.
I expect that this should work. Otherwhise I only can grant access by assigning the Storage Blob Reader role but that applies to all directory and file within a container and cannot be overwritten by ACL statements. Did I something wrong here?
According to my research, if you want to grant a security principal read access to a file, we need to give the security principal Execute permissions to the container, and to each folder in the hierarchy of folders that lead to the file. for more details, please refer to the document
I found that I could not get ACLs to work without an RBAC role. I ended up creating a custom "Storage Blob Container Reader" RBAC role in my resource group with only permission "Microsoft.Storage/storageAccounts/blobServices/containers/read" to prevent access to listing and reading the actual blobs.