How to create SAS token to list\delete blobs - azure

I tried creating SAS like this (ADDING "Read" permission changes nothing):
But it didnt work for me. I only want my script to get blob list, read metadata and delete old blobs.
Get-AzureStorageContainer : The remote server returned an error: (403)
Forbidden. HTTP Status Code: 403 - HTTP Error Message: This request is
not authorized to perform this operation.
Also, I'd like to know whats the minimum possible permissions to achieve my goal.
$ctx = New-AzureStorageContext -StorageAccountName xxx -SasToken zzz
$Containers = Get-AzureStorageContainer -Context $ctx
sample sas token:
?sv=2017-07-29&ss=b&srt=co&sp=dl&se=2018-03-31T21:24:06Z&st=2018-03-31T09:24:06Z&spr=https&sig=bWsg5sSPZF%2FaBXxfW6RoCH%2BlcFKBT6MFyMKTRM3I2jI%3D

So there are two things here:
You're getting 403 error: Assuming you're using the same SAS token as you have mentioned in the question along with Get-AzureStorageContainer Cmdlet, you will get this error. The reason for this is the purpose of this Cmdlet is to list blob containers in a storage account and for that you need to have Service permission in your SAS token (srt value in your SAS token should be sco instead of co). Because the required permission is not there in your SAS token, you are getting this 403 error. However if you use the same token along with Get-AzureStorageBlob, you should not get any error.
Necessary permissions for get blob list, read metadata and delete old blobs: For this, you would need the following permissions:
Allowed Services: Blobs (b)
Allowed resource types: Container (c) and Object (o)
Allowed permissions: List (l), Read (r) and Delete (d)
With this combination you should be able to list blobs from a blob container using Get-AzureStorageBlob, read its metadata and delete the blobs.
UPDATE
So what I did was I followed your steps and tried to list the blob containers using Get-AzureStorageContainer Cmdlet. I also got the same error :).
Then I ran the Cmdlet with Debug and Verbose switches and found that for each blob container, this Cmdlet tries to get the ACL.
_https://account.blob.core.windows.net/my-container?sv=2017-07-29&ss=b&srt=sco&sp=dl&se=2018-03-31T23:28:27Z&st=2018-03-31T15:2
8:27Z&spr=https&sig=signature&api-version=2017-04-17&restype=container&comp=acl.
Confirm The remote server returned an error: (403) Forbidden. HTTP
Status Code: 403 - HTTP Error Message: This request is not authorized
to perform this operation. [Y] Yes [A] Yes to All [H] Halt Command
[S] Suspend [?] Help (default is "Y"): y Get-AzureStorageContainer :
The remote server returned an error: (403) Forbidden. HTTP Status
Code: 403 - HTTP Error Message: This request is not authorized to
perform this operation. At line:1 char:1
+ Get-AzureStorageContainer -Context $ctx -Debug -Verbose
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Get-AzureStorageContainer], StorageException
+ FullyQualifiedErrorId : StorageException,Microsoft.WindowsAzure.Commands.Storage.Blob.Cmdlet.GetAzureStorageCont
ainerCommand
Now the problem is that you can't fetch ACL for a container using a shared access signature, you would need to use the account key (same thing goes for creating a shared access signature). This is the reason you're getting 403 error back from the service.
Not sure you would classify this as a bug in Get-AzureStorageContainer or would want to put in a feature request allowing you to list blob containers without getting its ACL but they way things are today, you can't list blob containers using this Cmdlet and a SAS token.

Related

Azure PowerShell Copy-AzStorageBlob: Invalid parameter: comp

I am trying to copy blobs between storage accounts using Copy-AzStorageBlob command.
$srcCtx = New-AzStorageContext -StorageAccountName $srcStorageAccountName -SasToken $srcSasToken
$destCtx = New-AzStorageContext -StorageAccountName $destStorageAccountName -SasToken $destSasToken
Copy-AzStorageBlob -SrcBlob "blobPath" -SrcContainer "src" -Context $srcCtx -DestContainer "dest" -DestContext $destCtx
and I get an error that the following parameter is invalid:
QueryParameterName: comp
QueryParameterValue: tags
but I am not using them directly. Do you have any idea what is wrong? I would like to underline, that source and destination context are working - I have read and write some blobs using mentioned contexts. The issue occurs only during blob copying.
The error message:
Copy-AzStorageBlob: Value for one of the query parameters specified in the request URI is invalid.
RequestId:d705aed0-b01e-0013-12c6-244430000000
Time:2022-02-18T12:56:40.1603279Z
Status: 400 (Value for one of the query parameters specified in the request URI is invalid.)
ErrorCode: InvalidQueryParameterValue
Additional Information:
QueryParameterName: comp
QueryParameterValue: tags
Reason:
Content:
<?xml version="1.0" encoding="utf-8"?><Error><Code>InvalidQueryParameterValue</Code><Message>Value for one of the query parameters specified in the request URI is invalid.
RequestId:d705aed0-b01e-0013-12c6-244430000000
Time:2022-02-18T12:56:40.1603279Z</Message><QueryParameterName>comp</QueryParameterName><QueryParameterValue>tags</QueryParameterValue><Reason /></Error>
Headers:
Server: Microsoft-HTTPAPI/2.0
x-ms-request-id: d705aed0-b01e-0013-12c6-244430000000
x-ms-client-request-id: f76e9297-504e-42a7-8c86-54ad81bedad7
x-ms-error-code: InvalidQueryParameterValue
Date: Fri, 18 Feb 2022 12:56:40 GMT
Content-Length: 375
Content-Type: application/xml
Please try freshly generating a new SAS key through Azure Portal or Azure Storage Explorer to ensure that the SAS key is fresh and not expired and copy that in url without whitespaces and also make sure to have the correct permissions ( read, write, list etc) which are required for certain operations.
Make sure to have one of any of below roles tried.
1. Storage Blob Data Contributor
2. Storage Blob Data Owner
Try to upgrade(or downgrade) the package versions of azure.storage.blob
Try to provide only required headers like x-ms-blob-type in your request by removing all other unneccessary headers .Check if x-ms-tags header is present and remove if not required.
References:
Error -Value for one of
the query parameters specified in the request URI is invalid - Stack
Overflow
Get Blob Tags (REST API) - Azure Storage | Microsoft Docs
Copy Blob (REST API) - Azure Storage | Microsoft Docs

Azure setting recursive ACL results in 403 - updating goes okay though

I'm using the Az.Storage cmdlets in Powershell to set the permissions on an Azure Data Lake gen 2 storage account. I have "owner" permissions, along with "Data Storage Owner" via my Azure AD account.
I can run the cmdlet "Update-AzDataLakeGen2AclRecursive" without issue, but if i instead try to 'replace' the permissions, using "set-AzDataLakeGen2AclRecursive", i get the following error:
Set-AzDataLakeGen2AclRecursive : An error occurred while recursively changing the access control list. See the InnerException of type Azure.RequestFailedException with Status=403
and ErrorCode=SetAclMissingAces for more information. You can resume changing the access control list using ContinuationToken= after addressing the error.
At file.ps1:62 char:9
+ Set-AzDataLakeGen2AclRecursive -Context $context -FileSystem ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Set-AzDataLakeGen2AclRecursive], DataLakeAclChangeFailedException
+ FullyQualifiedErrorId : DataLakeAclChangeFailedException,Microsoft.WindowsAzure.Commands.Storage.Blob.Cmdlet.SetAzDataLakeGen2AclRecursiveCommand
I'm not entirely sure why i'd get a 403 in this scenario, as it appears i have the correct permissions on the account already. (Having created the storage account with that same account - and reading the documentation, and finding that 'Data Storage Owner' was required)
Any ideas here?
To answer this one - you must make sure your AD Principal has permissions at the container level to be able to access it in the first place.
For me - this meant granting my AAD Account read/write/execute + default perms on the container that i was attempting to modify before using the "Set-" cmdlet.

Connect-AzureAD -Confirm throwing AADSTS900144: The request body must contain the following parameter: 'code'. error

I have downloaded powershell azure ad modules and I'm trying to connect to azure ad using below command but it throwing error.
Connect-AzureAD -Confirm
AADSTS900144: The request body must contain the following parameter: 'code'.
I have specified correct credentials even then getting above issue .
https://social.msdn.microsoft.com/Forums/en-US/281ffa55-1024-4d39-b83f-a7f184fc4da8/cannot-login-to-azure-portal?forum=AzureAvailabilityZones
Looks like it might be an azure auth issue.

Get-AzDataLakeStoreItem returns GETFILESTATUS failed with Unknown Error for valid items

I'm writing a powershell script to setup a new Data Lake store gen1 account, and create folders with updated ACLs. The New-AzDataLakeStoreAccount statement works fine; New-AzDataLakeStoreItem and Get-AzDataLakeStoreItem fail with similar Unknown Error.
Following powershell code will create a new Data Lake store. I have a resourcegroup and security group in the subscription as listed below. The second and third scripts for listing and adding a folder fail. I'm using https://shell.azure.com to execute the powershell.
New-AzDataLakeStoreAccount -ResourceGroupName "ade-dev-eastus2" -Name "adedeveastus2" -Location "East US 2" -DefaultGroup (Get-AzADGroup -DisplayName "Technical Operations").Id -Encryption ServiceManaged -Tag #{User="ADE";}-Tier Consumption
Get-AzDataLakeStoreItem -AccountName "adedeveastus2" -Path "/"
New-AzDataLakeStoreItem -AccountName "adedeveastus2" -Path "/Staging" -Folder
Following is an error message for Get-AzDataLakeStoreItem
Get-AzDataLakeStoreItem : Error in getting metadata for path /.
Operation: GETFILESTATUS failed with Unknown Error: Token Length is 6. Token is most probably malformed. Source: StackTrace: .
Last encountered exception thrown after 5 tries. [There was an error retrieving the managed service access token for resource 'https://datalake.azure.net' using the URI 'http://localhost:50342/oauth2/token?resource=https%3A%2F%2Fdatalake.azure.net&api-version=2018-02-01'. Please check that this managed service is configured to emit tokens at this address and that the associated managed service identity has the appropriate role assignment and try logging in again.,Token Length is 6. Token is most probably malformed.,Token Length is 6. Token is most probably malformed.,Token Length is 6. Token is most probably malformed.,Token Length is 6. Token is most probably malformed.]
[ServerRequestId:]
At line:1 char:1
+ Get-AzDataLakeStoreItem -Account "adedeveastus2" -Path "/"
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [Get-AzDataLakeStoreItem], AdlsException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.DataLakeStore.GetAzureDataLakeStoreItem
The error returned when using a Windows Powershell host is more descriptive.
Operation: GETFILESTATUS failed with Unknown Error: The 'User-Agent' header must be modified using the appropriate property or method.
I would expect to get back a DataLakeStoreItem object, and things like Name and Path. I think a general error for other users of ADL or the Cloud shell powershell?
I can reproduce your issue in the Azure Cloud Shell, when I run the command in local(PSVersion is 5.1.17134.228), I got the same error with the GitHub known issue.
As the datalake team said, see this link:
This is a problem with Httpwebrequest class used by our SDK. Setting useragent for httpwebrequest is different across netframework and netcore:
NET framework: webReq.UserAgent = client.GetUserAgent()
net standard: webReq.Headers["User-Agent"] = client.GetUserAgent()
If you try to later thing in net framework you get the error you are getting above.
When you use this az module which is using the netstandard dll of our SDK. When you use that from windows powershell, its trying to use net standard dll on netframework which gives this error.
And the soluation:
I tested this in net core powershell. It runs fine.
We are moving from httpwebrequest to httpclient which will probably fix the issue.
So basically if you are using windows powershell use azurerm or else use Az from powershell netcore.
It seems explains the error The 'User-Agent' header must be modified using the appropriate property or method, so you could try to use the Az powershell in Powershell core. If it is acceptable, you can also use AzureRM powershell module in windows powershell, it works fine on my side.
Get-AzureRmDataLakeStoreItem -Account "joydatalake2" -Path "/"
New-AzureRmDataLakeStoreItem -Account "joydatalake2" -Path "/Staging" -Folder

SAS URLs not working

I'm trying to create a SAS URL for a blob storage container. I've tried multiple storage accounts and multiple methods of creating the SAS, and all of them give this result when I test the SAS URL in a browser:
<Error>
<Code>AuthenticationFailed</Code>
<Message>
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:d95bf34f-0001-0022-4430-b1a25b000000 Time:2016-05-18T18:12:30.5552096Z
</Message>
<AuthenticationErrorDetail>
Signature did not match. String to sign used was rl 2016-05-18T18:10:00Z 2016-05-19T18:10:00Z /blob/cloudappmanager/$root 2015-04-05
</AuthenticationErrorDetail>
</Error>
I tried Storage Explorer (right-click container, Get SAS, click OK with defaults):
I tried the old Storage Explorer:
And I tried PowerShell:
PS C:\Users\virklba> $context = New-AzureStorageContext -StorageAccountName msuscoreaprod
cmdlet New-AzureStorageContext at command pipeline position 1
Supply values for the following parameters:
(Type !? for Help.)
StorageAccountKey: xxxxxxxxx
PS C:\Users\virklba> New-AzureStorageContainerSASToken -Name aadlogs -Context $context -FullUri -Permission rl
https://msuscoreaprod.blob.core.windows.net/aadlogs?sv=2015-04-05&sr=c&sig=xxxxxxxx&se=2016-05-18T19%3A47%3A56Z&sp=rl
All with the same result. Is anyone else seeing this behavior, or is it just me?
You are creating a SAS on the container, and it looks like you are trying to read the container in the browser. When I paste the container SAS into the browser, I get the same error you are getting.
The container SAS (with read permissions) gives you read access to the blobs in the container. So you need to append a blob name to the SAS before you paste it into the browser, in order to read a blob.
For example, this will not work:
https://myaccount.blob.core.windows.net/lotsofblobs?st=2016-05-18T22%3A49%3A00Z&se=2016-05-19T22%3A59%3A00Z&sp=rl&sv=2015-04-05&sr=c&sig=62WHwaZGI60ub1hYcQyKg1%2FE%2F1w9HUrOPGorzoWDLvE%3D
This does work, with myblob.txt appended to the base URL:
https://myaccount.blob.core.windows.net/lotsofblobs/myblob.txt?st=2016-05-18T22%3A49%3A00Z&se=2016-05-19T22%3A59%3A00Z&sp=rl&sv=2015-04-05&sr=c&sig=62WHwaZGI60ub1hYcQyKg1%2FE%2F1w9HUrOPGorzoWDLvE%3D
Please also see Gaurav Mantri's detailed explanation here: Azure Shared Access Signature - Signature did not match
To fix this, try connecting the storage account first, then the blob.

Resources