Create SAS token for Azure Data Lake directory in Powershell - azure

I have a storage account in Azure which hosts a data lake. I want to authorize a specific directory with SAS token and I was able to do configure it by clicking in the portal.
First of all, I created a stored access policy lets call it "external1". The policy does not define a permission or an expiry date, it is just used to be able to revoke the SAS token before the token expires.
After that, I navigated to the container "axexternal" to the directory "/external1/central" and generated a SAS token, defining the stored access policy, permissions and expiration date:
These steps worked as expected.
I need to re-create those SAS token automatically. I chose to use an Automation Account (Authorization by its Identity) and use Powershell script to execute the recreation. In detail, I recreate the Storage Access Key first, and then recreate the SAS token. Since the recreation of the Storage Access Key worked like a charm, I focus on the code of the SAS token recreation.
Unfortunatly the documenation for SAS token is technically available but poor. I am not sure which of the commandlets I have to use to get the same result as I had in the Azure Portal.
Is it New-AzStorageAccountKey, New-AzStorageContainerSASToken, New-AzStorageBlobSASToken?
None of the possible combination of parameters in the documentation seems to fit for my needs.
I need to pass these parameters to the appropriate commandllet:
Which storage access key to use for encryption
Stored Access Policy to use
Expiration Date
Permissions
Moreover, I am not able to understand the purpose of the -Context parameter in these commandlets. Is this context used to connect to the storage and execute the script for the creation of the SAS token or is it used to pass parameters to the commandlet?
I tried many variations to achieve the goal, but I failed. Is there anybody able to give me some hints please?
Here is some code I tried:
$ctx = New-AzStorageContext `
-StorageAccountName $storageAccount `
-StorageAccountKey $key.Value `
-Protocol "Https"
$uri = New-AzStorageBlobSASToken `
-Context $ctx `
-Container $container `
-Blob "/external1/central" `
-Policy $policy
-StartTime (Get-Date).AddDays(-1) `
-ExpiryTime (Get-Date).AddDays(370) `
-FullUri
Update
I recognized that my post was maybe too unspecific. I want to supplement it with a specific question.
Given
A data lake container named "axexternal"
A directory "/external1/central"
A stored access policy "external1" which does not define permissions or expiry date
Wanted
A Powershell script which creates a SAS for the specific directory "/external1/central" in the container "axexternal"
SAS signed by account key
SAS uses stored access policy "external1"
SAS defines permissions
SAS defines expiration date
Thanks for your help, I really appreciate!

I tried in my environment and got successfully created azure SAS token with policy:
Initially I created access policy in portal like below:
I executed below command, and it created SAS token with URL successfully.
Command:
$accountname="venkat123"
$accountkey="<your storage account key >"
$containername="docker"
$blob="directory1/demo1mbimage.jpg"
$policy="external"
$ctx = New-AzStorageContext -StorageAccountName $accountName -StorageAccountKey $accountKey
New-AzStorageBlobSASToken `
-Context $ctx `
-Container $containername `
-Blob $blob `
-Policy $policy `
-FullUri
Console:
I checked the file URL with browser it worked perfectly.
Browser:
Reference:
New-AzStorageBlobSASToken (Az.Storage) | Microsoft Learn

After investigation I am now sure that currently it is not possible to get the exact same outcome using Powershell as you are able by using the Azure Portal.
The correct cmdlet to be used for my question is New-AzDataLakeGen2SasToken. That's the only way to be able to grant permission to a whole directory of a datalake.
Unfortunatly for this cmdlet there is no parameter to accept a stored access policy. For a datalake both APIs, datalake and BLOB is available for datalakes. As a consequence, the BLOB storage cmdlet New-AzStorageBlobSASToken works, but I didn't find a way to make it work on a directory. This seems to be consequent, since for BLOB storage, directories are just a part of a BLOB name.
Even if the permission to a single BLOB would be sufficient, it would not possible to use a Stored Access Policy which does not define permissions.
So I ended up using New-AzStorageBlobSASToken without using a Stored Access Policy. My plan for escaping problems when a SAS token got exploited is to rename the base folder and create new SAS tokens. This will be sufficient for my use case.
I use this kind of code now:
$container = "ext"
$centralPath = "external1/central"
$storageContext = New-AzStorageContext `
-StorageAccountName $storageAccount `
-StorageAccountKey $key.Value
New-AzDataLakeGen2SasToken `
-Context $storageContext `
-FileSystem $container `
-Path $centralPath `
-Permission "rdl" `
-StartTime (Get-Date).AddDays(-1).Date `
-ExpiryTime (Get-Date).AddDays(3).Date `
-Protocol Https `
-FullUri

Related

Get-AzStorageFileContent: Can not find your azure storage credential

I am using a automation account runbook to compare files within a storage account fileshare and have been trying to use Get-AzStorageFileContent to download them so I can then compare.
However, I get the error: "Get-AzStorageFileContent : Can not find your azure storage credential. Please set current storage account using "Set-AzSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable."
When I google "Set-AzSubscription" it doesn't appear to exist but I am directed to Set-Azcontext which I have tried to use to set the context to the subscription the storage account is in but this produces the either the same error when testing in powershell ISE or the erorr "please provide a valid tenant or a valid subscription" in the runbook (even though I am using the correct IDs for both)
I have noticed that the storage account is in a different subscription to the runbook could this be breaking it? It does allow me to save files to the storage in the same script so I'm not sure why it would break here.
I am authenticating with a managed identity if that's relevant.
My code to get the file looks like this:
try{
write-output "get file"
Set-Azcontext -Subscription "--storage account subscription--" -Tenant "--Our tenant--"
Get-AzStorageFileContent -ShareName "--storage account name--" -Path "--path of file--"
}
catch{
Write-Output "couldn't get file"
Write-Warning $_.Exception.Message
break
}
Get-AzStorageFileContent : Can not find your azure storage credential. Please set current storage account using "Set-AzSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable:
I also got the same error when I tried in my environment.
This issue usually occurs if you do not create a storage context by specifying a storage account name and storage account key, which is required for storage account authentication.
I tried below script in Azure runbook, and it worked successfully as detailed below:
Connect-AzAccount
Set-AzContext -Subscription "<SubscriptionID>"
$context = New-AzStorageContext -StorageAccountName "<StorageAccountName>" -StorageAccountKey "<StorageAccountKey>"
Get-AzStorageFile -ShareName <FileshareName> -context $context
Output:

Error about permission with Powershell command Get-AzureStorageBlob in Azure Runbook

I'm trying to create a runbook in Azure that accesses a blob storage and list the contents. But I keep getting the following error:
The remote server returned an error: (403) Forbidden. HTTP Status Code: 403 - HTTP Error Message: This request is not authorized to perform this operation using this permission.
I checked the following:
Azure Portal -> Storage Account -> Networking -> Check Allow Access From (All Networks / Selected Networks)
It is set to all networks.
I checked the SAS. It's correct.
On the storage account and the container I set the Access Control to Storage Blob Data Reader and Sotrage Blob Data Owner to Managed Identity\Automation Account.
i created an Access Policy and set its rights to rdl, but I don't know how to call it from within my Powershell statement. I don't know whether it makes any difference.
Who can help me? I've about read all the articles on Internet but can't find the answer.
It's the statement Get-AzureStorageBlob that fails.
This is the code in the runbook:
$storage = "opslag" #name of storage account
$blobcontainer = "contener" #name of container
$sas = "****"
Write-Output $storage
Write-Output $container
$context = New-AzureStorageContext -StorageAccountName $storage -
SasToken $sas
Write-Output $context
$blobs = Get-AzureStorageBlob -Container $blobcontainer -Context
$context
To test this in our local environment, we have created a storage account, automation account with PowerShell runbook
We have enabled Managed identity for the automation account, given the permissions Storage blob data Reader, Storage Blob Data Owner for the same managed identity.
In the storage account, we have created an access policy with read, delete, list permissions to access the blob contents from PowerShell statements.
Here is the PowerShell Script that we have run in the Automation account Runbook:
We have used the same managed identity to authenticate to our azure account in the automation account.
Disable-AzContextAutosave -Scope Process # Ensures you do not inherit an AzContext in your runbook
$AzureContext = (Connect-AzAccount -Identity).context # Connect to Azure with system-assigned managed identity
$AzureContext = Set-AzContext -SubscriptionName $AzureContext.Subscription -DefaultProfile $AzureContext # set and store context
Import-module -name Az.Storage
$storage = "<strgName>" #name of storage account
$blobcontainer = "<containerName>" #name of container
$sas = "<SAStoken>" # Generated SAS token for the container with allowing HTTP & HTTPS protocol.
Write-Output $storage
Write-Output $container
$context = New-AzStorageContext -StorageAccountName $storage -SasToken $sas
Write-Output $context
$blobs = Get-AzStorageBlob -Container $blobcontainer -Context $context
Write-Output $blobs
Here is the sample output for reference:

Unable to upload to Azure BLOB storage via Az Module or Az/CLI

I've used to have a quite ordinary code to upload files to Azure BLOB storage. I post just the most important lines:
$context = New-AzStorageContext $StorageName -StorageAccountKey $Key -ErrorAction SilentlyContinue
Set-AzStorageBlobContent -Context $context -Container 'spfx' `
-File $path -Blob "$Package/v$PackageVersion/$([Path]::GetFileName($path))" `
-ServerTimeoutPerRequest 1800 -ClientTimeoutPerRequest 1800 -ConcurrentTaskCount 1 -Force
It locks for a long time than I get this exception:
The app registration used to login has full access to the resource group which hosts the storage account. The container is configured to allow just anonymous read. As you can see for write access I use a secret.
I tried to use Az/CLI instead fo Az Module but I get an error message the same:
Using the portal UI I'm able to add files to the BLOB storage.
If there's literally anything I can try, please let me know. Tell me also if you need more info to help me.
As always any help will be really appreciated!
Giacomo S. S.
In the second error, you've got a backslash in the path: temp%5Cout.json
Check the script to see where it's coming from

upload files to storage account without SAS

I need to upload files to an storage account without using SAS.
I create an "app registration" and give contributor access to the storage account.
If I want to upload files from powershell? How can I do it?
First az login? and then azcopy? Because I tried this way but ask me for a token
The Azure Powershell, Azure CLI and AzCopy are three different things, you should not mix them together.
If you want to use powershell to upload file with the service principal, after you create the App Registration, please get values for signing in, then create a new application secret.
In your storage account, the Contributor role is enough, but you should note, actually the Contributor does not have the permission to access the blob directly, it just let you get the context of the storage account, then use the context to access the blob, to access the blob directly, we need the Storage Blob Data Owner/Contributor as mentioned in the comment.
Then use the script below(the Get-Credential in another reply is an interactive way, here is a non-interactive way, usually, we use the service principal in a non-interactive way for the automation)
$azureAplicationId ="<Application-ID>"
$azureTenantId= "<Tenant-ID>"
$azurePassword = ConvertTo-SecureString "<Application-secret>" -AsPlainText -Force
$psCred = New-Object System.Management.Automation.PSCredential($azureAplicationId , $azurePassword)
Connect-AzAccount -Credential $psCred -TenantId $azureTenantId -ServicePrincipal
$context = (Get-AzStorageAccount -ResourceGroupName <group-name> -Name <storageaccount-name>).Context
Set-AzStorageBlobContent -Container <container-name> -File <localfile-path> -Blob <blob-name> -Context $context
You can use a Service Principal or a certificate to login using azcopy, and then copy your files to the storage account. Please reference this article for further information.
There are lots of ways to do this. Since you mentioned PowerShell, I'll use it in this example.
# The following assumes you have already created the App registration.
# Either through the portal, PS, whatever.
$creds = Get-Credential
# Uername = Application ID
# password = The service principle secret. You need to create this.
Connect-AzAccount `
-Credential $creds `
-Tenant $tenantId `
-ServicePrincipal
# you will need to get the storage accounts context.
# There are a few ways to do this I usually just get the storage account
$context = (Get-AzStorageAccount -Name $saName -ResourceGroupName $rgName).Context
# You will need to give the App Registration permissions to upload the blob.
# If you don't assign permissions this cmdlet will fail.
Set-AzStorageBlobContent `
-Container $containerName `
-File $filePath `
-Blob $blobName `
-Context $context

Create a SAS token for Specific blob and not other blob in same storage account

Regardless of the type of storage accounts in Azure. Is there any way to create SAS token in Powershell or portal(doesn't seem like) that has exclusive access to a blob and not rest of blobs in the same storage account
Seems below command is available but maybe for different storage account type and not necessary for a blob
New-AzStorageBlobSASToken;
I did create SAS token with below PowerShell script but this token is for the whole blob service
$SA = Get-AzStorageAccount | Select-Object StorageAccountName,ResourceGroupName,Location,SkuName,CreationTime | Out-GridView -PassThru
$key = Get-AzStorageAccountKey -ResourceGroupName $SA.ResourceGroupName -Name $SA.StorageAccountName
$context = New-AzStorageContext -StorageAccountName $SA.StorageAccountName -StorageAccountKey $key.value[0]
$sas = New-AzStorageAccountSASToken -Service Blob, File, Table, Queue -ResourceType Service, Container, Object -Permission "racwdlup" -Context $context
Write-Output $sas
New-AzStorageBlobSASToken does exactly that. It creates a SAS token for one specific blob (think of blob=file in this case)
https://learn.microsoft.com/en-us/powershell/module/az.storage/new-azstorageblobsastoken?view=azps-2.7.0#examples

Resources