How to connect with Storage account in Azure automation runbook - azure

In Azure automation runbook I want to connect with storage account and get the context without the account key. I can connect with the storage account key but I don't want to connect with storage key.
FYI
$Context = New-AzStorageContext -StorageAccountName "cordus6abfsuat001" -UseConnectedAccount
echo $Context
ERROR is "Context cannot be null."
I am expecting to connect with storage account with out the storage account key.

You can use a system-assigned managed identity for you Azure Automation account
Then, in your storage account, you got to:
Access Control
Add Role Assignment
There you can give the Role Contributor for your Automation Account Identity
Hoppe This helps!

I am expecting to connect with storage account with out the storage account key.
You can alternatively use Connection string and get the context as below and I followed Microsoft-Document:
Connect-AzAccount
$Context = New-AzStorageContext -ConnectionString "XX"
Write-Host $Context
XX is the Connection string of Storage account.
Output:
You can also get it with uisng SAS token as below:
$Context = New-AzStorageContext -StorageAccountName "rithvayamo" -SasToken "sp=r&st=2022i4n3vHCuHye6PzkDLUbXTnQT2jeNphU1j0%3D"
Write-Host $Context
Output:

Related

Get-AzStorageFileContent: Can not find your azure storage credential

I am using a automation account runbook to compare files within a storage account fileshare and have been trying to use Get-AzStorageFileContent to download them so I can then compare.
However, I get the error: "Get-AzStorageFileContent : Can not find your azure storage credential. Please set current storage account using "Set-AzSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable."
When I google "Set-AzSubscription" it doesn't appear to exist but I am directed to Set-Azcontext which I have tried to use to set the context to the subscription the storage account is in but this produces the either the same error when testing in powershell ISE or the erorr "please provide a valid tenant or a valid subscription" in the runbook (even though I am using the correct IDs for both)
I have noticed that the storage account is in a different subscription to the runbook could this be breaking it? It does allow me to save files to the storage in the same script so I'm not sure why it would break here.
I am authenticating with a managed identity if that's relevant.
My code to get the file looks like this:
try{
write-output "get file"
Set-Azcontext -Subscription "--storage account subscription--" -Tenant "--Our tenant--"
Get-AzStorageFileContent -ShareName "--storage account name--" -Path "--path of file--"
}
catch{
Write-Output "couldn't get file"
Write-Warning $_.Exception.Message
break
}
Get-AzStorageFileContent : Can not find your azure storage credential. Please set current storage account using "Set-AzSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable:
I also got the same error when I tried in my environment.
This issue usually occurs if you do not create a storage context by specifying a storage account name and storage account key, which is required for storage account authentication.
I tried below script in Azure runbook, and it worked successfully as detailed below:
Connect-AzAccount
Set-AzContext -Subscription "<SubscriptionID>"
$context = New-AzStorageContext -StorageAccountName "<StorageAccountName>" -StorageAccountKey "<StorageAccountKey>"
Get-AzStorageFile -ShareName <FileshareName> -context $context
Output:

Azure APIM Backup throwing error "Invalid parameter: "This request is not authorized to perform this operation.Parameter name: backupContainerName"

I am using the Azure automation runbook to back up APIM on the same region using "user assigned Managed Identities". Have given the role assignment "Storage Account Contributor" to the user assigned managed identity. Also whitelisted IP address of APIM control panel IP i.e "20.44.72.3" for east us 2. Doing all this still i get the error as
Body:
{
"error": {
"code": "InvalidParameters",
"message": "Invalid parameter: This request is not authorized to perform this operation.\r\nParameter name: backupContainerName (value: [apimbackup])",
"details": null,
"innerError": null
}
}
PowerShel Script:
$AzureContext = (Connect-AzAccount -Identity -AccountId XXXX-e9f9-XXXX-ad22-95f821a2c9bc).context
# set and store context
$AzureContext = Set-AzContext -SubscriptionName $AzureContext.Subscription -DefaultProfile $AzureContext
$storageKey = (Get-AzStorageAccountKey -ResourceGroupName "rg-nau2d-XXXX-01" -StorageAccountName "stornau2dXXXXXX")[0].Value
$storageContext = New-AzStorageContext -StorageAccountName "stornau2dXXXXXX" -StorageAccountKey $storageKey
$storageKey
$StorageContext
$resourceGroupName="rg-nau2d-XXXX-01";
$apiManagementName="apim-01";
$containerName="apimbackup";
$backupName= $apiManagementName +"blob1";
$clientId = "XXXX-e9f9-XXXX-ad22-95f821a2c9bc"
Backup-AzApiManagement -Debug -ResourceGroupName $resourceGroupName -Name $apiManagementName -StorageContext $storageContext -TargetContainerName $containerName -TargetBlobName $backupName -AccessType "UserAssignedManagedIdentity" -IdentityClientId $clientId -PassThru
Please note that the storage account that you have to back up to
can be in any Azure region except in the region where the API Management
service is located i.e.; except the same region.
For example: If at all the APIM service is in West US, then the Storage account must or can be
in West US 2 and have to open the API Management control plane
IP of West US in the firewall beacause the requests to Azure
Storage are not SNATed to a public IP from (compute )
i.e. ; Azure API Management control plane here ,in the same
Azure region. Public IP address can or will be SNATed to
Cross-region storage requests.
And as you are using user assigned identity, please make sure that
identity is given or assigned the role of Storage Blob Data Contributor from the Azure portal , which has scope
to the storage account that you are using for the APIM backup
/restore.After that save changes and refresh the portal and try again.
Note:
Backup is not possible in Consumption but is available in the Premium, Standard, Basic, and Developer tiers of API Management.
Also make sure CORS is not enabled to the target blob storage account
Also check if backup-contributor role is required in your case
Disable AzContextAutosave which makes sure to not to inherit AzContext in your runbook before creating azcontext when using backup-azapimanagement
Disable-AzContextAutosave -Scope Process
$AzureContext = (Connect-AzAccount .....).context
While creating the blob container to store backup , give it a blob permission
New-AzStorageContainer -Name "<apim name>" -Context $storageAccount.Context -Permission blob
Please check the below references to get in detail information about the same.
If the issue seems to be same , try to raise a support request for the same from support+troubleshooting blade from the azure portal.
References:
Implement disaster recovery using backup and restore in API
Management - Azure API Management | Microsoft Docs
Backup and Restore in Azure API Management | Azure Blog
(svenmalvik.com)

Error about permission with Powershell command Get-AzureStorageBlob in Azure Runbook

I'm trying to create a runbook in Azure that accesses a blob storage and list the contents. But I keep getting the following error:
The remote server returned an error: (403) Forbidden. HTTP Status Code: 403 - HTTP Error Message: This request is not authorized to perform this operation using this permission.
I checked the following:
Azure Portal -> Storage Account -> Networking -> Check Allow Access From (All Networks / Selected Networks)
It is set to all networks.
I checked the SAS. It's correct.
On the storage account and the container I set the Access Control to Storage Blob Data Reader and Sotrage Blob Data Owner to Managed Identity\Automation Account.
i created an Access Policy and set its rights to rdl, but I don't know how to call it from within my Powershell statement. I don't know whether it makes any difference.
Who can help me? I've about read all the articles on Internet but can't find the answer.
It's the statement Get-AzureStorageBlob that fails.
This is the code in the runbook:
$storage = "opslag" #name of storage account
$blobcontainer = "contener" #name of container
$sas = "****"
Write-Output $storage
Write-Output $container
$context = New-AzureStorageContext -StorageAccountName $storage -
SasToken $sas
Write-Output $context
$blobs = Get-AzureStorageBlob -Container $blobcontainer -Context
$context
To test this in our local environment, we have created a storage account, automation account with PowerShell runbook
We have enabled Managed identity for the automation account, given the permissions Storage blob data Reader, Storage Blob Data Owner for the same managed identity.
In the storage account, we have created an access policy with read, delete, list permissions to access the blob contents from PowerShell statements.
Here is the PowerShell Script that we have run in the Automation account Runbook:
We have used the same managed identity to authenticate to our azure account in the automation account.
Disable-AzContextAutosave -Scope Process # Ensures you do not inherit an AzContext in your runbook
$AzureContext = (Connect-AzAccount -Identity).context # Connect to Azure with system-assigned managed identity
$AzureContext = Set-AzContext -SubscriptionName $AzureContext.Subscription -DefaultProfile $AzureContext # set and store context
Import-module -name Az.Storage
$storage = "<strgName>" #name of storage account
$blobcontainer = "<containerName>" #name of container
$sas = "<SAStoken>" # Generated SAS token for the container with allowing HTTP & HTTPS protocol.
Write-Output $storage
Write-Output $container
$context = New-AzStorageContext -StorageAccountName $storage -SasToken $sas
Write-Output $context
$blobs = Get-AzStorageBlob -Container $blobcontainer -Context $context
Write-Output $blobs
Here is the sample output for reference:

Why do I get the error 'The provided information does not map to an AD object id.' when executing New-AzRoleAssignment using a Service Principal?

Using Powershell in an Azure DevOps pipeline, I am trying to assign the key vault's principal the role Storage Account Key Operator Service Role to a storage account.
Command Line
The command line is run after I connected Azure with the service principal:
$credentials = New-Object -TypeName System.Management.Automation.PSCredential($servicePrincipalApplicationId, $clientSecret)
Connect-AzAccount -ServicePrincipal -Credential $credentials -Tenant $tenantId
Here is the command line that I execute :
New-AzRoleAssignment -ApplicationId $keyVaultServicePrincipalId -ResourceGroupName $resourceGroupName -ResourceName $storageAccountName -ResourceType "Microsoft.Storage/storageAccounts" -RoleDefinitionName "Storage Account Key Operator Service Role"
Where:
$keyVaultServicePrincipalId is the pre-registered principal ID for Key Vault. Its value is cfa8b339-82a2-471a-a3c9-0fc0be7a4093.
$resourceGroupName is the name of the resource group in which the storage is located. Its value is accountsmanager-test-global-rg.
$storageAccountName is the name of my storage account. Its value is accountsmanagertest.
Service Principal
Here are the permission of the service principal under which the command is run:
The command is run as a Service Principal that has the Owner role in the subscription:
The resource group created in that subscription is also owned by that Service Principal:
Question
When I run the command, I get the following error:
New-AzRoleAssignment: The provided information does not map to an AD object id.
Why do I get the error The provided information does not map to an AD object id. when executing the command New-AzRoleAssignment?
I can also reproduce this on my side, there are two issues.
1.In your command, the ResourceType should be Microsoft.Storage/storageAccounts, not Microsoft.Storage/storageAccount.
2.In the API permission of your AD App related to the service principal used in the DevOps servcie connection, you need to add the Application permission Directory.Read.All in Azure Active Directory Graph, not Microsoft Graph.
After a while to take effect, it will work fine.

upload files to storage account without SAS

I need to upload files to an storage account without using SAS.
I create an "app registration" and give contributor access to the storage account.
If I want to upload files from powershell? How can I do it?
First az login? and then azcopy? Because I tried this way but ask me for a token
The Azure Powershell, Azure CLI and AzCopy are three different things, you should not mix them together.
If you want to use powershell to upload file with the service principal, after you create the App Registration, please get values for signing in, then create a new application secret.
In your storage account, the Contributor role is enough, but you should note, actually the Contributor does not have the permission to access the blob directly, it just let you get the context of the storage account, then use the context to access the blob, to access the blob directly, we need the Storage Blob Data Owner/Contributor as mentioned in the comment.
Then use the script below(the Get-Credential in another reply is an interactive way, here is a non-interactive way, usually, we use the service principal in a non-interactive way for the automation)
$azureAplicationId ="<Application-ID>"
$azureTenantId= "<Tenant-ID>"
$azurePassword = ConvertTo-SecureString "<Application-secret>" -AsPlainText -Force
$psCred = New-Object System.Management.Automation.PSCredential($azureAplicationId , $azurePassword)
Connect-AzAccount -Credential $psCred -TenantId $azureTenantId -ServicePrincipal
$context = (Get-AzStorageAccount -ResourceGroupName <group-name> -Name <storageaccount-name>).Context
Set-AzStorageBlobContent -Container <container-name> -File <localfile-path> -Blob <blob-name> -Context $context
You can use a Service Principal or a certificate to login using azcopy, and then copy your files to the storage account. Please reference this article for further information.
There are lots of ways to do this. Since you mentioned PowerShell, I'll use it in this example.
# The following assumes you have already created the App registration.
# Either through the portal, PS, whatever.
$creds = Get-Credential
# Uername = Application ID
# password = The service principle secret. You need to create this.
Connect-AzAccount `
-Credential $creds `
-Tenant $tenantId `
-ServicePrincipal
# you will need to get the storage accounts context.
# There are a few ways to do this I usually just get the storage account
$context = (Get-AzStorageAccount -Name $saName -ResourceGroupName $rgName).Context
# You will need to give the App Registration permissions to upload the blob.
# If you don't assign permissions this cmdlet will fail.
Set-AzStorageBlobContent `
-Container $containerName `
-File $filePath `
-Blob $blobName `
-Context $context

Resources