I have to restore all blob files to the same key vault from where i backed up the files. I have backed up the files using the referenced website. Then deleted all the secrets and now 'restore all' is not working.I can restore individual secrets but not all of them at once.
I am trying following script.
[string]$VaultName = 'NewVault'
Get-AzureKeyVaultSecret -VaultName $VaultName | ForEach-Object {
Restore-AzureKeyVaultSecret -VaultName $VaultName -InputFile ('C:\Backup1\backup_{0}.blob' -f $_."Name")
}
Reference
Azure Key Vault: Backup Secrets using PowerShell
If you want to restore all secrets in a folder to keyvault, you could use the script below.
[string]$VaultName = 'joykeyvault'
$files = Get-ChildItem C:\Backup1 -Filter Backup_*.blob -Recurse | % { $_.FullName }
foreach($file in $files){
Restore-AzureKeyVaultSecret -VaultName $VaultName -InputFile $file
}
Note: In the screenshot, I use the new Az command Restore-AzKeyVaultSecret, in your case, you are using the old AzureRM module, so just use Restore-AzureKeyVaultSecret.
One thing you can check is if the Azure Key Vault is soft-deleted
Upon deleting a key vault object, such as a key, the service will
place the object in a deleted state, making it inaccessible to any
retrieval operations. While in this state, the key vault object can
only be listed, recovered, or forcefully/permanently deleted.
https://learn.microsoft.com/en-us/azure/key-vault/general/soft-delete-overview#key-vault-object-recovery
You can check this with this command from Azure Cloud Shell:
az keyvault list-deleted
If you have a soft-deleted key vault it will show up as a:
"type": "Microsoft.KeyVault/deletedVaults"
You can then recover the complete vault like this:
az keyvault recover --name <key-vault-name>
https://learn.microsoft.com/en-us/cli/azure/keyvault?view=azure-cli-latest
Related
I am trying to automate the creation of certain azure resources via an Azure PowerShell script that is triggered from an Azure DevOps release pipeline. I want to create a function app, and automatically integrate reading right access to secrets in an already existing Key Vault. This Key Vault is in the same Azure subscription.
While I can create most resources following the documentation, there seems to be a lack of documentation regarding the creation of certain resources using Azure PowerShell (or I can't find it).
If I follow the sample from this link, I can accomplish it without a problem by using the UI in the Azure Portal, but I can't find any documentation on Microsoft Docs to do it using PowerShell.
Write-Host "Creating Function App..."
$fnApp = New-AzFunctionApp -Name $functionAppName `
-ResourceGroupName $emailFunctionRg `
-Location "$(AzureRegion)" `
-StorageAccount $storageName `
-Runtime dotnet `
-FunctionsVersion '3' `
-IdentityType SystemAssigned
Write-Host "Function App created!"
Write-Host "Assigning Key Vault access..."
$appId = Get-AzADServicePrincipal -DisplayName $functionAppName
Set-AzKeyVaultAccessPolicy -VaultName EmailSettings -ServicePrincipalName $appId -PermissionsToSecrets Get,List
Write-Host "Key Vault access granted!"
Running Set-AzKeyVaultAccessPolicy fails with "Insufficient privileges to complete the operation.". But I am not sure if this is the right path to follow, it was just a guess, based on the available functions in the documentation.
Any ideas?
Two potential issues to check out here:
your app creation assigns the result to $fnApp. perhaps $fnApp or as commented above, $fnApp.ApplicationId is what you should be using for the -ServicePrincipalName parameter on the access policy grant.
you don't have privileges to assign RBAC roles. Go to the Key Vault, choose Access Control, then click the Role Assignments tab and verify that your user appears in the list as an Administrator, User Access Administrator, or Owner.
Edit: With respect to the RBAC privilege, since this is running in Azure Powershell from Azure DevOps, you need to check the role assignment for the Service Connection's service principal - under Azure Active Directory in the Azure Portal, look up the principal used to create the service connection, and make sure THAT gets the correct Role on the key vault.
After a little of trial and error I just came to the conclusion I was not using the right parameter for the Set-AzKeyVaultAccessPolicy cmdlet.
The following script will work (if the service principle running it has the appropriate role, like WaitingForGuacamole mentioned in his/her answer):
Write-Host "Creating Function App..."
$fnApp = New-AzFunctionApp -Name <FnAppName> `
-ResourceGroupName <ResourceGroupName> `
-Location <AzureRegion> `
-StorageAccount <StorageAccount> `
-Runtime dotnet `
-FunctionsVersion '3' `
-IdentityType SystemAssigned
Write-Host "Function App created!"
Write-Host "Assigning Key Vault access..."
Set-AzKeyVaultAccessPolicy -VaultName <NameOfTheKeyVault> -ObjectId (Get-AzADServicePrincipal -DisplayName <FnAppName>).Id -PermissionsToSecrets <Get, List, etc...>
Write-Host "Key Vault access granted!"
I have 2 different subscriptions (Dev & Prod) and each subscription we are using separate key vault. So, for Dev- keyvault: az-kv-dev & for Prod- keyvault: az-kv-prod.
Now, we want to read the secrets from Dev and needs to write all to Prod key vault using Azure DevOps release pipeline. Please note, we do not hard-code the password inside devops.
Is there any way to do that?
I'm not familiar with DevOps. But It seems you could move the key vault(az-kv-dev) to Prod subscription first, then copy all secrets to az-kv-prod.
Moving Key Vault to a new subscription:
If the two subscriptions are in the same tenant, you could move just in the portal. Navigate to your key vault -> Overview -> "Move" button. If you move key vault to a subscription in a new tenant, you could use Powershell.
Select-AzSubscription -SubscriptionId <your-subscriptionId>
$vaultResourceId = (Get-AzKeyVault -VaultName myvault).ResourceId
$vault = Get-AzResource –ResourceId $vaultResourceId -ExpandProperties
$vault.Properties.TenantId = (Get-AzContext).Tenant.TenantId
$vault.Properties.AccessPolicies = #()
Set-AzResource -ResourceId $vaultResourceId -Properties $vault.Properties
Clear-AzContext
Connect-AzAccount
Copy all secrets from new-az-kv-dev to az-kv-prod:
Param(
[Parameter(Mandatory)]
[string]$sourceVaultName,
[Parameter(Mandatory)]
[string]$destVaultName
)
$secretNames = (Get-AzKeyVaultSecret -VaultName $sourceVaultName).Name
$secretNames.foreach{
Set-AzKeyVaultSecret -VaultName $destVaultName -Name $_ `
-SecretValue (Get-AzKeyVaultSecret -VaultName $sourceVaultName -Name $_).SecretValue
}
For more details about moving to another subscription, see here.
I have a requirement to get the secrets from key vault. This should be done in powershell script and this powershell script execute through azure devops pipeline (not inline script) and the file saved in azure repo.
Get-AzureKeyVaultSecret -VaultName 'Contoso' -Name 'secret1'
Get the current version of a specific secret in this case 'secret1'. You can output it to a file using
| Out-File -FilePath .\secrets.txt
More information in the official Doc Page: https://learn.microsoft.com/en-us/powershell/module/azurerm.keyvault/get-azurekeyvaultsecret?view=azurermps-6.13.0
I am trying to set the secrets inside my Azure Keyvault using the Azure Powershell Task in Azure DevOps.
I use the following code:
Set-AzureKeyVaultSecret -VaultName $KeyvaultName -Name $SecretName -SecretValue $secretvalue
With the names and the value all setup inside variables and tried to use this without also variables.
The value is saved as a secure string with the following code.ConvertTo-SecureString
But when I run this powershell code inside my Azure DevOps Release pipeline I keep getting following Error message:
Cannot retrieve access token for resource 'AzureKeyVaultServiceEndpointResourceId'. Please ensure that you have provided the appropriate access tokens when using access token login.
So I've made sure that the service principal and the build server are having the right access on the keyvault by adding them both to the access policies with the get,list,set secrets permission.
I've also added following lines of code to make sure that the profile is loaded correctly
########################################################################################
$azureRmProfile = [Microsoft.Azure.Commands.Common.Authentication.Abstractions.AzureRmProfileProvider]::Instance.Profile
$profileClient = New-Object -TypeName Microsoft.Azure.Commands.ResourceManager.Common.RMProfileClient -ArgumentList ($azureRmProfile)
$context = Get-AzureRmContext
$AzureToken = $profileClient.AcquireAccessToken($context.Tenant.Id)
Add-AzureRmAccount -AccessToken $AzureToken.AccessToken -AccountId $AzureToken.UserId
########################################################################################
By adding this code in the beginning of the inline script and using the profile with the commando as variable to the -DefaultProfile.
I also enabled the option to enable the script to access the Oauth token.
Is there someone who also tried to set the secret from the powershell task in Azure DevOps. Or know why the powershell script can't get access on the keyvault.
The azurermContext commando provided me with the right output, even tried the Get-AzureRmKeyvault command to figure out if the connection to the environment was already setup right. And that also didn't gave any problems.
below working for sure (using this reguarly)
Set-AzContext -SubscriptionId $SubscriptionId
## $SubscriptionId is a subscription ID where is the target KV
$Secretvalue = ConvertTo-SecureString $SecretValuePlainText -AsPlainText -Force
## $SecretValuePlainText is the secret to store
Set-AzKeyVaultSecret -VaultName $KeyVaultName -Name $SecretName -SecretValue $Secretvalue -ErrorVariable setSecretError -Expires $ExpirationDate -NotBefore $ActivationDate
## $SecretName, $ExpirationDate, $ActivationDate - obvious :)
of course if your refer to variable not from script or inline, but from release the use $(variable_name)
Service Principal/Service Connection we use for this is temporary an Owner of target subscription (or key vault, up to you).
I had the exact same issue.
Found that the problem was a missing access token.
Namely -KeyVaultAccessToken when you call Add-AzureRmAccount.
Found the solution here: https://github.com/Azure/azure-powershell/issues/4818#issuecomment-376155173
I fixed my question with the following.
I used a service connection that was based on a managed identity. And this needed some workaround to access the key vault like #john mentioned. But this was unnecessary. by creating a new service connection based on a service principal. This workaround was not necessary and fixed the issue.
We use Azure Backup and set our backup vaults to use GRS. We want to use LRS instead. It is understood that this cannot be changed once machines have been added to the vault, and we need to start from scratch. Two questions:
Do I need to remove the current vault first before I set up a new vault for that same server?
Can the current backups be transferred to the new vault?
Changing a Recovery Service Vault's storage replication type can be achieved via the Portal or PowerShell. Unfortunately, this option is greyed-out in the Portal, and whilst the cmdlet successfully executes, it doesn't change the underlying value: if there is one or more Protected Instances already contained in the vault.
Because of this, and because the default value is GeoRedundant, this must be set before any items have been protected.
To set the storage to Locally Redundant via the Portal:
Create/Open the Recovery Services Vault
Scroll-down and select Backup Infrastructure
Select Backup Configuration
Set Storage replication type to Locally-redundant
To achieve the same via PowerShell:
$RG = 'testResourceGroup'
$VaultName = 'testVault'
$Location = 'Central US'
$vault = Get-AzureRmRecoveryServicesVault -ResourceGroupName $RG -Name $VaultName
If (-not $vault) {
$vault = New-AzureRmRecoveryServicesVault -ResourceGroupName $RG -Location $Location -Name $VaultName
}
Set-AzureRmRecoveryServicesBackupProperties -Vault $vault -BackupStorageRedundancy LocallyRedundant
With regards removing existing vaults and transferring existing backup points:
The existing vault does not need to be deleted, however any protected items will need to be removed from the vault before they can be added to a new vault. It is not sufficient to simply stop backup on the protected item - all the restore points must also be deleted before the item can be added to the new vault
I cannot find any documentation, facility in the Portal or PowerShell which would allow the migration of existing protected items and/or restore points
The only way I've been able to change from Geo-Redundant Storage (GRS) to Locally Redundant Storage (LRS) is to create a new empty vault in the old portal (https://manage.windowsazure.com).
In the old portal you can change storage type in "Configuration".
I expect you will also be able to do it with PowerShell, but haven't tried it though.
You can register your server with 1 vault. In order to register your server with the new vault, you need to use the new vault credentials downloaded from manage.windowsazure.com
You can have multiple vaults. If you do not use your current vault in the future, it will stay there. You have to pay for each vault. So, if you don't need it in the future, it may be better to remove it completely.
There is a comprehensive documentation here:
https://azure.microsoft.com/en-us/documentation/services/backup/