I have a requirement to get the secrets from key vault. This should be done in powershell script and this powershell script execute through azure devops pipeline (not inline script) and the file saved in azure repo.
Get-AzureKeyVaultSecret -VaultName 'Contoso' -Name 'secret1'
Get the current version of a specific secret in this case 'secret1'. You can output it to a file using
| Out-File -FilePath .\secrets.txt
More information in the official Doc Page: https://learn.microsoft.com/en-us/powershell/module/azurerm.keyvault/get-azurekeyvaultsecret?view=azurermps-6.13.0
Related
As part of troubleshooting/diagnosing a permissions issue with the service principal that one of our runbooks is using, Microsoft support has asked me to run the same PowerShell code locally so we can capture logs. In order to do this, I need to authenticate exactly the same way as the runbook does, which means authenticating with the certificate that the AzureRunAsConnection is using.
After swapping out some cmdlets that only exist in Azure Automation for equivalent commands for Azure RM (e.g. Get-AutomationConnection has to be replaced with Get-AzAutomationAccount and Get-AzAutomationConnection, and you have to switch to using FieldDefinitionValues), the authentication part of my script looks like this::
Set-StrictMode -Version Latest
$ErrorActionPreference = 'Stop'
$PSDefaultParameterValues['*:ErrorAction']='Stop'
'*** Authenticating with Azure'
$automationAccount = Get-AzAutomationAccount `
-Name "MY_AUTOMATION_ACCOUNT" `
-ResourceGroupName "MY_RESOURCE_GROUP"
$connection = $automationAccount | Get-AzAutomationConnection -Name "AzureRunAsConnection"
# Log-in to Az Account Management Graph API (for DNS updates)
Login-AzAccount -ServicePrincipal `
-Tenant $connection.FieldDefinitionValues.TenantID `
-ApplicationId $connection.FieldDefinitionValues.ApplicationID `
-CertificateThumbprint $connection.FieldDefinitionValues.CertificateThumbprint
When I run it, though, I get this error:
Login-AzAccount : No certificate was found in the certificate store with thumbprint
93FAB7F0BA11D08F8ABBAF5C587C77ECB058A8BB
It appears that I need to export the certificate that the AzureRunAsConnection is using so I can import it into my local machine. However, though I can renew the certificate that Azure Automation uses or upload my own, it doesn't appear that there is an easy way to get the current certificate. I know this is by design -- for security -- but for a case like this it's a pain.
I found this article from 2018 that describes how to export the cert while inside a hybrid worker, but the code didn't work for me in Azure Cloud Shell nor locally (Get-AutomationCertificate is undefined):
https://www.lunavi.com/blog/how-to-download-an-azure-automation-connection-certificate-locally
Some guides on the web suggest creating a blob storage account and then writing a script to export to it, but that seems like a lot of effort for something I just need once for this repro.
What's the fastest, easiest way to get this certificate for local debugging/repro?
In the end, I used the article from 2018 to craft a quick and dirty way to get the certificate in a consumable format out of the runbook.
I modified the script that's running in the runbook to add the following lines at the top:
"*** Exporting Run As Certificate for Debugging"
$cert = Get-AutomationCertificate -Name "AzureRunAsCertificate"
$certData = $cert.Export("pfx", 'MySuperSecurePassword')
throw ([Convert]::ToBase64String($certData))
This causes an exception that contains the contents of the certificate as its message. Then when I run this code via the runbook "Test pane", I get the contents of the certificate dumped out in base64 format at the top:
I use an exception just so that: 1) the rest of the runbook does not execute, and 2) I am guaranteed to see the message. I found that if I just did a Write-Host before throwing an exception, the output might not get flushed before the script stopped running.
I'm then able to copy the base64-encoded part that's in parenthesis, paste it into a file, and decode the contents of the file with [System.Text.Encoding]::Unicode.GetString([System.Convert]::FromBase64String in PowerShell or base64 --decode in WSL to get a PFX file. I can then open that PFX file up and import it into my local user store:
I have to restore all blob files to the same key vault from where i backed up the files. I have backed up the files using the referenced website. Then deleted all the secrets and now 'restore all' is not working.I can restore individual secrets but not all of them at once.
I am trying following script.
[string]$VaultName = 'NewVault'
Get-AzureKeyVaultSecret -VaultName $VaultName | ForEach-Object {
Restore-AzureKeyVaultSecret -VaultName $VaultName -InputFile ('C:\Backup1\backup_{0}.blob' -f $_."Name")
}
Reference
Azure Key Vault: Backup Secrets using PowerShell
If you want to restore all secrets in a folder to keyvault, you could use the script below.
[string]$VaultName = 'joykeyvault'
$files = Get-ChildItem C:\Backup1 -Filter Backup_*.blob -Recurse | % { $_.FullName }
foreach($file in $files){
Restore-AzureKeyVaultSecret -VaultName $VaultName -InputFile $file
}
Note: In the screenshot, I use the new Az command Restore-AzKeyVaultSecret, in your case, you are using the old AzureRM module, so just use Restore-AzureKeyVaultSecret.
One thing you can check is if the Azure Key Vault is soft-deleted
Upon deleting a key vault object, such as a key, the service will
place the object in a deleted state, making it inaccessible to any
retrieval operations. While in this state, the key vault object can
only be listed, recovered, or forcefully/permanently deleted.
https://learn.microsoft.com/en-us/azure/key-vault/general/soft-delete-overview#key-vault-object-recovery
You can check this with this command from Azure Cloud Shell:
az keyvault list-deleted
If you have a soft-deleted key vault it will show up as a:
"type": "Microsoft.KeyVault/deletedVaults"
You can then recover the complete vault like this:
az keyvault recover --name <key-vault-name>
https://learn.microsoft.com/en-us/cli/azure/keyvault?view=azure-cli-latest
I am trying to set the secrets inside my Azure Keyvault using the Azure Powershell Task in Azure DevOps.
I use the following code:
Set-AzureKeyVaultSecret -VaultName $KeyvaultName -Name $SecretName -SecretValue $secretvalue
With the names and the value all setup inside variables and tried to use this without also variables.
The value is saved as a secure string with the following code.ConvertTo-SecureString
But when I run this powershell code inside my Azure DevOps Release pipeline I keep getting following Error message:
Cannot retrieve access token for resource 'AzureKeyVaultServiceEndpointResourceId'. Please ensure that you have provided the appropriate access tokens when using access token login.
So I've made sure that the service principal and the build server are having the right access on the keyvault by adding them both to the access policies with the get,list,set secrets permission.
I've also added following lines of code to make sure that the profile is loaded correctly
########################################################################################
$azureRmProfile = [Microsoft.Azure.Commands.Common.Authentication.Abstractions.AzureRmProfileProvider]::Instance.Profile
$profileClient = New-Object -TypeName Microsoft.Azure.Commands.ResourceManager.Common.RMProfileClient -ArgumentList ($azureRmProfile)
$context = Get-AzureRmContext
$AzureToken = $profileClient.AcquireAccessToken($context.Tenant.Id)
Add-AzureRmAccount -AccessToken $AzureToken.AccessToken -AccountId $AzureToken.UserId
########################################################################################
By adding this code in the beginning of the inline script and using the profile with the commando as variable to the -DefaultProfile.
I also enabled the option to enable the script to access the Oauth token.
Is there someone who also tried to set the secret from the powershell task in Azure DevOps. Or know why the powershell script can't get access on the keyvault.
The azurermContext commando provided me with the right output, even tried the Get-AzureRmKeyvault command to figure out if the connection to the environment was already setup right. And that also didn't gave any problems.
below working for sure (using this reguarly)
Set-AzContext -SubscriptionId $SubscriptionId
## $SubscriptionId is a subscription ID where is the target KV
$Secretvalue = ConvertTo-SecureString $SecretValuePlainText -AsPlainText -Force
## $SecretValuePlainText is the secret to store
Set-AzKeyVaultSecret -VaultName $KeyVaultName -Name $SecretName -SecretValue $Secretvalue -ErrorVariable setSecretError -Expires $ExpirationDate -NotBefore $ActivationDate
## $SecretName, $ExpirationDate, $ActivationDate - obvious :)
of course if your refer to variable not from script or inline, but from release the use $(variable_name)
Service Principal/Service Connection we use for this is temporary an Owner of target subscription (or key vault, up to you).
I had the exact same issue.
Found that the problem was a missing access token.
Namely -KeyVaultAccessToken when you call Add-AzureRmAccount.
Found the solution here: https://github.com/Azure/azure-powershell/issues/4818#issuecomment-376155173
I fixed my question with the following.
I used a service connection that was based on a managed identity. And this needed some workaround to access the key vault like #john mentioned. But this was unnecessary. by creating a new service connection based on a service principal. This workaround was not necessary and fixed the issue.
I have a powershell script that attempts to retrieve a secret stored in Azure key vault using this command.
$password = (Get-AzureKeyVaultSecret -vaultName $vaultName -name $secretName).SecretValueText
It is working perfectly fine when I execute my powershell script locally. But, when I try to do the same on Azure Devops, it fails giving below error.
[error]Operation returned an invalid status code 'Forbidden'
I feel it isn't an access policy issue, as I am able to successfully perform read/write on my vault using powershell script running locally.
I'm quite sure it is a access policy issue.
Go to your DevOps Project Settings - Pipelines - Service Connections and click on "Update Service Connection" (Use the full version of the dialog). There you can find the Subscription Id and Service Principal ID.
You then have to give explicit permissions to this SPN:
Login-AzureRmAccount -subscription <YourSubscriptionID>
$spn= Get-AzureRmADServicePrincipal -spn <YourSPN>
Set-AzureRmKeyVaultAccessPolicy -VaultName <YourVaultName> -ObjectId $spn.Id -PermissionsToSecrets get,list;
How to remove data sets in bulk? I have many data sets but could not find an option on the UI how to delete them all, neither I could find a powershell command.
While there is no such option to delete all datasets in the portal, you can do so with a PowerShell snippet:
Get-AzureRmDataFactory -ResourceGroupName <rgname> -Name <factory name> | Get-AzureRmDataFactoryDataset | Remove-AzureRmDataFactoryDataset
You may wish to append -Force to Remove-AzureRmDataFactoryDataset, which will stop the cmdlet from prompting you before deleting each dataset. Keep in mind that datasets cannot be deleted if they are being referenced by some existing pipeline.
I have not found this option but I created a ps1 script file with Remove-AzureRMDataFactoryDataset statement for each dataset and with -force parameter and then ran it via powershell. It did the job.
Microsoft has updated the version of powershell command for azure data factory. If you want to run above query successfully, now you need to use the V2 version which shows below
Get-AzureRmDataFactoryV2 -ResourceGroupName <rgname> -Name <factory name> | Get-AzureRmDataFactoryV2Dataset |Remove-AzureRmDataFactoryV2Dataset -Force