In Azure RM Powershell the statement
Remove-AzureRmKeyVault -InputObject $sdvobjKeyVault -Force
always pops up with the prompt whether I really want to execute that action - '-Force' seems to get ignored! Now then, how to have a KV deleted from a RG without user interaction via Azure Powershell?
I'm not seeing the same behavior here. Adding -Force suppresses the confirmation as expected.
Any reason you are using the old AzureRm commands? You should start making the move to the Az commands. Here is a good reference for KeyVaults including the ability to manage the Soft Delete for vaults.
MS Docs Reference: https://learn.microsoft.com/en-us/azure/key-vault/key-vault-soft-delete-powershell
Related
As per my understanding keyvault names are globally unique and also secrets as well as
I won't be able to reuse the keyvault that exists in the soft deleted state
I am having the multiple keyvaults, after deleting multiple keyvaluts it is moving to softdelete state
I want to enable the soft delete option automatically,
If someone came and acciedently delete my keyvalut i can be able to grant the access permissions to recover the secrets
Every time I cannot go to the portal and enable the soft delete option for the keyvault manually i want this in automated way
How can we write the playbook using powershell to automate the soft delete option for all keyvaults
I have searched in the net and find this microsoft Document but didnot get any related information related to automation to get the results
Can any one help me to do this I will really appreciated
Thanks in advance $ have a good day with nice answer :)-
I tried to create the runbook using PowerShell for keyvault in my environment and got the below results
I have created the automation account to use the runbook
Created the runbook and wrote the PowerShell script for soft delete
#soft delete option for single vault
Connect-AzAccount
Get-AzKeyVault -VaultName "XXXXXX"
$vaultId = (Get-AzRecoveryServicesVault -Name "recovery-services" -ResourceGroupName 'XXXXX'.id)
(Get-AzRecoveryServicesVaultProperty -VaultID $vaultId).SoftDeleteFeatureState
#soft delete option for multiple keyvaults
$vaults = Get-AzRecoveryServicesVault
foreach($vault in $vaults) {
$properties = Get-AzRecoveryServicesVaultProperty -VaultId $vault.Id
if($properties.SoftDeleteFeatureState -eq 'Enabled') {
Write-Host "Soft delete option is enabled" $properties.SoftDeleteFeatureState "for" $vault.Name "`n" `
-ForeGroundColor Green
} else {
Write-Host "Soft delete option is enabled" $properties.SoftDeleteFeatureState "for" $vault.Name "`n" `
-ForeGroundColor Red
}
}
Saved my script and published, and I run my script
when I check the job its succeeded and the status is running
When I check the keyvault the auto soft delete got enabled
Added the schedule to run automatically for particular period of time
I am trying to shutdown the VM using Azure Automation Account System Managed identity option.
However I am ending up with below error.
As per the other articles it is mentioned to upgrade the module Update-ModulesInAutomationToLatestVersion but I could not update due to below error. I am not sure what is the issue in the script. Same script works with AzureRunAsConnection option without issues ( script ).I even checked with simple login with System Managed Identity it successfully login and fetches the resource group names.
I have tested the above shared script in my automation account. Below are
the couple of observations:
You need to use Connect-AzureRMAccount -Identity cmdlet instead of 'connect-AzAccount` to connect to your subscription because the rest of the script that you have written using Azure RM cmdlets.
If we use AzureRM cmdlets in your run book script the job is getting suspended stating that Azure RM is going to retired and suggesting us to use Az Module in your workflow.
You can refer to this documentation on how to migrate your PowerShell scripts automatically from AzureRM to AZ modules.
If you want to perform start/stop on your virtual Machines you can leverage the Azure Automation start/stop during the off hours feature.
According to the MICROSOFT DOCUMENTATION and looking at your script the Azure Rm module is not supported and it has been updated to the latest version of Az module.
For more information please refer the below links:-
MICROSOFT DOCUMENT|Using a system-assigned managed identity for an Azure Automation account & Troubleshoot runbook issue.
As part of troubleshooting/diagnosing a permissions issue with the service principal that one of our runbooks is using, Microsoft support has asked me to run the same PowerShell code locally so we can capture logs. In order to do this, I need to authenticate exactly the same way as the runbook does, which means authenticating with the certificate that the AzureRunAsConnection is using.
After swapping out some cmdlets that only exist in Azure Automation for equivalent commands for Azure RM (e.g. Get-AutomationConnection has to be replaced with Get-AzAutomationAccount and Get-AzAutomationConnection, and you have to switch to using FieldDefinitionValues), the authentication part of my script looks like this::
Set-StrictMode -Version Latest
$ErrorActionPreference = 'Stop'
$PSDefaultParameterValues['*:ErrorAction']='Stop'
'*** Authenticating with Azure'
$automationAccount = Get-AzAutomationAccount `
-Name "MY_AUTOMATION_ACCOUNT" `
-ResourceGroupName "MY_RESOURCE_GROUP"
$connection = $automationAccount | Get-AzAutomationConnection -Name "AzureRunAsConnection"
# Log-in to Az Account Management Graph API (for DNS updates)
Login-AzAccount -ServicePrincipal `
-Tenant $connection.FieldDefinitionValues.TenantID `
-ApplicationId $connection.FieldDefinitionValues.ApplicationID `
-CertificateThumbprint $connection.FieldDefinitionValues.CertificateThumbprint
When I run it, though, I get this error:
Login-AzAccount : No certificate was found in the certificate store with thumbprint
93FAB7F0BA11D08F8ABBAF5C587C77ECB058A8BB
It appears that I need to export the certificate that the AzureRunAsConnection is using so I can import it into my local machine. However, though I can renew the certificate that Azure Automation uses or upload my own, it doesn't appear that there is an easy way to get the current certificate. I know this is by design -- for security -- but for a case like this it's a pain.
I found this article from 2018 that describes how to export the cert while inside a hybrid worker, but the code didn't work for me in Azure Cloud Shell nor locally (Get-AutomationCertificate is undefined):
https://www.lunavi.com/blog/how-to-download-an-azure-automation-connection-certificate-locally
Some guides on the web suggest creating a blob storage account and then writing a script to export to it, but that seems like a lot of effort for something I just need once for this repro.
What's the fastest, easiest way to get this certificate for local debugging/repro?
In the end, I used the article from 2018 to craft a quick and dirty way to get the certificate in a consumable format out of the runbook.
I modified the script that's running in the runbook to add the following lines at the top:
"*** Exporting Run As Certificate for Debugging"
$cert = Get-AutomationCertificate -Name "AzureRunAsCertificate"
$certData = $cert.Export("pfx", 'MySuperSecurePassword')
throw ([Convert]::ToBase64String($certData))
This causes an exception that contains the contents of the certificate as its message. Then when I run this code via the runbook "Test pane", I get the contents of the certificate dumped out in base64 format at the top:
I use an exception just so that: 1) the rest of the runbook does not execute, and 2) I am guaranteed to see the message. I found that if I just did a Write-Host before throwing an exception, the output might not get flushed before the script stopped running.
I'm then able to copy the base64-encoded part that's in parenthesis, paste it into a file, and decode the contents of the file with [System.Text.Encoding]::Unicode.GetString([System.Convert]::FromBase64String in PowerShell or base64 --decode in WSL to get a PFX file. I can then open that PFX file up and import it into my local user store:
I think this is an Azure problem, but I am not confident. Today all my deployments from Octopus stopped working. I am getting an error in the logs that makes me feel like its an authentication issue. I checked the key in Azure didnt expire. If I click the Save and Test button in Octopus its successful. I even made a new app, assigned permissions in Azure and set it up in OCtopus and deployed using that account. Same problem.
If I login to Powershell using the Service Principle, I can run commands everything is good, but every job is getting this error which seems like an account issue. I am currently looking at the Azure side, but nothing yet, so I thought I would ask this community as well.
pushd $env:OctopusCalamariWorkingDirectory
try {
If ([System.Convert]::ToBoolean($OctopusUseServicePrincipal)) {
# Authenticate via Service Principal
$securePassword = ConvertTo-SecureString $OctopusAzureADPassword -AsPlainText -Force
$creds = New-Object System.Management.Automation.PSCredential ($OctopusAzureADClientId, $securePassword)
# Turn off context autosave, as this will make all authentication occur in memory, and isolate each session from the context changes in other sessions
Disable-AzureRMContextAutosave -Scope Process
$AzureEnvironment = Get-AzureRmEnvironment -Name $OctopusAzureEnvironment
if (!$AzureEnvironment) {
Write-Error “No Azure environment could be matched given the name $OctopusAzureEnvironment”
exit -2
}
...
Update 1:
Added info, I expanded the logs to Verbose and noticed this
Performing variable substitution on ‘F:\Octopus\Work\20201030000719-612-78\AppServiceEnvironment-Octopus\ase.ps1’
Attempt 1 of 5 failed: No Azure environment could be matched given the name AzureCloud
Almost like the variable substitution stopped working? Could be a red herring, but figured I would add the information.
Update 2:
I added OctopusPrintEvaluatedVariables and OctopusPrintVariables and set them to True and can see the variables are created right. It seems to be something with the connection between Azure and Octopus that stopped working
How to remove data sets in bulk? I have many data sets but could not find an option on the UI how to delete them all, neither I could find a powershell command.
While there is no such option to delete all datasets in the portal, you can do so with a PowerShell snippet:
Get-AzureRmDataFactory -ResourceGroupName <rgname> -Name <factory name> | Get-AzureRmDataFactoryDataset | Remove-AzureRmDataFactoryDataset
You may wish to append -Force to Remove-AzureRmDataFactoryDataset, which will stop the cmdlet from prompting you before deleting each dataset. Keep in mind that datasets cannot be deleted if they are being referenced by some existing pipeline.
I have not found this option but I created a ps1 script file with Remove-AzureRMDataFactoryDataset statement for each dataset and with -force parameter and then ran it via powershell. It did the job.
Microsoft has updated the version of powershell command for azure data factory. If you want to run above query successfully, now you need to use the V2 version which shows below
Get-AzureRmDataFactoryV2 -ResourceGroupName <rgname> -Name <factory name> | Get-AzureRmDataFactoryV2Dataset |Remove-AzureRmDataFactoryV2Dataset -Force