Please help with error PermanentDeleteNotSupportedOnRootBlob
I generated SAStoken for BLOB with all permissions including Delete & Permanent Delete
While making API call using below blog
https://learn.microsoft.com/en-us/rest/api/storageservices/delete-blob
request url:
"https://[SAS URL]&deletetype=permanent"
Getting 409 error as "PermanentDeleteNotSupportedOnRootBlob"
x-ms-delete-snapshots in the header is "include"
Storage Account - 'Permanent Delete soft deleted blobs' option is enabled
Google does not return anything for PermanentDeleteNotSupportedOnRootBlob
Also tried with both Signing method as Account key & User delegation key and got same error
I tried to reproduce the same in my environment.
To permanently delete blob , their snapshot must also be soft deleted
For that Make sure to enable versioning for blobs while creating.
If already created, you can enable version from data protection configuration.
With
https://xxxx.blob.core.windows.net/container/blob?sp=r&st=2023-01-27T12:43:35Z&se=2023-01-27T20:43:35Z&spr=https&sv=2021-06-08&sr=b&sig=xxxx%3D&deletetype=permanent
The blob can be deleted successfully
Note:
Make sure the versionId is given correctly.
For already existing blob, disable soft delete , undelete the blobs then enable all the above mentioned properties and delete again.
Otherwise one may come across 409 error.
Reference: Delete Blob (REST API) -permanent-delete Azure Storage | Microsoft Learn
I used the Azure Backup client (MARS) to back up a server he had. The server no longer exists. In the Azure portal I am unable to delete the vault because the resource group contains backup items.
I tried using Powershell but Az.RecoveryServices is not meant to be used for MARS BackupManagementType. You can Get-AzureRmRecoveryServicesBackupContainer but then Get-AzureRmRecoveryServicesBackupItem fails because there is no WorkLoadType for MARS
So I cant delete the backup items from the Portal. I cant delete backup Items using powershell and the server no longer exists so I can use the MARS agent to delete items.
You can't delete a Recovery Services vault that has servers registered in it, or that holds backup data.
To gracefully delete a vault, unregister servers it contains, remove vault data, and then delete the vault.
If you try to delete a vault that still has dependencies, an error message is issued, and you will need to manually remove the vault dependencies, including:
Backed up items
Protected servers
Backup management servers (Azure Backup Server, DPM)
Refer to this article for detailed info:https://learn.microsoft.com/en-us/azure/backup/backup-azure-delete-vault
Note: You can use Cloud Shell available in portal to achieve this. Please select PowerShell after you launch Cloud Shell.
Kindly let us know if the above helps or you need further assistance on this issue.
If I used Get-AzAutomationSchedule for my automation account, I get nothing in return because I used Remove-AzAutomationSchedule for all of them. But if I look in the portal, every schedule I've deployed is there present. If I select a schedule that I've removed using Powershell and then attempt to update the schedule in the portal, I get the crying rain cloud and it says
NewScheduleBladeV2
MICROSOFT_AZURE_AUTOMATION
NewScheduleBladeV2
The reacurrance is also listed as unknown in the list.
This is a problem not only for clarity when viewing in the portal, but if I attempt to run my ARM template again with the schedules there, I get an "Internal Server Error" code 500. I can't redeploy them if I delete them with Powershell.
Is there anyway to send something to Azure to update these? Not sure if I need to do some API call or some form of Set-Az cmdlet
Thanks
I have tried to reproduce the issue you are facing but all worked good for me when i used cmdlets Get-AzAutomationSchedule and Remove-AzAutomationSchedule of Az.Automation module versioned 1.2.1.
Is this still an issue at your end ? If yes, can you restart the browser after clearing the cache and deleting the cookies ?
How do you trigger Azure CDN to read the latest version of custom certificate from Key Store without downtime?
My CDN-setup is working ok, but given Let's Encrypt, the certificate is short lived and requires automation for updates. Doing az keyvault certificate import is trivial enough on Azure CLI to update the certificate into Key Vault, but what next? How do you tell Azure CDN to start using the new version of the cert?
Failed attempts
Waiting for couple hours. Nothing happened.
Running az cdn custom-domain enable-https on a domain having the HTTPS already enabled. Result: an internal misconfiguration and couple hours of downtime to first disable the custom domain and then enable it. Certificate was updated, though.
From Azure Portal
Azure Portal tooltip on custom domain certificate version says "Select the version of the certificate you want to use. By default we'll use the latest version." That is true when creating the endpoint, but how do you start using the latest version? The latest version is already selected from dropdown, but I did select the previous version and select the latest version. Doing that enabled "Save".
Saving the form resulted in no-downtime update of the certificate. Nice, but given automation and scripting, not really the way to go.
Things which might do the trick, but I haven't tested yet
Applying ARM-template of the CDN-setup
Powershell Az.Cdn has Start-AzCdnEndpoint/Stop-AzCdnEndpoint cmdlets. Maybe helpful, but 100% guarantee to generate downtime.
Is there anything I can try on next update cycle?
As of 2021-04-05, Azure CDN can be told to use the "Latest" version of a particular KeyVault certificate. I haven't found any news of this change, but it was added to the documentation with this commit.
In order for the certificate to be automatically rotated to the latest version when a newer version of the certificate is available in your Key Vault, please set the certificate/secret version to 'Latest'. If a specific version is selected, you have to re-select the new version manually for certificate rotation. It takes up to 24 hours for the new version of the certificate/secret to be deployed.
In the portal, this can be done by choosing the "Latest" option in the "Certificate/Secret version" dropdown. With the Azure CLI, this can be done with:
az cdn custom-domain enable-https \
--resource-group "$cdnResourceGroupName" \
--profile-name "$cdnProfileName" \
--endpoint-name "$cdnEndpointName" \
--name "$cdnCustomDomainName" \
--user-cert-subscription-id "$subscriptionId" \
--user-cert-group-name "$keyVaultResourceGroupName" \
--user-cert-vault-name "$keyVaultName" \
--user-cert-secret-name "$secretName" \
--user-cert-protocol-type 'sni'
Notice that this command does not set the --user-cert-secret-version parameter, which is how you select the "Latest" functionality.
For anyone who wants to do it manually, the old answer of doing this manually is below.
Running az cdn custom-domain enable-https on a domain having the HTTPS already enabled. Result: an internal misconfiguration and couple hours of downtime to first disable the custom domain and then enable it.
As of 2021-04-05, this can be done with the Azure CLI with:
az cdn custom-domain enable-https \
--resource-group "$cdnResourceGroupName" \
--profile-name "$cdnProfileName" \
--endpoint-name "$cdnEndpointName" \
--name "$cdnCustomDomainName" \
--user-cert-subscription-id "$subscriptionId" \
--user-cert-group-name "$keyVaultResourceGroupName" \
--user-cert-vault-name "$keyVaultName" \
--user-cert-secret-name "$secretName" \
--user-cert-secret-version "$secretVersion" \
--user-cert-protocol-type 'sni'
(When this answer was originally written in 2019 May, the Azure CLI documented a --custom-domain-https-parameters parameter that implied it could be used for this purpose. If the parameter was not supplied, the CLI would run the CDN-managed cert workflow (cert issued by DigiCert). However, it was never properly documented how to actually use the parameter. As of 2021 March, the parameter was removed from the CLI again. Finally, as of 2021 April, the --user-cert-* parameters have been added.)
The equivalent feature was added in 2019 March to the .Net SDK,. So the Nuget package should allow you to use user-managed certs.
As of 2021 April, Azure PowerShell's Enable-AzCdnCustomDomainHttps commandlet still does not support user-managed certs, only CDN-managed certs.
Or you can use the REST API directly as documented here. Make a POST request to https://management.azure.com/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Cdn/profiles/$cdnProfileName/endpoints/$cdnEndpointName/customDomains/$cdnCustomDomainName/enableCustomHttps?api-version=2018-04-02 with an application/json body that looks like
{
"certificateSource": "AzureKeyVault",
"certificateSourceParameters": {
"#odata.type": "#Microsoft.Azure.Cdn.Models.KeyVaultCertificateSourceParameters",
"deleteRule": "NoAction",
"resourceGroupName": "$resourceGroupName",
"secretName": "$secretName",
"secretVersion": "$secretVersion",
"subscriptionId": "$subscriptionId",
"updateRule": "NoAction",
"vaultName": "$keyVaultName"
},
"protocolType": "ServerNameIndication"
}
$resourceGroupName and $keyVaultName identify the KeyVault. $secretName and $secretVersion identify the certificate. (Don't be confused if the Portal doesn't show any secrets in your KeyVault; a KeyVault certificate with a private key is implicitly a KeyVault secret with the same name and version.)
This API endpoint follows standard REST semantics, in that it returns HTTP 202 Accepted since it's a long-running async operation. It'll set the Location header in the response, and you should GET that URL repeatedly till it resolves to a success or failure status code.
Note that the portal also uses the REST API, so you can always derive this from just doing the steps in the portal UI and inspecting the network requests in your browser's developer tools. You will need to get your own oauth2 token, though (by creating an SP).
Aside: To save people the time it took me to discover this when trying to do this for my own domain, do not be fooled by the documentation or the example in the Azure Rest API specs repo. They imply the API version 2017-10-12 supports customHttpsParameters, but in fact only 2018-04-02 and newer support it. If you use 2017-10-12 then the parameter is silently ignored, and it will try to use the Digicert automatic cert workflow.
As suggested by #Arnavion, Azure CDN Custom Domain REST API call can be used to trigger the update of the certificate used by an existing custom domain in Azure CDN. The suggested API docs are at https://learn.microsoft.com/en-us/rest/api/cdn/customdomains/enablecustomhttps. When combining above example with API reference information, it is possible to craft a suitable request with PowerShell 6 Invoke-RestMethod. On success, CDN will start updating the certificate from Key Vault to endpoints in a few minutes.
There is no trickery involved in the operation. All that is needed is to gather all required five parameters for API-required UserManagedHttpsParameters to indicate the exact custom domain in the CDN and pinpoint the X.509 certificate at Key Vault. What is also needed is to combine the parameters with a valid bearer token and then make the mentioned API-call to inform CDN to pick up the new certificate from Key Vault and deploy it to all CDN endpoints.
My script doing all this is available at https://gist.github.com/HQJaTu/c5695626ba51c6194845fa60913e911b
How you get the new certificate into Key Vault is an another discussion and my script won't address that. However, uploading a new version of a certificate is much easier than triggering the CDN-update.
At the moment documentation for Azure CLI still doesn't give answer how to do it.
I have written some python script to do it as a part the of acme.sh renew-hook.
Mainly it works for apex custom domain but you can use it with subdomains.
You can find it on github -> https://github.com/przemika/azure-byoc-for-custom-domain
I am cleaning out some old items from my azure account and cannot remove an older version Bacup Vault.
I get the following error when I try to delete it:
Vault cannot be deleted as there are existing resources within the
vault. Please ensure there are no backup items, protected servers or
backup management servers associated with this vault. Unregister the
following containers associated with this vault before proceeding for
deletion : COMPUTER-NAME. Unregister all containers from the vault and then
retry to delete vault
Notice the COMPUTER-NAME
That is the name of my computer, but I can not find the Azure back up agent installed on that computer. I also cannot find the computer name container in any storage containers in my entire azure account.
Can someone help me figure out how to remove these items
thanks in advance
First screenshot shows the Backup vault and the error message I get when I try to delete.
the second screenshot shows the BackupItems that remain, but I cannot delete them.
the red boxes cover my COMPUTER-NAME
Looks like my previous answer was turned into a comment due to brevity. Here's an update to make it a better answer anyway. Answer from that link quoted below for reference.
I have not mapped this answer to the corresponding Azure commands, but I was able to find my way to a solution via the Azure Portal. The steps were as follows:
Selected my Recovery Service resource
Under the Manage section, clicked Backup Infrastructure
Under Management Servers, clicked Protected Servers
In the list that followed, clicked on the row where my Protected Server > Count was greater than 0, in my case, Azure Backup Agent (because the backup agent was installed on my Windows Desktop)
Clicked on my server name in the Protected Server list
Clicked Delete in the card for my protected server
After that completed, I was able to delete the entire vault. These steps may be helpful if you have other Backup Infrastructure resources and possibly even Site Recovery Infrastructure resources associated with a vault.
Update: It seems like there's an open issue for Get-AzureRmRecoveryServicesBackupItem not having any capacity to return MARS backup items which is ultimately what the issue here was.