Azure storage deletion policy does not work - azure

I need to delete all blobs residing in a particular container in my storage account after 1 day since creation. To do that I have tried to set a policy but I cant make it work so far.
I have tried to create the policy both by using the azure portal and by using terraform. The policy code view shown in the portal is the following:
{
"rules": [
{
"enabled": true,
"name": "delete_old_records_csv_files",
"type": "Lifecycle",
"definition": {
"actions": {
"baseBlob": {
"delete": {
"daysAfterCreationGreaterThan": 1
}
}
},
"filters": {
"blobTypes": [
"blockBlob"
],
"prefixMatch": [
"queryresults"
]
}
}
}
]
}
I have waited 2 days for the files to get deleted but they did not. Is there something I am doing wrong?
Thanks!

I have waited 2 days for the files to get deleted but they did not. Is there something I am doing wrong?
If the blobs are created before the creation of the lifecycle management rule they will not be deleted.
To apply policy for existing blobs, you can move or copy them to a new container where the policy will be applied.
Make sure If you are updating or new policy it may take up to 48Hrs to complete.
Portal:
Ensure that the blobs you trying to delete match the prefix "queryresults" as same as in the specified policy.
Also check if any restriction is applied to a container or blob like Immutablity policy and the blob is leased.
If everything is correct but still not being deleted, you can debug your storage account by using audit logs and if you get warning that will explain why the policy is not working.
Reference:
azure - Lifecycle management rule not working for ADLS Gen2 Storage Account - Stack Overflow by Joel Cochran.

Related

how to Enable soft delete for blobs or containers in azure storage accounts using rest api or python library

I am searching all over the internet and read already many MS docs but nothing regarding enabling soft delete for an azure blob storage using API.
I want to Enable soft delete for blobs and setup a retention policy like in the picture
I could able to get this done using REST API PUT method by Setting DeleteRetentionPolicy property of Blob Storage. I have referred from this Official Documentation.
API
https://management.azure.com/subscriptions/<SUBSCRIPTION_ID>/resourceGroups/<RESOURCE_GROUP_NAME>/providers/Microsoft.Storage/storageAccounts/<ACCOUNT_NAME>/blobServices/default?api-version=2022-09-01
BODY
{
"properties": {
"deleteRetentionPolicy": {
"enabled": true,
"days": <DAYS>,
"allowPermanentDelete": true
},
"isVersioningEnabled": true }
}
RESULTS:
IN STORAGE ACCOUNT

Need to deploy the Azure Policy for the Tags only for the VM

I am deploying the Azure policy for the Recommended Tags that need to be applied when anyone creates the new VM.
I found one in-built policy: Require a tag on resources
But when I deployed, it will be applied to all the resources and I need a policy for only VM resources.
Also how I can use more than one tag in a single policy?
In your policy rule, you must indicate that the policy is just for VMs
For example:
...
"policyRule": {
"if": {
"field": "type",
"in": [
"Microsoft.Compute/virtualMachines",
"Microsoft.ClassicCompute/virtualMachines"
]
},
"then": {
...
}
}
...
Hope this helps!

Azure lifecycle policy for container is not working

I have configured azure lifecycle policy for container as below,
{
"rules": [
{
"enabled": true,
"name": "name",
"type": "Lifecycle",
"definition": {
"actions": {
"baseBlob": {
"delete": {
"daysAfterModificationGreaterThan": 30
}
}
},
"filters": {
"blobTypes": [
"blockBlob"
],
"prefixMatch": [
"storageaccount/container"
]
}
}
}
]
}
So it will delete the blobs which was modified before 30 days. I am putting backups in the container so I want to delete the old backups which is 30 days older.
I configured this policy before 2 days before. Yet old backup files are not removed from the container
I analyzed and as per below links after the new policy configuration it will take up 24 hrs to take effect for new policy and policy update
ref:https://learn.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-overview?tabs=azure-portal#faq
Lifecycle management policy not working on Azure Data Lake Gen 2
I cant find many documents regarding this issue also
I configured Firewalls and virtual networks as 'all networks'. why i am mention this because according to below documents his solution works,
https://learn.microsoft.com/en-us/answers/questions/107954/lifecycle-mangement-isn39t-doing-anything.html
Yet there is no update in containers. So anyone know the reason and troubleshoots to resolve this?
thank you!
Please check if below points can help:
Things to check:
Please check if there any immutability policies configured for blob
versions .
If version-level immutability support is enabled for a container and
the container contains one or more blobs, then you must delete all
blobs in the container before you can delete the container, even if
there are no immutability policies in effect for the container or its
blobs.
Please check if last access time tracking is enabled, because
every time the blobs are accessed they move to hot tier and may
change its modified time. You need to make sure you are using a V2
storage account: optionally-enable-access-time-tracking
Work around :
Use az cli to add life cycle policy to delete the blobs in the container whose last modification time is more than 30 days.
Note: (Try simply specifying the container name as blob prefixcontainer name in prefixmatch instead of storageaccount/container)
Ex policy.json :
"rules": [
{
"name": "expirationRule1",
"enabled": true,
"type": "Lifecycle",
"definition": {
"filters": {
"blobTypes": [ "blockBlob" ],
"prefixMatch": [ "containername" ]
},
"actions": {
"baseBlob": {
"delete": { "daysAfterModificationGreaterThan": 30 }
}
}
}
}
]
}
Try to create this lifecycle management policy (policy.son) using the az cli command:
az storage account management-policy create --account-name youraccountname --policy #policy.json --resource-group myresourcegroup
References:
lifecycle - Unable to add new rule in Storage Management Policy on
Azure - Stack Overflow
How to automatically delete old Azure Blob Storage containers -
Stack Overflow

Set-AzVMDiagnosticsExtension doesn't work work as expected across subscriptions

what I'm trying to is to enable VM Diagnostic extension to send Event logs (Application [1,2,3], Security [all], System [1,2,3]) to one unified storage account (let's call logs storage) where WADWindowsEventLogsTable is supposed to be created.
different scenarios I'm trying to implement :
VM is in the same resource group where logs storage is.
The result : works
VM in a different resource group where logs storage is.
The result : works
VM in a different subscription
The result : the extension will be enabled. However, when go to Agent tab, I'll get the error message "the value must not be empty" under Storage account section
agent tab, storage account section error
Environment
Windows
Powershell 7.0.2
DiagnosticsConfiguration.json
{
"PublicConfig": {
"WadCfg": {
"DiagnosticMonitorConfiguration": {
"overallQuotaInMB": 5120,
"WindowsEventLog": {
"scheduledTransferPeriod": "PT1M",
"DataSource": [
{
"name": "Application!*[System[(Level=1 or Level=2 or Level=3 or Level=4)]]"
},
{
"name": "Security!*"
},
{
"name": "System!*[System[(Level=1 or Level=2 or Level=3 or Level=4)]]"
}
]
}
}
},
"StorageAccount": "logsstorage",
"StorageType": "TableAndBlob"
},
"PrivateConfig": {
"storageAccountName": "logsstorage",
"storageAccountKey": "xxxxxxx",
"storageAccountEndPoint": "https://logsstorage.blob.core.windows.net"
}
}
Powershell commands :
Set-AzVMDiagnosticsExtension -ResourceGroupName "myvmresourcegroup" -VMName "myvm" -DiagnosticsConfigurationPath "DiagnosticsConfiguration.json"
I even tried to explicitly specifying account name and key as :
$storage_key = "xxxxxx"
Set-AzVMDiagnosticsExtension -ResourceGroupName "myvmresourcegroup" -VMName "myvm" -DiagnosticsConfigurationPath "DiagnosticsConfiguration.json" -StorageAccountName "logsstroage" -StorageAccountKey $storage_key
I've spent a lot of time trying to figure out the issue without luck.
The real issue here is that the extension doesn't create the expected table WADWindowsEventLogsTable (or write to it if it's already exist)
According to the official documentation I should be able to do this, example 3 :
https://learn.microsoft.com/en-us/powershell/module/az.compute/set-azvmdiagnosticsextension?view=azps-4.3.0
I've submitted an issue with the team on GitHub and gave more details, but still waiting for their input
https://github.com/Azure/azure-powershell/issues/12259
This is because the storage account "logsstorage" you specify is in another subscription.
You should have selected a different subscription to enable VM Diagnostic extension. So you also need to modify your DiagnosticsConfiguration.json file and specify a storage account which is in the current subscription.
I managed to get this fixed with some help from Microsoft engineer.
I've detailed the answer in this GitHub issue :
Set-AzVMDiagnosticsExtension doesn't seem working properly across subscriptions
The answer :
I managed to get this work, thanks for the help from #prernavashistha from Microsoft support it turned out there's some inconsistency in the documentations.
According to the documentation here :
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/diagnostics-extension-windows-install#powershell-deployment
In PrivateConfig I should pass the storage URI to "storageAccountEndPoint" key :
"PrivateConfig": {
"storageAccountEndPoint": "https://logsstorage.blob.core.windows.net"}
However, according to another documentation reference :
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/diagnostics-extension-schema-windows#json
I should pass the Azure storage endpoint :
"PrivateConfig": {
"storageAccountEndPoint": "https://core.windows.net"}
I can confirm that using Azure storage endpoint resolved the issue, and I can enable the extension across subscriptions, and I can see logs being written to the correct table as expected.
Thanks

Retention Policy for Azure Containers?

I'm looking to set up a policy for one of my containers so it deletes or only retains data for x days. So if x is 30, that container should only contain files that are less than 30 days old. If the files are sitting in the container for more than 30 days it should discard them. Is there any way I can configure that?
Currently such kind of thing is not supported by Azure Blob Storage. You would need to write something of your own that would run periodically to do this check and delete old blobs.
On a side note, this feature has been long pending (since 2011): https://feedback.azure.com/forums/217298-storage/suggestions/2474308-provide-time-to-live-feature-for-blobs.
UPDATE
If you need to do it yourself, there are two things to consider:
Code to fetch the list of blobs, find out the blobs that need to be deleted and then delete those blobs. To do this, you can use Azure Storage SDK. Azure Storage SDK is available for many programming languages like .Net, Java, Node, PHP etc. You just need to use the one that you're comfortable with.
Schedule this code to run once on a daily basis: To do this, you can use one of the many services available in Azure. You can use Azure WebJobs, Functions, Schedular, Azure Automation etc.
If you decide to use Azure Automation, there's a Runbook already available for you that you can use (no need to write your code). You can find more details about this here: https://gallery.technet.microsoft.com/scriptcenter/Remove-Storage-Blobs-that-aae4b761.
Azure Blob storage Lifecycle (Preview) is now available and using that we create Policies with different rules.
Here is the rule to delete the blobs which are older than 30 days
{
"version": "0.5",
"rules": [
{
"name": "expirationRule",
"type": "Lifecycle",
"definition": {
"filters": {
"blobTypes": [ "blockBlob" ]
},
"actions": {
"baseBlob": {
"delete": { "daysAfterModificationGreaterThan": 30}
}
}
}
}
]
}
For more details refer this Azure Blob storage Lifecycle

Resources