I need to get all Storage Accounts which last modified date is 6 months ago with PS script.
I didn't found any cmdlet or function which could provide such information. I thought it would be enough to sort by 'LastModifiedTime' but then I dig dipper, I saw that I have a lot of new files inside containers with the parameter "Modified". Question is how can I access these files with Powershell? Any cmdlet, function, etc?
Here is what I used to get SA before:
function check_stores {
$stores = Get-AzureRmResource -ODataQuery "`$filter=resourcetype eq 'Microsoft.Storage/storageAccounts'"
$x = (Get-Date).AddDays(-180)
foreach($store in $stores){
$storename = $store.Name
$dates = (Get-AzureRmStorageContainer -ResourceGroupName $store.ResourceGroupName -StorageAccountName $store.Name).LastModifiedTime
if(!($dates -ge $x)){
"Storage Account Name: $storename"
}}
}
check_stores
Not sure if you just want to get the blobs which LastModifiedTime (aka: LMT) is in 180 days.
If so, you don't need to check the container LMT, since it is not related with blob last modify time. (container LMT is for container properties modification).
Following script works with pipeline. If you don't need to check container LMT, just remove the check:
$x = (Get-Date).AddDays(-180)
# get all storage accounts of current subscription
$accounts = Get-AzStorageAccount
foreach($a in $accounts)
{
# get container of storage account with LMT in 180 days
$containers = $a | Get-AzStorageContainer | ? {$_.LastModified -ge $x}
# if don't need check container LMT, use : $containers = $a[0] | Get-AzStorageContainer
# get blob of containers with LMT in 180 days
$blobs = $containers | Get-AzStorageBlob | ? {$_.LastModified -ge $x}
#add code to handle blobs
echo $blobs
}
Related
trying to add tags to VMs via CSV data, right now I have the below code:
if ($context.Account -eq $null) {
# Login-AzureAccount
Connect-AzAccount
}
# Select Azure Subscription
$subscriptionId = (Get-AzSubscription | Out-GridView -Title "Select an Azure Subscription ..." -PassThru).SubscriptionId
#Select specified subscription ID
Select-AzSubscription -SubscriptionId $subscriptionId
$InputCSVFilePath = "test.csv"
$csvItems = Import-Csv $InputCSVFilePath
################
foreach ($item in $csvItems){
Clear-Variable r
#$r = Get-AzResource -ResourceGroupName $item.ResourceGroup -Name $item.VM -ErrorAction Continue
$r = Get-AzResource -Name $item.VM
################
if ($r -ne $null){
if ($r.Tags){
# Tag - Client DL
if ($r.Tags.ContainsKey("Client_DL")){
$r.Tags["Client_DL"] = $item.ClientDL
}else{
$r.Tags.Add("Client_DL", $item.ClientDL)
}
# Tag - Priority
if ($r.Tags.ContainsKey("Priority")){
$r.Tags["Priority"] = $item.Priority
}else{
$r.Tags.Add("Priority", $item.Priority)
}
}
}else{
Write-Host "No VM found named $($item.VMName)!"
}
}
I verified that my code does indeed go through the functions but for some reason the tags are not being set on my VM's. I ran the commands manually in powershell and I was able to set a tag by doing:
$r = Get-AzResource -Name TestVM
$r.Tags.Add("Client_DL", "TEST-DL")
Am I missing something? i'm running a Set-PSDebug -Trace 2 when running my code and it seems to check out just fine, but the tags aren't getting set/written.
So you're adding the tags in memory but you're not calling any of the Az cmdlets to set the tags back on the resource.
You can see in the example in the docs here they return the VM with Get-AzResource, append their new tag to the existing tags and then use Set-AzResource to write the newly added tag back.
Just be careful of that
When updating tags through PowerShell, tags are updated as a whole. If you are adding one tag to a resource that already has tags, you will need to include all the tags that you want to be placed on the resource
Alternatively you could use Update-AzTag which has an -Operation parameter and lets you choose whether you want to merge, replace or delete the existing tags.
https://learn.microsoft.com/en-us/powershell/module/az.resources/update-aztag?view=azps-6.0.0
Ultimately you'd need to set back your $r.Tags value to your resource as the last operation within your if statement.
I'm working on a script to list the blobs in a container which has a ridiculous number of blobs (over 30 million!).
Anyway, I'm using the code from https://learn.microsoft.com/en-us/powershell/module/az.storage/get-azstorageblob?view=azps-3.8.0
Which appears to use a continuation token for every 10,000 files.
$MaxReturn = 10000
$ContainerName = "abc"
$Total = 0
$Token = $Null
do
{
$Blobs = Get-AzStorageBlob -Container $ContainerName -MaxCount $MaxReturn -ContinuationToken $Token
$Total += $Blobs.Count
if($Blobs.Length -le 0) { Break;}
$Token = $Blobs[$blobs.Count -1].ContinuationToken;
}
While ($Token -ne $Null)
Echo "Total $Total blobs in container $ContainerName"
The problem is that this always ends up hanging or getting stuck and never completes.
It usually gets around half way and I have to restart it which kicks off the entire process all over again.
However, I already have the data from the first run, is there a way to get it to start from a specific value rather than from the start?
Lets say I already have the records I need for the first 3 million blobs. How do I tell it to start from 3 million instead of 0?
Or am I not understanding how the process works?
Just a summary for the issue to let others know who has the similar issue.
How do I tell it to start from 3 million instead of 0?
Since the data in the container is static, you can store the latest ContinuationToken. Then run the script with the ContinuationToken next time to get remaining blobs.
For more details you could refer to this article.
I want to get Data sizes of Cosmos DB Storage Accounts from multiple Subscriptions.
For instance, we have a Subscription which has 4 Cosmos DB Accounts in 4 regions.
PS V:\> Get-AzResource -ResourceType Microsoft.DocumentDb/databaseAccounts | ft
Name ResourceGroupName ResourceType Location
---- ----------------- ------------ --------
Account1 dbcosmosdb Microsoft.DocumentDb/databaseAccounts eastasia
Account2 dbcosmosdb Microsoft.DocumentDb/databaseAccounts eastus2
Account3 dbcosmosdb Microsoft.DocumentDb/databaseAccounts northeurope
Account4 dbcosmosdb Microsoft.DocumentDb/databaseAccounts westus
Now I would like to query all 4 Cosmos DB Accounts to get the Data size used of each Account.
Example, Account1 has 137 GB Used so far. I would like to see that Number using Powershell so that i can query through multiple Subscriptions and add this my telemetry reporting.
You could use the Get-AzMetric command, try the script as below, it works fine on my side.
$ids = (Get-AzResource -ResourceType Microsoft.DocumentDb/databaseAccounts).ResourceId
foreach($item in $ids){
$name = (Get-AzResource -ResourceId $item).Name
$metric = Get-AzMetric -ResourceId $item -MetricName "DataUsage" -WarningAction Ignore
$data = ($metric.Data | Select-Object -Last 1).Total/1024/1024/1024
Write-Output "$name : $data GB"
}
I would like to delete a folder from the container of my Azure Blob Storage account. This one contains 3 000 000+ files and using Azure Storage Explorer it is a pretty long process (1 000 files/5 min) so I would like to know if it is possible to delete a folder at once.
I am aware there is no "folder" in Azure Blob Storage and it is more a virtual path to access a blob but regarding batch deletion for a huge amount of blobs it is problematic.
Ben I'd recommend using this Powershell script allowing the deletion of 10,000 a time:
This PowerShell script, designed to run in Azure Automatiom, deletes huge number of blobs in a container, by processing them in chunks of 10,000 blobs at a time. When the number of blobs grows beyond a couple of thousands, the usual method of deleting each blob at a time may just get suspended without completing the task.
This could be used to to delete all blobs (when parameter retentionDays is supplied as 0), or certain blobs which has not been modified for the last rententionDays number of days.
Script can be downloaded here: https://gallery.technet.microsoft.com/Delete-large-number-of-97e04976
<#
.Synopsis
Deletes large number of blobs in a container of Storage account, which are older than x days
.DESCRIPTION
This Runbook deletes huge number of blobs in a container, by processing them in chunks of 10,000 blobs at a time. When the number of blobs grow beyond a couple of thousands, the usual method of deleting each blob at a time may just get suspended without completing the task.
.PARAMETER CredentialAssetName
The Credential asset which contains the credential for connecting to subscription
.PARAMETER Subscription
Name of the subscription attached to the credential in CredentialAssetName
.PARAMETER container
Container name from which the blobs are to be deleted
.PARAMETER AzStorageName
The Storage Name to which the container belong to
.PARAMETER retentionDays
Retention days. Blobs older than these many days will be deleted. To delete all, use 0
.NOTES
AUTHOR: Anurag Singh, MSFT
LASTEDIT: March 30, 2016
#>
function delete-blobs
{
param (
[Parameter(Mandatory=$true)]
[String] $CredentialAssetName,
[Parameter(Mandatory=$true)]
[String] $Subscription,
[Parameter(Mandatory=$true)]
[String] $container,
[Parameter(Mandatory=$true)]
[String] $AzStorageName,
[Parameter(Mandatory=$true)]
[Int] $retentionDays
)
$Cred = Get-AutomationPSCredential -Name $CredentialAssetName
$Account = Add-AzureAccount -Credential $Cred
if(!$Account)
{
write-output "Connection to Azure Subscription using the Credential asset failed..."
Break;
}
set-AzureSubscription -SubscriptionName $Subscription
$AzStorageKey = (Get-AzureStorageKey -StorageAccountName $AzStorageName).Primary
$context = New-AzureStorageContext -StorageAccountName $AzStorageName -StorageAccountKey $AzStorageKey
$blobsremoved = 0
$MaxReturn = 10000
$Total = 0
$Token = $Null
$TotalDel = 0
$dateLimit = (get-date).AddDays(-$retentionDays)
try
{
do
{
Write-Output "Retrieving blobs"
$blobs = Get-AzureStorageBlob -Container $container -context $context -MaxCount $MaxReturn -ContinuationToken $Token
$blobstodelete = $blobs | where LastModified -LE $dateLimit
$Total += $Blobs.Count
Write-Output "$Total total Retrieved blobs"
$Token = $Blobs[$blobs.Count -1].ContinuationToken;
if($Blobs.Length -le 0)
{
break;
}
if($blobstodelete.Length -le 0)
{
continue;
}
$TotalDel += $blobstodelete.Count
$blobstodelete | Remove-AzureStorageBlob -Force
Write-Output "$TotalDel blobs deleted"
}
While ($Token -ne $Null)
}
catch
{
write-output $_
}
}
rclone is a great tool to interact with cloud storage.
try rclone purge
As others have already mentioned, you cannot delete a "folder" in Azure Blob Storage. You have to use workaround like listing all files with a prefix and then running a for loop to delete each of them.
In PowerShell, you can simplify these 2 steps in one line by running the following command (using the AzureRM module):
Get-AzureStorageBlob -Context $context -Container $container -Blob 'FolderName*' | Remove-AzureStorageBlob -WhatIf
The -WhatIf option will print out the exact actions it is going to take. What I observed is that it will print What if: Performing the operation "Remove blob" on target ... for each file in the "folder". That probably means this is actually doing individual file deletion.
I'm trying to change daily cap for data transfer for all my Application Insights on Azure. Is there any way to change it for all of them?
I can't find how to do it by using Azure CLI.
Thank you.
You can change the daily cap with the Azure PowerShell cmdlet Set-AzureRmApplicationInsightsDailyCap.
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionName "Your Sub Name"
function Set-DailyCap {
$AI = Get-AzureRmApplicationInsights | Select ResourceGroupName, Name
$AI | foreach {
write-output ("Attempting to set daily cap for App Insights in resource group {0} instance {1}" -f $_.ResourceGroupName, $_.Name)
Set-AzureRmApplicationInsightsDailyCap -ResourceGroupName $_.ResourceGroupName -Name $_.Name -DailyCapGB 0.2
}
}
Set-DailyCap
Here is a modified version of #RonDBA's solution, which includes some logic to parse the names and set limits based on their name. Using this script, I was able to update hundreds of daily caps in a matter of seconds.
import-module azurerm.applicationinsights
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionName "yoursubscription here"
$ai = Get-AzureRmApplicationInsights | select ResourceGroupName, Name
$AI | foreach {
$cap = 1
$color = 'red'
if($_.Name -match 'dev'){
$cap = .12
$color = 'green'
}
if($_.Name -match 'stg'){
$cap = .24
$color = 'blue'
}
if($cap -eq 1)
{
if($_.Name -match 'api'){
$cap = 1.4
$color = 'yellow'
}
else{$cap = 2.9}
}
write-host ("Attempting to set daily cap at $cap for {0} instance " -f $_.ResourceGroupName) -NoNewline
write-host $_.Name -ForegroundColor $color
Set-AzureRmApplicationInsightsDailyCap -ResourceGroupName $_.ResourceGroupName -Name $_.Name -DailyCapGB $cap
}
There is no way you can change the daily cap of your application insights component using Azure CLI or even Azure REST APIs as of today.
To change it, use the Daily volume cap blade, linked from the Data
Volume Management blade (see below). Note that some subscription types
have credit which cannot be used for Application Insights. If the
subscription has a spending limit, the daily cap blade will have
instructions how to remove it and enable the daily cap to be raised
beyond 32.3 MB/day.
Data source/Reference:
https://learn.microsoft.com/en-us/azure/application-insights/app-insights-pricing#data-rate