Copy container by Azure CLI and wait for result - azure

I tried to find some simple way how to copy container from one storage to other asynchronously via Azure CLI. Something that can be done by azcopy. I don't have azcopy on my machine installed, but Azure CLI is.
Question: I understand I need to copy one blob after other. How do I check that the copy operation is finished?
Something that kind of works, but calling az storage blob show one by one takes very long time (minutes).
$backup = 'somecontainer'
$exists = (az storage container exists --name $backup --account-name an --account-key ak --output tsv) -match 'true'
if (!$exists) {
az storage container create --name $backup --account-name mt --account-key mk
}
$blobs = az storage blob list --container-name $backup --account-name an --account-key ak | ConvertFrom-Json
# copy one by one
$blobs.name | % {
$name = $_
az storage blob copy start --destination-blob $name --destination-container $backup --source-blob $name --source-container $backup --account-name mt --account-key mk --source-account-name an --source-account-key ak
}
# check operation status
$results = $blobs.name | % {
az storage blob show --container-name $backup --name $_ --account-name mt --account-key mk | ConvertFrom-Json
}
# still unfinished copy opearations:
$results | ? { !($_.properties.copy.completiontime) } | % { $_.name }

#stej As #GeorgeChen mentioned you can use the below:
az storage blob copy start-batch --account-key 00000000 --account-name MyAccount --destination-container MyDestinationContainer --source-account-key MySourceKey --source-account-name MySourceAccount --source-container MySourceContainer
Here is the documentation link:
https://learn.microsoft.com/en-us/cli/azure/storage/blob/copy?view=azure-cli-latest#az-storage-blob-copy-start-batch

Related

Get-AzureRmStorageAccount, Dig into Container files and get "Modified" property

I need to get all Storage Accounts which last modified date is 6 months ago with PS script.
I didn't found any cmdlet or function which could provide such information. I thought it would be enough to sort by 'LastModifiedTime' but then I dig dipper, I saw that I have a lot of new files inside containers with the parameter "Modified". Question is how can I access these files with Powershell? Any cmdlet, function, etc?
Here is what I used to get SA before:
function check_stores {
$stores = Get-AzureRmResource -ODataQuery "`$filter=resourcetype eq 'Microsoft.Storage/storageAccounts'"
$x = (Get-Date).AddDays(-180)
foreach($store in $stores){
$storename = $store.Name
$dates = (Get-AzureRmStorageContainer -ResourceGroupName $store.ResourceGroupName -StorageAccountName $store.Name).LastModifiedTime
if(!($dates -ge $x)){
"Storage Account Name: $storename"
}}
}
check_stores
Not sure if you just want to get the blobs which LastModifiedTime (aka: LMT) is in 180 days.
If so, you don't need to check the container LMT, since it is not related with blob last modify time. (container LMT is for container properties modification).
Following script works with pipeline. If you don't need to check container LMT, just remove the check:
$x = (Get-Date).AddDays(-180)
# get all storage accounts of current subscription
$accounts = Get-AzStorageAccount
foreach($a in $accounts)
{
# get container of storage account with LMT in 180 days
$containers = $a | Get-AzStorageContainer | ? {$_.LastModified -ge $x}
# if don't need check container LMT, use : $containers = $a[0] | Get-AzStorageContainer
# get blob of containers with LMT in 180 days
$blobs = $containers | Get-AzStorageBlob | ? {$_.LastModified -ge $x}
#add code to handle blobs
echo $blobs
}

ACR - delete only old images - variable reference not vald

Im trying to cleanup old images in my ACR. It has 8 repositories so first I want it to test it in only one of them... The complicated thing about it that I need to keep last 4 images created. So I have this script:
$acrName = ACRttestt
$repo = az acr repository list --name $acrName --top 1
$repo | Convertfrom-json | Foreach-Object {
$imageName = $_
(az acr repository show-tags -n $acrName --repository $_ |
convertfrom-json |) Select-Object -SkipLast 4 | Foreach-Object {
az acr repository delete -n $acrName --image "$imageName:$_"
}
}
But Im receiving the following error:
Failed At line:9 char:58 + ... az acr repository delete -n $acrName
--image "$imageName:$_" + ~~~~~~~~~~~ Variable reference is not valid. ':' was not followed by a valid variable name character. Consider
using ${} to delimit the name.
Any ideas?
Thanks in advance
You need to change the "$imageName:$_" into "${imageName}:$_". Then the script will like below:
$acrName = "ACRttestt"
$repo = az acr repository list --name $acrName --top 1
$repo | Convertfrom-json | Foreach-Object {
$imageName = $_
(az acr repository show-tags -n $acrName --repository $_ |
convertfrom-json |) Select-Object -SkipLast 4 | Foreach-Object {
az acr repository delete -n $acrName --image "${imageName}:$_"
}
}

Azure CLI SQL DB Restore time format

I am writing a Powershell script using Azure CLI for doing an Azure SQL Instance restore. This is my script so far:
az login
$AzureSubscription = "SubscriptionName"
az account set --subscription $AzureSubscription
$RGName = "ResourceGroupName"
$SrvName = "AzureSQLServerName"
$RestoreDateTime = (Get-Date).ToUniversalTime().AddHours(-1).ToString()
$RestoreDateTimeString = (Get-Date).ToUniversalTime().AddHours(-1).ToString("yyyy-MM-dd_HH:mm")
$RestoreName = $SrvName + "_" + $RestoreDateTimeString
az sql db restore --dest-name $RestoreName --resource-group $RGName --server $SrvName --name $SrvName --time = $RestoreDateTime
When I run this, I get the following error:
az: error: unrecognized arguments: 7/10/2019 10:39:21 AM
usage: az [-h] [--verbose] [--debug]
[--output {json,jsonc,table,tsv,yaml,none}] [--query JMESPATH]
{sql} ...
I have tried a variety of date-time formats, but, I can't seem to get any of them to work. Is there a specific format that is needed? Should I be passing a different value into time? Any help would be appreciated.
As far as I can tell, the --time parameter wants the datetime formatted as 'Sortable date/time pattern' (yyyy-MM-ddTHH:mm:ss).
This should do it:
$RestoreDateTime = (Get-Date).ToUniversalTime().AddHours(-1)
$RestoreDateTimeString = '{0:yyyy-MM-dd_HH:mm}' -f $RestoreDateTime
$RestoreName = '{0}_{1}' -f $SrvName, $RestoreDateTimeString
# format the datetime as Sortable date/time pattern 'yyyy-MM-ddTHH:mm:ss'
# see: https://learn.microsoft.com/en-us/dotnet/standard/base-types/standard-date-and-time-format-strings
$azRestoreTime = '{0:s}' -f $RestoreDateTime
az sql db restore --dest-name $RestoreName --resource-group $RGName --server $SrvName --name $SrvName --time $azRestoreTime
Hope that helps

How to read txt/csv line by line using PowerShell

I have a couple of txt/csv files, I am trying to feed the each line in the file into my API commands in a loop.
eg: server.txt - first file contains servers and resource groups server1, rg1
server2, rg2
server3, rg3 etc.
ips.txt - another file contains rules & ips rule1, startip1, endip1
rule2, startip2, endip2
rule3, startip3, endip3 etc.
The issue is how to I set up powershell to each line in server.txt and also ips.txt within the same loop?
I had something like this before, but It doesnt seem to work well. Any thoughts?
$list=Get-Content "servers.txt"
foreach ($data in $list) {
$server_name, $rg = $data -split ',' -replace '^\s*|\s*$'
Write-Host "Checking if SQL server belongs in subscription"
$check=$(az sql server list -g $rg --query "[?name == '$server_name'].name" -o tsv)
Write-Host $check
# Get current rules and redirect output to file
az sql server firewall-rule list --resource-group "$rg" --server "$server_name" --output table | Export-Csv -NoTypeInformation current-natrules.csv
$new_rules=Get-Content "ips.txt"
foreach ($data in $new_rules) {
$rule_name, $start_ip, $end_ip = $data -split ',' -replace '^\s*|\s*$'
# Create rule with new configs
Write-Host "Assigning new firewall rules..."
az sql server firewall-rule create --name "$rule_name" --server "$server_name" --resource-group "$rg" --start-ip-address "$start_ip" --end-ip-address "$end_ip" --output table
}
# Validating
Write-Host "Printing new NAT rules to file..."
az sql server firewall-rule list --resource-group "$rg" --server "$server_name" --output table | Export-Csv -NoTypeInformation new-natrules.csv
}

Delete a "folder" in Azure Blob Storage

I would like to delete a folder from the container of my Azure Blob Storage account. This one contains 3 000 000+ files and using Azure Storage Explorer it is a pretty long process (1 000 files/5 min) so I would like to know if it is possible to delete a folder at once.
I am aware there is no "folder" in Azure Blob Storage and it is more a virtual path to access a blob but regarding batch deletion for a huge amount of blobs it is problematic.
Ben I'd recommend using this Powershell script allowing the deletion of 10,000 a time:
This PowerShell script, designed to run in Azure Automatiom, deletes huge number of blobs in a container, by processing them in chunks of 10,000 blobs at a time. When the number of blobs grows beyond a couple of thousands, the usual method of deleting each blob at a time may just get suspended without completing the task.
This could be used to to delete all blobs (when parameter retentionDays is supplied as 0), or certain blobs which has not been modified for the last rententionDays number of days.
Script can be downloaded here: https://gallery.technet.microsoft.com/Delete-large-number-of-97e04976
<#
.Synopsis
Deletes large number of blobs in a container of Storage account, which are older than x days
.DESCRIPTION
This Runbook deletes huge number of blobs in a container, by processing them in chunks of 10,000 blobs at a time. When the number of blobs grow beyond a couple of thousands, the usual method of deleting each blob at a time may just get suspended without completing the task.
.PARAMETER CredentialAssetName
The Credential asset which contains the credential for connecting to subscription
.PARAMETER Subscription
Name of the subscription attached to the credential in CredentialAssetName
.PARAMETER container
Container name from which the blobs are to be deleted
.PARAMETER AzStorageName
The Storage Name to which the container belong to
.PARAMETER retentionDays
Retention days. Blobs older than these many days will be deleted. To delete all, use 0
.NOTES
AUTHOR: Anurag Singh, MSFT
LASTEDIT: March 30, 2016
#>
function delete-blobs
{
param (
[Parameter(Mandatory=$true)]
[String] $CredentialAssetName,
[Parameter(Mandatory=$true)]
[String] $Subscription,
[Parameter(Mandatory=$true)]
[String] $container,
[Parameter(Mandatory=$true)]
[String] $AzStorageName,
[Parameter(Mandatory=$true)]
[Int] $retentionDays
)
$Cred = Get-AutomationPSCredential -Name $CredentialAssetName
$Account = Add-AzureAccount -Credential $Cred
if(!$Account)
{
write-output "Connection to Azure Subscription using the Credential asset failed..."
Break;
}
set-AzureSubscription -SubscriptionName $Subscription
$AzStorageKey = (Get-AzureStorageKey -StorageAccountName $AzStorageName).Primary
$context = New-AzureStorageContext -StorageAccountName $AzStorageName -StorageAccountKey $AzStorageKey
$blobsremoved = 0
$MaxReturn = 10000
$Total = 0
$Token = $Null
$TotalDel = 0
$dateLimit = (get-date).AddDays(-$retentionDays)
try
{
do
{
Write-Output "Retrieving blobs"
$blobs = Get-AzureStorageBlob -Container $container -context $context -MaxCount $MaxReturn -ContinuationToken $Token
$blobstodelete = $blobs | where LastModified -LE $dateLimit
$Total += $Blobs.Count
Write-Output "$Total total Retrieved blobs"
$Token = $Blobs[$blobs.Count -1].ContinuationToken;
if($Blobs.Length -le 0)
{
break;
}
if($blobstodelete.Length -le 0)
{
continue;
}
$TotalDel += $blobstodelete.Count
$blobstodelete | Remove-AzureStorageBlob -Force
Write-Output "$TotalDel blobs deleted"
}
While ($Token -ne $Null)
}
catch
{
write-output $_
}
}
rclone is a great tool to interact with cloud storage.
try rclone purge
As others have already mentioned, you cannot delete a "folder" in Azure Blob Storage. You have to use workaround like listing all files with a prefix and then running a for loop to delete each of them.
In PowerShell, you can simplify these 2 steps in one line by running the following command (using the AzureRM module):
Get-AzureStorageBlob -Context $context -Container $container -Blob 'FolderName*' | Remove-AzureStorageBlob -WhatIf
The -WhatIf option will print out the exact actions it is going to take. What I observed is that it will print What if: Performing the operation "Remove blob" on target ... for each file in the "folder". That probably means this is actually doing individual file deletion.

Resources