Listing blobs in Loop not being Shown - azure

I've created a script which deletes blobs which are older than a set date, I'm trying to run this using an automation account, however when I test it using the "test pane" it gives the desired output, which is a list of blobs to be deleted, however when it actually runs using the automation account it doesn't display a list of blobs to be deleted.
The code is below:
### delete blobs older than 30 days
param(
[parameter(mandatory=$true)]
[int32]$daysToKeep,
[parameter(mandatory=$true)]
[string]$storageAccount,
[parameter(mandatory=$true)]
[string]$storageContainer,
[parameter(mandatory=$true)]
[string]$storageAccessKey
)
$connectionName = "AzureRunAsConnection"
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
"Logging in to Azure..."
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
Write-Host "logged into Azure"
$context = New-AzureStorageContext -StorageAccountName $storageAccount -StorageAccountKey $storageAccessKey
New-AzureStorageContainer -Name $storageContainer -Context $context -Permission Blob -ErrorAction SilentlyContinue
$EGBlobs = Get-AzureStorageBlob -Container $storageContainer -Context $context | sort-object LastModified | select lastmodified, name
foreach($blob in $EGBlobs)
{
if($blob.lastmodified -lt (get-date).AddDays($daysToKeep*-1))
{
$blob_date = [datetime]$blob.LastModified.UtcDateTime
Write-Output "-----------------------------------"
write-output "Purging blob from Storage: " $blob.name
write-output "----------------------------------- "
write-output "Last Modified Date of the Blob: " $blob_date
Write-Output "-----------------------------------"
Remove-AzureStorageBlob -Blob $blob.name -Container $storageContainer -Context $context
}
}
Can't see where I'm going wrong, is this a setting within Azure Automation account.
Thanks in advance

Not sure why it is, but in fact it is.
Just move the line Remove-AzureStorageBlob -Blob $blob.name -Container $storageContainer -Context $context to the top in the loop, it will work fine.
It should be:
foreach($blob in $EGBlobs)
{
if($blob.lastmodified -lt (get-date).AddDays(2))
{
Remove-AzureStorageBlob -Blob $blob.name -Container $storageContainer -Context $context
$blob_date = [datetime]$blob.LastModified.UtcDateTime
Write-Output "-----------------------------------"
write-output "Purging blob from Storage: " $blob.name
write-output "----------------------------------- "
write-output "Last Modified Date of the Blob: " $blob_date
Write-Output "-----------------------------------"
}
}

Related

Powershell script to download file with current date as filename from Azure blob - Download log file and Remove the Downloaded content from blob

I have PowerShell script which downloads the file with current date as filename from Azure blob. But How to get the log file of the process and how to remove the file which is downloaded from Azure blob through script. Could someone help me in this, Please.
Example.
app_09102021.txt
app_10102021.txt
app_11102021.txt
Below is the script.
$container_name = '$XXX'
$destination_path = 'D:\test'
$Ctx = New-AzStorageContext $ZZZZ -StorageAccountKey $CCVCVCVCV
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$Listblobs = 'app_{0:ddMMyyyy}.txt' -f (Get-Date)
# Just download that blob
Get-AzStorageBlobContent -Context $Ctx -Container $container_name -Blob $Listblobs -Destination $destination_path
I have tested in my environment to download the blobs and delete by using below cmd and its successfully downloaded and got removed from Azure as well .
$container_name = 'testaj'
$destination_path = 'C:\Users\Desktop\test'
$Ctx = New-AzStorageContext 'accountname' -StorageAccountKey 'accountkeyrA=='
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$Listblobs ='app_{0:ddMMyyyy}.txt' -f (Get-Date)
$blob = Get-AzStorageBlobContent -Context $Ctx -Container $container_name -Blob $Listblobs -Destination $destination_path -ErrorAction SilentlyContinue
if($blob -ne $Null)
{
Write-Host ("The File $Listblobs has been downloaded to $destination_path")
Write-Host ("Proceeding to delete the downloaded file from $container_name in Azure!!!")
Remove-AzStorageBlob -Container $container_name -Blob $Listblobs -Context $Ctx
Write-Host ("deleted file from $Listblobs in Azure!!!")
}
else{
write-Host ("The file does not exit")
}
Here is the Output for downloaded and deleted file from blob:
If the file got deleted then it will be something like below:
For more information please refer this MS DOC: Monitoring Azure Blob Storage
UPDATE:
I am looking for the solution to search & match the date with today's
date in filename, not sorting by last modified date/time.
Tried with the below code to download all the blobs with todays date for e.g abc_04012022,app_04012022 and delete them from Azure
PS Script :-
$container_name = 'test'
$destination_path = 'C:\Users\Desktop\test'
$Ctx = New-AzStorageContext 'accountname' -StorageAccountKey 'accountkey=='
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$latestBlob = Get-Date -UFormat '%d%m%Y'
$bloblist = Get-AzStorageBlob -Container $container_name -Context $Ctx |select -Property Name
foreach($item in $bloblist){
if($item.Name -match $latestBlob){
Write-Output "Here is the blobs"
$blob = Get-AzStorageBlobContent -Context $Ctx -Container $container_name -Blob $item.Name -Destination $destination_path -ErrorAction SilentlyContinue
if($blob -ne $Null)
{
Write-Output ("The File $item.Name has been downloaded to $destination_path")
Write-Output ("Proceeding to delete the downloaded file from $container_name in Azure!!!")
Remove-AzStorageBlob -Container $container_name -Blob $item.Name -Context $Ctx
Write-Output ("deleted file from $item.Name in Azure!!!")
}
else{
write-Output ("The file does not exit")
}
}
}
Screenshot for reference:

Backup to azure Blob showing completed while backup still in progress

I am using azure automation to automate the process to backup my database to azure blob storage, its working all fine but the issue i am trying to resolve is that the status shows completed before the actual backup is done, and if i try to rerun the automation it tells me this :
"ErrorActionPreference" or common parameter is set to Stop: 45183: There is an import or export operation in progress on the database 'database'.
Here is my powershell script:
param(
[parameter(Mandatory=$true)]
[String] $ResourceGroupName,
[parameter(Mandatory=$true)]
[String] $DatabaseServerName,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminUsername,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminPassword,
[parameter(Mandatory=$true)]
[String]$DatabaseNames,
[parameter(Mandatory=$true)]
[String]$StorageAccountName,
[parameter(Mandatory=$true)]
[String]$BlobStorageEndpoint,
[parameter(Mandatory=$true)]
[String]$StorageKey,
[parameter(Mandatory=$true)]
[string]$BlobContainerName
# [parameter(Mandatory=$true)]
# [Int32]$RetentionDays
)
$ErrorActionPreference = 'stop'
function Login() {
$connectionName = "AzureRunAsConnection"
try
{
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
Write-Verbose "Logging in to Azure..." -Verbose
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint | Out-Null
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
}
function Create-Blob-Container([string]$blobContainerName, $storageContext) {
Write-Verbose "Checking if blob container '$blobContainerName' already exists" -Verbose
if (Get-AzureStorageContainer -ErrorAction "Stop" -Context $storageContext | Where-Object { $_.Name -eq $blobContainerName }) {
Write-Verbose "Container '$blobContainerName' already exists" -Verbose
} else {
New-AzureStorageContainer -ErrorAction "Stop" -Name $blobContainerName -Permission Off -Context $storageContext
Write-Verbose "Container '$blobContainerName' created" -Verbose
}
}
function Export-To-Blob-Storage([string]$resourceGroupName, [string]$databaseServerName, [string]$databaseAdminUsername, [string]$databaseAdminPassword, [string[]]$databaseNames, [string]$storageKey, [string]$blobStorageEndpoint, [string]$blobContainerName) {
Write-Verbose "Starting database export to databases '$databaseNames'" -Verbose
$securePassword = ConvertTo-SecureString –String $databaseAdminPassword –AsPlainText -Force
$creds = New-Object –TypeName System.Management.Automation.PSCredential –ArgumentList $databaseAdminUsername, $securePassword
foreach ($databaseName in $databaseNames.Split(",").Trim()) {
Write-Output "Creating request to backup database '$databaseName'"
$bacpacFilename =$databaseName + "LiveBak_anon" + ".bacpac"
$bacpacUri = $blobStorageEndpoint + "/" + $blobContainerName + "/" + $bacpacFilename
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $resourceGroupName –ServerName $databaseServerName `
–DatabaseName $databaseName –StorageKeytype "StorageAccessKey" –storageKey $storageKey -StorageUri $BacpacUri `
–AdministratorLogin $creds.UserName –AdministratorLoginPassword $creds.Password -ErrorAction "Stop"
# Print status of the export
# Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $exportRequest.OperationStatusLink -ErrorAction "Stop"
}
}
# function Delete-Old-Backups([int]$retentionDays, [string]$blobContainerName, $storageContext) {
# Write-Output "Removing backups older than '$retentionDays' days from blob: '$blobContainerName'"
# $isOldDate = [DateTime]::UtcNow.AddDays(-$retentionDays)
# $blobs = Get-AzureStorageBlob -Container $blobContainerName -Context $storageContext
# foreach ($blob in ($blobs | Where-Object { $_.LastModified.UtcDateTime -lt $isOldDate -and $_.BlobType -eq "BlockBlob" })) {
# Write-Verbose ("Removing blob: " + $blob.Name) -Verbose
# Remove-AzureStorageBlob -Blob $blob.Name -Container $blobContainerName -Context $storageContext
# }
# }
Write-Verbose "Starting database backup" -Verbose
$StorageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey
Login
Create-Blob-Container `
-blobContainerName $blobContainerName `
-storageContext $storageContext
Export-To-Blob-Storage `
-resourceGroupName $ResourceGroupName `
-databaseServerName $DatabaseServerName `
-databaseAdminUsername $DatabaseAdminUsername `
-databaseAdminPassword $DatabaseAdminPassword `
-databaseNames $DatabaseNames `
-storageKey $StorageKey `
-blobStorageEndpoint $BlobStorageEndpoint `
-blobContainerName $BlobContainerName
# Delete-Old-Backups `
# -retentionDays $RetentionDays `
# -storageContext $StorageContext `
# -blobContainerName $BlobContainerName
Write-Verbose "Database backup script finished" -Verbose
All i basically need is for this process to show running even while the backup operation is in progress because i am using this automation in a logic app.
You could use Get-AzureRmSqlDatabaseImportExportStatus
Sample :
PS C:\>Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink "https://management.contoso.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/resource01/providers/Microsoft.Sql/servers/server01/databases/database01/importExportOperationResults/00000000-000-0000-0000-000000000000?api-version=2014-04-01"
OperationStatusLink :
ErrorMessage :
LastModifiedTime : 4/15/2016 10:16:14 PM
QueuedTime : 4/15/2016 10:16:13 PM
StatusMessage : Running, Progress = 5.00 %
Status : InProgress
When you run New-AzureRmSqlDatabaseExport, A job is submitted however it is never waited for the completion of the job. The subsequent lines gets executed.
In your code, you have mentioned the below line
Write-Verbose "Database backup script finished" -Verbose
Once the export job is submitted, the above line is executed.
To overcome this, you could use the Get-AzureRmSqlDatabaseImportExportStatus and poll the status, proceed on the completion status.

Daily import and export Azure SQL Database

I want daily backup data in Azure SQL Database then save as a file in Blob and when my system has an error I can import a backup in Blob to recover the database. In the export case, I found DataFactory to do that but, It is hard to import data. What is the best way to resolve my problem?
Thanks for your help.
The best way to do this is to schedule a daily job using Azure Automation. Below you will find a PowerShell runbook you can use on Azure Automation to backup your database to an Azure storage account.
<#
.SYNOPSIS
This Azure Automation runbook automates Azure SQL database backup to Blob storage and deletes old backups from blob storage.
.DESCRIPTION
You should use this Runbook if you want manage Azure SQL database backups in Blob storage.
This runbook can be used together with Azure SQL Point-In-Time-Restore.
This is a PowerShell runbook, as opposed to a PowerShell Workflow runbook.
.PARAMETER ResourceGroupName
Specifies the name of the resource group where the Azure SQL Database server is located
.PARAMETER DatabaseServerName
Specifies the name of the Azure SQL Database Server which script will backup
.PARAMETER DatabaseAdminUsername
Specifies the administrator username of the Azure SQL Database Server
.PARAMETER DatabaseAdminPassword
Specifies the administrator password of the Azure SQL Database Server
.PARAMETER DatabaseNames
Comma separated list of databases script will backup
.PARAMETER StorageAccountName
Specifies the name of the storage account where backup file will be uploaded
.PARAMETER BlobStorageEndpoint
Specifies the base URL of the storage account
.PARAMETER StorageKey
Specifies the storage key of the storage account
.PARAMETER BlobContainerName
Specifies the container name of the storage account where backup file will be uploaded. Container will be created if it does not exist.
.PARAMETER RetentionDays
Specifies the number of days how long backups are kept in blob storage. Script will remove all older files from container.
For this reason dedicated container should be only used for this script.
.INPUTS
None.
.OUTPUTS
Human-readable informational and error messages produced during the job. Not intended to be consumed by another runbook.
#>
param(
[parameter(Mandatory=$true)]
[String] $ResourceGroupName,
[parameter(Mandatory=$true)]
[String] $DatabaseServerName,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminUsername,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminPassword,
[parameter(Mandatory=$true)]
[String]$DatabaseNames,
[parameter(Mandatory=$true)]
[String]$StorageAccountName,
[parameter(Mandatory=$true)]
[String]$BlobStorageEndpoint,
[parameter(Mandatory=$true)]
[String]$StorageKey,
[parameter(Mandatory=$true)]
[string]$BlobContainerName,
[parameter(Mandatory=$true)]
[Int32]$RetentionDays
)
$ErrorActionPreference = 'stop'
function Login() {
$connectionName = "AzureRunAsConnection"
try
{
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
Write-Verbose "Logging in to Azure..." -Verbose
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint | Out-Null
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
}
function Create-Blob-Container([string]$blobContainerName, $storageContext) {
Write-Verbose "Checking if blob container '$blobContainerName' already exists" -Verbose
if (Get-AzureStorageContainer -ErrorAction "Stop" -Context $storageContext | Where-Object { $_.Name -eq $blobContainerName }) {
Write-Verbose "Container '$blobContainerName' already exists" -Verbose
} else {
New-AzureStorageContainer -ErrorAction "Stop" -Name $blobContainerName -Permission Off -Context $storageContext
Write-Verbose "Container '$blobContainerName' created" -Verbose
}
}
function Export-To-Blob-Storage([string]$resourceGroupName, [string]$databaseServerName, [string]$databaseAdminUsername, [string]$databaseAdminPassword, [string[]]$databaseNames, [string]$storageKey, [string]$blobStorageEndpoint, [string]$blobContainerName) {
Write-Verbose "Starting database export to databases '$databaseNames'" -Verbose
$securePassword = ConvertTo-SecureString –String $databaseAdminPassword –AsPlainText -Force
$creds = New-Object –TypeName System.Management.Automation.PSCredential –ArgumentList $databaseAdminUsername, $securePassword
foreach ($databaseName in $databaseNames.Split(",").Trim()) {
Write-Output "Creating request to backup database '$databaseName'"
$bacpacFilename = $databaseName + (Get-Date).ToString("yyyyMMddHHmm") + ".bacpac"
$bacpacUri = $blobStorageEndpoint + $blobContainerName + "/" + $bacpacFilename
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $resourceGroupName –ServerName $databaseServerName `
–DatabaseName $databaseName –StorageKeytype "StorageAccessKey" –storageKey $storageKey -StorageUri $BacpacUri `
–AdministratorLogin $creds.UserName –AdministratorLoginPassword $creds.Password -ErrorAction "Stop"
# Print status of the export
Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $exportRequest.OperationStatusLink -ErrorAction "Stop"
}
}
function Delete-Old-Backups([int]$retentionDays, [string]$blobContainerName, $storageContext) {
Write-Output "Removing backups older than '$retentionDays' days from blob: '$blobContainerName'"
$isOldDate = [DateTime]::UtcNow.AddDays(-$retentionDays)
$blobs = Get-AzureStorageBlob -Container $blobContainerName -Context $storageContext
foreach ($blob in ($blobs | Where-Object { $_.LastModified.UtcDateTime -lt $isOldDate -and $_.BlobType -eq "BlockBlob" })) {
Write-Verbose ("Removing blob: " + $blob.Name) -Verbose
Remove-AzureStorageBlob -Blob $blob.Name -Container $blobContainerName -Context $storageContext
}
}
Write-Verbose "Starting database backup" -Verbose
$StorageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey
Login
Create-Blob-Container `
-blobContainerName $blobContainerName `
-storageContext $storageContext
Export-To-Blob-Storage `
-resourceGroupName $ResourceGroupName `
-databaseServerName $DatabaseServerName `
-databaseAdminUsername $DatabaseAdminUsername `
-databaseAdminPassword $DatabaseAdminPassword `
-databaseNames $DatabaseNames `
-storageKey $StorageKey `
-blobStorageEndpoint $BlobStorageEndpoint `
-blobContainerName $BlobContainerName
Delete-Old-Backups `
-retentionDays $RetentionDays `
-storageContext $StorageContext `
-blobContainerName $BlobContainerName
Write-Verbose "Database backup script finished" -Verbose

Powershell script to delete file from subfolder from blob container

I am trying to delete file from specific folder like from full or diff in blob container but unable to do.
Container_name and then there are two folders full and diff and I want to delete file from full only.
Please help.
$context = New-AzureStorageContext -StorageAccountName "storage_name" -StorageAccountKey "key"
$blobs= Get-AzureStorageBlob -Container "container_name" -blob *DIFF*.bak -Context $context
foreach ($blob in $blobs)
{
$modifieddate = $blob.LastModified
Write-Host $modifieddate
if ($modifieddate -ne $null)
{
$howold = ([DateTime]::Now - [DateTime]$modifieddate.LocalDateTime)
if ($howold.TotalDays -ge 5)
{
Remove-AzureStorageBlob -Blob $blob.Name -Container "container_name" -Context $context
Write-Host $blob.Name
}
}
}
If you want to use Azure PowerShell to delete blobs from one subfolder in one container, you can use the following script :
$StorageAccountKey=" "
$StorageAccountName=" "
$ContainerName=" "
$Token = $null
$Total = 0
$MaxCount=5000
$context = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
do
{
$Blobs = Get-AzStorageBlob -Container $ContainerName -MaxCount $MaxCount -ContinuationToken $Token -Context $context -Prefix "your fodler name"
if($Blobs.Length -le 0) { Break;}
$Token = $Blobs[$blobs.Count -1].ContinuationToken;
foreach($blob in $blobs){
Remove-AzStorageBlob -Blob $blob.Name -Container $ContainerName -Context $context
}
}
While ($Token -ne $null)
Update
I use the latest version Azure Az module to do a test.

copying one blob container data into another blob container in azure throught run books

i have write a script to copy one blob container data into another blob container
it copying only blob name and showing completed inside VHDs are not copying can any one please help in script,,,
what exactly i am trying is
ex:- storage account 1 contain 3 vhds
i have created new storage account in the same log and trying to copy all vhds into new storage account
In the script i have passed few parameters to copy the vhds with date
when i am running that script it only creating name but Vhds are not copying
Thanks in Advance
According to your description, you want to copy all blobs to another storage accounts, we can use this script to do it:
Add this to a new runbook:
$connectionName = "AzureRunAsConnection"
try
{
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection=Get-AutomationConnection -Name $connectionName
"Logging in to Azure..."
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
$RGName = "vm"
$SAName = "jasondisk321"
$ConName = "vhd"
$key = "UUzQRoWeIMHzwzwJW9LxtgmwaJJS/Ac3DoXnPMHFIbUmupDpQ+KXCWG8ISJ4E20zjq7ugPtgN4vtVIv3A4m2Pg=="
$Ctx = New-AzureStorageContext -StorageAccountName $SAName -StorageAccountKey $Key
$List = Get-AzureStorageBlob -Blob * -Container $ConName -Context $Ctx
$List = $List.ICloudBlob.Uri.AbsoluteUri
$storageAccount = "jasondisk322"
$storageKey = "6lRJq6hTS1aHfVF4/iWskq/QS+tu4Jm/2zdz7Mo6AINGZOQKUiHtOAKmdZhBAWbcNEcBQq0YxZjXHgHha/iUKw=="
$destContext = New-AzureStorageContext –StorageAccountName $storageAccount -StorageAccountKey $storageKey
$containerName = "vhd"
foreach ( $l in $list ){
$bn = ($l -split '/')[4]
Start-AzureStorageBlobCopy -srcUri $l -context $Ctx -DestContainer $containerName -DestBlob $bn -DestContext $destContext
}
If you want to use powershell on your local PC, we can use this script:
Login-AzureRmAccount
$RGName = "vm" #source storage account resource group name
$SAName = "jasondisk321" #source storage account name
$ConName = "vhd" #source container name
$key = "UUzQRoWeIMHzwzwJW9LxtgmwaJJS/Ac3DoXnPMHFIbUmupDpQ+KXCWG8ISJ4E20zjq7ugPtgN4vtVIv3A4m2Pg=="#source storage account key
$Ctx = New-AzureStorageContext -StorageAccountName $SAName -StorageAccountKey $Key
$List = Get-AzureStorageBlob -Blob * -Container $ConName -Context $Ctx
$List = $List.ICloudBlob.Uri.AbsoluteUri
$storageAccount = "jasondisk322"#destination storage account name
$storageKey = "6lRJq6hTS1aHfVF4/iWskq/QS+tu4Jm/2zdz7Mo6AINGZOQKUiHtOAKmdZhBAWbcNEcBQq0YxZjXHgHha/iUKw=="#destination storage account key
$destContext = New-AzureStorageContext –StorageAccountName $storageAccount -StorageAccountKey $storageKey
$containerName = "vhd"#destination container name
foreach ( $l in $list ){
$bn = ($l -split '/')[4]
Start-AzureStorageBlobCopy -srcUri $l -context $Ctx -DestContainer $containerName -DestBlob $bn -DestContext $destContext
}
You can do it with AzCopy:
AzCopy /source:https://[SourceStorageAccountName].blob.core.windows.net/vhds /dest:https://[DestStprageAccountName].blob.core.windows.net/vhds /sourcekey:<here-is-source-key> /destkey:<here-is-destination-key> /Pattern:[vhd-name].vhd
more info: https://learn.microsoft.com/en-us/azure/storage/storage-use-azcopy
If you want to automate, then you can use powershell using Start-AzureStorageBlobCopy:
Select-AzureSubscription "my subscription"
### Source VHD (West US) - anonymous access container ###
$srcUri = "http://mwwestus1.blob.core.windows.net/source/testcopy1.vhd"
### Target Storage Account (East US) ###
$storageAccount = "mweastus1"
$storageKey = "STORAGEACCOUNTKEY"
### Create the destination context for authenticating the copy
$destContext = New-AzureStorageContext –StorageAccountName $storageAccount `
-StorageAccountKey $storageKey
### Target Container Name
$containerName = "copiedvhds"
### Create the target container in storage
New-AzureStorageContainer -Name $containerName -Context $destContext
### Start the Asynchronous Copy ###
$blob1 = Start-AzureStorageBlobCopy -srcUri $srcUri `
-DestContainer $containerName `
-DestBlob "testcopy1.vhd" `
-DestContext $destContext
more info:https://www.opsgility.com/blog/windows-azure-powershell-reference-guide/copying-vhds-blobs-between-storage-accounts/
UPDATE:
AzCopy /Source:https://myaccount1.blob.core.windows.net/myContainer/ /Dest:https://myaccount2.blob.core.windows.net/myContainer/ /SourceKey:key1 /DestKey:key2 /Pattern:ab /SyncCopy

Resources