Daily import and export Azure SQL Database - azure

I want daily backup data in Azure SQL Database then save as a file in Blob and when my system has an error I can import a backup in Blob to recover the database. In the export case, I found DataFactory to do that but, It is hard to import data. What is the best way to resolve my problem?
Thanks for your help.

The best way to do this is to schedule a daily job using Azure Automation. Below you will find a PowerShell runbook you can use on Azure Automation to backup your database to an Azure storage account.
<#
.SYNOPSIS
This Azure Automation runbook automates Azure SQL database backup to Blob storage and deletes old backups from blob storage.
.DESCRIPTION
You should use this Runbook if you want manage Azure SQL database backups in Blob storage.
This runbook can be used together with Azure SQL Point-In-Time-Restore.
This is a PowerShell runbook, as opposed to a PowerShell Workflow runbook.
.PARAMETER ResourceGroupName
Specifies the name of the resource group where the Azure SQL Database server is located
.PARAMETER DatabaseServerName
Specifies the name of the Azure SQL Database Server which script will backup
.PARAMETER DatabaseAdminUsername
Specifies the administrator username of the Azure SQL Database Server
.PARAMETER DatabaseAdminPassword
Specifies the administrator password of the Azure SQL Database Server
.PARAMETER DatabaseNames
Comma separated list of databases script will backup
.PARAMETER StorageAccountName
Specifies the name of the storage account where backup file will be uploaded
.PARAMETER BlobStorageEndpoint
Specifies the base URL of the storage account
.PARAMETER StorageKey
Specifies the storage key of the storage account
.PARAMETER BlobContainerName
Specifies the container name of the storage account where backup file will be uploaded. Container will be created if it does not exist.
.PARAMETER RetentionDays
Specifies the number of days how long backups are kept in blob storage. Script will remove all older files from container.
For this reason dedicated container should be only used for this script.
.INPUTS
None.
.OUTPUTS
Human-readable informational and error messages produced during the job. Not intended to be consumed by another runbook.
#>
param(
[parameter(Mandatory=$true)]
[String] $ResourceGroupName,
[parameter(Mandatory=$true)]
[String] $DatabaseServerName,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminUsername,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminPassword,
[parameter(Mandatory=$true)]
[String]$DatabaseNames,
[parameter(Mandatory=$true)]
[String]$StorageAccountName,
[parameter(Mandatory=$true)]
[String]$BlobStorageEndpoint,
[parameter(Mandatory=$true)]
[String]$StorageKey,
[parameter(Mandatory=$true)]
[string]$BlobContainerName,
[parameter(Mandatory=$true)]
[Int32]$RetentionDays
)
$ErrorActionPreference = 'stop'
function Login() {
$connectionName = "AzureRunAsConnection"
try
{
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
Write-Verbose "Logging in to Azure..." -Verbose
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint | Out-Null
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
}
function Create-Blob-Container([string]$blobContainerName, $storageContext) {
Write-Verbose "Checking if blob container '$blobContainerName' already exists" -Verbose
if (Get-AzureStorageContainer -ErrorAction "Stop" -Context $storageContext | Where-Object { $_.Name -eq $blobContainerName }) {
Write-Verbose "Container '$blobContainerName' already exists" -Verbose
} else {
New-AzureStorageContainer -ErrorAction "Stop" -Name $blobContainerName -Permission Off -Context $storageContext
Write-Verbose "Container '$blobContainerName' created" -Verbose
}
}
function Export-To-Blob-Storage([string]$resourceGroupName, [string]$databaseServerName, [string]$databaseAdminUsername, [string]$databaseAdminPassword, [string[]]$databaseNames, [string]$storageKey, [string]$blobStorageEndpoint, [string]$blobContainerName) {
Write-Verbose "Starting database export to databases '$databaseNames'" -Verbose
$securePassword = ConvertTo-SecureString –String $databaseAdminPassword –AsPlainText -Force
$creds = New-Object –TypeName System.Management.Automation.PSCredential –ArgumentList $databaseAdminUsername, $securePassword
foreach ($databaseName in $databaseNames.Split(",").Trim()) {
Write-Output "Creating request to backup database '$databaseName'"
$bacpacFilename = $databaseName + (Get-Date).ToString("yyyyMMddHHmm") + ".bacpac"
$bacpacUri = $blobStorageEndpoint + $blobContainerName + "/" + $bacpacFilename
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $resourceGroupName –ServerName $databaseServerName `
–DatabaseName $databaseName –StorageKeytype "StorageAccessKey" –storageKey $storageKey -StorageUri $BacpacUri `
–AdministratorLogin $creds.UserName –AdministratorLoginPassword $creds.Password -ErrorAction "Stop"
# Print status of the export
Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $exportRequest.OperationStatusLink -ErrorAction "Stop"
}
}
function Delete-Old-Backups([int]$retentionDays, [string]$blobContainerName, $storageContext) {
Write-Output "Removing backups older than '$retentionDays' days from blob: '$blobContainerName'"
$isOldDate = [DateTime]::UtcNow.AddDays(-$retentionDays)
$blobs = Get-AzureStorageBlob -Container $blobContainerName -Context $storageContext
foreach ($blob in ($blobs | Where-Object { $_.LastModified.UtcDateTime -lt $isOldDate -and $_.BlobType -eq "BlockBlob" })) {
Write-Verbose ("Removing blob: " + $blob.Name) -Verbose
Remove-AzureStorageBlob -Blob $blob.Name -Container $blobContainerName -Context $storageContext
}
}
Write-Verbose "Starting database backup" -Verbose
$StorageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey
Login
Create-Blob-Container `
-blobContainerName $blobContainerName `
-storageContext $storageContext
Export-To-Blob-Storage `
-resourceGroupName $ResourceGroupName `
-databaseServerName $DatabaseServerName `
-databaseAdminUsername $DatabaseAdminUsername `
-databaseAdminPassword $DatabaseAdminPassword `
-databaseNames $DatabaseNames `
-storageKey $StorageKey `
-blobStorageEndpoint $BlobStorageEndpoint `
-blobContainerName $BlobContainerName
Delete-Old-Backups `
-retentionDays $RetentionDays `
-storageContext $StorageContext `
-blobContainerName $BlobContainerName
Write-Verbose "Database backup script finished" -Verbose

Related

How to set the ComputeModel property to Serverless on an Azure SQL Database using PowerShell?

I'm restoring an Azure SQL Database (Serverless) from a deleted database backup using Get-AzSqlDeletedDatabaseBackup and Restore-AzSqlDatabase PowerShell commandlets. The restore works, but the tags and ComputeModel are not restored with the database.
I've tried using Set-AzSqlDatabase:
Set-AzSqlDatabase -ResourceGroupName $resourcegroupname -DatabaseName $databasename -ServerName $servername -ComputeModel "Serverless" -AutoPauseDelayInMinutes 45
Update: I tried the following code and the Kind is set prior to using the Set-AzResource cmdlet, but it doesn't stick
$resource = Get-AzResource -ResourceGroupName $resourcegroupname -ResourceType "Microsoft.Sql/servers/databases" -Name "$servername/$databasename"
Write-Host "Setting ComputeModel to Serverless..."
$resource.Kind = "v12.0,user,vcore,serverless"
$resource
# resource.Kind is successfully set on the $resource object
Write-Host "Set-AzResource..."
$resource | Set-AzResource -Force
Anyone have any ideas?
Thank you.
Cheers,
Andy
The Get-AzSqlDeletedDatabaseBackup and Restore-AzSqlDatabase PowerShell cmdlets are doesn't contain a property to get the ComputeModel.
While Restore and Delete Backup database we don't require the ComputeModel properties. while setting database we need to require the ComputeModel.
If you want to get the compute model for the Azure SQL Database you can use, Get-AzResource command to fetch the specific information.
Thanks #joy wang SO Solution we can get the serverless Azure SQL Database.
Thanks to #holger and #Delliganesh Sevanesan for the help, I was able implement a solution that restores the most recent deleted database (SQL Database), adds some resource tags, and sets the ComputeModel to serverless.
Here's the code:
<#
Purpose: Restore the most recently deleted Azure Sql Database
Dependencies:
Az.Sql PowerShell module
#>
# Set variables first
$resourcegroupname = 'myresourcegroup'
$servername = 'myservername'
$databasename = 'mydatabasename'
[hashtable]$tags = #{
application = "testing"
}
try {
$deleteddatabases = Get-AzSqlDeletedDatabaseBackup -ResourceGroupName $resourcegroupname -ServerName $servername -DatabaseName $databasename
} catch {
Write-Error "Error getting database backups [Get-AzSqlDeletedDatabaseBackup]: " + $_.Exception.Message
exit(1)
}
# Get most recent backup in case there are multiple copies
# Assumes index and order in foreach is the same - proven in test
Write-Host "Database backups:"
$index = 0
$MostRecentBackupIndex = 0
$MostRecentBackupDate = (Get-date).AddDays(-2) # initialize variable with date from two days ago
foreach ($db in $deleteddatabases) {
if ($db.CreationDate -ge $MostRecentBackupDate) {
$MostRecentBackupIndex = $index
$MostRecentBackupDate = $db.CreationDate
Write-Host "Most Recent Database: $($db.DatabaseName) : Created: $($db.CreationDate) : DeleteDate: $($db.DeletionDate)"
}
$index++
}
$deleteddatabase = $deleteddatabases[$MostRecentBackupIndex]
Write-Host "----------------------------------------------------------------------------------"
Write-Host "Restoring: $($deleteddatabase.DatabaseName) from: $($deleteddatabase.CreationDate) backup"
Write-Host "----------------------------------------------------------------------------------"
Write-Host "Deleted database info ResourceId: "
Write-Host $deleteddatabase.ResourceId
try {
Restore-AzSqlDatabase -FromDeletedDatabaseBackup `
-DeletionDate $deleteddatabase.DeletionDate `
-ResourceGroupName $resourcegroupname `
-ServerName $servername `
-TargetDatabaseName $databasename `
-ResourceId $deleteddatabase.ResourceID `
-Edition $deleteddatabase.Edition `
-Vcore 2 `
-ComputeGeneration "Gen5"
} catch {
Write-Error "Error restoring database [Restore-AzSqlDatabase]: " + $_.Exception.Message
exit(1)
}
# Wait a few minutes to allow restore to complete before applying the tags
Start-Sleep -Seconds 180
Write-Host "Applying tags to database..."
try {
$resource = Get-AzResource -ResourceGroupName $resourcegroupname -ResourceType "Microsoft.Sql/servers/databases" -Name "$servername/$databasename"
New-AzTag -ResourceId $resource.Id -Tag $tags
} catch {
Write-Error "Error adding tags to database [Get-AzResource, New-AzTag]: " + $_.Exception.Message
exit(1)
}
Write-Host "Setting ComputeModel to Serverless..."
try {
# Important - must include -AutoPauseDelayInMinutes 60, -MinVcore, and MaxVcore parameters (thanks holger)
Set-AzSqlDatabase -ResourceGroupName $resourcegroupname -DatabaseName $databasename -ServerName $servername -ComputeModel Serverless -AutoPauseDelayInMinutes 60 -MinVcore 1 -MaxVcore 2
} catch {
Write-Error "Error setting serverless mode [Set-AzSqlDatabase]: " + $_.Exception.Message
exit(1)
}
Write-Host "Database restore complete."

Encrypt Azure Storage account key in powershell script

I'm developing a new powershell script in order to download any blobs from a specific container and the problem is due to security reasons because I do not want to paste in text plain the azure account key.
So I have implemented a solution using 'ConvertTo-SecureString' command but the problem still exists because when I create a connection string to the blob, there appears a message who said: "Server Failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. HTTP Status Code 403 - HTTP Error".
With the key in plain text I'm able to create the connection string properly and then list and download all blobs from the container.
I tried other solutions for example ' $Credential= New-Object System.Management.Automation.PSCredential ('$ShareUser, $SharePassword)'
but there is other problem related with the input is not valid base64 string.
Do you know how to avoid this issues and create a secure connection string with an Azure Storage Account?
Best regards and thanks in advance
Here a part of my powershell script
$SecurePassword= Read-Host -AsSecureString | ConvertFrom-SecureString
$SecurePassword | Out-File -FilePath C:\test_blob\pass_file.xml
$ConfigFile= 'C:\Users\\config_file.xml'
IF (Test-Path) {
[xml]$Config= Get-Content $ConfigFile
[string] $Server = $Config.Config.Server;
[string] $SharePassword = $Config.Config.SharePassword;
} ELSE
{
write-host "File do not exists: $ConfigFile"
}
#BlobStorageInformation
$StorageAccountName='test_acc'
$Container='test'
$DestinationFolder= 'C:\Users\user1\Blobs'
$Context = New-AzStorageConext -StorageAccountName $StorageAccountName -StorageAccountKey $SharePassword
#List of Blobs
$ListBlob=#()
$ListBlob+= Get-AzStorageBlob -context $Context -container $Container | Where-Object {$_.LastModified -lt (Get-Date).AddDAys(-1)}
Why would you maintain the password files or enter storage key manually when you have az powershell. Just login using az powershell, set the subscription and enjoy !
$ResourceGroupName = "YOURRESOURCEGROUPNAME"
$StorageAccountName = "YOURSTORAGEACCOUNTNAME"
$ContainerName = "YOURCONTAINERNAME"
$LocalPath = "D:\Temp"
Write-Output 'Downloading Content from Azure blob to local...'
$storageKey = (Get-AzStorageAccountKey -ResourceGroupName $ResourceGroupName -AccountName $StorageAccountName).value[0]
$storageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storageKey
$blobs = Get-AzStorageBlob -Container $ContainerName -Context $storageContext
foreach($blob in $blobs)
{
Get-AzStorageBlobContent -Container $ContainerName -Context $storageContext -Force -Destination $LocalPath -Blob $blob.Name
}
Write-Output 'Content Downloaded Successfully !!!'

Backup to azure Blob showing completed while backup still in progress

I am using azure automation to automate the process to backup my database to azure blob storage, its working all fine but the issue i am trying to resolve is that the status shows completed before the actual backup is done, and if i try to rerun the automation it tells me this :
"ErrorActionPreference" or common parameter is set to Stop: 45183: There is an import or export operation in progress on the database 'database'.
Here is my powershell script:
param(
[parameter(Mandatory=$true)]
[String] $ResourceGroupName,
[parameter(Mandatory=$true)]
[String] $DatabaseServerName,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminUsername,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminPassword,
[parameter(Mandatory=$true)]
[String]$DatabaseNames,
[parameter(Mandatory=$true)]
[String]$StorageAccountName,
[parameter(Mandatory=$true)]
[String]$BlobStorageEndpoint,
[parameter(Mandatory=$true)]
[String]$StorageKey,
[parameter(Mandatory=$true)]
[string]$BlobContainerName
# [parameter(Mandatory=$true)]
# [Int32]$RetentionDays
)
$ErrorActionPreference = 'stop'
function Login() {
$connectionName = "AzureRunAsConnection"
try
{
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
Write-Verbose "Logging in to Azure..." -Verbose
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint | Out-Null
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
}
function Create-Blob-Container([string]$blobContainerName, $storageContext) {
Write-Verbose "Checking if blob container '$blobContainerName' already exists" -Verbose
if (Get-AzureStorageContainer -ErrorAction "Stop" -Context $storageContext | Where-Object { $_.Name -eq $blobContainerName }) {
Write-Verbose "Container '$blobContainerName' already exists" -Verbose
} else {
New-AzureStorageContainer -ErrorAction "Stop" -Name $blobContainerName -Permission Off -Context $storageContext
Write-Verbose "Container '$blobContainerName' created" -Verbose
}
}
function Export-To-Blob-Storage([string]$resourceGroupName, [string]$databaseServerName, [string]$databaseAdminUsername, [string]$databaseAdminPassword, [string[]]$databaseNames, [string]$storageKey, [string]$blobStorageEndpoint, [string]$blobContainerName) {
Write-Verbose "Starting database export to databases '$databaseNames'" -Verbose
$securePassword = ConvertTo-SecureString –String $databaseAdminPassword –AsPlainText -Force
$creds = New-Object –TypeName System.Management.Automation.PSCredential –ArgumentList $databaseAdminUsername, $securePassword
foreach ($databaseName in $databaseNames.Split(",").Trim()) {
Write-Output "Creating request to backup database '$databaseName'"
$bacpacFilename =$databaseName + "LiveBak_anon" + ".bacpac"
$bacpacUri = $blobStorageEndpoint + "/" + $blobContainerName + "/" + $bacpacFilename
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $resourceGroupName –ServerName $databaseServerName `
–DatabaseName $databaseName –StorageKeytype "StorageAccessKey" –storageKey $storageKey -StorageUri $BacpacUri `
–AdministratorLogin $creds.UserName –AdministratorLoginPassword $creds.Password -ErrorAction "Stop"
# Print status of the export
# Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $exportRequest.OperationStatusLink -ErrorAction "Stop"
}
}
# function Delete-Old-Backups([int]$retentionDays, [string]$blobContainerName, $storageContext) {
# Write-Output "Removing backups older than '$retentionDays' days from blob: '$blobContainerName'"
# $isOldDate = [DateTime]::UtcNow.AddDays(-$retentionDays)
# $blobs = Get-AzureStorageBlob -Container $blobContainerName -Context $storageContext
# foreach ($blob in ($blobs | Where-Object { $_.LastModified.UtcDateTime -lt $isOldDate -and $_.BlobType -eq "BlockBlob" })) {
# Write-Verbose ("Removing blob: " + $blob.Name) -Verbose
# Remove-AzureStorageBlob -Blob $blob.Name -Container $blobContainerName -Context $storageContext
# }
# }
Write-Verbose "Starting database backup" -Verbose
$StorageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey
Login
Create-Blob-Container `
-blobContainerName $blobContainerName `
-storageContext $storageContext
Export-To-Blob-Storage `
-resourceGroupName $ResourceGroupName `
-databaseServerName $DatabaseServerName `
-databaseAdminUsername $DatabaseAdminUsername `
-databaseAdminPassword $DatabaseAdminPassword `
-databaseNames $DatabaseNames `
-storageKey $StorageKey `
-blobStorageEndpoint $BlobStorageEndpoint `
-blobContainerName $BlobContainerName
# Delete-Old-Backups `
# -retentionDays $RetentionDays `
# -storageContext $StorageContext `
# -blobContainerName $BlobContainerName
Write-Verbose "Database backup script finished" -Verbose
All i basically need is for this process to show running even while the backup operation is in progress because i am using this automation in a logic app.
You could use Get-AzureRmSqlDatabaseImportExportStatus
Sample :
PS C:\>Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink "https://management.contoso.com/subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/resource01/providers/Microsoft.Sql/servers/server01/databases/database01/importExportOperationResults/00000000-000-0000-0000-000000000000?api-version=2014-04-01"
OperationStatusLink :
ErrorMessage :
LastModifiedTime : 4/15/2016 10:16:14 PM
QueuedTime : 4/15/2016 10:16:13 PM
StatusMessage : Running, Progress = 5.00 %
Status : InProgress
When you run New-AzureRmSqlDatabaseExport, A job is submitted however it is never waited for the completion of the job. The subsequent lines gets executed.
In your code, you have mentioned the below line
Write-Verbose "Database backup script finished" -Verbose
Once the export job is submitted, the above line is executed.
To overcome this, you could use the Get-AzureRmSqlDatabaseImportExportStatus and poll the status, proceed on the completion status.

Listing blobs in Loop not being Shown

I've created a script which deletes blobs which are older than a set date, I'm trying to run this using an automation account, however when I test it using the "test pane" it gives the desired output, which is a list of blobs to be deleted, however when it actually runs using the automation account it doesn't display a list of blobs to be deleted.
The code is below:
### delete blobs older than 30 days
param(
[parameter(mandatory=$true)]
[int32]$daysToKeep,
[parameter(mandatory=$true)]
[string]$storageAccount,
[parameter(mandatory=$true)]
[string]$storageContainer,
[parameter(mandatory=$true)]
[string]$storageAccessKey
)
$connectionName = "AzureRunAsConnection"
# Get the connection "AzureRunAsConnection "
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
"Logging in to Azure..."
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint
Write-Host "logged into Azure"
$context = New-AzureStorageContext -StorageAccountName $storageAccount -StorageAccountKey $storageAccessKey
New-AzureStorageContainer -Name $storageContainer -Context $context -Permission Blob -ErrorAction SilentlyContinue
$EGBlobs = Get-AzureStorageBlob -Container $storageContainer -Context $context | sort-object LastModified | select lastmodified, name
foreach($blob in $EGBlobs)
{
if($blob.lastmodified -lt (get-date).AddDays($daysToKeep*-1))
{
$blob_date = [datetime]$blob.LastModified.UtcDateTime
Write-Output "-----------------------------------"
write-output "Purging blob from Storage: " $blob.name
write-output "----------------------------------- "
write-output "Last Modified Date of the Blob: " $blob_date
Write-Output "-----------------------------------"
Remove-AzureStorageBlob -Blob $blob.name -Container $storageContainer -Context $context
}
}
Can't see where I'm going wrong, is this a setting within Azure Automation account.
Thanks in advance
Not sure why it is, but in fact it is.
Just move the line Remove-AzureStorageBlob -Blob $blob.name -Container $storageContainer -Context $context to the top in the loop, it will work fine.
It should be:
foreach($blob in $EGBlobs)
{
if($blob.lastmodified -lt (get-date).AddDays(2))
{
Remove-AzureStorageBlob -Blob $blob.name -Container $storageContainer -Context $context
$blob_date = [datetime]$blob.LastModified.UtcDateTime
Write-Output "-----------------------------------"
write-output "Purging blob from Storage: " $blob.name
write-output "----------------------------------- "
write-output "Last Modified Date of the Blob: " $blob_date
Write-Output "-----------------------------------"
}
}

Copy-BlobFromAzureStorage

I'm trying to copy-Blob from azure storage for that i have taken a runbook from the Azure runbook gallery named "Copy-BlobFromAzureStorage". When i try to test, it prompt me for "PATHTOPLACEBLOB" here i have given the default location "c:/" . and its running fine.But the thing is I don't understand where exactly i can find the stored blob, and it is given " PSComputerName" as "localhost". Can any one please suggest me regarding this.
Code:
workflow Copy-BlobFromAzureStorage{
param
(
[parameter(Mandatory=$True)]
[String]
$AzureSubscriptionName,
[parameter(Mandatory=$True)]
[PSCredential]
$AzureOrgIdCredential,
[parameter(Mandatory=$True)]
[String]
$StorageAccountName,
[parameter(Mandatory=$True)]
[String]
$ContainerName,
[parameter(Mandatory=$True)]
[String]
$BlobName,
[parameter(Mandatory=$False)]
[String]
$PathToPlaceBlob = "C:\"
)
$Null = Add-AzureAccount -Credential $AzureOrgIdCredential
$Null = Select-AzureSubscription -SubscriptionName $AzureSubscriptionName
Write-Verbose "Downloading $BlobName from Azure Blob Storage to $PathToPlaceBlob"
Set-AzureSubscription `
-SubscriptionName $AzureSubscriptionName `
-CurrentStorageAccount $StorageAccountName
$blob =
Get-AzureStorageBlobContent `
-Blob $BlobName `
-Container $ContainerName `
-Destination $PathToPlaceBlob `
-Force
try {
Get-Item -Path "$PathToPlaceBlob\$BlobName" -ErrorAction Stop
}
catch {
Get-Item -Path $PathToPlaceBlob
}}
The blob is placed on the sandbox where the Azure Automation runbook is running. There's not much point in putting it there, since this sandbox will be cleaned up after the runbook job finishes, but it can make sense, depending on your scenario, as an intermediary point to put the blob, such as to edit it or transfer it to somewhere else outside of the sandbox (ex another Azure Storage account or an FTP server).

Resources