Using Azure Automation to create a cloud service instance using Powershell - azure

I'm trying to write a script that I can automate on Azure to create a new instance of a cloud service. I am having trouble getting the New-AzureDeployment cmdlet to work.
Both the CSPKG and CSCFG files are stored on Azure under the same storage account but in different containers. They were uploaded using CloudBerry.
param(
[parameter(Mandatory=$False)]
[string] $StorageAccount = 'storageaccount',
[parameter(Mandatory=$False)]
[string] $ServiceName = 'cloudservicesdev',
[parameter(Mandatory=$False)]
[string] $Slot = 'Production',
[parameter(Mandatory=$False)]
[string] $Label = 'BASE'
)
Write-Output "Start of workflow"
$cert = Get-AutomationCertificate -Name 'Credential'
$subID = 'subId'
Set-AzureSubscription -SubscriptionName "SubName" -CurrentStorageAccountName $StorageAccount -Certificate $cert -SubscriptionId $subID
Select-AzureSubscription "SubName"
$package = (Get-AzureStorageBlob -blob "package.cspkg" -Container "package").ICloudBlob.uri.AbsoluteUri
$config = (Get-AzureStorageBlob -blob "Config - Dev.cscfg" -Container "config").ICloudBlob.uri.AbsoluteUri
New-AzureDeployment -ServiceName "$ServiceName" -Slot "$Slot" -Package "$package" -Configuration "$config" -Label "$Label"
I get the following error
2014-12-08 05:57:57 PM, Error: New-AzureDeployment : The given path's format is not supported.
At Create_New_Cloud_Service_Instance:40 char:40
+
+ CategoryInfo : NotSpecified: (:) [New-AzureDeployment], NotSupportedException
+ FullyQualifiedErrorId :
System.NotSupportedException,Microsoft.WindowsAzure.Commands.ServiceManagement.HostedServices.NewAzureDeploymentCommand
I've checked both the $package and $config variables and they are pointing to the file locations I'd expect (https://storageaccount.blob.core.windows.net/package/package.cspkg and https://storageaccount.blob.core.windows.net/config/Config%20-%20Dev.cscfg respectively). They match the URLs I see when I navigate to the files under their containers in Storage.
This looks similar to the examples that I have seen. What have I done wrong?
This is the new code that I used based on Joe's answer below
I also used this example script https://gallery.technet.microsoft.com/scriptcenter/Continuous-Deployment-of-A-eeebf3a6 to help with the using the Azure Automation sandbox
param(
[parameter(Mandatory=$False)]
[string] $StorageAccount = 'storageaccount',
[parameter(Mandatory=$False)]
[string] $ServiceName = 'cloudservicesdev',
[parameter(Mandatory=$False)]
[string] $Slot = 'Production',
[parameter(Mandatory=$False)]
[string] $Label = 'BASE'
)
Write-Output "Start of workflow"
$cert = Get-AutomationCertificate -Name 'Credential'
$subID = 'subId'
Set-AzureSubscription -SubscriptionName "SubName" -CurrentStorageAccountName $StorageAccount -Certificate $cert -SubscriptionId $subID
Select-AzureSubscription "SubName"
$package = (Get-AzureStorageBlob -blob "package.cspkg" -Container "package").ICloudBlob.uri.AbsoluteUri
$TempFileLocation = "C:\temp\Config - Dev.cscfg"
$config = (Get-AzureStorageBlobContent -blob "Config - Dev.cscfg" -Container "config" -Destination $TempFileLocation -Force)
New-AzureDeployment -ServiceName "$ServiceName" -Slot "$Slot" -Package "$package" -Configuration $TempFileLocation -Label "$Label"

The -Package parameter of New-AzureDeployment should be passed a storage URI as you are doing, but the -Configuration parameter expects a local file. See http://msdn.microsoft.com/en-us/library/azure/dn495143.aspx for more details.
So you need to, within the runbook, download the file at the $config URI to the Azure Automation sandbox, and then pass that file's local path to New-AzureDeployment.

Related

Setup and deploy Azure function using PowerShell

I try to setup and deploy Azure Function by using PowerShell script based on this topic: Setup Azure Function from PowerShell
My script looks like this:
#=============Defining All Variables=========
$location = 'Central US'
$resourceGroupName = 'MyResourceGroup'
$subscriptionId = 'MysubscriptionId'
$functionAppName = 'MyfunctionAppName'
$appServicePlanName = 'ASP-test-8b50'
$tier = 'Dynamic'
$archivePath = 'd:\TestAzureFunc.zip'
Connect-AzAccount
#========Creating Azure Resource Group========
$resourceGroup = Get-AzResourceGroup | Where-Object { $_.ResourceGroupName -eq $resourceGroupName }
if ($resourceGroup -eq $null)
{
New-AzResourceGroup -Name $resourceGroupName -Location $location -force
}
#selecting default azure subscription by name
Select-AzSubscription -SubscriptionID $subscriptionId
Set-AzContext $subscriptionId
#========Creating App Service Plan============
New-AzAppServicePlan -ResourceGroupName $resourceGroupName -Name $appServicePlanName -Location $location -Tier $tier
$functionAppSettings = #{
ServerFarmId="/subscriptions/$subscriptionId/resourceGroups/$resourceGroupName/providers/Microsoft.Web/serverfarms/$appServicePlanName";
alwaysOn=$True;
}
#========Creating Azure Function========
$functionAppResource = Get-AzResource | Where-Object { $_.ResourceName -eq $functionAppName -And $_.ResourceType -eq "Microsoft.Web/Sites" }
if ($functionAppResource -eq $null)
{
New-AzResource -ResourceType 'Microsoft.Web/Sites' -ResourceName $functionAppName -kind 'functionapp' -Location $location -ResourceGroupName $resourceGroupName -Properties $functionAppSettings -force
}
#========Defining Azure Function Settings========
$AppSettings =#{}
$AppSettings =#{'FUNCTIONS_EXTENSION_VERSION' = '~2';
'FUNCTIONS_WORKER_RUNTIME' = 'dotnet';}
Set-AzWebApp -Name $functionAppName -ResourceGroupName $resourceGroupName -AppSettings $AppSettings
#========Deploy Azure Function from zip========
Publish-AzWebapp -ResourceGroupName $resourceGroupName -Name $functionAppName -ArchivePath $archivePath
The script works without errors. Resource group and Function App created as needed. But the list of functions of the Function App is empty.
Function details here:
My intuition tells me that I've forgotten something. But I don't know what.
Could you advise me on how to deploy my Azure function properly?
One of the workaround you can follow ,
Looking at your script we need to ensure that we are providing function app configuration as below cmdlts the link you followed:-
$AzFunctionAppSettings = #{
APPINSIGHTS_INSTRUMENTATIONKEY = $AppInsightsKey;
AzureWebJobsDashboard = $AzFunctionAppStorageAccountConnectionString;
AzureWebJobsStorage = $AzFunctionAppStorageAccountConnectionString;
FUNCTIONS_EXTENSION_VERSION = "~4";
FUNCTIONS_WORKER_RUNTIME = "dotnet";
}
And also make sure that the storage account connection string you provided in the function is same as here providing.
And then you can navigate to Kudu API to check the wwwroot folder is exist or not.
For more information please refer the below links:-
SO THREAD|Powershell command Publish-AzWebApp not publishing apllication
BLOG|How to Deploy Azure Function Apps With Powershell.

Encrypt Azure Storage account key in powershell script

I'm developing a new powershell script in order to download any blobs from a specific container and the problem is due to security reasons because I do not want to paste in text plain the azure account key.
So I have implemented a solution using 'ConvertTo-SecureString' command but the problem still exists because when I create a connection string to the blob, there appears a message who said: "Server Failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. HTTP Status Code 403 - HTTP Error".
With the key in plain text I'm able to create the connection string properly and then list and download all blobs from the container.
I tried other solutions for example ' $Credential= New-Object System.Management.Automation.PSCredential ('$ShareUser, $SharePassword)'
but there is other problem related with the input is not valid base64 string.
Do you know how to avoid this issues and create a secure connection string with an Azure Storage Account?
Best regards and thanks in advance
Here a part of my powershell script
$SecurePassword= Read-Host -AsSecureString | ConvertFrom-SecureString
$SecurePassword | Out-File -FilePath C:\test_blob\pass_file.xml
$ConfigFile= 'C:\Users\\config_file.xml'
IF (Test-Path) {
[xml]$Config= Get-Content $ConfigFile
[string] $Server = $Config.Config.Server;
[string] $SharePassword = $Config.Config.SharePassword;
} ELSE
{
write-host "File do not exists: $ConfigFile"
}
#BlobStorageInformation
$StorageAccountName='test_acc'
$Container='test'
$DestinationFolder= 'C:\Users\user1\Blobs'
$Context = New-AzStorageConext -StorageAccountName $StorageAccountName -StorageAccountKey $SharePassword
#List of Blobs
$ListBlob=#()
$ListBlob+= Get-AzStorageBlob -context $Context -container $Container | Where-Object {$_.LastModified -lt (Get-Date).AddDAys(-1)}
Why would you maintain the password files or enter storage key manually when you have az powershell. Just login using az powershell, set the subscription and enjoy !
$ResourceGroupName = "YOURRESOURCEGROUPNAME"
$StorageAccountName = "YOURSTORAGEACCOUNTNAME"
$ContainerName = "YOURCONTAINERNAME"
$LocalPath = "D:\Temp"
Write-Output 'Downloading Content from Azure blob to local...'
$storageKey = (Get-AzStorageAccountKey -ResourceGroupName $ResourceGroupName -AccountName $StorageAccountName).value[0]
$storageContext = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $storageKey
$blobs = Get-AzStorageBlob -Container $ContainerName -Context $storageContext
foreach($blob in $blobs)
{
Get-AzStorageBlobContent -Container $ContainerName -Context $storageContext -Force -Destination $LocalPath -Blob $blob.Name
}
Write-Output 'Content Downloaded Successfully !!!'

Daily import and export Azure SQL Database

I want daily backup data in Azure SQL Database then save as a file in Blob and when my system has an error I can import a backup in Blob to recover the database. In the export case, I found DataFactory to do that but, It is hard to import data. What is the best way to resolve my problem?
Thanks for your help.
The best way to do this is to schedule a daily job using Azure Automation. Below you will find a PowerShell runbook you can use on Azure Automation to backup your database to an Azure storage account.
<#
.SYNOPSIS
This Azure Automation runbook automates Azure SQL database backup to Blob storage and deletes old backups from blob storage.
.DESCRIPTION
You should use this Runbook if you want manage Azure SQL database backups in Blob storage.
This runbook can be used together with Azure SQL Point-In-Time-Restore.
This is a PowerShell runbook, as opposed to a PowerShell Workflow runbook.
.PARAMETER ResourceGroupName
Specifies the name of the resource group where the Azure SQL Database server is located
.PARAMETER DatabaseServerName
Specifies the name of the Azure SQL Database Server which script will backup
.PARAMETER DatabaseAdminUsername
Specifies the administrator username of the Azure SQL Database Server
.PARAMETER DatabaseAdminPassword
Specifies the administrator password of the Azure SQL Database Server
.PARAMETER DatabaseNames
Comma separated list of databases script will backup
.PARAMETER StorageAccountName
Specifies the name of the storage account where backup file will be uploaded
.PARAMETER BlobStorageEndpoint
Specifies the base URL of the storage account
.PARAMETER StorageKey
Specifies the storage key of the storage account
.PARAMETER BlobContainerName
Specifies the container name of the storage account where backup file will be uploaded. Container will be created if it does not exist.
.PARAMETER RetentionDays
Specifies the number of days how long backups are kept in blob storage. Script will remove all older files from container.
For this reason dedicated container should be only used for this script.
.INPUTS
None.
.OUTPUTS
Human-readable informational and error messages produced during the job. Not intended to be consumed by another runbook.
#>
param(
[parameter(Mandatory=$true)]
[String] $ResourceGroupName,
[parameter(Mandatory=$true)]
[String] $DatabaseServerName,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminUsername,
[parameter(Mandatory=$true)]
[String]$DatabaseAdminPassword,
[parameter(Mandatory=$true)]
[String]$DatabaseNames,
[parameter(Mandatory=$true)]
[String]$StorageAccountName,
[parameter(Mandatory=$true)]
[String]$BlobStorageEndpoint,
[parameter(Mandatory=$true)]
[String]$StorageKey,
[parameter(Mandatory=$true)]
[string]$BlobContainerName,
[parameter(Mandatory=$true)]
[Int32]$RetentionDays
)
$ErrorActionPreference = 'stop'
function Login() {
$connectionName = "AzureRunAsConnection"
try
{
$servicePrincipalConnection = Get-AutomationConnection -Name $connectionName
Write-Verbose "Logging in to Azure..." -Verbose
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $servicePrincipalConnection.TenantId `
-ApplicationId $servicePrincipalConnection.ApplicationId `
-CertificateThumbprint $servicePrincipalConnection.CertificateThumbprint | Out-Null
}
catch {
if (!$servicePrincipalConnection)
{
$ErrorMessage = "Connection $connectionName not found."
throw $ErrorMessage
} else{
Write-Error -Message $_.Exception
throw $_.Exception
}
}
}
function Create-Blob-Container([string]$blobContainerName, $storageContext) {
Write-Verbose "Checking if blob container '$blobContainerName' already exists" -Verbose
if (Get-AzureStorageContainer -ErrorAction "Stop" -Context $storageContext | Where-Object { $_.Name -eq $blobContainerName }) {
Write-Verbose "Container '$blobContainerName' already exists" -Verbose
} else {
New-AzureStorageContainer -ErrorAction "Stop" -Name $blobContainerName -Permission Off -Context $storageContext
Write-Verbose "Container '$blobContainerName' created" -Verbose
}
}
function Export-To-Blob-Storage([string]$resourceGroupName, [string]$databaseServerName, [string]$databaseAdminUsername, [string]$databaseAdminPassword, [string[]]$databaseNames, [string]$storageKey, [string]$blobStorageEndpoint, [string]$blobContainerName) {
Write-Verbose "Starting database export to databases '$databaseNames'" -Verbose
$securePassword = ConvertTo-SecureString –String $databaseAdminPassword –AsPlainText -Force
$creds = New-Object –TypeName System.Management.Automation.PSCredential –ArgumentList $databaseAdminUsername, $securePassword
foreach ($databaseName in $databaseNames.Split(",").Trim()) {
Write-Output "Creating request to backup database '$databaseName'"
$bacpacFilename = $databaseName + (Get-Date).ToString("yyyyMMddHHmm") + ".bacpac"
$bacpacUri = $blobStorageEndpoint + $blobContainerName + "/" + $bacpacFilename
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $resourceGroupName –ServerName $databaseServerName `
–DatabaseName $databaseName –StorageKeytype "StorageAccessKey" –storageKey $storageKey -StorageUri $BacpacUri `
–AdministratorLogin $creds.UserName –AdministratorLoginPassword $creds.Password -ErrorAction "Stop"
# Print status of the export
Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $exportRequest.OperationStatusLink -ErrorAction "Stop"
}
}
function Delete-Old-Backups([int]$retentionDays, [string]$blobContainerName, $storageContext) {
Write-Output "Removing backups older than '$retentionDays' days from blob: '$blobContainerName'"
$isOldDate = [DateTime]::UtcNow.AddDays(-$retentionDays)
$blobs = Get-AzureStorageBlob -Container $blobContainerName -Context $storageContext
foreach ($blob in ($blobs | Where-Object { $_.LastModified.UtcDateTime -lt $isOldDate -and $_.BlobType -eq "BlockBlob" })) {
Write-Verbose ("Removing blob: " + $blob.Name) -Verbose
Remove-AzureStorageBlob -Blob $blob.Name -Container $blobContainerName -Context $storageContext
}
}
Write-Verbose "Starting database backup" -Verbose
$StorageContext = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageKey
Login
Create-Blob-Container `
-blobContainerName $blobContainerName `
-storageContext $storageContext
Export-To-Blob-Storage `
-resourceGroupName $ResourceGroupName `
-databaseServerName $DatabaseServerName `
-databaseAdminUsername $DatabaseAdminUsername `
-databaseAdminPassword $DatabaseAdminPassword `
-databaseNames $DatabaseNames `
-storageKey $StorageKey `
-blobStorageEndpoint $BlobStorageEndpoint `
-blobContainerName $BlobContainerName
Delete-Old-Backups `
-retentionDays $RetentionDays `
-storageContext $StorageContext `
-blobContainerName $BlobContainerName
Write-Verbose "Database backup script finished" -Verbose

Export SQL Azure db to blob - Start-AzureSqlDatabaseExport : Cannot convert AzureStorageContainer to AzureStorageContainer

I am using this code found on stack , and all connections are correct.
Import-Module Azure
Import-Module Azure.Storage
Get-AzureRmSubscription –SubscriptionName “Production” | Select-AzureRmSubscription
# Username for Azure SQL Database server
$ServerLogin = "username"
# Password for Azure SQL Database server
$serverPassword = ConvertTo-SecureString "abcd" -AsPlainText -Force
# Establish credentials for Azure SQL Database Server
$ServerCredential = new-object System.Management.Automation.PSCredential($ServerLogin, $serverPassword)
# Create connection context for Azure SQL Database server
$SqlContext = New-AzureSqlDatabaseServerContext -FullyQualifiedServerName “myspecialsqlserver.database.windows.net” -Credential $ServerCredential
$StorageContext = New-AzureStorageContext -StorageAccountName 'prodwad' -StorageAccountKey 'xxxxx'
$Container = Get-AzureStorageContainer -Name 'automateddbbackups' -Context $StorageContext
$exportRequest = Start-AzureSqlDatabaseExport -SqlConnectionContext $SqlContext -StorageContainer $Container -DatabaseName 'Users' -BlobName 'autobackupotest.bacpac' -Verbose -Debug
I am getting this error. I have spent hours on this.
Start-AzureSqlDatabaseExport : Cannot bind parameter 'StorageContainer'. Cannot convert the
"Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageContainer" value of type
"Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageContainer" to type
"Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageContainer".
At line:31 char:99
+ ... SqlConnectionContext $SqlContext -StorageContainer $Container -Databa ...
+ ~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Start-AzureSqlDatabaseExport], ParameterBindingException
+ FullyQualifiedErrorId : CannotConvertArgumentNoMessage,Microsoft.WindowsAzure.Commands.SqlDatabase.Database.Cmdlet.StartAzureSqlDatabaseExport
I am using AzureRM 3.8.0
According to your description and codes, If you want to start a new sqldatabase export.I suggest you could try below codes. It will work well.
$subscriptionId = "YOUR AZURE SUBSCRIPTION ID"
Login-AzureRmAccount
Set-AzureRmContext -SubscriptionId $subscriptionId
# Database to export
$DatabaseName = "DATABASE-NAME"
$ResourceGroupName = "RESOURCE-GROUP-NAME"
$ServerName = "SERVER-NAME"
$serverAdmin = "ADMIN-NAME"
$serverPassword = "ADMIN-PASSWORD"
$securePassword = ConvertTo-SecureString -String $serverPassword -AsPlainText -Force
$creds = New-Object -TypeName System.Management.Automation.PSCredential -ArgumentList $serverAdmin, $securePassword
# Generate a unique filename for the BACPAC
$bacpacFilename = $DatabaseName + (Get-Date).ToString("yyyyMMddHHmm") + ".bacpac"
# Storage account info for the BACPAC
$BaseStorageUri = "https://STORAGE-NAME.blob.core.windows.net/BLOB-CONTAINER-NAME/"
$BacpacUri = $BaseStorageUri + $bacpacFilename
$StorageKeytype = "StorageAccessKey"
$StorageKey = "YOUR STORAGE KEY"
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName $ResourceGroupName -ServerName $ServerName `
-DatabaseName $DatabaseName -StorageKeytype $StorageKeytype -StorageKey $StorageKey -StorageUri $BacpacUri `
-AdministratorLogin $creds.UserName -AdministratorLoginPassword $creds.Password
$exportRequest
# Check status of the export
Get-AzureRmSqlDatabaseImportExportStatus -OperationStatusLink $exportRequest.OperationStatusLink
Then you could use Get-AzureRmSqlDatabaseImportExportStatus to see the details information, like below:
I got the same problem after updating some powershell packages which I do not remember exactly. After the update my scripts started to fail.
My solution is :
Install the latest AzureRM from nuget via powershell
Use the other overload of Start-AzureSqlDatabaseExport which utilizes the parameters
-StorageContainerName and -StorageContext rather than -StorageContainer
It looks like if you pass the parameters to the function it will create the container object internally without crashing.

Copy-BlobFromAzureStorage

I'm trying to copy-Blob from azure storage for that i have taken a runbook from the Azure runbook gallery named "Copy-BlobFromAzureStorage". When i try to test, it prompt me for "PATHTOPLACEBLOB" here i have given the default location "c:/" . and its running fine.But the thing is I don't understand where exactly i can find the stored blob, and it is given " PSComputerName" as "localhost". Can any one please suggest me regarding this.
Code:
workflow Copy-BlobFromAzureStorage{
param
(
[parameter(Mandatory=$True)]
[String]
$AzureSubscriptionName,
[parameter(Mandatory=$True)]
[PSCredential]
$AzureOrgIdCredential,
[parameter(Mandatory=$True)]
[String]
$StorageAccountName,
[parameter(Mandatory=$True)]
[String]
$ContainerName,
[parameter(Mandatory=$True)]
[String]
$BlobName,
[parameter(Mandatory=$False)]
[String]
$PathToPlaceBlob = "C:\"
)
$Null = Add-AzureAccount -Credential $AzureOrgIdCredential
$Null = Select-AzureSubscription -SubscriptionName $AzureSubscriptionName
Write-Verbose "Downloading $BlobName from Azure Blob Storage to $PathToPlaceBlob"
Set-AzureSubscription `
-SubscriptionName $AzureSubscriptionName `
-CurrentStorageAccount $StorageAccountName
$blob =
Get-AzureStorageBlobContent `
-Blob $BlobName `
-Container $ContainerName `
-Destination $PathToPlaceBlob `
-Force
try {
Get-Item -Path "$PathToPlaceBlob\$BlobName" -ErrorAction Stop
}
catch {
Get-Item -Path $PathToPlaceBlob
}}
The blob is placed on the sandbox where the Azure Automation runbook is running. There's not much point in putting it there, since this sandbox will be cleaned up after the runbook job finishes, but it can make sense, depending on your scenario, as an intermediary point to put the blob, such as to edit it or transfer it to somewhere else outside of the sandbox (ex another Azure Storage account or an FTP server).

Resources