Azure Data Factory webhook execution times out instead of relaying errors - azure

I've attempted to set up a simple webhook execution in Azure Data Factory (v2), calling a simple (parameter-less) webhook for an Azure Automation Runbook I set up.
From the Azure Portal, I can see that the webhook is being executed and my runbook is being run, so far so good. The runbook is (currently) returning an error within 1 minute of execution - but that's fine, I also want to test failure scenarios.
Problem:
Data Factory doesn't seem to be 'seeing' the error result and spins until the timeout (10 minutes) elapses. When I kick off a debug run of the pipeline, I get the same - a timeout and no error result.
Update: I've fixed the runbook and it's now completing successfully, but Data Factory is still timing out and is not seeing the success response either.
Here is a screenshot of the setup:
And here is the portal confirming that the webhook is being run by azure data factory, and is completing in under a minute:
WEBHOOKDATA JSON is:
{"WebhookName":"Start CAMS VM","RequestBody":"{\r\n \"callBackUri\": \"https://dpeastus.svc.datafactory.azure.com/dataplane/workflow/callback/f7c...df2?callbackUrl=AAEAFF...0927&shouldReportToMonitoring=True&activityType=WebHook\"\r\n}","RequestHeader":{"Connection":"Keep-Alive","Expect":"100-continue","Host":"eab...ddc.webhook.eus2.azure-automation.net","x-ms-request-id":"7b4...2eb"}}
So as far as I can tell, things should be in place to pick up on the result (success of failure). Hopefully someone who's done this before knows what I'm missing.
Thanks!

I had assumed that Azure would automatically notify the ADF "callBackUri" with the result once the runbook completed or errored out (since they take care of 99% of the scaffolding without requiring a line of code).
It turns out that is not the case, and anyone wishing to execute a runbook from ADF will have to manually extract the callBackUri from the Webhookdata input parameter, and POST the result to it when done.
I haven't nailed this down yet, since the Microsoft tutorial sites I've found have a bad habit of taking screenshots of the code that does this rather than providing the code itself:
I guess I'll come back and edit this once I have it figured out.
EDIT I ended up implementing this by leaving my original Webhook untouched, and creating a "wrapper"/helper/utility Runbook that will execute an arbitrary webhook, and relay its status to ADF once it's complete.
Here's the full code I ended up with, in case it helps someone else. It's meant to be generic:
Setup / Helper Functions
param
(
[Parameter (Mandatory = $false)]
[object] $WebhookData
)
Import-Module -Name AzureRM.resources
Import-Module -Name AzureRM.automation
# Helper function for getting the current running Automation Account Job
# Inspired heavily by: https://github.com/azureautomation/runbooks/blob/master/Utility/ARM/Find-WhoAmI
<#
Queries the automation accounts in the subscription to find the automation account, runbook and resource group that the job is running in.
AUTHOR: Azure/OMS Automation Team
#>
Function Find-WhoAmI {
[CmdletBinding()]
Param()
Begin { Write-Verbose ("Entering {0}." -f $MyInvocation.MyCommand) }
Process {
# Authenticate
$ServicePrincipalConnection = Get-AutomationConnection -Name "AzureRunAsConnection"
Add-AzureRmAccount `
-ServicePrincipal `
-TenantId $ServicePrincipalConnection.TenantId `
-ApplicationId $ServicePrincipalConnection.ApplicationId `
-CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint | Write-Verbose
Select-AzureRmSubscription -SubscriptionId $ServicePrincipalConnection.SubscriptionID | Write-Verbose
# Search all accessible automation accounts for the current job
$AutomationResource = Get-AzureRmResource -ResourceType Microsoft.Automation/AutomationAccounts
$SelfId = $PSPrivateMetadata.JobId.Guid
foreach ($Automation in $AutomationResource) {
$Job = Get-AzureRmAutomationJob -ResourceGroupName $Automation.ResourceGroupName -AutomationAccountName $Automation.Name -Id $SelfId -ErrorAction SilentlyContinue
if (!([string]::IsNullOrEmpty($Job))) {
return $Job
}
Write-Error "Could not find the current running job with id $SelfId"
}
}
End { Write-Verbose ("Exiting {0}." -f $MyInvocation.MyCommand) }
}
Function Get-TimeStamp {
return "[{0:yyyy-MM-dd} {0:HH:mm:ss}]" -f (Get-Date)
}
My Code
### EXPECTED USAGE ###
# 1. Set up a webhook invocation in Azure data factory with a link to this Runbook's webhook
# 2. In ADF - ensure the body contains { "WrappedWebhook": "<your url here>" }
# This should be the URL for another webhook.
# LIMITATIONS:
# - Currently, relaying parameters and authentication credentials is not supported,
# so the wrapped webhook should require no additional authentication or parameters.
# - Currently, the callback to Azure data factory does not support authentication,
# so ensure ADF is configured to require no authentication for its callback URL (the default behaviour)
# If ADF executed this runbook via Webhook, it should have provided a WebhookData with a request body.
if (-Not $WebhookData) {
Write-Error "Runbook was not invoked with WebhookData. Args were: $args"
exit 0
}
if (-Not $WebhookData.RequestBody) {
Write-Error "WebhookData did not contain a ""RequestBody"" property. Data was: $WebhookData"
exit 0
}
$parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)
# And this data should contain a JSON body containing a 'callBackUri' property.
if (-Not $parameters.callBackUri) {
Write-Error 'WebhookData was missing the expected "callBackUri" property (which Azure Data Factory should provide automatically)'
exit 0
}
$callbackuri = $parameters.callBackUri
# Check for the "WRAPPEDWEBHOOK" parameter (which should be set up by the user in ADF)
$WrappedWebhook = $parameters.WRAPPEDWEBHOOK
if (-Not $WrappedWebhook) {
$ErrorMessage = 'WebhookData was missing the expected "WRAPPEDWEBHOOK" peoperty (which the user should have added to the body via ADF)'
Write-Error $ErrorMessage
}
else
{
# Now invoke the actual runbook desired
Write-Output "$(Get-TimeStamp) Invoking Webhook Request at: $WrappedWebhook"
try {
$OutputMessage = Invoke-WebRequest -Uri $WrappedWebhook -UseBasicParsing -Method POST
} catch {
$ErrorMessage = ("An error occurred while executing the wrapped webhook $WrappedWebhook - " + $_.Exception.Message)
Write-Error -Exception $_.Exception
}
# Output should be something like: {"JobIds":["<JobId>"]}
Write-Output "$(Get-TimeStamp) Response: $OutputMessage"
$JobList = (ConvertFrom-Json -InputObject $OutputMessage).JobIds
$JobId = $JobList[0]
$OutputMessage = "JobId: $JobId"
# Get details about the currently running job, and assume the webhook job is being run in the same resourcegroup/account
$Self = Find-WhoAmI
Write-Output "Current Job '$($Self.JobId)' is running in Group '$($Self.ResourceGroupName)' and Automation Account '$($Self.AutomationAccountName)'"
Write-Output "Checking for Job '$($JobId)' in same Group and Automation Account..."
# Monitor the job status, wait for completion.
# Check against a list of statuses that likely indicate an in-progress job
$InProgressStatuses = ('New', 'Queued', 'Activating', 'Starting', 'Running', 'Stopping')
# (from https://learn.microsoft.com/en-us/powershell/module/az.automation/get-azautomationjob?view=azps-4.1.0&viewFallbackFrom=azps-3.7.0)
do {
# 1 second between polling attempts so we don't get throttled
Start-Sleep -Seconds 1
try {
$Job = Get-AzureRmAutomationJob -Id $JobId -ResourceGroupName $Self.ResourceGroupName -AutomationAccountName $Self.AutomationAccountName
} catch {
$ErrorMessage = ("An error occurred polling the job $JobId for completion - " + $_.Exception.Message)
Write-Error -Exception $_.Exception
}
Write-Output "$(Get-TimeStamp) Polled job $JobId - current status: $($Job.Status)"
} while ($InProgressStatuses.Contains($Job.Status))
# Get the job outputs to relay to Azure Data Factory
$Outputs = Get-AzureRmAutomationJobOutput -Id $JobId -Stream "Any" -ResourceGroupName $Self.ResourceGroupName -AutomationAccountName $Self.AutomationAccountName
Write-Output "$(Get-TimeStamp) Outputs from job: $($Outputs | ConvertTo-Json -Compress)"
$OutputMessage = $Outputs.Summary
Write-Output "Summary ouput message: $($OutputMessage)"
}
# Now for the entire purpose of this runbook - relay the response to the callback uri.
# Prepare the success or error response as per specifications at https://learn.microsoft.com/en-us/azure/data-factory/control-flow-webhook-activity#additional-notes
if ($ErrorMessage) {
$OutputJson = #"
{
"output": { "message": "$ErrorMessage" },
"statusCode": 500,
"error": {
"ErrorCode": "Error",
"Message": "$ErrorMessage"
}
}
"#
} else {
$OutputJson = #"
{
"output": { "message": "$OutputMessage" },
"statusCode": 200
}
"#
}
Write-Output "Prepared ADF callback body: $OutputJson"
# Post the response to the callback URL provided
$callbackResponse = Invoke-WebRequest -Uri $callbackuri -UseBasicParsing -Method POST -ContentType "application/json" -Body $OutputJson
Write-Output "Response was relayed to $callbackuri"
Write-Output ("ADF replied with the response: " + ($callbackResponse | ConvertTo-Json -Compress))
At a high-level, steps I've taken are to:
Execute the "main" Webhook - get back a "Job Id"
Get the current running job's "context" (resource group and automation account info) so that I can poll the remote job.
Poll the job until it is complete
Put together either a "success" or "error" response message in the format that Azure Data Factory expects.
Invoke the ADF callback.

For those looking, I created a secondary approach to the above solution - one that executes a Runbook (with parameters) from a Webhook, rather than a invoking a nested Webhook. This has a couple of benefits:
Parameters can be passed through to the Runbook (rather than requiring parameters to be baked-into a new Webhook.
A Runbook from another Azure Automation Account / Resource Group can be invoked.
There's no need to poll the status of the job, since the Start-AzureRmAutomationRunbook commandlet has a -Wait parameter.
Here's the code:
param
(
# Note: While "WebhookData" is the only root-level parameter (set by Azure Data Factory when it invokes this webhook)
# The user should ensure they provide (via the ADF request body) these additional properties required to invoke the runbook:
# - RunbookName
# - ResourceGroupName (TODO: Can fill this in by default if not provided)
# - AutomationAccountName (TODO: Can fill this in by default if not provided)
# - Parameters (A nested dict containing parameters to forward along)
[Parameter (Mandatory = $false)]
[object] $WebhookData
)
Import-Module -Name AzureRM.resources
Import-Module -Name AzureRM.automation
Function Get-TimeStamp {
return "[{0:yyyy-MM-dd} {0:HH:mm:ss}]" -f (Get-Date)
}
# If ADF executed this runbook via Webhook, it should have provided a WebhookData with a request body.
if (-Not $WebhookData) {
Write-Error "Runbook was not invoked with WebhookData. Args were: $args"
exit 0
}
if (-Not $WebhookData.RequestBody) {
Write-Error "WebhookData did not contain a ""RequestBody"" property. Data was: $WebhookData"
exit 0
}
$parameters = (ConvertFrom-Json -InputObject $WebhookData.RequestBody)
# And this data should contain a JSON body containing a 'callBackUri' property.
if (-Not $parameters.callBackUri) {
Write-Error 'WebhookData was missing the expected "callBackUri" property (which Azure Data Factory should provide automatically)'
exit 0
}
$callbackuri = $parameters.callBackUri
# Check for required parameters, and output any errors.
$ErrorMessage = ''
$RunbookName = $parameters.RunbookName
$ResourceGroupName = $parameters.ResourceGroupName
$AutomationAccountName = $parameters.AutomationAccountName
if (-Not $RunbookName) {
$ErrorMessage += 'WebhookData was missing the expected "RunbookName" property (which the user should have added to the body via ADF)`n'
} if (-Not $ResourceGroupName) {
$ErrorMessage += 'WebhookData was missing the expected "ResourceGroupName" property (which the user should have added to the body via ADF)`n'
} if (-Not $AutomationAccountName) {
$ErrorMessage += 'WebhookData was missing the expected "AutomationAccountName" property (which the user should have added to the body via ADF)`n'
} if ($ErrorMessage) {
Write-Error $ErrorMessage
} else {
# Set the current automation connection's authenticated account to use for future Azure Resource Manager cmdlet requests.
# TODO: Provide the user with a way to override this if the target runbook doesn't support the AzureRunAsConnection
$ServicePrincipalConnection = Get-AutomationConnection -Name "AzureRunAsConnection"
Add-AzureRmAccount -ServicePrincipal `
-TenantId $ServicePrincipalConnection.TenantId `
-ApplicationId $ServicePrincipalConnection.ApplicationId `
-CertificateThumbprint $ServicePrincipalConnection.CertificateThumbprint | Write-Verbose
Select-AzureRmSubscription -SubscriptionId $ServicePrincipalConnection.SubscriptionID | Write-Verbose
# Prepare the properties to pass on to the next runbook - all provided properties exept the ones specific to the ADF passthrough invocation
$RunbookParams = #{ }
if($parameters.parameters) {
$parameters.parameters.PSObject.Properties | Foreach { $RunbookParams[$_.Name] = $_.Value }
Write-Output "The following parameters will be forwarded to the runbook: $($RunbookParams | ConvertTo-Json -Compress)"
}
# Now invoke the actual runbook desired, and wait for it to complete
Write-Output "$(Get-TimeStamp) Invoking Runbook '$($RunbookName)' from Group '$($ResourceGroupName)' and Automation Account '$($AutomationAccountName)'"
try {
# Runbooks have this nice flag that let you wait on their completion (unlike webhook-invoked)
$Result = Start-AzureRmAutomationRunbook -Wait -Name $RunbookName -AutomationAccountName $AutomationAccountName -ResourceGroupName $ResourceGroupName –Parameters $RunbookParams
} catch {
$ErrorMessage = ("An error occurred while invoking Start-AzAutomationRunbook - " + $_.Exception.Message)
Write-Error -Exception $_.Exception
}
# Digest the result to be relayed to ADF
if($Result) {
Write-Output "$(Get-TimeStamp) Response: $($Result | ConvertTo-Json -Compress)"
$OutputMessage = $Result.ToString()
} elseif(-Not $ErrorMessage) {
$OutputMessage = "The runbook completed without errors, but the result was null."
}
}
# Now for the entire purpose of this runbook - relay the response to the callback uri.
# Prepare the success or error response as per specifications at https://learn.microsoft.com/en-us/azure/data-factory/control-flow-webhook-activity#additional-notes
if ($ErrorMessage) {
$OutputJson = #{
output = #{ message = $ErrorMessage }
statusCode = 500
error = #{
ErrorCode = "Error"
Message = $ErrorMessage
}
} | ConvertTo-Json -depth 2
} else {
$OutputJson = #{
output = #{ message = $OutputMessage }
statusCode = 200
} | ConvertTo-Json -depth 2
}
Write-Output "Prepared ADF callback body: $OutputJson"
# Post the response to the callback URL provided
$callbackResponse = Invoke-WebRequest -Uri $callbackuri -UseBasicParsing -Method POST -ContentType "application/json" -Body $OutputJson
Write-Output "Response was relayed to $callbackuri"
Write-Output ("ADF replied with the response: " + ($callbackResponse | ConvertTo-Json -Compress))

Related

Azure Function gives "InvokeMethodOnNull" error when calling Start-AzContainerGroup

I have a basic Azure Function running Powershell. I have followed instructions from the Microsoft Learn Tutorial to create an HTTP trigger to start a Container Instance. I have modified the tutorial after various attempts, so that the code now just starts an existing Container Instance on my Azure tenant, as follows:
using namespace System.Net
# Input bindings are passed in via param block.
param($Request, $TriggerMetadata)
# Write to the Azure Functions log stream.
Write-Host "PowerShell HTTP trigger function processed a request."
# Interact with query parameters or the body of the request.
$command = $Request.Query.Command
if (-not $command) {
$command = $Request.Body.Command
}
$body = "This HTTP triggered function executed successfully. Either pass 'start' or 'stop' as the 'command' parameter for the appropriate action to be executed on the acme-dns container."
if ($command) {
$body = "Command received: $command"
if ($command = "start") {
Start-AzContainerGroup -ResourceGroupName test -Name dev
}
elseif ($command = "stop") {
Stop-AzContainerGroup -ResourceGroupName test -Name dev
}
if ($?) {
$body = "This HTTP triggered function executed successfully. Started container group $name"
}
else {
$body = "There was a problem starting the container group."
}
}
# Associate values to output bindings by calling 'Push-OutputBinding'.
Push-OutputBinding -Name Response -Value ([HttpResponseContext]#{
StatusCode = [HttpStatusCode]::OK
Body = $body
})
requirements.psd1:
#{
# For latest supported version, go to 'https://www.powershellgallery.com/packages/Az'.
# To use the Az module in your function app, please uncomment the line below.
'Az' = '9.*'
}
Error text when I try to debug locally:
Azure Functions Core Tools
Core Tools Version: 4.0.4915 Commit hash: N/A (64-bit)
Function Runtime Version: 4.14.0.19631
Functions:
HttpExample: [GET,POST] http://localhost:7071/api/HttpExample
For detailed output, run func with --verbose flag.
[2022-12-13T14:27:14.456Z] Worker process started and initialized.
[2022-12-13T14:27:18.512Z] Host lock lease acquired by instance ID '0000000000000000000000000713F6BA'.
[2022-12-13T14:27:38.039Z] Executing 'Functions.HttpExample' (Reason='This function was programmatically called via the host APIs.', Id=5b3c9e21-290f-4ff4-ac49-21f362414926)
[2022-12-13T14:27:38.238Z] INFORMATION: PowerShell HTTP trigger function processed a request.
[2022-12-13T14:27:56.862Z] EXCEPTION: You cannot call a method on a null-valued expression.
[2022-12-13T14:27:56.863Z]
[2022-12-13T14:27:56.863Z] Exception :
[2022-12-13T14:27:56.864Z] Type : System.Management.Automation.ParentContainsErrorRecordException
[2022-12-13T14:27:56.865Z] Message : You cannot call a method on a null-valued expression.
[2022-12-13T14:27:56.866Z] HResult : -2146233087
[2022-12-13T14:27:56.867Z] CategoryInfo : InvalidOperation: (:) [], ParentContainsErrorRecordException
[2022-12-13T14:27:56.868Z] FullyQualifiedErrorId : InvokeMethodOnNull
[2022-12-13T14:27:56.869Z] InvocationInfo :
[2022-12-13T14:27:56.870Z] ScriptLineNumber : 1777
[2022-12-13T14:27:56.871Z] OffsetInLine : 13
[2022-12-13T14:27:56.872Z] HistoryId : -1
[2022-12-13T14:27:56.873Z] ScriptName : C:\Users\{USER}\Documents\PowerShell\Modules\Az.ContainerInstance\3.1.0\exports\ProxyCmdletDefinitions.ps1
[2022-12-13T14:27:56.874Z] Line : [Microsoft.WindowsAzure.Commands.Utilities.Common.AzurePSCmdlet]::PowerShellVersion = $Host.Runspace.Version.ToString()
[2022-12-13T14:27:56.875Z]
[2022-12-13T14:27:56.876Z] PositionMessage : At C:\Users\{USER}\Documents\PowerShell\Modules\Az.ContainerInstance\3.1.0\exports\ProxyCmdletDefinitions.ps1:1777 char:13
[2022-12-13T14:27:56.877Z] + [Microsoft.WindowsAzure.Commands.Utilities.Common.AzurePS .
[2022-12-13T14:27:56.878Z] + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
[2022-12-13T14:27:56.879Z] PSScriptRoot : C:\Users\{USER}\Documents\PowerShell\Modules\Az.ContainerInstance\3.1.0\exports
[2022-12-13T14:27:56.880Z] PSCommandPath : C:\Users\{USER}\Documents\PowerShell\Modules\Az.ContainerInstance\3.1.0\exports\ProxyCmdletDefinitions.ps1
[2022-12-13T14:27:56.881Z] CommandOrigin : Internal
[2022-12-13T14:27:56.882Z] ScriptStackTrace : at Start-AzContainerGroup<Begin>, C:\Users\{USER}\Documents\PowerShell\Modules\Az.ContainerInstance\3.1.0\exports\ProxyCmdletDefinitions.ps1: line 1777
[2022-12-13T14:27:56.883Z] at <ScriptBlock>, C:\Coding\AzFunctions\PsHttpTrigger\HttpExample\run.ps1: line 20
[2022-12-13T14:27:56.884Z]
[2022-12-13T14:27:56.884Z]
[2022-12-13T14:27:56.913Z] Executed 'Functions.HttpExample' (Failed, Id=5b3c9e21-290f-4ff4-ac49-21f362414926, Duration=18894ms)
[2022-12-13T14:27:56.914Z] System.Private.CoreLib: Exception while executing function: Functions.HttpExample. System.Private.CoreLib: Result: Failure
Exception: You cannot call a method on a null-valued expression.
Stack: at System.Management.Automation.Runspaces.PipelineBase.Invoke(IEnumerable input)
[2022-12-13T14:27:56.914Z] at System.Management.Automation.Runspaces.Pipeline.Invoke()
[2022-12-13T14:27:56.915Z] at System.Management.Automation.PowerShell.Worker.ConstructPipelineAndDoWork(Runspace rs, Boolean performSyncInvoke)
[2022-12-13T14:27:56.916Z] at System.Management.Automation.PowerShell.Worker.CreateRunspaceIfNeededAndDoWork(Runspace rsToUse, Boolean isSync)
[2022-12-13T14:27:56.917Z] at System.Management.Automation.PowerShell.CoreInvokeHelper[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)
[2022-12-13T14:27:56.917Z] at System.Management.Automation.PowerShell.CoreInvoke[TInput,TOutput](PSDataCollection`1 input, PSDataCollection`1 output, PSInvocationSettings settings)
[2022-12-13T14:27:56.918Z] at System.Management.Automation.PowerShell.Invoke[T](IEnumerable input, IList`1 output, PSInvocationSettings settings)
[2022-12-13T14:27:56.919Z] at System.Management.Automation.PowerShell.Invoke[T]()
[2022-12-13T14:27:56.919Z] at Microsoft.Azure.Functions.PowerShellWorker.PowerShell.PowerShellExtensions.InvokeAndClearCommands[T](PowerShell pwsh) in /mnt/vss/_work/1/s/src/PowerShell/PowerShellExtensions.cs:line 45
[2022-12-13T14:27:56.920Z] at Microsoft.Azure.Functions.PowerShellWorker.PowerShell.PowerShellManager.InvokeNonOrchestrationFunction(DurableController durableController, IDictionary outputBindings) in /mnt/vss/_work/1/s/src/PowerShell/PowerShellManager.cs:line 301
[2022-12-13T14:27:56.921Z] at Microsoft.Azure.Functions.PowerShellWorker.PowerShell.PowerShellManager.InvokeFunction(AzFunctionInfo functionInfo, Hashtable triggerMetadata, TraceContext traceContext, RetryContext retryContext, IList`1 inputData, FunctionInvocationPerformanceStopwatch stopwatch) in /mnt/vss/_work/1/s/src/PowerShell/PowerShellManager.cs:line 230
[2022-12-13T14:27:56.922Z] at Microsoft.Azure.Functions.PowerShellWorker.RequestProcessor.InvokeFunction(AzFunctionInfo functionInfo, PowerShellManager psManager, FunctionInvocationPerformanceStopwatch stopwatch, InvocationRequest invocationRequest) in /mnt/vss/_work/1/s/src/RequestProcessor.cs:line 336
[2022-12-13T14:27:56.923Z] at Microsoft.Azure.Functions.PowerShellWorker.RequestProcessor.ProcessInvocationRequestImpl(StreamingMessage request, AzFunctionInfo functionInfo, PowerShellManager psManager, FunctionInvocationPerformanceStopwatch stopwatch) in /mnt/vss/_work/1/s/src/RequestProcessor.cs:line 308.
It seems to me that the $Host variable within the AzurePS module is null, but I have no idea how to go about fixing this.
When I run the function from Azure cloud, it times out (Error: 500 - The request timed out. The web server failed to respond within the specified time.)
Running Start-AzContainerGroup from Powershell on my local PC works just fine.
I have modified your code accordingly for the requirement of Starting and stopping the Azure Container groups using PowerShell Azure Functions and it's working as expected:
using namespace System.Net
# Input bindings are passed in via param block.
param($Request, $TriggerMetadata)
# Write to the Azure Functions log stream.
Write-Host "PowerShell HTTP trigger function processed a request."
Connect-AzAccount -Tenant 'TeantId' -SubscriptionId 'SubscriptionId'
Set-AzContext -Subscription 'SubscriptionId'
$command = $Request.Query.Command
if (-not $command) {
$command = $Request.Body.Command
}
$body = "This HTTP triggered function executed successfully. Either pass 'start' or 'stop' as the 'command' parameter for the appropriate action to be executed on the acme-dns container."
if ($command) {
Write-Host "Command received: $command"
if ($command = "start") {
Start-AzContainerGroup -ResourceGroupName HariTestRG -Name test-cg
$body = "This HTTP triggered function executed successfully. Started container group."
}
elseif ($command = "stop") {
Stop-AzContainerGroup -ResourceGroupName HariTestRG -Name test-cg
$body = "This HTTP triggered function executed successfully. Stopped container group."
}
}
Push-OutputBinding -Name Response -Value ([HttpResponseContext]#{
StatusCode = [HttpStatusCode]::OK
Body = $body
})
Function API Response:
Result of Input Parameter to the Function API as Command = Start:
Before running this function, I have created the Container Group in Azure using these PowerShell Cmdlets:
$port1 = New-AzContainerInstancePortObject -Port 8000 -Protocol TCP
$port2 = New-AzContainerInstancePortObject -Port 8001 -Protocol TCP
$container = New-AzContainerInstanceObject -Name test-container -Image nginx -RequestCpu 1 -RequestMemoryInGb 1.5 -Port #($port1, $port2)
$containerGroup = New-AzContainerGroup -ResourceGroupName HariTestRG -Name test-cg -Location eastus -Container $container -OsType Linux -RestartPolicy "Never" -IpAddressType Public
Note: Few Parts of the above code taken from the MS Doc Reference 1 & 2.

Azure Automation Purge CDN Endpoint

I am trying to put in place a Azure Automation Runbook with the intent to purge all the cache when I change is made to a blob storage. So far, if I upload from azure portal 1 file that works just fine. But if I try to upload multiple file, some of them they just fail with the following error.
We can only accept 100 paths for purging concurrently. Please try again in a few minutes.
Here is the code I am using in the automation Runbook process:
param (
[Parameter (Mandatory = $false)]
[object] $WebHookData
)
## Authentication ##
# Runbook must authenticate to purge content
# Connect to Azure with RunAs account
$conn = Get-AutomationConnection -Name "AzureRunAsConnection"
# Connect to Azure Automation
$null = Add-AzAccount `
-ServicePrincipal `
-TenantId $conn.TenantId `
-ApplicationId $conn.ApplicationId `
-CertificateThumbprint $conn.CertificateThumbprint
## declarations ##
# Update parameters below
# CDN Profile name
$profileName = "<CDNProfileName>"
# CND Resource Group
$resourceGroup = "<Resource-Group>"
# CDN Endpoint Name
$endPointName = "<EndPointName>"
# Set Error Action Default
$errorDefault = $ErrorActionPreference
## Execution ##
# Convert Webhook Body to json
try {
$requestBody = $WebHookData.requestBody | ConvertFrom-json -ErrorAction 'stop'
}
catch {
$ErrorMessage = $_.Exception.message
write-error ('Error converting Webhook body to json ' + $ErrorMessage)
Break
}
# Convert requestbody to file path
try {
$ErrorActionPreference = 'stop'
$filePath = $requestBody.data.url -replace "https://<storageaccountname>.blob.core.windows.net",""
}
catch {
$ErrorMessage = $_.Exception.message
write-error ('Error converting file path ' + $ErrorMessage)
Break
}
finally {
$ErrorActionPreference = $errorDefault
}
# Run the purge command against the file
try {
Unpublish-AzCdnEndpointContent -ErrorAction 'Stop' -ProfileName $profileName -ResourceGroupName $resourceGroup `
-EndpointName $endPointName -PurgeContent '/*'
}
catch {
$ErrorMessage = $_.Exception.message
write-error ('Error purging content from CDN ' + $ErrorMessage)
Break
}
Anyone can help with this or clarify to me what could be the reason why the Purge is failing with that error ("BadRequest")
Thank you so much for your help
From the bottom of the article about CDN purge enpoint:
Purge requests take approximately 10 minutes to process with Azure CDN
from Microsoft, approximately 2 minutes with Azure CDN from Verizon
(standard and premium), and approximately 10 seconds with Azure CDN
from Akamai. Azure CDN has a limit of 100 concurrent purge requests
at any given time at the profile level.
There is a limit of 100 concurrent purge requests at any given time at the profile level.

How to delete the ADFPipeline which is having the references Forcefully

I'm actually some automation for my ADF. As a part of that, I'm trying to delete all the ADF V2 pipelines. The problem is my pipelines having many references with different pipelines itself.
$ADFPipeline = Get-AzDataFactoryV2Pipeline -DataFactoryName $(datafactory-name) -ResourceGroupName $(rg)
$ADFPipeline | ForEach-Object { Remove-AzDataFactoryV2Pipeline -ResourceGroupName $(rg) -DataFactoryName $(datafactory-name) -Name $_.name -Force }
And most of the time I get the error like
The document cannot be deleted since it is referenced by "blabla"
I understand the error that it saying some references and cannot be deleted. However, when I tried the same deletion in the azure portal, irrespective of the reference I can able to delete. So I want to find a way that whether it possible to tell that Powershell even though it's having a reference delete it forcefully
Any other inputs much appreciated!
I run into the same issue, found out that it's rather complicated to build the whole dependency graph out of the pipeline's Activities property.
As a working solution (powershell):
function Remove-Pipelines {
param (
[Parameter(Mandatory=$true)]
[AllowEmptyCollection()]
[AllowNull()]
[System.Collections.ArrayList]$pipelines
)
if($pipelines.Count -gt 0) {
[System.Collections.ArrayList]$plsToProcess = New-Object System.Collections.ArrayList($null)
foreach ($pipeline in $pipelines) {
try {
$removeAzDFCommand = "Remove-AzDataFactoryV2Pipeline -dataFactoryName '$DataFactoryName' -resourceGroupName '$ResourceGroupName' -Name '$($pipeline.Name)' -Force -ErrorAction Stop"
Write-Host $removeAzDFCommand
Invoke-Expression $removeAzDFCommand
}
catch {
if ($_ -match '.*The document cannot be deleted since it is referenced by.*') {
Write-Host $_
$plsToProcess.Add($pipeline)
} else {
throw $_
}
}
}
Remove-Pipelines $plsToProcess
}
}
Here is the complete solution for clearing the whole DF: "trigger","pipeline","dataflow","dataset","linkedService"
Param(
[Parameter(Mandatory=$true)][string] $ResourceGroupName,
[Parameter(Mandatory=$true)][string] $DataFactoryName
)
$artfTypes = "trigger","pipeline","dataflow","dataset","linkedService"
function Remove-Artifacts {
param (
[Parameter(Mandatory=$true)][AllowEmptyCollection()][AllowNull()][System.Collections.ArrayList]$artifacts,
[Parameter(Mandatory=$true)][string]$artfType
)
if($artifacts.Count -gt 0) {
[System.Collections.ArrayList]$artToProcess = New-Object System.Collections.ArrayList($null)
foreach ($artifact in $artifacts) {
try {
$removeAzDFCommand = "Remove-AzDataFactoryV2$($artfType) -dataFactoryName '$DataFactoryName' -resourceGroupName '$ResourceGroupName' -Name '$($artifact.Name)' -Force -ErrorAction Stop"
Write-Host $removeAzDFCommand
Invoke-Expression $removeAzDFCommand
}
catch {
if ($_ -match '.*The document cannot be deleted since it is referenced by.*') {
Write-Host $_
$artToProcess.Add($artifact)
} else {
throw $_
}
}
}
Remove-Artifacts $artToProcess $artfType
}
}
foreach ($artfType in $artfTypes) {
$getAzDFCommand = "Get-AzDataFactoryV2$($artfType) -dataFactoryName '$DataFactoryName' -resourceGroupName '$ResourceGroupName'"
Write-Output $getAzDFCommand
$artifacts = Invoke-Expression $getAzDFCommand
Write-Output $artifacts.Name
Remove-Artifacts $artifacts $artfType
}
The same approach can be adapted for "Set-AzDataFactoryV2Pipeline" command as well.
It worth to mention that along with dependencies tracking, Remove/Set artifact's sequence should be right (because of cross artifacts' dependencies).
For Set - "linkedService","dataset","dataflow","pipeline","trigger"
For Remove - "trigger","pipeline","dataflow","dataset","linkedService"
Hello and thank you for the question. According to the Remove-AzDataFactoryV2Pipeline doc, the -Force flag simply skips the confirmation prompt. It does not actually 'Force' the deletion in spite of errors.
Since you are already doing automation, might I suggest leveraging the error message to recursively attempt to delete the referencing pipeline. $error[0] gets the most recent error.
(Pseudocode)
try_recurse_delete( pipeline_name )
do_delete(pipeline_name)
if not $error[0].contains("referenced by " + pipeline_name)
then return true
else
try_recurse_delete( get_refrencer_name($error[0]) )
Given that pipeline dependencies can be a many-to-many relationship, subsequent pipelines in your for-each loop might already be deleted by the recursion. You will have to adapt your code to react to 'pipeline not found' type errors.

Is there option to auto terminate Azure SQL DW

I am using Azure SQL DW which costs more per an hour. So I want to know is there option for auto terminate SQL DW after an hour or so?
You can pause the Azure Data warehouse and then you only pay for the storage used.
You can automate pausing your DWH by using an Azure automation account and a runbook.
This blog explains the process:
https://blogs.msdn.microsoft.com/allanmiller/2017/09/20/pausing-azure-sql-data-warehouse-using-an-automation-runbook/
Markus
Yes, it is possible.
To save costs, you can pause and resume compute resources on-demand. For example, if you won't be using the database during the night and on weekends, you can pause it during those times, and resume it during the day. You won't be charged for DWUs while the database is paused.
When you pause a database:
Compute and memory resources are returned to the pool of available resources in the data center
DWU costs are zero for the duration of the pause.
Data storage is not affected and your data stays intact.
SQL Data Warehouse cancels all running or queued operations.
To pause a database, use the Suspend-AzureRmSqlDatabase cmdlet.
Suspend-AzureRmSqlDatabase –ResourceGroupName "ResourceGroup1" `
–ServerName "Server01" –DatabaseName "Database02"
To start a database, use the Resume-AzureRmSqlDatabase cmdlet.
Resume-AzureRmSqlDatabase –ResourceGroupName "ResourceGroup1" `
–ServerName "Server01" -DatabaseName "Database02"
More information please refer to this official document.
As Markus Bohse said, you also could use Automation to do this.
Note: You also could write a runbook to start your database.
Update: If you want to use java to do this, please refer to this API document.
public void pauseDataWarehouse()
Update:
You also could Rest API to do this. See this link
POST https://management.azure.com/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.Sql/servers/{server-name}/databases/{database-name}/pause?api-version=2014-04-01-preview HTTP/1.1
You can use runbooks or powershell script to pause dn resume SQL DWH.
Powershell script is below. If you would like "runbook" let me know
[CmdletBinding(DefaultParametersetName='None')]
Param
(
[Parameter(Mandatory=$true)][ValidateNotNullOrEmpty()]
[String]
$AzureSubscriptionId,
[Parameter(Mandatory=$true)][ValidateNotNullOrEmpty()]
[String]
$AzureDataWareHouseList="All",
[Parameter(Mandatory=$true)][ValidateSet("Suspend","Resume")]
[String]
$Action
)
function PauseAzureDWH
{
Param
(
[Parameter(Mandatory=$true)][ValidateNotNullOrEmpty()]
[String]
$AzureSubscriptionId,
[Parameter(Mandatory=$true)][ValidateNotNullOrEmpty()]
[String]
$AzureDataWareHouseList="All",
[Parameter(Mandatory=$true)][ValidateSet("Suspend","Resume")]
[String]
$Action
)
try
{
Login-AzureRmAccount
Select-AzureRmSubscription -SubscriptionId $AzureSubscriptionId
if($AzureDataWareHouseList -ne "All")
{
$AzureDWHList = #()
$AzureDWHTotalList = $AzureDataWareHouseList.Split(",")
foreach($DWHitem in $AzureDWHTotalList)
{
$DWH = "*$DWHitem*"
$DWH = Get-AzureRmResource | Where-Object ResourceName -like $DWH
if($DWH -ne $Null)
{
$dwc = $DWH.ResourceName.split("/")
# splat reused parameter lists
$ThisDW = #{
'ResourceGroupName' = $DWH.ResourceGroupName
'ServerName' = $dwc[0]
'DatabaseName' = $dwc[1]
}
$AzureDWHList += $ThisDW
}
else
{
Write-Warning "Given DataWarehouse '$DWHitem' is not found in given subscription"
}
}
}
else
{
[array]$TotalDataWareHouseList = Get-AzureRmResource | Where-Object ResourceType -EQ "Microsoft.Sql/servers/databases" | Where-Object Kind -Like "*datawarehouse*"
$AzureDWHList = #()
foreach($DWH in $TotalDataWareHouseList)
{
$dwc = $DWH.ResourceName.split("/")
$ThisDW = #{
'ResourceGroupName' = $DWH.ResourceGroupName
'ServerName' = $dwc[0]
'DatabaseName' = $dwc[1]
}
$AzureDWHList += $ThisDW
}
}
<# foreach($AzureDWHItem in $AzureDWHList)
{
if(!(Get-AzureRmResource | ? {$_.Name -eq $AzureDWHItem.ServerName}) )
{
throw " AzureDWH : [$AzureDWHItem] - Does not exist! - please Check your inputs "
}
} #>
if($Action -eq "Suspend")
{
Write-Output "Suspending Azure DataWareHouses";
foreach ($AzureDWH in $AzureDWHList)
{
$status = Get-AzureRmSqlDatabase #AzureDWH | Select Status
if($status.Status -eq "Online")
{
Suspend-AzureRmSqlDatabase #AzureDWH
}
}
}
else
{
Write-Output "Resuming Azure DataWareHouses";
foreach ($AzureDWH in $AzureDWHList)
{
$status = Get-AzureRmSqlDatabase #AzureDWH | Select Status
if($status.Status -eq "Paused")
{
Resume-AzureRmSqlDatabase #AzureDWH
}
}
}
}
catch
{
Write-Error " Exception while getting resource details and writing back to CSV"
Write-Error $_.Exception.message
Write-Error " ErrorStack: $Error[0] "
exit 1
}
}
PauseAzureDWH -AzureSubscriptionId $AzureSubscriptionId -AzureDataWareHouseList $AzureDataWareHouseList -Action $Action

Problems Running Azure Automation Powershell to Scale Database Back After Restore Operation

I am trying to scale back a database after the restore operation has completed and am running into some problems. I am getting this exception and wonder if there is something in this script not supported by Azure Automation Workflows?
Parameter set cannot be resolved using the specified named parameters.
workflow insertflowname
{
<#
.SYNOPSIS
The purpose of this runbook is to demonstrate how to restore a particular database to a new database using an Azure Automation workflow. Then it is scaled back to Basic.
.NOTES
#>
# Specify Azure Subscription Name
$subName = 'insertsubscription name'
# Connect to Azure Subscription
Connect-Azure -AzureConnectionName $subName
Select-AzureSubscription -SubscriptionName $subName
# Define source databasename
$SourceDatabaseName = 'insert database name'
# Define source server
$SourceServerName = 'insert source server'
# Define destination server
$TargetServerName = 'insert destination server'
Write-Output "`$SourceServerName [$SourceServerName]"
Write-Output "`$TargetServerName [$TargetServerName]"
Write-Output "`$SourceDatabaseName [$SourceDatabaseName]"
Write-Output "Retrieving recoverable database details for database [$SourceDatabaseName] on server [$SourceServerName]."
$RecoverableDatabase = Get-AzureSqlRecoverableDatabase –ServerName $SourceServerName -DatabaseName $SourceDatabaseName
$TargetDatabaseName = "$SourceDatabaseName-$($RecoverableDatabase.LastAvailableBackupDate.ToString('O'))"
Write-Output "`$TargetDatabaseName [$TargetDatabaseName]"
Write-Output "Starting recovery of database [$SourceDatabaseName] to server [$TargetServerName] as database [$TargetDatabaseName]."
Start-AzureSqlDatabaseRecovery -SourceDatabase $RecoverableDatabase -TargetServerName $TargetServerName –TargetDatabaseName $TargetDatabaseName
$PollingInterval = 10
Write-Output "Monitoring status of recovery operation, polling every [$PollingInterval] second(s)."
$KeepGoing = $true
while ($KeepGoing) {
$operation = Get-AzureSqlDatabaseOperation -ServerName $TargetServerName -DatabaseName $TargetDatabaseName | Where-Object {$_.Name -eq "DATABASE RECOVERY"} | Sort-Object StartTime -Descending
if ($operation) {
$operation[0]
if ($operation[0].State -eq "COMPLETED") { $KeepGoing = $false }
if ($operation[0].State -eq "FAILED") {
# Throw error
$KeepGoing = $false
}
} else {
# Throw error since something went wrong and object was not created
# May want to have this retry a few times before giving up or at least notify somebody
# since at this point the recovery has been kicked off and you don't want the database
# restore to remain at the elevated service level.
$KeepGoing = $false
}
if ($KeepGoing) { Start-Sleep -Seconds $PollingInterval }
}
if ($operation[0].State -eq "COMPLETED") {
Write-Output "Setting service level for database [$TargetDatabaseName] on server [$TargetServerName] to Basic."
$ServiceObjective = Get-AzureSQLDatabaseServiceObjective –ServerName $TargetServerName –ServiceObjectiveName "Basic"
$ServiceObjective
Set-AzureSqlDatabase –ServerName $TargetServerName –DatabaseName $TargetDatabaseName –Edition "Basic" –ServiceObjective $ServiceObjective -MaxSizeGB 2 –Force
}
}
You are probably hitting the issue described here: https://social.msdn.microsoft.com/Forums/en-US/ce6412b8-5cce-4573-befb-6017924ce0d0/whereobject-fails-with-parameter-set-cannot-be-resolved-using-the-specified-named-parameters?forum=azureautomation
Summary:
Use parameter names, don't rely on positional parameters, in PowerShell Workflow. In this case, you need to add the -FilterScript parameter name to Where-Object.

Resources