Failed to get Credential in Azure Automation Runbook - azure

I am trying to do a basic database operation using azure runbook (Code below). It is failing when I test it on Azure cloud. Error -
By doing a line by line execution, I found out that this error is caused by Get-AutomationPSCredential. However, the entire code is working fine when I run it from PowerShell ISE in my local system.
workflow sqlrunbook()
{
$SqlServer = "devserver001.database.windows.net"
$Database = "devdb001"
$SqlCredential = Get-AutomationPSCredential -Name 'SqlCredentialAsset'
# Get the username and password from the SQL Credential
$SqlUsername = $SqlCredential.GetNetworkCredential().UserName
$SqlPassword = $SqlCredential.GetNetworkCredential().Password
inlinescript
{
$Conn = New-Object System.Data.SqlClient.SqlConnection("Server=tcp:$using:SqlServer;Database=$using:Database;User ID=$using:SqlUsername;Password=$using:SqlPassword;")
$Cmd = New-Object System.Data.SqlClient.SqlCommand("insert into TestTable(ID) values(1)", $Conn)
$Cmd.CommandTimeout=120
$Conn.Open()
$Cmd.ExecuteNonQuery()
$Conn.Close()
}
}
sqlrunbook
Can you please let me know if I missed anyting while creating credential or should I enable any configuration to test this run book on Azure cloud? Thank in advance.

try this :
Get-AzureAutomationCredential -AutomationAccountName "Contoso17" -Name "MyCredential"
you can check this link also:
https://learn.microsoft.com/en-us/azure/automation/automation-credentials

Related

In Azure, how can you configure an alert or notification when a SQL Server failover happened?

In Azure, how can you configure an alert or notification when a SQL Server failover happened if you setup a SQL server with Failover groups and failover policy is set to automatic? If it can't be setup in Monitor can it be scripted elsewhere?
Found a way to script this in Azure using Automation Accounts > Runbook > using Powershell. A simple script like this should do it. Just need to figure out the run as account and trigger it by schedule or alert.
function sendEmailAlert
{
# Send email
}
function checkFailover
{
$list = Get-AzSqlDatabaseFailoverGroup -ResourceGroupName "my-resourceGroup" -server "my-sql-server"
if ( $list.ReplicationRole -ne 'Primary')
{
sendEmailAlert
}
}
checkFailover
Azure SQL database only support these alert metrics:
We could not using the alert when SQL Server failure happened. You can get this from this document: Create alerts for Azure SQL Database and Data Warehouse using Azure portal.
Hope this helps.
Thanks CKelly - gave me a good kick start to something that should be standard in Azure. I created an Azure Automation Account, added the Az.Account, Az.Automation and Az.Sql modules then added a bit more to your code. In Azure I created a SendGrid account.
#use the Azure Account Automation details to login to Azure
$Conn = Get-AutomationConnection -Name AzureRunAsConnection
Connect-AzAccount -ServicePrincipal -Tenant $Conn.TenantID -ApplicationId $Conn.ApplicationID -CertificateThumbprint $Conn.CertificateThumbprint
#create email alert
function sendEmailAlert
{
# Send email
$From = "<email from>"
$To = "<email of stakeholders to receive this message>"
$SMTPServer = "smtp.sendgrid.net"
$SMTPPort = "587"
$Username = "<sendgrid username>"
$Password = "<sendgridpassword>"
$subject = "<email subject>"
$body = "<text to go in email body>"
$smtp = New-Object System.Net.Mail.SmtpClient($SMTPServer, $SMTPPort)
$smtp.EnableSSL = $true
$smtp.Credentials = New-Object System.Net.NetworkCredential($Username, $Password)
$smtp.Send($From, $To, $subject, $body)
}
#create failover check and send if the primary server has changed
function checkFailover
{
$list = Get-AzSqlDatabaseFailoverGroup -ResourceGroupName "<the resourcegroup>" -server "<SQl Databse server>"
if ( $list.ReplicationRole -ne 'Primary')
{
sendEmailAlert
}
}
checkFailover
This process may help others.

How to give a 200 response from elasticsearch without authentication to an app gateway?

I have an issue setting up an ES cluster on Azure. I'd like my cluster to be behind an application gateway and also use shield authentication.
The problem is that the azure application gateway needs to send a health ping to the cluster and get back a 200 response, otherwise it returns a 502 "bad gateway". If I create an anonymous use then I can get the cluster to return a 200 but I'd rather not enable an anonymous user and use basic authentication instead.
Is there some endpoint on the cluster that will return a 200 even if the user is not authenticated and anonymous users are turned off?
Thanks!
There is no such endpoint in Elasticsearch. There is status.allowAnonymous in Kibana for the stats api endpoint, but nothing similar in Elasticsearch.
You'd have to define your own user that has access to a specific healthcheck url and use that or anonymous access enabled.
The healthchecks story can have variations: you check the health of a specific node (/_cluster/health?local=true), or the health of the cluster. You can also get a 200 if you send a _search request (with preference=_local) to a specific node even if that cluster doesn't have an elected master node, for example, because by default a _search operation is permitted on a node even in such situation.
In addition to #Andrei's answer, I'd recommend taking a look at Elastic's Azure ARM template which can deploy a cluster with Application Gateway configured for load balancing and SSL offload to the cluster.
This also works with X-Pack Security by setting up anonymous access for the Application Gateway ping health check that is assigned a role with access only to
cluster:
- cluster:monitor/main
It would be great if the ping check supported supplying credentials in the future, in which case anonymous access would not be required, locking things down further.
To deploy a cluster with Application Gateway using Azure PowerShell
function PromptCustom($title, $optionValues, $optionDescriptions)
{
Write-Host $title
Write-Host
$a = #()
for($i = 0; $i -lt $optionValues.Length; $i++){
Write-Host "$($i+1))" $optionDescriptions[$i]
}
Write-Host
while($true)
{
Write-Host "Choose an option: "
$option = Read-Host
$option = $option -as [int]
if($option -ge 1 -and $option -le $optionValues.Length)
{
return $optionValues[$option-1]
}
}
}
function Prompt-Subscription() {
# Choose subscription. If there's only one we will choose automatically
$subs = Get-AzureRmSubscription
$subscriptionId = ""
if($subs.Length -eq 0) {
Write-Error "No subscriptions bound to this account."
return
}
if($subs.Length -eq 1) {
$subscriptionId = $subs[0].SubscriptionId
}
else {
$subscriptionChoices = #()
$subscriptionValues = #()
foreach($subscription in $subs) {
$subscriptionChoices += "$($subscription.SubscriptionName) ($($subscription.SubscriptionId))";
$subscriptionValues += ($subscription.SubscriptionId);
}
$subscriptionId = PromptCustom "Choose a subscription" $subscriptionValues $subscriptionChoices
}
return $subscriptionId
}
$subscriptionId = "{YOUR SUBSCRIPTION ID}"
try {
Select-AzureRmSubscription -SubscriptionId $subscriptionId -ErrorAction Stop
}
catch {
Write-Host "Please Login"
Login-AzureRmAccount
$subscriptionId = Prompt-Subscription
Select-AzureRmSubscription -SubscriptionId $subscriptionId
}
# Specify the template version to use. This can be a branch name, commit hash, tag, etc.
# NOTE: different template versions may require different parameters to be passed, so be sure to check
# the parameters/password.parameters.json file in the respective tag branch
$templateVersion = "master"
$templateSrc = "https://raw.githubusercontent.com/elastic/azure-marketplace/$templateVersion/src"
$elasticTemplate = "$templateSrc/mainTemplate.json"
$location = "Australia Southeast"
$resourceGroup = "app-gateway-cluster"
$name = $resourceGroup
$cert = [System.Convert]::ToBase64String([System.IO.File]::ReadAllBytes("{PATH TO cert.pfx}"))
$clusterParameters = #{
"artifactsBaseUrl"= $templateSrc
"esVersion" = "5.1.2"
"esClusterName" = $name
# Deploy to same location as Resource Group location
"location" = "ResourceGroup"
# Install X-Pack plugins.
# Will install trial license
"xpackPlugins" = "Yes"
# Use Application Gateway
"loadBalancerType" = "gateway"
"vmDataDiskCount" = 2
"dataNodesAreMasterEligible" = "Yes"
"adminUsername" = "russ"
"authenticationType" = "password"
"adminPassword" = "{Super Secret Password}"
"securityAdminPassword" = "{Super Secret Admin User Password}"
"securityReadPassword" = "{Super Secret Read User Password}"
"securityKibanaPassword" = "{Super Secret Kibana User Password}"
"appGatewayCertBlob" = $cert
"appGatewayCertPassword" = "{Password for cert.pfx (if it has one)}"
}
Write-Host "[$(Get-Date -format 'u')] Deploying cluster"
New-AzureRmResourceGroup -Name $resourceGroup -Location $location
New-AzureRmResourceGroupDeployment -Name $name -ResourceGroupName $resourceGroup -TemplateUri $elasticTemplate -TemplateParameterObject $clusterParameters
Write-Host "[$(Get-Date -format 'u')] Deployed cluster"
Take a look at the other parameters available for the template which can be used to control other elements such as size and number of disks to attach to each data node, setting up Azure Cloud plugin for snapshot/restore, etc.

Problems Running Azure Automation Powershell to Scale Database Back After Restore Operation

I am trying to scale back a database after the restore operation has completed and am running into some problems. I am getting this exception and wonder if there is something in this script not supported by Azure Automation Workflows?
Parameter set cannot be resolved using the specified named parameters.
workflow insertflowname
{
<#
.SYNOPSIS
The purpose of this runbook is to demonstrate how to restore a particular database to a new database using an Azure Automation workflow. Then it is scaled back to Basic.
.NOTES
#>
# Specify Azure Subscription Name
$subName = 'insertsubscription name'
# Connect to Azure Subscription
Connect-Azure -AzureConnectionName $subName
Select-AzureSubscription -SubscriptionName $subName
# Define source databasename
$SourceDatabaseName = 'insert database name'
# Define source server
$SourceServerName = 'insert source server'
# Define destination server
$TargetServerName = 'insert destination server'
Write-Output "`$SourceServerName [$SourceServerName]"
Write-Output "`$TargetServerName [$TargetServerName]"
Write-Output "`$SourceDatabaseName [$SourceDatabaseName]"
Write-Output "Retrieving recoverable database details for database [$SourceDatabaseName] on server [$SourceServerName]."
$RecoverableDatabase = Get-AzureSqlRecoverableDatabase –ServerName $SourceServerName -DatabaseName $SourceDatabaseName
$TargetDatabaseName = "$SourceDatabaseName-$($RecoverableDatabase.LastAvailableBackupDate.ToString('O'))"
Write-Output "`$TargetDatabaseName [$TargetDatabaseName]"
Write-Output "Starting recovery of database [$SourceDatabaseName] to server [$TargetServerName] as database [$TargetDatabaseName]."
Start-AzureSqlDatabaseRecovery -SourceDatabase $RecoverableDatabase -TargetServerName $TargetServerName –TargetDatabaseName $TargetDatabaseName
$PollingInterval = 10
Write-Output "Monitoring status of recovery operation, polling every [$PollingInterval] second(s)."
$KeepGoing = $true
while ($KeepGoing) {
$operation = Get-AzureSqlDatabaseOperation -ServerName $TargetServerName -DatabaseName $TargetDatabaseName | Where-Object {$_.Name -eq "DATABASE RECOVERY"} | Sort-Object StartTime -Descending
if ($operation) {
$operation[0]
if ($operation[0].State -eq "COMPLETED") { $KeepGoing = $false }
if ($operation[0].State -eq "FAILED") {
# Throw error
$KeepGoing = $false
}
} else {
# Throw error since something went wrong and object was not created
# May want to have this retry a few times before giving up or at least notify somebody
# since at this point the recovery has been kicked off and you don't want the database
# restore to remain at the elevated service level.
$KeepGoing = $false
}
if ($KeepGoing) { Start-Sleep -Seconds $PollingInterval }
}
if ($operation[0].State -eq "COMPLETED") {
Write-Output "Setting service level for database [$TargetDatabaseName] on server [$TargetServerName] to Basic."
$ServiceObjective = Get-AzureSQLDatabaseServiceObjective –ServerName $TargetServerName –ServiceObjectiveName "Basic"
$ServiceObjective
Set-AzureSqlDatabase –ServerName $TargetServerName –DatabaseName $TargetDatabaseName –Edition "Basic" –ServiceObjective $ServiceObjective -MaxSizeGB 2 –Force
}
}
You are probably hitting the issue described here: https://social.msdn.microsoft.com/Forums/en-US/ce6412b8-5cce-4573-befb-6017924ce0d0/whereobject-fails-with-parameter-set-cannot-be-resolved-using-the-specified-named-parameters?forum=azureautomation
Summary:
Use parameter names, don't rely on positional parameters, in PowerShell Workflow. In this case, you need to add the -FilterScript parameter name to Where-Object.

how to operate On List of IIS Application Pools On Remote Server Using Powershell?

I am trying to build powershell program which would:
connect to the remote server
Show number of active IIS app pools on the active server
based on the selection (1,2,3,4,....n etc) it would reset app pool
Can you please give me some tips?
Give this a try:
[Reflection.Assembly]::LoadWithPartialName('Microsoft.Web.Administration')
$sm = [Microsoft.Web.Administration.ServerManager]::OpenRemote('server1')
$sm.ApplicationPools['AppPoolName'].Recycle()
Building upon the answers already given, try the following. It uses powershell remoting, specifically Invoke-Command so you need to familiarise yourself with that.
[cmdletBinding(SupportsShouldProcess=$true,ConfirmImpact="High")]
param
(
[parameter(Mandatory=$true,ValueFromPipeline=$true)]
[string]$ComputerName,
[parameter(Mandatory=$false)]
[System.Management.Automation.PSCredential]$Credential
)
begin
{
if (!($Credential))
{
# Prompt for credentials if not passed in
$Credential = get-credential
}
$scriptBlock = {
Import-Module WebAdministration
# Get all running app pools
$applicationPools = Get-ChildItem IIS:\AppPools | ? {$_.state -eq "Started"}
$i = 0
# Display a basic menu
Write-Host "`nApplication Pools`n"
$applicationPools | % {
"[{0}]`t{1}" -f $i, $($applicationPools[$i].Name)
$i++
}
# Get their choice
$response = Read-Host -Prompt "`nSelect Application Pool to recycle"
# Grab the associated object, which will be null
# if an out of range choice was entered
$appPool = $applicationPools[$response]
if ($appPool)
{
"Recycling '{0}'" -f $appPool.name
$appPool.recycle()
}
}
}
process
{
Invoke-Command -ComputerName $computerName -Credential $credential -ScriptBlock $scriptBlock
}
I cannot help with existing code, but which some links
Check out remote powershell sessions here
Check out the Web Server (IIS) Administration Cmdlets in Windows PowerShell, specialy the Get-WebApplication and Get-WebAppPoolState
If reset means stop, then you could take a look on Stop-WebAppPool

Validating PowerShell PSCredential

Let's say I have a PSCrendential object in PowerShell that I created using Get-Credential.
How can I validate the input against Active Directory ?
By now I found this way, but I feel it's a bit ugly :
[void][System.Reflection.Assembly]::LoadWithPartialName("System.DirectoryServices.AccountManagement")
function Validate-Credentials([System.Management.Automation.PSCredential]$credentials)
{
$pctx = New-Object System.DirectoryServices.AccountManagement.PrincipalContext([System.DirectoryServices.AccountManagement.ContextType]::Domain, "domain")
$nc = $credentials.GetNetworkCredential()
return $pctx.ValidateCredentials($nc.UserName, $nc.Password)
}
$credentials = Get-Credential
Validate-Credentials $credentials
[Edit, two years later] For future readers, please note that Test-Credential or Test-PSCredential are better names, because Validate is not a valid powershell verb (see Get-Verb)
I believe using System.DirectoryServices.AccountManagement is the less ugly way:
This is using ADSI (more ugly?):
$cred = Get-Credential #Read credentials
$username = $cred.username
$password = $cred.GetNetworkCredential().password
# Get current domain using logged-on user's credentials
$CurrentDomain = "LDAP://" + ([ADSI]"").distinguishedName
$domain = New-Object System.DirectoryServices.DirectoryEntry($CurrentDomain,$UserName,$Password)
if ($domain.name -eq $null)
{
write-host "Authentication failed - please verify your username and password."
exit #terminate the script.
}
else
{
write-host "Successfully authenticated with domain $domain.name"
}
I was having a similar issue with an installer and required to verify the service account details supplied. I wanted to avoid using the AD module in Powershell as I wasn't 100% this would be installed on the machine running the script.
I did the test using the below, it is slightly dirty but it does work.
try{
start-process -Credential $c -FilePath ping -WindowStyle Hidden
} catch {
write-error $_.Exception.Message
break
}

Resources