I need to stop all my vm's in azure portal when my balance is 0$ How can i do it? maybe some script?
Unfortunately you can't do that with script for now... The balance (or billing) is not exposed via the Azure REST API (or Azure PowerShell).
For shutdown, we use a RUNBOOK with the follow code inside.
We run this everyday at 8PM. (You could trigger this from another alert http://www.matthewjbailey.com/create-azure-billing-alert-email/)
workflow ShutDown-AllVMs {
param (
[parameter(Mandatory=$true)]
[String] $VMCredentialName = "ourcred#xyz.com"
)
$Credential = Get-AutomationPSCredential -Name $VMCredentialName
if ($Credential -eq $null) {
throw "Could not retrieve '$VMCredentialName' credential asset. Check that you created this asset in the Automation service."
}
Add-AzureAccount -Credential $Credential
Select-AzureSubscription BizSpark
InlineScript {
Get-azurevm | ? { $_.Status -ne "StoppedDeallocated"} | Stop-AzureVM -Force
}
}
For alerts
Setup the alert
http://www.matthewjbailey.com/create-azure-billing-alert-email/
Monitor the alert
http://blogs.technet.com/b/keithmayer/archive/2014/11/08/scripts-to-tools-automate-monitoring-alert-rules-in-microsoft-azure-with-powershell-and-the-azure-service-management-rest-api.aspx
I have not tried all this together, so you may have a 2+2=5 issue :) but have a read of the blogs and you may find that you get 4 :)
Related
I am using an Azure Powershell task in an Azure (VSTS) Pipeline to try and upload a file to an Azure Classic container.
As part of setting up the Azure Powershell task (ver 3.1.10), I added the Azure Classic subscription that this container lives in to the Project Settings\Service connections:
I then select that subscription in the pipeline task:
When I execute the script, the logs seem to show that the task is both setting and selecting the expected subscription:
However, if I don't explicitly (re-)create and pass in the AzureStorageContext to the Set-AzureStorageBlobContent function, the task fails with:
[error]Could not get the storage context. Please pass in a storage context or set the current storage context.
Is this expected behavior?
Is there any way around having to re-create and pass in the context when it appears that it already exists?
For example, is there an environment variable that might contain the context that was automatically created/selected that I can just pass in?
Update:
I suspect that if the
Select-AzureSubscription call seen in the logs used the -Current switch, this would work as I'm expecting it to.
However, since that command is automatically ran with no way to configure it via the pipeline task, it's not verifiable.
Perhaps this needs to be a feature request?
Excerpts from script:
#Not passing in the context results in the error
Set-AzureStorageBlobContent -File "$file" -Blob "$blobpath/$blah" -Container $blobname -Properties $properties -Force
#Passing in the context works:
$azureKey = Get-AzureStorageKey "$accountName"
$storagekey = [string]$azureKey.Primary
$context = New-AzureStorageContext "$accountName" -StorageAccountKey $storagekey
Set-AzureStorageBlobContent -File "$file" -Blob "$blobpath/$blah" -Container $blobname -Properties $properties -Context $context -Force
While I probably should just delete this question (upon discovering the "answer"), I suppose I will provide what I found after more debugging, etc.
TLDR; -- This is mostly me not grepping the concept that an Azure Subscription (context) does not correlate to an Azure Storage (context).
Is this expected behavior?
Yes.
Simply having a currently set subscription does not mean there's a currently set storage context.
Come to find out, our company has multiple storage accounts in the subscription I was using.
It could be that if a subscription only has one storage account, the function would succeed without specifying a context? Maybe I will research that later.
Is there any way around having to re-create and pass in the context when it appears that it already exists?
No (perhaps because of the multiple storage accounts in the subscription).
I will have to specify/select the current storage context from the current subscription (as I did in the "Passing in the context works" part in my question).
Here's how I arrived at this:
First, I verified what actually was being set (if anything) as the current [subscription] context and then explicitly (re-)setting it.
Running the command still failed.
So, it wasn't that the subscription wasn't being set (since it was).
$current = (Get-AzureSubscription -Current).SubscriptionName
Write-Host "current subscription is $current"
$setCurrent = $false
Write-Host "setCurrent is $setCurrent"
$setCurrent = Select-AzureSubscription -Current -SubscriptionName "CDN Subscription" -PassThru
if ($setCurrent)
{
Write-Host "current set"
$current = (Get-AzureSubscription -Current).SubscriptionName
Write-Host "current subscription is $current"
}
else
{
Write-Host "current not set"
}
It then dawned on me that maybe 'subscription' did not equal 'storage'.
To verify that, I then ran the following:
$current = (Get-AzureSubscription -Current).SubscriptionName
Write-Host "current subscription is $current"
$table = Get-AzureStorageAccount | Format-Table -AutoSize -Property #{Label="Name";Expression={$_.StorageAccountName}},"Label","Location" | Out-String
Write-Host "$table"
The result - 4 storage accounts in the subscription.
Ergo, I will need to specify the account I want to upload to
I have created an Azure Data Factory pipeline and deployed it. No issues. When I am in the Azure portal Data Factory blade, I click on the Monitor & Manage button and it takes me to a new tab in MS Edge with the following error:
404 - File or directory not found.
The resource you are looking for might have been removed, had its name changed, or is temporarily unavailable.
Does anyone know what I need to do to be able to monitor my Azure Data Factory pipeline activity?
Thank you.
Not sure about the 404 error, maybe just an authentication issue. Or permissions in your Azure directory.
To monitor a pipeline or ADF in general I would suggest using PowerShell. There are loads of cmdlets to do things beyond the Azure Portal UI, including set a time slice status.
For example, to check what's currently 'In Progress' in your factory do something like this...
Import-Module Azure
#Params...
$AzureUser = "" # <<< enter username
$AzurePass = "" #<<< enter password
$AzureSubscription = "" # <<< enter subscription name
$ResourceGroup = "" # <<< enter resource group name
#Create credential
$SecurePassword = ConvertTo-SecureString $AzurePass -AsPlainText -Force
$PSCredential = New-Object System.Management.Automation.PSCredential ($AzureUser, $SecurePassword)
#Create Azure Connection
Login-AzureRmAccount -Credential $PSCredential | Out-Null
#Set context for subscription
$SubId = Get-AzureSubscription `
-SubscriptionName $AzureSubscription | SELECT SubscriptionId
Set-AzureRmContext -SubscriptionId $SubId.SubscriptionId | Out-Null
#Get ADF details
$ADFName = Get-AzureRmDataFactory `
-ResourceGroupName $ResourceGroup | SELECT DataFactoryName
Get-AzureRmDataFactoryActivityWindow `
-DataFactoryName $ADFName.DataFactoryName `
-ResourceGroupName $ResourceGroup `
| ? {$_.WindowState -eq "InProgress"}
Here's a link to the other cmdlets.
https://learn.microsoft.com/en-us/powershell/resourcemanager/azurerm.datafactories/v2.5.0/azurerm.datafactories
Hope this helps.
Thank you Paul, I want to start getting more familiar with Azure Powershell cmdlets.
I was able to access the portal Monitor & Manage by opening the portal in Chrome. Seems strange that MS Azure doesn't work in MS Edge but it does work in Chrome.
Thank you for you response.
Jon
I'm trying to create start/stop schedule for my virtual machine. Simple schedule:
Start # 10am, Stop # 5pm, don't run on the weekend
Trying to create this automatic schedule is turning into a nightmare!
I don't have time to learn PowerShell.
I looked into doing this thru the Automation portal. I have imported a script through the repository.
"Name of Script: Scheduled Virtual Machine Shutdown/Startup by Automys"
and as it is shown in Azure portal:
"assert-autoshutdownshedule."
I now need to edit this script.
Where/how do I input my credentials/parameters? What needs to be changed?
Create Run as Automation account in azure portal
then import the below script to your Runbook.
if(-not (#('Saturday', 'Sunday') -contains (Get-Date).DayOfWeek)) #skip execution if the day of week is Saturday or Sunday
{
$cred = Get-AutomationPSCredential -Name "Your Automation Account"
Login-AzureRmAccount -Credential $cred
Get-AzureRmSubscription
Select-AzureRmSubscription -SubscriptionName "Your Subscription Name"
Start-AzureRmVM -Name "VM001" -ResourceGroupName "Your Resource group" -ErrorAction Continue -Force
}
All the best :)
Let me know if this work
I have a PaaS VM role that need to be restart using Azure Management libraries. I tried following codes but failed with "BadRequest: The operation is not supported on a role of type MyPaaSVmName". But I successfully restarted IaaS VM using below Method1.
Is it possible to restart a PaaS VM role using Azure Management Libraries?
if not, is there any other way to achieve it using c#.
1.
ComputeManagementClient client = new ComputeManagementClient(cloudCredentials);
client.VirtualMachines.Restart(hostedServiceName, deploymentName, vmName);
2.
ComputeManagementClient client = new ComputeManagementClient(cloudCredentials);
VirtualMachineOperationsExtensions.Restart(client.VirtualMachines, hostserviceName, deploymentName, vmName);
Thank you.
Found the issue,
Method1 should be like this as I am restarting a Role Instance. Method2 is wrong.
client.Deployments.RebootRoleInstanceByDeploymentName(hostserviceName, deploymentName, roleName);
Here's how you can do it using Azure Powershell:
ReSet-AzureRoleInstance -ServiceName "MySvc1" -Slot Staging -InstanceName "MyWebRole_IN_0" –reboot
https://msdn.microsoft.com/en-us/library/azure/dn495202.aspx
And here's a snippet from an Azure Automation Runbook which can reboot all cloud service's instances, per update domain (so you have no downtime):
https://gallery.technet.microsoft.com/Reboot-Cloud-Service-PaaS-b337a06d
$roleInstances = Get-AzureRole -ServiceName $cloudServiceName -Slot Production -InstanceDetails
Write-Output "Retrieved all role instances for cloud service: $cloudServiceName. Number of instances: " + $roleInstances.Count
# Group instances per update domain
$roleInstanceGroups = $roleInstances | Group-Object -AsHashTable -AsString -Property InstanceUpgradeDomain
Write-Output "Number of update domains found: " + $roleInstanceGroups.Keys.Count
# Visit each update domain
foreach ($key in $roleInstanceGroups.Keys)
{
$count = $perDomainInstances.Count;
Write-Output "Rebooting $count instances in domain $key"
$perDomainInstances = $roleInstanceGroups.Get_Item($key)
foreach -parallel($instance in $perDomainInstances)
{
$instanceName = $instance.InstanceName
Write-Output "Rebooting instance $instanceName"
Reset-AzureRoleInstance -ServiceName $cloudServiceName -Slot Production -InstanceName $instanceName -Reboot -ErrorAction Stop
}
}
I have the following PS script that runs fine locally:
Get-AzureVM | Where-Object { $_.Name -eq "my-server-selector" } | select name | ForEach-Object {
Write-Output $_.Name
Start-AzureVM $_.Name $_.Name
}
In the context of my local PS console, I add my subscription info and the code executes without a problem; all VMs are printed to the output and the servers are started up.
When I move it to the cloud I need to do a few other things, namely, bring the subscription in scope. I do that by creating the credential asset in the portal, adding the account to my script via said credentials, then selecting the correct subscription in the script. I also wrap it in a workflow (there are aspects I intent to parametrize at a later date).
The final code is as follows:
workflow StartServer
{
$credential = GetAutomationPSCredential -Name "credential-asset-name"
Add-AzureAccount -Credential $credential
Select-AzureSubscription -SubscriptionName "subscription-name"
Write-Output "Starting the server."
Get-AzureVM | Where-Object { $_.Name -Contains "my-server-selector" } | select name | ForEach-Object {
Write-Output $_.Name
Start-AzureVM $_.Name $_.Name
}
Write-Output "Execution Complete."
}
If I remove the Start-AzureVM command, the workflow runs as expected. I get a listing of all the matching VMs printed out. If I attempt to put the command back in, I get the following error:
Parameter set cannot be resolved using the specified named parameters.
So, things I think I know:
the credentials are working as I'm getting the correct list of VMs
the subscription is being correctly set, as it's dumped to the output
the inner part of the script works on a local powershell console without any changes
Can anyone provide any ideas as to what needs to be done differently in an Azure Automation workflow to get this to work?
The fix was to be more explicit in the naming of parameters, both in the filter for the Where-Object as well as in the call to Start-AzureVM. I'm not sure why this would make a difference; as I said, the call to write the names of the servers worked without the explicit parameter name, but low and behold, here it works with it set.
The final code of the inner block is as follows:
Get-AzureVM | Where-Object -FilterScript { $_.Name -Contains "my-server-selector" } | select name | ForEach-Object {
Write-Output $_.Name
Start-AzureVM -ServiceName $_.Name -Name $_.Name
}
Thanks to #DexterPOSH on Twitter for the direction on -FilterScript.
Please take a look at http://azure.microsoft.com/blog/2014/11/25/introducing-the-azure-automation-script-converter/ which talks about this exact issue. When authoring Powershell in the ISE to move into Azure Automation, make sure you are testing / writing as Powershell Workflow in the ISE, since Powershell workflow has some differences vs Powershell script.
Or, if you need to take PS script and use it in Azure Automation, make sure you import the script, not copy paste it in. Azure Automation will then convert the PS script to PS Workflow for you. The link above has more details on this.