Starting vms in batches of 5 in parallel - azure

I have a set if VM in Azure and needs to start the VMS in a batch in parallel. For example I have 100 vm, I need a batch of 1-5 vm first start in parallel, then the next from 6-10, and so forth. I am able to start all the vms in parallel - but I can't find a way of adding limits in the foreach statement
foreach ($vm in ($vms | Select-Object 5))
Any suggestion of how I can do that?
$vms = Get-azvm -ResourceGroupName "VmList"
#$jobs = #()
foreach ($vm in ($vms | Select-Object 5))
{
$params = #($vm.Name)
$job = Start-Job -ScriptBlock {
param($ComputerName)
start-Azvm -Name $ComputerName -ResourceGroupName "VmList"
} -ArgumentList $params
}
# Wait for it all to complete
Wait-Job -Job $job
# Getting the information back from the jobs
Get-Job | Receive-Job

If you want to keep track yourself you could use splatting on Select-Object. Starting with -Skip 0 -First 5, increment skip so the next loop would be -Skip 5 -First 5 and so on.
$vms = Get-azvm -ResourceGroupName "VmList"
$batch = #{
Skip = 0
First = 5
}
Do
{
foreach ($vm in ($vms | Select-Object #batch))
{
$params = #($vm.Name)
$job = Start-Job -ScriptBlock {
param($ComputerName)
start-Azvm -Name $ComputerName -ResourceGroupName "VmList"
} -ArgumentList $params
}
# Wait for it all to complete
Wait-Job -Job $job
# Getting the information back from the jobs
Get-Job | Receive-Job
$batch.Skip += 5
}
until($batch.skip -ge $vms.count)

Related

How to loop through mulitple azure subscriptions parallelly

I am trying with nested ForEach-Object -Parallel to loop through multiple azure subscriptions parallelly and get data from all VMs in one go. I am using below code:
$SubsJob = Get-AzSubscription -WarningAction SilentlyContinue | Where-Object {$_.Name -match 'abc'} | ForEach-Object -Parallel {
$Context = Set-AzContext -Tenant $_.TenantId -SubscriptionId $_.SubscriptionId
[System.String]$ScriptBlock = {Get-Process}
$VMsJob = Get-AzVM | ForEach-Object -Parallel {
$FileName = $using:Context.Subscription.Name + "_$($_.Name)_" + (Get-Random) + ".ps1"
Out-File -FilePath $FileName -InputObject $using:ScriptBlock -NoNewline
$Output = Invoke-AzVMRunCommand -Name $_.Name -ResourceGroupName $_.ResourceGroupName -CommandId 'RunPowerShellScript' -ScriptPath $FileName
$PSCustomObject = [PSCustomObject]#{Subscription = $using:Context.Subscription.Name; ServerName = $_.Name; Output = $Output}
#Remove-Item -Path $FileName -Force -ErrorAction SilentlyContinue
Write-Output $PSCustomObject
} -ThrottleLimit 200 -AsJob
Write-Output $VMsJob
} -ThrottleLimit 200 -AsJob
I am not able to get it work, not sure what is wrong. One thing I observed while debugging is that, Get-AzVM command is getting VMs list from all the subscriptions rather than specific subscription. I came to know that by looking at the Out-File -FilePath $FileName which are generated.
sub1_server1_980337551.ps1
sub2_server1_42701325.ps1
server1 is only present in sub1 but it is being picked in sub2 as well.

Limit the Number of Simultaneously Running Threads

I have a script that does a ping sweep on my work environment and I am trying to limit the number of simultaneously running threads. I have been playing around with many of the solutions online but I cant seem to make any progress. Any help will be appreciated.
$scriptblock = {
Param($comp)
IF (Test-Connection -computername $comp -Quiet -Count 1) {
[bool]$responding = $true
}
else {
Write-Host "***$comp ERROR -Not responding***"
$responding = $false
}
New-Object psobject -Property #{'Computer Name' = $comp;
'Online' = $responding;
}
}
$CurrentDate = Get-Date
$CurrentDate = $CurrentDate.ToString('MM-dd-yyyy_hh-mm-ss')
$location = Get-Location
$comps = (get-content -path $location'\hostnames.txt').Trim()
$comps | ForEach-Object {Start-Job -Scriptblock $scriptblock -ArgumentList $_ | Out-Null}
Get-Job | Wait-Job | Receive-Job | Select-Object 'Computer Name','Online'`
-ExcludeProperty RunspaceId, PSComputerName, PSShowComputerName | Export-csv -NoTypeInformation "$location\PingTest-$CurrentDate.csv"

Multi-threaded powershell script on single processor system

I have access to a single-core single-processor VM with which to do logging for my team. I have the following code:
$sb = {
Param($_)
if($_.CONTROLLER -ne ".xx" ){
$posIP = "10." + $_.IP + $_.CONTROLLER
if (Test-Connection -ComputerName $posIP -Count 1 -Quiet) {
$mapPath = "\\" + $posIP + "\c$"
net use $mapPath $password /user:$userName | Out-Null
if(Test-Path $mapPath$dataFile) {
[xml]$periods = Get-Content $mapPath$dataFile
$endDate = $periods.IndataDbf.ingredient.PeriodDetail.PeriodEndDate | select -last 1
$output = "$($_.STORE);$endDate" }
else {
$outPut = $_.STORE + ';' + "$dataFile Not Found" }
net use $mapPath /de | Out-Null
}
else {
$outPut = $_.STORE + ';' + "Map FAILED" }
Write-Output $OutPut
}
}
Import-Csv $inFile | ForEach-Object {
while ((Get-Job -State Running).Count -ge 100) {
Start-Sleep -Seconds 5;
}
Write-Output $_.STORE
Start-Job -Scriptblock $sb -ArgumentList $_ | Write-Verbose
Get-Job -State Completed -HasMoreData 1 | Receive-Job | Out-File -Append -FilePath $outLog
}
Get-Job | Wait-Job | Receive-Job | Out-File -Append -FilePath $outLog
Which runs well, but takes the same amount of time as running the same code without Start-Job and just a loop. However, the previous logging command used BATCH files and automatically opened a couple dozen child command windows to process data, then return, and it runs in under half the time. The code used is the same, so I don't understand why adding more threads didn't make the script run faster. Can anyone tell me why a BATCH file program with a couple dozen child windows runs so much faster with arguably the same code? Any why does the Start-Job command not improve the speed at all? I would think it would try to execute multiple threads simultaneously.
Because there is a lot of overhead when using start-job and whenever you use pipeline.
If you use runspaces instead it maybe faster.Take a look at http://newsqlblog.com/2012/05/22/concurrency-in-powershell-multi-threading-with-runspaces/

PowerShell Write to Same File Multiple Jobs

I have a script that pulls one line of data from a file on multiple servers. I have a single-threaded version that works just fine, but I want to get it to run faster. Since I only need one line of one file from each server, I'm sure I could run this in parallel. I pulled code from multiple places to get a multi-threaded script running, but when I try to get all the results to print to one output file, nothing prints. I wonder if anyone can look at my code to tell me why this same script, without the Jobs, works fine, but after adding jobs, it doesn't.
$sb = {
Param($computer, $fileName, $outLog)
net use "\\$computer\c$" **** /user:****
if(test-path \\$computer\c$\sc\$fileName){
[xml]$periods = Get-Content \\$computer\c$\sc\$fileName
$endDate = $periods.PeriodDetail | select -last 1
$output = "$computer;$endDate"
}
Else {
$output = "$computer;$fileName Not Found"
}
#Synchronize file usage
$mutex = new-object System.Threading.Mutex $false,'SomeUniqueName'
$mutex.WaitOne() > $null
#Write data to log
Out-File -Append -InputObject $output -FilePath $outLog
#Release file hold
$mutex.ReleaseMutex()
net use "\\$computer\c$" /de
}
foreach($computer in $computerName){
while ((Get-Job -State Running).Count -ge 20) {
Start-Sleep -Seconds 5;
}
Start-Job -Scriptblock $sb -ArgumentList $computer,$fileName,$outLog
}
Get-Job | Wait-Job | Receive-Job
Thank you for all the assistance. Here is the resulting code that works pretty well:
$sb = {
Param($computer, $fileName, $outLog)
net use "\\$computer\c$" $password /user:$userName | Out-Null
if(test-path \\$computer\c$\sc\$fileName){
[xml]$periods = Get-Content \\$computer\c$\sc\$fileName
$endDate = $periods.IndataDbf.ingredient.PeriodDetail.PeriodEndDate | select -last 1
$output = "$computer;$endDate"
}
Else {
$output = "$computer;$fileName Not Found"
}
Write-Output -InputObject $output
net use "\\$computer\c$" /de | Out-Null
}
foreach($computer in $computerName){
while ((Get-Job -State Running).Count -ge 20) {
Start-Sleep -Seconds 5;
}
Start-Job -Scriptblock $sb -ArgumentList $computer,$fileName,$outLog
}
Get-Job | Wait-Job | Receive-Job | Out-File -Append -FilePath $outLog
I'm thinking of doing another Get-Job right before the Start-Job, getting only jobs that are complete with more data, but I haven't tested it yet.

Powershell v2.0 Using multiple threads

Basic script idea:
Hello. I've created a powershell script which I use to check the filesizes of certain executables, and then keep them in a text file. Next time the script runs, if a filesize differs it will replace the one in the text file with the new one.
The structure:
I have a main script and a folder which contains many scripts, each for every executable of which I want to check the filesize. So the scripts in the folder will return a string containing the link to the executable, which will be fed to the main script.
The code:
$progdir = "C:\script\programms"
$items = Get-ChildItem -filter *.ps1 -Path $progdir
$webclient = New-Object System.Net.WebClient
$filesizes = get-content C:\updatechecker\programms\filesizes
if ($filesizes.length -ne $items.length) {
if ($filesizes.length -eq $null) {
Write-Host ("Building filesize database...") -nonewline
}
else {
Write-Host ("Rebuilding filesize database...") -nonewline
}
clear-content C:\programms\filesizes
for ($i=0; $i -le $items.length-1; $i++) {
$command = "c:\programms\" + $items[$i].name
$link = & $command
$webclient.OpenRead($link) | Out-Null
$filesize = $webclient.ResponseHeaders["Content-Length"]
$filesize >> C:\programms\filesizes
}
echo "Done."
}
else {
...
Question:
This for loop is the one I want to run in parallel. I need your advice on how to do this since I'm new to powershell. I tried to implement a few things I found but they didn't work correctly (took very long to finish, output errors, multiple entries of filesizes in my filesizes file). I suspect it's a synchronization issue and somehow I need to lock the critical parts. Isn't there anything like omp parallel for in powershell? :P
Any help,advice on how to achieve this would be appreciated :)
edit:
Get-Job | Remove-Job -Force
$progdir = "C:\programms"
$items = Get-ChildItem -filter *.ps1 -Path $progdir
$webclient = New-Object System.Net.WebClient
$filesizes = get-content C:\programms\filesizes
$jobWork = {
param ($MyInput)
$command = "c:\programms\" + $MyInput
$link = & $command
$webclient.OpenRead($link) | Out-Null
$filesize = $webclient.ResponseHeaders["Content-Length"]
$filesize >> C:\programms\filesizes
}
foreach ($item in $items) {
Start-Job -ScriptBlock $jobWork -ArgumentList $item.name | out-null
}
Get-Job | Wait-Job
Get-Job | Receive-Job | Out-GridView | out-null
echo "Done."
Edit 2: Used code I found here: http://ryan.witschger.net/?p=22
$mutex = new-object -TypeName System.Threading.Mutex -ArgumentList $false, “RandomGlobalMutexName”;
$MaxThreads = 4
$SleepTimer = 500
$jobWork = {
param ($MyInput)
$webclient = New-Object System.Net.WebClient
$command = "c:\programms\" + $MyInput
$link = & $command
$webclient.OpenRead($link) | Out-Null
$result = $mutex.WaitOne();
$file = $webclient.ResponseHeaders["Content-Length"]
$file >> C:\programms\filesizes
$mutex.ReleaseMutex();
}
$progdir = "C:\programms"
$items = Get-ChildItem -filter *.ps1 -Path $progdir
$webclient = New-Object System.Net.WebClient
$filesizes = get-content C:\programms\filesizes
Get-Job | Remove-Job -Force
$i = 0
ForEach ($item in $items){
While ($(Get-Job -state running).count -ge $MaxThreads){
Start-Sleep -Milliseconds $SleepTimer
}
$i++
Start-Job -ScriptBlock $jobWork -ArgumentList $item.name | Out-Null
}
You can run each iteration of the loop in a background job which is not the same a seperate thread in that it is a whole other PowerShell.exe process. Data is passed from the background processes through serialization.
To approach it using background jobs you'll need to define a script block that will do that actual work and then call the script block with parameters in each iteration of the loop. The script block can report back status via Write-Output or by throwing an exception.
You'll probably want to throttle how many concurrent background jobs are running. Here's an example of how to throttle:
$jobItems = "a", "b", "c", "d", "e"
$jobMax = 2
$jobs = #()
$jobWork = {
param ($MyInput)
if ($MyInput -eq "d") {
throw "an example of an error"
} else {
write-output "Processed $MyInput"
}
}
foreach ($jobItem in $jobItems) {
if ($jobs.Count -le $jobMax) {
$jobs += Start-Job -ScriptBlock $jobWork -ArgumentList $jobItem
} else {
$jobs | Wait-Job -Any
}
}
$jobs | Wait-Job
As an alternative you might try eventing. Take a look at this thread for some examples of how to implement concurrency using events.
PowerShell: Runspace problem with DownloadFileAsync
You might be able to replace DownloadFileAsync with OpenReadAsync

Resources