I have access to a single-core single-processor VM with which to do logging for my team. I have the following code:
$sb = {
Param($_)
if($_.CONTROLLER -ne ".xx" ){
$posIP = "10." + $_.IP + $_.CONTROLLER
if (Test-Connection -ComputerName $posIP -Count 1 -Quiet) {
$mapPath = "\\" + $posIP + "\c$"
net use $mapPath $password /user:$userName | Out-Null
if(Test-Path $mapPath$dataFile) {
[xml]$periods = Get-Content $mapPath$dataFile
$endDate = $periods.IndataDbf.ingredient.PeriodDetail.PeriodEndDate | select -last 1
$output = "$($_.STORE);$endDate" }
else {
$outPut = $_.STORE + ';' + "$dataFile Not Found" }
net use $mapPath /de | Out-Null
}
else {
$outPut = $_.STORE + ';' + "Map FAILED" }
Write-Output $OutPut
}
}
Import-Csv $inFile | ForEach-Object {
while ((Get-Job -State Running).Count -ge 100) {
Start-Sleep -Seconds 5;
}
Write-Output $_.STORE
Start-Job -Scriptblock $sb -ArgumentList $_ | Write-Verbose
Get-Job -State Completed -HasMoreData 1 | Receive-Job | Out-File -Append -FilePath $outLog
}
Get-Job | Wait-Job | Receive-Job | Out-File -Append -FilePath $outLog
Which runs well, but takes the same amount of time as running the same code without Start-Job and just a loop. However, the previous logging command used BATCH files and automatically opened a couple dozen child command windows to process data, then return, and it runs in under half the time. The code used is the same, so I don't understand why adding more threads didn't make the script run faster. Can anyone tell me why a BATCH file program with a couple dozen child windows runs so much faster with arguably the same code? Any why does the Start-Job command not improve the speed at all? I would think it would try to execute multiple threads simultaneously.
Because there is a lot of overhead when using start-job and whenever you use pipeline.
If you use runspaces instead it maybe faster.Take a look at http://newsqlblog.com/2012/05/22/concurrency-in-powershell-multi-threading-with-runspaces/
Related
I have a script that pulls one line of data from a file on multiple servers. I have a single-threaded version that works just fine, but I want to get it to run faster. Since I only need one line of one file from each server, I'm sure I could run this in parallel. I pulled code from multiple places to get a multi-threaded script running, but when I try to get all the results to print to one output file, nothing prints. I wonder if anyone can look at my code to tell me why this same script, without the Jobs, works fine, but after adding jobs, it doesn't.
$sb = {
Param($computer, $fileName, $outLog)
net use "\\$computer\c$" **** /user:****
if(test-path \\$computer\c$\sc\$fileName){
[xml]$periods = Get-Content \\$computer\c$\sc\$fileName
$endDate = $periods.PeriodDetail | select -last 1
$output = "$computer;$endDate"
}
Else {
$output = "$computer;$fileName Not Found"
}
#Synchronize file usage
$mutex = new-object System.Threading.Mutex $false,'SomeUniqueName'
$mutex.WaitOne() > $null
#Write data to log
Out-File -Append -InputObject $output -FilePath $outLog
#Release file hold
$mutex.ReleaseMutex()
net use "\\$computer\c$" /de
}
foreach($computer in $computerName){
while ((Get-Job -State Running).Count -ge 20) {
Start-Sleep -Seconds 5;
}
Start-Job -Scriptblock $sb -ArgumentList $computer,$fileName,$outLog
}
Get-Job | Wait-Job | Receive-Job
Thank you for all the assistance. Here is the resulting code that works pretty well:
$sb = {
Param($computer, $fileName, $outLog)
net use "\\$computer\c$" $password /user:$userName | Out-Null
if(test-path \\$computer\c$\sc\$fileName){
[xml]$periods = Get-Content \\$computer\c$\sc\$fileName
$endDate = $periods.IndataDbf.ingredient.PeriodDetail.PeriodEndDate | select -last 1
$output = "$computer;$endDate"
}
Else {
$output = "$computer;$fileName Not Found"
}
Write-Output -InputObject $output
net use "\\$computer\c$" /de | Out-Null
}
foreach($computer in $computerName){
while ((Get-Job -State Running).Count -ge 20) {
Start-Sleep -Seconds 5;
}
Start-Job -Scriptblock $sb -ArgumentList $computer,$fileName,$outLog
}
Get-Job | Wait-Job | Receive-Job | Out-File -Append -FilePath $outLog
I'm thinking of doing another Get-Job right before the Start-Job, getting only jobs that are complete with more data, but I haven't tested it yet.
All the tuts I have found use a pre defined sleep time to throttle jobs.
I need the throttle to wait until a job is completed before starting a new one.
Only 4 jobs can be running at one time.
So The script will run up 4 and currently pauses for 10 seconds then runs up the rest.
What I want is for the script to only allow 4 jobs to be running at one time and as a job is completed a new one is kicked off.
Jobs are initialised via a list of servers names.
Is it possible to archive this?
$servers = Get-Content "C:\temp\flashfilestore\serverlist.txt"
$scriptBlock = { #DO STUFF }
$MaxThreads = 4
foreach($server in $servers) {
Start-Job -ScriptBlock $scriptBlock -argumentlist $server
While($(Get-Job -State 'Running').Count -ge $MaxThreads) {
sleep 10 #Need this to wait until a job is complete and kick off a new one.
}
}
Get-Job | Wait-Job | Receive-Job
You can test the following :
$servers = Get-Content "C:\temp\flashfilestore\serverlist.txt"
$scriptBlock = { #DO STUFF }
invoke-command -computerName $servers -scriptblock $scriptBlock -jobname 'YourJobSpecificName' -throttlelimit 4 -AsJob
This command uses the Invoke-Command cmdlet and its AsJob parameter to start a background job that runs a scriptblock on numerous computers. Because the command must not be run more than 4 times concurrently, the command uses the ThrottleLimit parameter of Invoke-Command to limit the number of concurrent commands to 4.
Be careful that the file contains the computer names in a domain.
In order to avoid inventing a wheel I would recommend to use one of the
existing tools.
One of them is the script
Invoke-Parallel.ps1.
It is written in PowerShell, you can see how it is implemented directly. It is
easy to get and it does not require any installation for using it.
Another one is the module SplitPipeline.
It may work faster because it is written in C#. It also covers some more use
cases, for example slow or infinite input, use of initialization and cleanup scripts.
In the latter case the code with 4 parallel pipelines will be
$servers | Split-Pipeline -Count 4 {process{ <# DO STUFF on $_ #> }}
I wrote a blog article which covers multithreading any given script via actual threads. You can find the full post here:
http://www.get-blog.com/?p=189
The basic setup is:
$ISS = [system.management.automation.runspaces.initialsessionstate]::CreateDefault()
$RunspacePool = [runspacefactory]::CreateRunspacePool(1, $MaxThreads, $ISS, $Host)
$RunspacePool.Open()
$Code = [ScriptBlock]::Create($(Get-Content $FileName))
$PowershellThread = [powershell]::Create().AddScript($Code)
$PowershellThread.RunspacePool = $RunspacePool
$Handle = $PowershellThread.BeginInvoke()
$Job = "" | Select-Object Handle, Thread, object
$Job.Handle = $Handle
$Job.Thread = $PowershellThread
$Job.Object = $Object.ToString()
$Job.Thread.EndInvoke($Job.Handle)
$Job.Thread.Dispose()
Instead of sleep 10 you could also just wait on a job (-any job):
Get-Job | Wait-Job -Any | Out-Null
When there are no more jobs to kick off, start printing the output. You can also do this within the loop immediately after the above command. The script will receive jobs as they finish instead of waiting until the end.
Get-Job -State Completed | % {
Receive-Job $_ -AutoRemoveJob -Wait
}
So your script would look like this:
$servers = Get-Content "C:\temp\flashfilestore\serverlist.txt"
$scriptBlock = { #DO STUFF }
$MaxThreads = 4
foreach ($server in $servers) {
Start-Job -ScriptBlock $scriptBlock -argumentlist $server
While($(Get-Job -State Running).Count -ge $MaxThreads) {
Get-Job | Wait-Job -Any | Out-Null
}
Get-Job -State Completed | % {
Receive-Job $_ -AutoRemoveJob -Wait
}
}
While ($(Get-Job -State Running).Count -gt 0) {
Get-Job | Wait-Job -Any | Out-Null
}
Get-Job -State Completed | % {
Receive-Job $_ -AutoRemoveJob -Wait
}
Having said all that, I prefer runspaces (similar to Ryans post) or even workflows if you can use them. These are far less resource intensive than starting multiple powershell processes.
Your script looks good, try and add something like
Write-Host ("current count:" + ($(Get-Job -State 'Running').Count) + " on server:" + $server)
after your while loop to work out whether the job count is going down where you wouldn't expect it.
I noticed that every Start-Job command resulted in an additional conhost.exe process in the task manager. Knowing this, I was able to throttle using the following logic, where 5 is my desired number of concurrent threads (so I use 4 in my -gt statement since I am looking for a count greater than):
while((Get-Process conhost -ErrorAction SilentlyContinue).Count -gt 4){Start-Sleep -Seconds 1}
I'm trying to use Start-Job to run a command to collect security logs from some servers.
I'm parsing a .ini file to get the list of servers, number of days etc.
#___Collect Logs from Servers___#
$servList = $iniContent["SERVERS"]["svr"]
$days = $iniContent["DAYS"]["days"]
$date = $(get-date -format ddMMyyyy)
$err = "Error Collecting $($logType) from $($server) or the Event Log is empty! | $(Get-Date -format g) "
$serv = $servList.Split(",")
foreach ($server in $serv){
$outfile = "D:\DCLogs\$($date)_$($server)_$logType.txt"
$ScriptBlock = cmd /c "D:\CollectLog\Dumpel.exe -f $($outFile) -l $($logType) -s $($server) -d $($days)"
Start-Job -ScriptBlock $ScriptBlock
Get-Job | Wait-Job
$file = Get-ChildItem D:\DCLogs -Filter "$($date)_$($server)*" -Name
$len = $file.length/1KB # Check LogFile Size
if ($len -eq 0){
$errCount = 1
write-output $err | Out-File $errLog -append
}
}
It's only starting one job at a time so I know I'm doing something wrong. If someone could please point out the problem I'd greatly appreciate it.
Thank you.
Amelia
Get-Job | Wait-Job in the loop, just serialize the jobs. You can use the loop to start the jobs and then use Get-Job | Wait-Job outside the loop.
Try to define your ScriptBlock using :
$ScriptBlock = {...}
Basic script idea:
Hello. I've created a powershell script which I use to check the filesizes of certain executables, and then keep them in a text file. Next time the script runs, if a filesize differs it will replace the one in the text file with the new one.
The structure:
I have a main script and a folder which contains many scripts, each for every executable of which I want to check the filesize. So the scripts in the folder will return a string containing the link to the executable, which will be fed to the main script.
The code:
$progdir = "C:\script\programms"
$items = Get-ChildItem -filter *.ps1 -Path $progdir
$webclient = New-Object System.Net.WebClient
$filesizes = get-content C:\updatechecker\programms\filesizes
if ($filesizes.length -ne $items.length) {
if ($filesizes.length -eq $null) {
Write-Host ("Building filesize database...") -nonewline
}
else {
Write-Host ("Rebuilding filesize database...") -nonewline
}
clear-content C:\programms\filesizes
for ($i=0; $i -le $items.length-1; $i++) {
$command = "c:\programms\" + $items[$i].name
$link = & $command
$webclient.OpenRead($link) | Out-Null
$filesize = $webclient.ResponseHeaders["Content-Length"]
$filesize >> C:\programms\filesizes
}
echo "Done."
}
else {
...
Question:
This for loop is the one I want to run in parallel. I need your advice on how to do this since I'm new to powershell. I tried to implement a few things I found but they didn't work correctly (took very long to finish, output errors, multiple entries of filesizes in my filesizes file). I suspect it's a synchronization issue and somehow I need to lock the critical parts. Isn't there anything like omp parallel for in powershell? :P
Any help,advice on how to achieve this would be appreciated :)
edit:
Get-Job | Remove-Job -Force
$progdir = "C:\programms"
$items = Get-ChildItem -filter *.ps1 -Path $progdir
$webclient = New-Object System.Net.WebClient
$filesizes = get-content C:\programms\filesizes
$jobWork = {
param ($MyInput)
$command = "c:\programms\" + $MyInput
$link = & $command
$webclient.OpenRead($link) | Out-Null
$filesize = $webclient.ResponseHeaders["Content-Length"]
$filesize >> C:\programms\filesizes
}
foreach ($item in $items) {
Start-Job -ScriptBlock $jobWork -ArgumentList $item.name | out-null
}
Get-Job | Wait-Job
Get-Job | Receive-Job | Out-GridView | out-null
echo "Done."
Edit 2: Used code I found here: http://ryan.witschger.net/?p=22
$mutex = new-object -TypeName System.Threading.Mutex -ArgumentList $false, “RandomGlobalMutexName”;
$MaxThreads = 4
$SleepTimer = 500
$jobWork = {
param ($MyInput)
$webclient = New-Object System.Net.WebClient
$command = "c:\programms\" + $MyInput
$link = & $command
$webclient.OpenRead($link) | Out-Null
$result = $mutex.WaitOne();
$file = $webclient.ResponseHeaders["Content-Length"]
$file >> C:\programms\filesizes
$mutex.ReleaseMutex();
}
$progdir = "C:\programms"
$items = Get-ChildItem -filter *.ps1 -Path $progdir
$webclient = New-Object System.Net.WebClient
$filesizes = get-content C:\programms\filesizes
Get-Job | Remove-Job -Force
$i = 0
ForEach ($item in $items){
While ($(Get-Job -state running).count -ge $MaxThreads){
Start-Sleep -Milliseconds $SleepTimer
}
$i++
Start-Job -ScriptBlock $jobWork -ArgumentList $item.name | Out-Null
}
You can run each iteration of the loop in a background job which is not the same a seperate thread in that it is a whole other PowerShell.exe process. Data is passed from the background processes through serialization.
To approach it using background jobs you'll need to define a script block that will do that actual work and then call the script block with parameters in each iteration of the loop. The script block can report back status via Write-Output or by throwing an exception.
You'll probably want to throttle how many concurrent background jobs are running. Here's an example of how to throttle:
$jobItems = "a", "b", "c", "d", "e"
$jobMax = 2
$jobs = #()
$jobWork = {
param ($MyInput)
if ($MyInput -eq "d") {
throw "an example of an error"
} else {
write-output "Processed $MyInput"
}
}
foreach ($jobItem in $jobItems) {
if ($jobs.Count -le $jobMax) {
$jobs += Start-Job -ScriptBlock $jobWork -ArgumentList $jobItem
} else {
$jobs | Wait-Job -Any
}
}
$jobs | Wait-Job
As an alternative you might try eventing. Take a look at this thread for some examples of how to implement concurrency using events.
PowerShell: Runspace problem with DownloadFileAsync
You might be able to replace DownloadFileAsync with OpenReadAsync
I have a powershell script to do some batch processing on a bunch of images and I'd like to do some parallel processing. Powershell seems to have some background processing options such as start-job, wait-job, etc, but the only good resource I found for doing parallel work was writing the text of a script out and running those (PowerShell Multithreading)
Ideally, I'd like something akin to parallel foreach in .net 4.
Something pretty seemless like:
foreach-parallel -threads 4 ($file in (Get-ChildItem $dir))
{
.. Do Work
}
Maybe I'd be better off just dropping down to c#...
You can execute parallel jobs in Powershell 2 using Background Jobs. Check out Start-Job and the other job cmdlets.
# Loop through the server list
Get-Content "ServerList.txt" | %{
# Define what each job does
$ScriptBlock = {
param($pipelinePassIn)
Test-Path "\\$pipelinePassIn\c`$\Something"
Start-Sleep 60
}
# Execute the jobs in parallel
Start-Job $ScriptBlock -ArgumentList $_
}
Get-Job
# Wait for it all to complete
While (Get-Job -State "Running")
{
Start-Sleep 10
}
# Getting the information back from the jobs
Get-Job | Receive-Job
The answer from Steve Townsend is correct in theory but not in practice as #likwid pointed out. My revised code takes into account the job-context barrier--nothing crosses that barrier by default! The automatic $_ variable can thus be used in the loop but cannot be used directly within the script block because it is inside a separate context created by the job.
To pass variables from the parent context to the child context, use the -ArgumentList parameter on Start-Job to send it and use param inside the script block to receive it.
cls
# Send in two root directory names, one that exists and one that does not.
# Should then get a "True" and a "False" result out the end.
"temp", "foo" | %{
$ScriptBlock = {
# accept the loop variable across the job-context barrier
param($name)
# Show the loop variable has made it through!
Write-Host "[processing '$name' inside the job]"
# Execute a command
Test-Path "\$name"
# Just wait for a bit...
Start-Sleep 5
}
# Show the loop variable here is correct
Write-Host "processing $_..."
# pass the loop variable across the job-context barrier
Start-Job $ScriptBlock -ArgumentList $_
}
# Wait for all to complete
While (Get-Job -State "Running") { Start-Sleep 2 }
# Display output from all jobs
Get-Job | Receive-Job
# Cleanup
Remove-Job *
(I generally like to provide a reference to the PowerShell documentation as supporting evidence but, alas, my search has been fruitless. If you happen to know where context separation is documented, post a comment here to let me know!)
There's so many answers to this these days:
jobs (or threadjobs in PS 6/7 or the module for PS 5)
start-process
workflows (PS 5 only)
powershell api with another runspace
invoke-command with multiple computers, which can all be localhost (have to be admin)
multiple session (runspace) tabs in the ISE, or remote powershell ISE tabs
Powershell 7 has a foreach-object -parallel as an alternative for #4
Using start-threadjob in powershell 5.1. I wish this worked like I expect, but it doesn't:
# test-netconnection has a miserably long timeout
echo yahoo.com facebook.com |
start-threadjob { test-netconnection $input } | receive-job -wait -auto
WARNING: Name resolution of yahoo.com microsoft.com facebook.com failed
It works this way. Not quite as nice and foreach-object -parallel in powershell 7 but it'll do.
echo yahoo.com facebook.com |
% { $_ | start-threadjob { test-netconnection $input } } |
receive-job -wait -auto | ft -a
ComputerName RemotePort RemoteAddress PingSucceeded PingReplyDetails (RTT) TcpTestS
ucceeded
------------ ---------- ------------- ------------- ---------------------- --------
facebook.com 0 31.13.71.36 True 17 ms False
yahoo.com 0 98.137.11.163 True 97 ms False
Here's workflows with literally a foreach -parallel:
workflow work {
foreach -parallel ($i in 1..3) {
sleep 5
"$i done"
}
}
work
3 done
1 done
2 done
Or a workflow with a parallel block:
function sleepfor($time) { sleep $time; "sleepfor $time done"}
workflow work {
parallel {
sleepfor 3
sleepfor 2
sleepfor 1
}
'hi'
}
work
sleepfor 1 done
sleepfor 2 done
sleepfor 3 done
hi
Here's an api with runspaces example:
$a = [PowerShell]::Create().AddScript{sleep 5;'a done'}
$b = [PowerShell]::Create().AddScript{sleep 5;'b done'}
$c = [PowerShell]::Create().AddScript{sleep 5;'c done'}
$r1,$r2,$r3 = ($a,$b,$c).begininvoke() # run in background
$a.EndInvoke($r1); $b.EndInvoke($r2); $c.EndInvoke($r3) # wait
($a,$b,$c).streams.error # check for errors
($a,$b,$c).dispose() # clean
a done
b done
c done
In Powershell 7 you can use ForEach-Object -Parallel
$Message = "Output:"
Get-ChildItem $dir | ForEach-Object -Parallel {
"$using:Message $_"
} -ThrottleLimit 4
http://gallery.technet.microsoft.com/scriptcenter/Invoke-Async-Allows-you-to-83b0c9f0
i created an invoke-async which allows you do run multiple script blocks/cmdlets/functions at the same time. this is great for small jobs (subnet scan or wmi query against 100's of machines) because the overhead for creating a runspace vs the startup time of start-job is pretty drastic. It can be used like so.
with scriptblock,
$sb = [scriptblock] {param($system) gwmi win32_operatingsystem -ComputerName $system | select csname,caption}
$servers = Get-Content servers.txt
$rtn = Invoke-Async -Set $server -SetParam system -ScriptBlock $sb
just cmdlet/function
$servers = Get-Content servers.txt
$rtn = Invoke-Async -Set $servers -SetParam computername -Params #{count=1} -Cmdlet Test-Connection -ThreadCount 50
Backgrounds jobs are expensive to setup and are not reusable. PowerShell MVP Oisin Grehan
has a good example of PowerShell multi-threading.
(10/25/2010 site is down, but accessible via the Web Archive).
I'e used adapted Oisin script for use in a data loading routine here:
http://rsdd.codeplex.com/SourceControl/changeset/view/a6cd657ea2be#Invoke-RSDDThreaded.ps1
To complete previous answers, you can also use Wait-Job to wait for all jobs to complete:
For ($i=1; $i -le 3; $i++) {
$ScriptBlock = {
Param (
[string] [Parameter(Mandatory=$true)] $increment
)
Write-Host $increment
}
Start-Job $ScriptBlock -ArgumentList $i
}
Get-Job | Wait-Job | Receive-Job
If you're using latest cross platform powershell (which you should btw) https://github.com/powershell/powershell#get-powershell, you can add single & to run parallel scripts. (Use ; to run sequentially)
In my case I needed to run 2 npm scripts in parallel: npm run hotReload & npm run dev
You can also setup npm to use powershell for its scripts (by default it uses cmd on windows).
Run from project root folder: npm config set script-shell pwsh --userconfig ./.npmrc
and then use single npm script command: npm run start
"start":"npm run hotReload & npm run dev"
This has been answered thoroughly. Just want to post this method i have created based on Powershell-Jobs as a reference.
Jobs are passed on as a list of script-blocks. They can be parameterized.
Output of the jobs is color-coded and prefixed with a job-index (just like in a vs-build-process, as this will be used in a build)
Can be used to startup multiple servers at a time or running build steps in parallel or so..
function Start-Parallel {
param(
[ScriptBlock[]]
[Parameter(Position = 0)]
$ScriptBlock,
[Object[]]
[Alias("arguments")]
$parameters
)
$jobs = $ScriptBlock | ForEach-Object { Start-Job -ScriptBlock $_ -ArgumentList $parameters }
$colors = "Blue", "Red", "Cyan", "Green", "Magenta"
$colorCount = $colors.Length
try {
while (($jobs | Where-Object { $_.State -ieq "running" } | Measure-Object).Count -gt 0) {
$jobs | ForEach-Object { $i = 1 } {
$fgColor = $colors[($i - 1) % $colorCount]
$out = $_ | Receive-Job
$out = $out -split [System.Environment]::NewLine
$out | ForEach-Object {
Write-Host "$i> "-NoNewline -ForegroundColor $fgColor
Write-Host $_
}
$i++
}
}
} finally {
Write-Host "Stopping Parallel Jobs ..." -NoNewline
$jobs | Stop-Job
$jobs | Remove-Job -Force
Write-Host " done."
}
}
sample output:
There is a new built-in solution in PowerShell 7.0 Preview 3.
PowerShell ForEach-Object Parallel Feature
So you could do:
Get-ChildItem $dir | ForEach-Object -Parallel {
.. Do Work
$_ # this will be your file
}-ThrottleLimit 4