How to run commands in parallel in PowerShell - multithreading

Though this might be duplicate, this is different.
I have written a script to fast delete folders containing large amounts of files and subfolders using PowerShell, now I will post how I achieved parallel processing, and set up some examples:
The function:
function Parallel-Delete {
param(
[Parameter(Valuefrompipeline=$true, Mandatory=$true, Position=0)] [array]$filelist,
[Parameter(Valuefrompipeline=$true, Mandatory=$true, Position=1)] [int]$number
)
0..($filelist.count-1) | Where-Object {$_ % 16 -eq $number} | foreach {Remove-Item -Path $filelist[$_]}
}
Making a test folder and list contents:
$test=[string]"C:\test"+$(get-random)
md $test | out-null
0..10000 | % {ni "${test}\${_}.txt"}|out-null
[array]$filelist=(Get-Childitem -Path $test -File -Force).Fullname
Test1:
0..15 | foreach-object {Invoke-Command -ScriptBlock { Parallel-Delete $filelist $_}}
rd $test
I have confirmed the parallel processes are working, but the parallel processes use the same amount of resources as the single-thread process:
Test2(remake test folders before running new tests):
(Get-Childitem -Path $test -File -Force).Fullname | Foreach {Remove-Item -Path $_}
rd $test
And the deletion speed of 16 parallel processes isn't 16 times as fast as the one-thread process as expected, and here are results:
Test1:
PS C:\Windows\System32>Measure-Command {0..15 | foreach-object {Invoke-Command -ScriptBlock { Parallel-Delete $filelist $_}}}
Days : 0
Hours : 0
Minutes : 0
Seconds : 36
Milliseconds : 279
Ticks : 362798015
TotalDays : 0.000419905109953704
TotalHours : 0.0100777226388889
TotalMinutes : 0.604663358333333
TotalSeconds : 36.2798015
TotalMilliseconds : 36279.8015
Test2:
PS C:\Windows\System32>Measure-Command {(Get-Childitem -Path $test -File -Force).Fullname | Foreach {Remove-Item -Path $_}}
Days : 0
Hours : 0
Minutes : 0
Seconds : 25
Milliseconds : 980
Ticks : 259802514
TotalDays : 0.000300697354166667
TotalHours : 0.0072167365
TotalMinutes : 0.43300419
TotalSeconds : 25.9802514
TotalMilliseconds : 25980.2514
And I have tried this using start-job:
0..15 | foreach-object {Start-Job -ScriptBlock { Parallel-Delete $filelist $_}}
And I didn't get what I expected, it successfully started 16 jobs that literally do nothing, so I stopped them all, I have also noticed, the variable $filelist in the "command" isn't green...
So I don't know if the function isn't recognized or the variables aren't passed...
And I have tried a method I found here:
Powershell Start-Process to start Powershell session and pass local variables
With this:
$ScriptBlock = {
function Parallel-Delete {...}
}
remake test folder...
$PowerShell=(Get-Process -Id $pid).path
0..15|%{Start-Process -FilePath $PowerShell -ArgumentList "-Command & {$ScriptBlock Parallel-Delete('$filelist $_')}"}
I have successfully started 16 black PowerShell console Windows, and all of them show this:
cmdlet Parallel-Delete at command pipeline position 1
Supply values for the following parameters:
number:
It means the number isn't passed, but this also means the $filelist is passed successfully(maybe), I have confirmed the start-process with scriptblock works well with one variable, but it failed to pass multiple variables.
I also know Invoke-Expression, though I haven't tried it yet. Currently I think the start-process method is more like what I wanted-if I can get it working.
So that's it, how can I run a custom function with multiple parameters in n parallel processes, and pass multiple variables to the processes, and make them run concurrently and seperately and indepently from each other, so that the execution speed of the parallel processes would be n times as fast as the speed of a single thread process that produces the same outcome?
Will anyone help me please? Any help would be appreciated. I say thanks in advance.
P.S. I use PowerShell 7.1 x64 on Windows 10 20H2.
Update: I have tried foreach -parallel with this:
function Parallel-Delete {
param(
[Parameter(Valuefrompipeline=$true, Mandatory=$true, Position=0)] [array]$filelist,
[Parameter(Valuefrompipeline=$true, Mandatory=$true, Position=1)] [int]$number
)
0..($filelist.count-1) | Where-Object {$_ % 16 -eq $number} | foreach {Remove-Item -Path $filelist[$_]}
}
[array]$filelist=(Get-Childitem -Path "C:\test\0" -File -Force).Fullname
0..15|foreach-object -Parallel {
Parallel-Delete $filelist $_
} -ThrottleLimit 16
And it gives me this error message 16 times:
Parallel-Delete:
Line |
2 | Parallel-Delete $filelist $_
| ~~~~~~~~~~~~~~~
| The term 'Parallel-Delete' is not recognized as a name of a cmdlet, function, script file, or executable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
So the function is not parsed, and the variables aren't green...
And now I have tried:
0..15|foreach-object -Parallel {
-begin {Parallel-Delete {$filelist $_}}
} -ThrottleLimit 16
And it just gives me this error:
ParserError:
Line |
2 | -begin {Parallel-Delete {$filelist $_}}
| ~~
| Unexpected token '$_' in expression or statement.
Please help...

For troubleshooting it is always good to exclude items that might interfere with your expectation. To come closer to your answer it often a good practice to cut your issue in half. In your specific case, ask yourself what you want to prove:
PowerShell commands that run parallel should process faster
My file system appears not handle asynchrone commands as expected
With all respect, the responders at this community (including myself) are more interested in the first topic than the second. In fact, if it concerns 2. the file system, there are a lot of other items (type of file system type, hardware, disk caching, etc.) involved and your topic belongs more to the super user community.
In other words, let's separate topic 1 from the file system by just processing Start-Sleep 1:
function Parallel-Delete { Start-Sleep 1 }
(Measure-Command { 0..15 | foreach-object { Parallel-Delete } }).TotalMilliseconds
16012.0359
(Measure-Command { 0..15 | foreach-object { Start-Job { Parallel-Delete } } }).TotalMilliseconds
4865.6182
(Measure-Command { 0..15 | foreach-object -Parallel { function Parallel-Delete { Start-Sleep 1 }; Parallel-Delete } }).TotalMilliseconds
4070.8242
Which is about what might expect from the performance differences of parallel processes on a 4 core system.

Related

Performance difference between Runspace and Job

I built a file processor script to convert some files to json. It works but not fast enough, so I am multithreading it. I prefer to use runspace pools since you can specify a max thread limit and it will run that many threads at a time and add new work as it completes other threads, spiffy. But I've found that if I have, say, 6 threads of work to complete, using runspaces takes ~50 minutes and keeps my computer at 40% CPU, while just using Start-Job for each piece of work pegs my computer at 100% CPU, and the work completes in 15 minutes. Am I misconfiguring the runspacepool in some way? Here are simplified examples of each
### Using Start-Job ###
$files = C:\temp | Get-Childitem -filter '*.xel' # returns 6 items
foreach ($file in $files) {
#simplified
Start-Job -ScriptBlock { C:\temp\FileProcessor.ps1 -filepath $using:file.fullname }
}
### Using Runspace Pool ###
$files = C:\temp | Get-Childitem -filter '*.xel' # returns 6 items
$Code = {
param ($filepath)
#simplified
C:\temp\FileProcessor.ps1 -filepath $filepath
}
$rsPool = [runspacefactory]::CreateRunspacePool(1,100)
$rsPool.Open()
$threads = #()
foreach ($file in $files) {
$PSinstance = [powershell]::Create().AddScript($Code).AddArgument($file.FullName)
$PSinstance.RunspacePool = $rsPool
$threads += $PSinstance.BeginInvoke()
}
while ($threads.IsCompleted -contains $False) {}
$rsPool.Dispose()
I may also be misunderstanding runspaces compared to jobs, any help is welcome. Thank you!
Jobs use multiple processes...

using threads to delete files with specific extensions ignoring files over a certain date

In my profession I make forensic images from "foreign" PCs which I extract later on my local storage.
To clean up the data I'd hope to delete all files that aren't relevant for me. (not limited to: audio, movies, systemfiles,...)
Since we're speaking of multiple TB of data, I'd have hoped to use threads, especially since my storage is all flash and the limitation on the disk is somewhat less of a problem.
To speed the process up after an initial manual run, I would want the script to exclude files older then 1 day (since I have done that one already with a manual run).
what I have so far:
$IncludeFiles = "*.log", "*.sys", "*.avi", "*.mpg", "*.mkv", ".mp3", "*.mp4",
"*.mpeg", "*.mov", "*.dll", "*.mof", "*.mui", "*.zvv", "*.wma",
"*.wav", "*.MPA", "*.MID", "*.M4A", "*.AIF", "*.IFF", "*.M3U",
"*.3G2", "*.3GP", "*.ASF", "*.FLV", "*.M4V", "*.RM", "*.SWF",
"*.VOB"
$ScriptBlock = {
Param($mypath = "D:\")
Get-ChildItem -Path $mypath -Recurse -File -Include $file | Where-Object {
$_.CreationTime -gt (Get-Date).AddDays(-1)
}
foreach ($file in $IncludeFiles) {
Start-Job -ScriptBlock $ScriptBlock -ArgumentList $file
}
Get-Job | Wait-Job
$out = Get-Job | Receive-Job
Write-Host $out
the only thing that doesn't work is the limitation that it only looks at files "younger" than 1 day. If I run the script without it, it seems to work perfectly. (as it gives me a list of files with the extensions I want to remove)
Parameter passing doesn't work the way you seem to expect. Param($mypath = "D:\") defines a parameter mypath with a default value of D:\. That default value is superseded by the value you pass into the scriptblock via -ArgumentList. Also the variable $file inside the scriptblock and the variable $file outside the scriptblock are not the same. Because of that an invocation
Start-Job -ScriptBlock $ScriptBlock -ArgumentList '*.log'
will run the command
Get-ChildItem -Path '*.log' -Recurse -File -Include $null | ...
Change your code to something like this to make it work:
$ScriptBlock = {
Param($extension)
$mypath = "D:\"
Get-ChildItem -Path $mypath -Recurse -File -Filter $extension | Where-Object {
$_.CreationTime -gt (Get-Date).AddDays(-1)
}
}
foreach ($file in $IncludeFiles) {
Start-Job -ScriptBlock $ScriptBlock -ArgumentList $file
}
Get-Job | Wait-Job | Receive-Job
Using -Filter should provide better performance than -Include, but accepts only a single string (not a list of strings like -Include), so you can only filter one extension at a time.

Multi-threaded powershell script on single processor system

I have access to a single-core single-processor VM with which to do logging for my team. I have the following code:
$sb = {
Param($_)
if($_.CONTROLLER -ne ".xx" ){
$posIP = "10." + $_.IP + $_.CONTROLLER
if (Test-Connection -ComputerName $posIP -Count 1 -Quiet) {
$mapPath = "\\" + $posIP + "\c$"
net use $mapPath $password /user:$userName | Out-Null
if(Test-Path $mapPath$dataFile) {
[xml]$periods = Get-Content $mapPath$dataFile
$endDate = $periods.IndataDbf.ingredient.PeriodDetail.PeriodEndDate | select -last 1
$output = "$($_.STORE);$endDate" }
else {
$outPut = $_.STORE + ';' + "$dataFile Not Found" }
net use $mapPath /de | Out-Null
}
else {
$outPut = $_.STORE + ';' + "Map FAILED" }
Write-Output $OutPut
}
}
Import-Csv $inFile | ForEach-Object {
while ((Get-Job -State Running).Count -ge 100) {
Start-Sleep -Seconds 5;
}
Write-Output $_.STORE
Start-Job -Scriptblock $sb -ArgumentList $_ | Write-Verbose
Get-Job -State Completed -HasMoreData 1 | Receive-Job | Out-File -Append -FilePath $outLog
}
Get-Job | Wait-Job | Receive-Job | Out-File -Append -FilePath $outLog
Which runs well, but takes the same amount of time as running the same code without Start-Job and just a loop. However, the previous logging command used BATCH files and automatically opened a couple dozen child command windows to process data, then return, and it runs in under half the time. The code used is the same, so I don't understand why adding more threads didn't make the script run faster. Can anyone tell me why a BATCH file program with a couple dozen child windows runs so much faster with arguably the same code? Any why does the Start-Job command not improve the speed at all? I would think it would try to execute multiple threads simultaneously.
Because there is a lot of overhead when using start-job and whenever you use pipeline.
If you use runspaces instead it maybe faster.Take a look at http://newsqlblog.com/2012/05/22/concurrency-in-powershell-multi-threading-with-runspaces/

Powershell Throttle Multi thread jobs via job completion

All the tuts I have found use a pre defined sleep time to throttle jobs.
I need the throttle to wait until a job is completed before starting a new one.
Only 4 jobs can be running at one time.
So The script will run up 4 and currently pauses for 10 seconds then runs up the rest.
What I want is for the script to only allow 4 jobs to be running at one time and as a job is completed a new one is kicked off.
Jobs are initialised via a list of servers names.
Is it possible to archive this?
$servers = Get-Content "C:\temp\flashfilestore\serverlist.txt"
$scriptBlock = { #DO STUFF }
$MaxThreads = 4
foreach($server in $servers) {
Start-Job -ScriptBlock $scriptBlock -argumentlist $server
While($(Get-Job -State 'Running').Count -ge $MaxThreads) {
sleep 10 #Need this to wait until a job is complete and kick off a new one.
}
}
Get-Job | Wait-Job | Receive-Job
You can test the following :
$servers = Get-Content "C:\temp\flashfilestore\serverlist.txt"
$scriptBlock = { #DO STUFF }
invoke-command -computerName $servers -scriptblock $scriptBlock -jobname 'YourJobSpecificName' -throttlelimit 4 -AsJob
This command uses the Invoke-Command cmdlet and its AsJob parameter to start a background job that runs a scriptblock on numerous computers. Because the command must not be run more than 4 times concurrently, the command uses the ThrottleLimit parameter of Invoke-Command to limit the number of concurrent commands to 4.
Be careful that the file contains the computer names in a domain.
In order to avoid inventing a wheel I would recommend to use one of the
existing tools.
One of them is the script
Invoke-Parallel.ps1.
It is written in PowerShell, you can see how it is implemented directly. It is
easy to get and it does not require any installation for using it.
Another one is the module SplitPipeline.
It may work faster because it is written in C#. It also covers some more use
cases, for example slow or infinite input, use of initialization and cleanup scripts.
In the latter case the code with 4 parallel pipelines will be
$servers | Split-Pipeline -Count 4 {process{ <# DO STUFF on $_ #> }}
I wrote a blog article which covers multithreading any given script via actual threads. You can find the full post here:
http://www.get-blog.com/?p=189
The basic setup is:
$ISS = [system.management.automation.runspaces.initialsessionstate]::CreateDefault()
$RunspacePool = [runspacefactory]::CreateRunspacePool(1, $MaxThreads, $ISS, $Host)
$RunspacePool.Open()
$Code = [ScriptBlock]::Create($(Get-Content $FileName))
$PowershellThread = [powershell]::Create().AddScript($Code)
$PowershellThread.RunspacePool = $RunspacePool
$Handle = $PowershellThread.BeginInvoke()
$Job = "" | Select-Object Handle, Thread, object
$Job.Handle = $Handle
$Job.Thread = $PowershellThread
$Job.Object = $Object.ToString()
$Job.Thread.EndInvoke($Job.Handle)
$Job.Thread.Dispose()
Instead of sleep 10 you could also just wait on a job (-any job):
Get-Job | Wait-Job -Any | Out-Null
When there are no more jobs to kick off, start printing the output. You can also do this within the loop immediately after the above command. The script will receive jobs as they finish instead of waiting until the end.
Get-Job -State Completed | % {
Receive-Job $_ -AutoRemoveJob -Wait
}
So your script would look like this:
$servers = Get-Content "C:\temp\flashfilestore\serverlist.txt"
$scriptBlock = { #DO STUFF }
$MaxThreads = 4
foreach ($server in $servers) {
Start-Job -ScriptBlock $scriptBlock -argumentlist $server
While($(Get-Job -State Running).Count -ge $MaxThreads) {
Get-Job | Wait-Job -Any | Out-Null
}
Get-Job -State Completed | % {
Receive-Job $_ -AutoRemoveJob -Wait
}
}
While ($(Get-Job -State Running).Count -gt 0) {
Get-Job | Wait-Job -Any | Out-Null
}
Get-Job -State Completed | % {
Receive-Job $_ -AutoRemoveJob -Wait
}
Having said all that, I prefer runspaces (similar to Ryans post) or even workflows if you can use them. These are far less resource intensive than starting multiple powershell processes.
Your script looks good, try and add something like
Write-Host ("current count:" + ($(Get-Job -State 'Running').Count) + " on server:" + $server)
after your while loop to work out whether the job count is going down where you wouldn't expect it.
I noticed that every Start-Job command resulted in an additional conhost.exe process in the task manager. Knowing this, I was able to throttle using the following logic, where 5 is my desired number of concurrent threads (so I use 4 in my -gt statement since I am looking for a count greater than):
while((Get-Process conhost -ErrorAction SilentlyContinue).Count -gt 4){Start-Sleep -Seconds 1}

Can Powershell Run Commands in Parallel?

I have a powershell script to do some batch processing on a bunch of images and I'd like to do some parallel processing. Powershell seems to have some background processing options such as start-job, wait-job, etc, but the only good resource I found for doing parallel work was writing the text of a script out and running those (PowerShell Multithreading)
Ideally, I'd like something akin to parallel foreach in .net 4.
Something pretty seemless like:
foreach-parallel -threads 4 ($file in (Get-ChildItem $dir))
{
.. Do Work
}
Maybe I'd be better off just dropping down to c#...
You can execute parallel jobs in Powershell 2 using Background Jobs. Check out Start-Job and the other job cmdlets.
# Loop through the server list
Get-Content "ServerList.txt" | %{
# Define what each job does
$ScriptBlock = {
param($pipelinePassIn)
Test-Path "\\$pipelinePassIn\c`$\Something"
Start-Sleep 60
}
# Execute the jobs in parallel
Start-Job $ScriptBlock -ArgumentList $_
}
Get-Job
# Wait for it all to complete
While (Get-Job -State "Running")
{
Start-Sleep 10
}
# Getting the information back from the jobs
Get-Job | Receive-Job
The answer from Steve Townsend is correct in theory but not in practice as #likwid pointed out. My revised code takes into account the job-context barrier--nothing crosses that barrier by default! The automatic $_ variable can thus be used in the loop but cannot be used directly within the script block because it is inside a separate context created by the job.
To pass variables from the parent context to the child context, use the -ArgumentList parameter on Start-Job to send it and use param inside the script block to receive it.
cls
# Send in two root directory names, one that exists and one that does not.
# Should then get a "True" and a "False" result out the end.
"temp", "foo" | %{
$ScriptBlock = {
# accept the loop variable across the job-context barrier
param($name)
# Show the loop variable has made it through!
Write-Host "[processing '$name' inside the job]"
# Execute a command
Test-Path "\$name"
# Just wait for a bit...
Start-Sleep 5
}
# Show the loop variable here is correct
Write-Host "processing $_..."
# pass the loop variable across the job-context barrier
Start-Job $ScriptBlock -ArgumentList $_
}
# Wait for all to complete
While (Get-Job -State "Running") { Start-Sleep 2 }
# Display output from all jobs
Get-Job | Receive-Job
# Cleanup
Remove-Job *
(I generally like to provide a reference to the PowerShell documentation as supporting evidence but, alas, my search has been fruitless. If you happen to know where context separation is documented, post a comment here to let me know!)
There's so many answers to this these days:
jobs (or threadjobs in PS 6/7 or the module for PS 5)
start-process
workflows (PS 5 only)
powershell api with another runspace
invoke-command with multiple computers, which can all be localhost (have to be admin)
multiple session (runspace) tabs in the ISE, or remote powershell ISE tabs
Powershell 7 has a foreach-object -parallel as an alternative for #4
Using start-threadjob in powershell 5.1. I wish this worked like I expect, but it doesn't:
# test-netconnection has a miserably long timeout
echo yahoo.com facebook.com |
start-threadjob { test-netconnection $input } | receive-job -wait -auto
WARNING: Name resolution of yahoo.com microsoft.com facebook.com failed
It works this way. Not quite as nice and foreach-object -parallel in powershell 7 but it'll do.
echo yahoo.com facebook.com |
% { $_ | start-threadjob { test-netconnection $input } } |
receive-job -wait -auto | ft -a
ComputerName RemotePort RemoteAddress PingSucceeded PingReplyDetails (RTT) TcpTestS
ucceeded
------------ ---------- ------------- ------------- ---------------------- --------
facebook.com 0 31.13.71.36 True 17 ms False
yahoo.com 0 98.137.11.163 True 97 ms False
Here's workflows with literally a foreach -parallel:
workflow work {
foreach -parallel ($i in 1..3) {
sleep 5
"$i done"
}
}
work
3 done
1 done
2 done
Or a workflow with a parallel block:
function sleepfor($time) { sleep $time; "sleepfor $time done"}
workflow work {
parallel {
sleepfor 3
sleepfor 2
sleepfor 1
}
'hi'
}
work
sleepfor 1 done
sleepfor 2 done
sleepfor 3 done
hi
Here's an api with runspaces example:
$a = [PowerShell]::Create().AddScript{sleep 5;'a done'}
$b = [PowerShell]::Create().AddScript{sleep 5;'b done'}
$c = [PowerShell]::Create().AddScript{sleep 5;'c done'}
$r1,$r2,$r3 = ($a,$b,$c).begininvoke() # run in background
$a.EndInvoke($r1); $b.EndInvoke($r2); $c.EndInvoke($r3) # wait
($a,$b,$c).streams.error # check for errors
($a,$b,$c).dispose() # clean
a done
b done
c done
In Powershell 7 you can use ForEach-Object -Parallel
$Message = "Output:"
Get-ChildItem $dir | ForEach-Object -Parallel {
"$using:Message $_"
} -ThrottleLimit 4
http://gallery.technet.microsoft.com/scriptcenter/Invoke-Async-Allows-you-to-83b0c9f0
i created an invoke-async which allows you do run multiple script blocks/cmdlets/functions at the same time. this is great for small jobs (subnet scan or wmi query against 100's of machines) because the overhead for creating a runspace vs the startup time of start-job is pretty drastic. It can be used like so.
with scriptblock,
$sb = [scriptblock] {param($system) gwmi win32_operatingsystem -ComputerName $system | select csname,caption}
$servers = Get-Content servers.txt
$rtn = Invoke-Async -Set $server -SetParam system -ScriptBlock $sb
just cmdlet/function
$servers = Get-Content servers.txt
$rtn = Invoke-Async -Set $servers -SetParam computername -Params #{count=1} -Cmdlet Test-Connection -ThreadCount 50
Backgrounds jobs are expensive to setup and are not reusable. PowerShell MVP Oisin Grehan
has a good example of PowerShell multi-threading.
(10/25/2010 site is down, but accessible via the Web Archive).
I'e used adapted Oisin script for use in a data loading routine here:
http://rsdd.codeplex.com/SourceControl/changeset/view/a6cd657ea2be#Invoke-RSDDThreaded.ps1
To complete previous answers, you can also use Wait-Job to wait for all jobs to complete:
For ($i=1; $i -le 3; $i++) {
$ScriptBlock = {
Param (
[string] [Parameter(Mandatory=$true)] $increment
)
Write-Host $increment
}
Start-Job $ScriptBlock -ArgumentList $i
}
Get-Job | Wait-Job | Receive-Job
If you're using latest cross platform powershell (which you should btw) https://github.com/powershell/powershell#get-powershell, you can add single & to run parallel scripts. (Use ; to run sequentially)
In my case I needed to run 2 npm scripts in parallel: npm run hotReload & npm run dev
You can also setup npm to use powershell for its scripts (by default it uses cmd on windows).
Run from project root folder: npm config set script-shell pwsh --userconfig ./.npmrc
and then use single npm script command: npm run start
"start":"npm run hotReload & npm run dev"
This has been answered thoroughly. Just want to post this method i have created based on Powershell-Jobs as a reference.
Jobs are passed on as a list of script-blocks. They can be parameterized.
Output of the jobs is color-coded and prefixed with a job-index (just like in a vs-build-process, as this will be used in a build)
Can be used to startup multiple servers at a time or running build steps in parallel or so..
function Start-Parallel {
param(
[ScriptBlock[]]
[Parameter(Position = 0)]
$ScriptBlock,
[Object[]]
[Alias("arguments")]
$parameters
)
$jobs = $ScriptBlock | ForEach-Object { Start-Job -ScriptBlock $_ -ArgumentList $parameters }
$colors = "Blue", "Red", "Cyan", "Green", "Magenta"
$colorCount = $colors.Length
try {
while (($jobs | Where-Object { $_.State -ieq "running" } | Measure-Object).Count -gt 0) {
$jobs | ForEach-Object { $i = 1 } {
$fgColor = $colors[($i - 1) % $colorCount]
$out = $_ | Receive-Job
$out = $out -split [System.Environment]::NewLine
$out | ForEach-Object {
Write-Host "$i> "-NoNewline -ForegroundColor $fgColor
Write-Host $_
}
$i++
}
}
} finally {
Write-Host "Stopping Parallel Jobs ..." -NoNewline
$jobs | Stop-Job
$jobs | Remove-Job -Force
Write-Host " done."
}
}
sample output:
There is a new built-in solution in PowerShell 7.0 Preview 3.
PowerShell ForEach-Object Parallel Feature
So you could do:
Get-ChildItem $dir | ForEach-Object -Parallel {
.. Do Work
$_ # this will be your file
}-ThrottleLimit 4

Resources