Access local function from a start-job scriptblock [duplicate] - multithreading

This question already has answers here:
Reuse 2 functions in Start-ThreadJob
(1 answer)
How to pass a custom function inside a ForEach-Object -Parallel
(3 answers)
Closed last month.
I have some Powershell functions I want to modify to use multi-threading.
They follow a pattern where local functions are defined in the Begin block and consumed in the Process block.
When I use Start-Job to multi-thread I can no-longer access the local functions.
Is there a way of scoping the local functions to make them available?
or is there a better pattern to use for multi-threading advanced functions?
Note: I am using Powershell 5 so can't use foreach parallel.
EXAMPLE
function Test-Jobs {
[CmdletBinding()]
param (
[Parameter(ValueFromPipeline=$true)]
[string[]]$ComputerName
)
Begin {
Write-host "Begin Block"
function DoThing {
param($ComputerName)
# may be a long running function
start-sleep -Seconds 5
"Did a thing on $ComputerName"
}
$Jobs = #()
}
Process {
Write-host "Process Block"
foreach ($Name in $ComputerName) {
$Jobs += Start-Job -ScriptBlock {
DoThing $Using:Name
}
}
}
End {
Write-host "End Block"
Wait-Job $jobs
Receive-Job $jobs
}
}
OUTPUT
The term 'DoThing' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is
correct and try again.

Related

How to search for a string in multiple text files in an active running log

I am trying to search for a string in multiple text files to trigger an event. The log file is being actively added to by a program. The following script successfully achieves that goal, but it only works for one text file at a time:
$PSDefaultParameterValues = #{"Get-Date:format"="yyyy-MM-dd HH:mm:ss"}
Get-Content -path "C:\Log 0.txt" -Tail 1 -Wait | ForEach-Object { If ($_ -match 'keyword') {
Write-Host "Down : $_" -ForegroundColor Green
Add-Content "C:\log.txt" "$(get-date) down"
Unfortunately it means I have to run 3 instances of this script to search the 3 log files (C:\log 0.txt, C:\log 1.txt and C:'log 2.txt).
What I want to do is run one powershell script to search for that string across all three text files and not three.
I tried using a wildcard in the path ("C:\log*.txt)
I also tried adding a foreach loop:
$PSDefaultParameterValues = #{"Get-Date:format"="yyyy-MM-dd HH:mm:ss"}
$LogGroup = ('C:\log 0.txt', 'C:\Log 1.txt', 'C:\Log 2.txt')
ForEach ($log in $LogGroup) {
Get-Content $log -Tail 1 -Wait | ForEach-Object { If ($_ -match 'keyword') {
Write-Host "Down: $_" -ForegroundColor Green
Add-Content -path "C:\log.txt" "$(get-date) down"
Add-Content -path "C:\log.txt" "$(get-date) down"
}
}
}
This got me no errors but it also didn't work.
I saw others use Get-ChildItem instead of Get-Content but since this worked with one file... shouldn't it work with multiple? I assume it's my lack of scripting ability. Any help would be appreciated. Thanks.
This is how you can apply the same logic you already have for one file but for multiple logs at the same time, the concept is to spawn as many PowerShell instances as log paths there are in the $LogGroup array. Each instance is assigned and will be monitoring 1 log path and when the keyword is matched it will append to the main log file.
The instances are assigned the same RunspacePool, this help us initialize all with a SemaphoreSlim instance which help us ensure thread safety (only 1 thread can write to the main log at a time).
using namespace System.Management.Automation.Runspaces
using namespace System.Threading
# get the log files here
$LogGroup = ('C:\log 0.txt', 'C:\Log 1.txt', 'C:\Log 2.txt')
# this help us write to the main log file in a thread safe manner
$lock = [SemaphoreSlim]::new(1, 1)
# define the logic used for each thread, this is very similar to the
# initial script except for the use of the SemaphoreSlim
$action = {
param($path)
$PSDefaultParameterValues = #{ "Get-Date:format" = "yyyy-MM-dd HH:mm:ss" }
Get-Content $path -Tail 1 -Wait | ForEach-Object {
if($_ -match 'down') {
# can I write to this file?
$lock.Wait()
try {
Write-Host "Down: $_ - $path" -ForegroundColor Green
Add-Content "path\to\mainLog.txt" -Value "$(Get-Date) Down: $_ - $path"
}
finally {
# release the lock so other threads can write to the file
$null = $lock.Release()
}
}
}
}
try {
$iss = [initialsessionstate]::CreateDefault2()
$iss.Variables.Add([SessionStateVariableEntry]::new('lock', $lock, $null))
$rspool = [runspacefactory]::CreateRunspacePool(1, $LogGroup.Count, $iss, $Host)
$rspool.ApartmentState = [ApartmentState]::STA
$rspool.ThreadOptions = [PSThreadOptions]::UseNewThread
$rspool.Open()
$res = foreach($path in $LogGroup) {
$ps = [powershell]::Create($iss).AddScript($action).AddArgument($path)
$ps.RunspacePool = $rspool
#{
Instance = $ps
AsyncResult = $ps.BeginInvoke()
}
}
# block the main thread
do {
$id = [WaitHandle]::WaitAny($res.AsyncResult.AsyncWaitHandle, 200)
}
while($id -eq [WaitHandle]::WaitTimeout)
}
finally {
# clean all the runspaces
$res.Instance.ForEach('Dispose')
$rspool.ForEach('Dispose')
}

Powershell Function -WhatIf for cmdlet without the -WhatIf support?

For these cmdlets below, I try to create the function with the [CmdletBinding(SupportsShouldProcess)] line, but not sure that it will work.
Using: https://learn.microsoft.com/en-us/powershell/module/msonline/set-msoluserlicense?view=azureadps-1.0
How can the script be modified to support the -WhatIf parameter?
function Remove-License {
[CmdletBinding(SupportsShouldProcess)]
param ([String] $UserPrincipalName )
$AssignedLicense = (Get-MsolUser -UserPrincipalName $UserPrincipalName).licenses.AccountSkuId
$AssignedLicense |
ForEach-Object {
Write-Host "Removing $($UserPrincipalName) License $($AssignedLicense)..." -ForegroundColor Red
Set-MsolUserLicense -UserPrincipalName $upn -RemoveLicenses $_ -Verbose
}
}
Remove-License -UserPrincipalName 'User.Name#domain.com' -WhatIf
You can use $PSCmdlet.ShouldProcess
if($PSCmdlet.ShouldProcess("Some text to display")){
Set-MsolUserLicense -UserPrincipalName $upn -RemoveLicenses $_ -Verbose
}
MS Docs:
https://learn.microsoft.com/en-us/powershell/scripting/learn/deep-dives/everything-about-shouldprocess?view=powershell-7.2
To complement guiwhatsthat helpful answer, if you want your function to support Common Parameter including -WhatIf as well as -Confirm, $PSCmdlet.ShouldProcess is how you do it.
If instead you want Set-MsolUserLicense to support it, you would need to create a proxy command / proxy function around this cmdlet to extend it's functionality.
These blogs demonstrate how to do it:
https://devblogs.microsoft.com/scripting/proxy-functions-spice-up-your-powershell-core-cmdlets/
https://devblogs.microsoft.com/powershell/extending-andor-modifing-commands-with-proxies/
To address some issues on your code, you should note that -RemoveLicenses takes string[] as input, this means you can pass the whole array of licenses to remove as argument.
You could use Write-Verbose instead of Write-Host to display information about what the function is doing (since your function already supports -Verbose, this would be logical). Also, -Verbose is being used always activated on Set-MsolUserLicense which can be confusing if someone else using your function does not want to see verbose messages (this is addressed on the example below).
You can also use ConfirmImpact set to High, this way the function will always ask for confirmation before processing any license removal assuming -WhatIf was not being used. -Confirm:$false becomes the alternative to avoid such confirmation messages.
function Remove-License {
[CmdletBinding(SupportsShouldProcess, ConfirmImpact = 'High')]
param(
[Parameter(ValueFromPipeline, ValueFromPipelineByPropertyName)]
[string] $UserPrincipalName
)
process {
if($PSCmdlet.ShouldProcess([string] $UserPrincipalName, 'Remove License')) {
$licenses = (Get-MsolUser -UserPrincipalName $UserPrincipalName).licenses.AccountSkuId
$param = #{
UserPrincipalName = $UserPrincipalName
RemoveLicenses = $licenses
Verbose = $PSBoundParameters['Verbose']
}
Set-MsolUserLicense #param
}
}
}
Now the function supports pipeline processing, you can process multiple users by piping it into other cmdlets, i.e.:
Get-MsolUser -EnabledFilter DisabledOnly -MaxResults 5 | Remove-License -WhatIf

Powershell concurrency with Start-ThreadJob and ForEach-Object -Parallel

I've been trying to implement a producer-consumer pattern with multiple producers using BlockingCollection<>, Start-ThreadJob and ForEach-Object -Parallel. The results were mixed. Some code runs, some freezes and some just crashes powershell. So I'm thinking, I must be doing something fundamentally wrong:
using namespace System.Collections.Concurrent
class TestProducerConsumer
{
[int] $result = 0
[BlockingCollection[int]] $queue =
[BlockingCollection[int]]::new()
[void] producer([int]$i) { $this.queue.Add($i) }
[void] consumer() {
$sum = 0
$it = $this.queue.GetConsumingEnumerable()
foreach( $i in $it ) { $sum += $i }
$this.result = $sum
}
[void] Run() {
$job = Start-ThreadJob { ($using:this).consumer() }
1..10 | ForEach-Object -Parallel {
#($using:this).producer($_) # freezing
($using:this).queue.Add($_) # working
}
#Start-Sleep -Seconds 1 # freezing
$this.queue.CompleteAdding()
$job | Receive-Job -Wait
}
}
$t = [TestProducerConsumer]::new(); $t.Run(); $t
In the simplified test case above, there are two lines doing the same thing: One is getting the queue member from the instance and adding the value directly; the other is calling a method on the instance to add the value to the queue. The former works, the latter freezes!?
Also, adding back in the line with Start-Sleep freezes the process.
Tested on Windows 10 with various PowerShell 7.* versions.
EDIT: Probably related to ForEach-Object -Parallel situationally drops pipeline input and similar issues

calling a user defined function inside the scriptblock in Powershell [duplicate]

How do I Start a job of a function i just defined?
function FOO { write-host "HEY" } Start-Job -ScriptBlock { FOO } |
Receive-Job
Receive-Job: The term 'FOO' is not recognized as the name of cmdlet,
function ,script file or operable program.
What do I do?
Thanks.
As #Shay points out, FOO needs to be defined for the job. Another way to do this is to use the -InitializationScript parameter to prepare the session.
For your example:
$functions = {
function FOO { write-host "HEY" }
}
Start-Job -InitializationScript $functions -ScriptBlock {FOO}|
Wait-Job| Receive-Job
This can be useful if you want to use the same functions for different jobs.
#Rynant's suggestion of InitializationScript is great
I thought the purpose of (script) blocks is so that you can pass them around. So depending on how you are doing it, I would say go for:
$FOO = {write-host "HEY"}
Start-Job -ScriptBlock $FOO | wait-job |Receive-Job
Of course you can parameterize script blocks as well:
$foo = {param($bar) write-host $bar}
Start-Job -ScriptBlock $foo -ArgumentList "HEY" | wait-job | receive-job
It worked for me as:
Start-Job -ScriptBlock ${Function:FOO}
An improvement to #Rynant's answer:
You can define the function as normal in the main body of your script:
Function FOO
{
Write-Host "HEY"
}
and then recycle this definition within a scriptblock:
$export_functions = [scriptblock]::Create(#"
Function Foo { $function:FOO }
"#)
(makes more sense if you have a substantial function body) and then pass them to Start-Job as above:
Start-Job -ScriptBlock {FOO} -InitializationScript $export_functions| Wait-Job | Receive-Job
I like this way, as it is easier to debug jobs by running them locally under the debugger.
The function needs to be inside the scriptblock:
Start-Job -ScriptBlock { function FOO { write-host "HEY" } ; FOO } | Wait-Job | Receive-Job
A slightly different take. A function is just a scriptblock assigned to a variable. Oh, it has to be a threadjob. It can't be foreach-object -parallel.
$func = { 'hi' } # or
function hi { 'hi' }; $func = $function:hi
start-threadjob { & $using:func } | receive-job -auto -wait
hi
#Ben Power's comment under the accepted answer was my concern also, so I googled how to get function definitions, and I found Get-Command - though this gets only the function body. But it can be used also if the function is coming from elsewhere, like a dot-sourced file. So I came up with the following (hold my naming convention :)), the idea is to re-build the function definitions delimited by newlines:
Filter Greeting {param ([string]$Greeting) return $Greeting}
Filter FullName {param ([string]$FirstName, [string]$LastName) return $FirstName + " " + $LastName}
$ScriptText = ""
$ScriptText += "Filter Greeting {" + (Get-Command Greeting).Definition + "}`n"
$ScriptText += "Filter FullName {" + (Get-Command FullName).Definition + "}`n"
$Job = Start-Job `
-InitializationScript $([ScriptBlock]::Create($ScriptText)) `
-ScriptBlock {(Greeting -Greeting "Hello") + " " + (FullName -FirstName "PowerShell" -LastName "Programmer")}
$Result = $Job | Wait-Job | Receive-Job
$Result
$Job | Remove-Job
As long as the function passed to the InitializationScript param on Start-Job isn't large Rynant's answer will work, but if the function is large you may run into the below error.
[localhost] There is an error launching the background process. Error
reported: The filename or extension is too long"
Capturing the function's definition and then using Invoke-Expression on it in the ScriptBlock is a better alternative.
function Get-Foo {
param
(
[string]$output
)
Write-Output $output
}
$getFooFunc = $(Get-Command Get-Foo).Definition
Start-Job -ScriptBlock {
Invoke-Expression "function Get-Foo {$using:getFooFunc}"
Get-Foo -output "bar"
}
Get-Job | Receive-Job
PS C:\Users\rohopkin> Get-Job | Receive-Job
bar

Passing relative paths of scripts to powershell jobs

I have functions in separate files I need to run as jobs in one main file.
I need to be able to pass these functions arguments.
Right now my problem is figuring out how to pass the path of the function files to the jobs in a way that is not completely awful.
I need to have the functions defined at the top of the file for readability (just having a static comment that says "script uses somefunc.ps1" is not adequate)
I also need to refer to the scripts relative path (they will all be in the same folder).
Right now I am using env: to store the path of the scripts, but doing this I need to refer to the script in like 5 places!
This is what I have:
testJobsMain.ps1:
#Store path of functions in env so jobs can find them
$env:func1 = "$PSScriptRoot\func1.ps1"
$env:func2 = "$PSScriptRoot\func2.ps1"
$arrOutput = #()
$Jobs = #()
foreach($i in ('aaa','bbb','ccc') ) {
$Import = {. $env:func1}
$Execute = {func1 -myArg $Using:i}
$Jobs += Start-Job -InitializationScript $Import -ScriptBlock $Execute
}
$JobsOutput = $Jobs | Wait-Job | Receive-Job
$JobsOutput
$Jobs | Remove-Job
#Clean up env
Remove-Item env:\func1
$arrOutput
func1.ps1
function func1( $myArg ) { write-output $myArg }
func2.ps1
function func2( $blah ) { write-output $blah }
You can simply make array of paths, and then pass one of paths/all of them in -ArgumentList param from Start-Job:
#func1.ps1
function add($inp) {
return $inp + 1
}
#func2.ps1
function add($inp) {
return $inp + 2
}
$paths = "$PSScriptRoot\func1.ps1", "$PSScriptRoot\func2.ps1"
$i = 0
ForEach($singlePath in $paths) {
$Execute = {
Param(
[Parameter(Mandatory=$True, Position=1)]
[String]$path
)
Import-Module $path
return add 1
}
Start-Job -Name "Job$i" -ScriptBlock $Execute -ArgumentList $singlePath
$i++
}
for ($i = 0; $i -lt 2; $i++) {
Wait-Job "Job$i"
[int]$result = Receive-Job "Job$i"
}
You can skip all those $i iterators with names, Powershell will name jobs automatically, and easly predictable: Job1, Job2.. So it would make code a lot prettier.

Resources