Fastest way to monitor specific process on multiple computers using powershell - multithreading

I am trying to find a way to check if a specific processs exists on about 200 computers. Right now i am doing this using the command tasklist (get-process doesnt work on some of the computers) but this takes about 3-4 minutes which causes my gui to freeze. How can i multi-thread this check?

You are looking for jobs.
In your particular case:
$scriptBlock = {
Get-Process -Name 'SomeProcessName'
}
foreach ($computer in $200Computers)
{
Invoke-Command -ComputerName $computer -ScriptBlock $scriptBlock -AsJob
}
$jobs = Get-Job
# other code goes here.

Related

Cross talk threads to allow for file system access in powershell

Reference: Runspace for button event in powershell
https://www.foxdeploy.com/blog/part-v-powershell-guis-responsive-apps-with-progress-bars.html
So, I believe my issue is that PowerShell is unable to access the memory space of the file system, from within the memory block, of my thread, is there a way to solve this, to access the file system, from a multi-threaded application?
Back Story:
So, I run a program, that calls upon "code"/command, from the command prompt, (*.exe) (Robocopy) to copy files from a server, to a group of computers, at a time. We have a classroom environment, at my work, so I have my setup, in a way, that I have a folder, per room. I keep a list of all our addresses (static), for each room, in their perspective folders. We have an update, from our developers, that we need to push to all of the rooms. We need to run a slow push, as to not disturb the production environment. It's proprietary, so we can't use a/any typical solution(s), like Microsoft SCCM. So, I created a script to push to the rooms. while it does work, it's not a smooth operation. I'm not actually the one pushing the update, because of the slow process, of updating. I'm just trying to make a stable smooth-running package, for the person, who is going to be doing it. My code works, outside of the thread, (I) tested it, I know it works.
So how I came my conclusion of knowing, that my code works outside of the thread. (The picture) I followed the same setup, with my code, (A button click event inside of a thread, running the form). Placed the actual working code, (tried, and tested, before making a Thread, for the interface, after completing backend operation code testing.)
("Region Boe's Addition") referring to Boe Prox, (from the link)
In his, he is updating from a command line/powershell window, via a function run inside a thread. I'm running an event from a button, inside of a thread and trying to run a separate thread, for the click event(s). Outside of the thread. The event works fine, but inside, it doesn't work, at all..
Basic Code:
// Multi thread, thread for the $form, and thread for the event (as per referenced link)
$var = [PowerShell]::Create().AddScript({ button.Add_Click{
$var = [PowerShell]::Create().AddScript{<Thread><Robocopy></Thread>}
})
Needed the "Start-Process" -Wait command to allow for the listbox, to be updated in-between copies, to confirm installation, through each step in the loop.
$choice = $comboBox.SelectedItem
# $drive = Get-Location
if(!(Test-Path -PathType Container -Path "L:\$choice"))
{
# New-Item -ItemType "container" -Path . -Name $choice
New-Item -ItemType "Directory" -Path . -Name $choice
}
# $folder = $_
# Where is it being stored at?
[System.IO.File]::ReadLines("Y:\$choice\IPs.txt") | foreach {
ping -a -n 2 -w 2000 $_ | Out-Null
Test-Connection -Count 2 -TimeToLive 2 $_ | Out-Null
if($?)
{
RoboCopy /Log:"L:\$folder\$_.log" $source \\$_\c$\tools
RoboCopy /Log+:"L:\$folder\$folder-MovementLogs.log" $source \\$_\c$\tools
Start-Process -Wait "P:\psexec.exe" -ArgumentList "\\$_ -d -e -h -s cmd /c reg import C:\tools\dump.reg"
# Copy-Item -LiteralPath Y:\* -Destination \\$_\c$\tools
$listBox.Items.Add($_)
}
}

Issue with Start-ThreadJob ScriptBlock Unable to find powershell script

I am using Start-ThreadJob and ScriptBlock to execute a powershell script in a new thread. It works fine on my local but on the preprod server, I am getting an error.
Code Block where I am initiating a new thread
Start-ThreadJob -InputObject $fileType -ScriptBlock {
./Functions/Download-FilesFromFTP.ps1 $args[0] $args[1] $args[2] $args[3] $args[4] $args[5]
} -ArgumentList $ftpServer,$user,$password,$completeSourceFolder,$completeStagingFolderPath,$completeLogFolderPath
As mentioned earlier, this code block works perfectly on my local. On Preprod env I get the following error when I display jobs using Get-Jobs command.
Powershell version on my local
Powershell version on preprod server
The version of the module ThreadJob is same on both servers
Start-ThreadJob runs the new thread with the same current location as the caller, which is unrelated to where the executing script is located.
If you want to refer to a file relative to the script's own location, use the automatic $PSScriptRoot variable, and refer to it in the thread script block via the $using: scope:
Start-ThreadJob -InputObject $fileType -ScriptBlock {
& "$using:PSScriptRoot/Functions/Download-FilesFromFTP.ps1" #args
} -ArgumentList $ftpServer,$user,$password,$completeSourceFolder,$completeStagingFolderPath,$completeLogFolderPath
Note the use of #args in order to also pass all positional arguments, reflected in the automatic $args array, through as individual arguments to the target script via splatting.

Running Powershell Script from Task Scheduler when User is Logged Off [duplicate]

This question already has an answer here:
Task Scheduler doesn't execute batch file properly
(1 answer)
Closed 6 years ago.
I have a powershell script that I'm intending to run from a remote server. The purpose of the script is to do the following:
Copy an Excel file from a mapped drive to the remote server
Open the Excel file and run a macro that
The macro copies an Access table that's on the remote server and pastes it into the Excel file, then does some manipulation of the data
Saves the Excel file, closes it, and then copies it back to the mapped drive
Right now, I'm testing it on my local machine, so it's copying the Excel file from the mapped drive to my C drive, then grabbing the Access table from a location on my local machine. It runs perfectly when I run it from Powershell. Here is the code:
# collect excel process ids before and after new excel background process is
# started
$priorExcelProcesses = Get-Process -name "*Excel*" | % { $_.Id }
$Excel = New-Object -ComObject Excel.Application
$postExcelProcesses = Get-Process -name "*Excel*" | % { $_.Id }
#run program
$folderPath = "my folder goes here"
$filePath = "my folder gooes here\filename.xlsm"
$tempPath = "C:\Users\Public"
$tempFile = "C:\Users\Public\filename.xlsm"
#copy file from I drive to remote desktop
Copy-Item -Path $filePath -Destination $tempPath
#create Excel variables
$excel = new-object -comobject excel.application
$excel.visible = $False
$excelFiles = Get-ChildItem -Path $tempPath -Include *.xls, *xlsm -Recurse
#open excel application and run routine
Foreach($file in $excelFiles)
{
$workbook = $excel.workbooks.open($tempFile)
$worksheet = $workbook.worksheets.item(1)
$excel.Run("getAccessData")
$workbook.save()
$workbook.close()
}
#copy file from remote desktop back onto I drive
Copy-Item -Path $tempFile -Destination $folderPath
# try to gently quit
$Excel.Quit()
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($Excel)
# otherwise allow stop-process to clean up
$postExcelProcesses | ? { $priorExcelProcesses -eq $null -or $priorExcelProcesses -notcontains $_ } | % { Stop-Process -Id $_ }
I need to have the script run once per day in the middle of the night, so I have been working on making it a scheduled task. I first set up the 'Action' with the following information:
Program/Script:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
Add arguments (optional):
-NoProfile -ExecutionPolicy Bypass -file "C:\Users\nelsonth\emsUpdateDesktop.ps1"
Now, if I run this task while the security option is set to "Run only when user is logged on" and the "Hidden" checkbox selected, then the task runs perfectly.
Since this is going to be running from a remote desktop in the middle of the night, I need the script to run while I am logged off. But when I selected "Run whether user is logged on or not" and "Run with highest privileges" the script no longer runs.
I need this to work so can anyone help me troubleshoot this?
When run as "Run whether user is logged on or not", your PowerShell Script will most likely have to manually Map the drives or access the file via UNC path.
Try adding something like the following to the top of your script, checking for the drive letter, then mapping it if not found:
if (-Not (Test-Path 'X:\')) {New-PSDrive -Name "X" -PSProvider FileSystem -Root "\\MyServer\MyShare"}

PowerShell Excel Add-in Deployment Issue

I have this script that I have gotten from http://poshcode.org/1811 and modified slightly for my own purposes. The problem I'm having is the user who requested for me to deploy the add-in has informed me that only the initial person that logs on to use the add-in can see it installed. This is an Active Directory environment and the script is set to run as the user logs on.
Here is the script, I will also get the code for the add-in itself later from the person in question.
$Addinfilepath = '\\network\path\to\Excel Add-ins\'
$Addinfilename = Get-ChildItem $Addinfilepath -Name
ForEach ($CustomAddin in $Addinfilename) {
$Excel = New-Object -ComObject excel.application
$ExcelWorkbook = $excel.Workbooks.Add()
if (($ExcelWorkbook.Application.AddIns | Where-Object {$_.name -eq $CustomAddin}) -eq $null) {
$ExcelAddin = $ExcelWorkbook.Application.AddIns.Add("$Addinfilepath$CustomAddin", $True)
$ExcelAddin.Installed = "True"
Write-Host "$CustomAddin added"}
#else{}
#{Write-Host "$Addinfilename already added"}
$Excel.Quit()
}
I'm not quite sure what's wrong with the script, but I believe there's something going on where it's installing to the first person's profile on the PC, but then maybe the script is seeing it as installed period and then no longer installs to the next user that logs on.
Thanks for advice or help you can give.
Not sure if you noticed it but you are missing $ sign in front of CustomAddin when you are trying to match the name.
Where-Object {$_.name -eq CustomAddin})
should be
Where-Object {$_.name -eq $CustomAddin})

PowerShell Split-Path does not work properly with IIS: drive qualifier

I am using PowerShell v3.0 and the IIS Administration Cmdlets to add and remove websites from my IIS 7 instance. I use Import-Module WebAdministration to make sure the IIS: drive is available, and I am able to use Remove-Item to delete files via the IIS: drive. For some reason though when I use the following code Split-Path always returns an empty string, even though the Remove-Item works fine.
$iisPath = Join-Path "IIS:\Sites" $fullPath
Remove-Item -Path $iisPath
$parent = Split-Path -Path $iisPath -Parent
Even if I comment out the Remove-Item line, Split-Path still returns an empty string. The $iisPath value would look something like this:
IIS:\Sites\Application Services\2.5.12\OurProductServicesDirectory
So I would expect $parent to contain:
IIS:\Sites\Application Services\2.5.12
But $parent is always empty. I have also tried creating the $iisPath using $iisPath = "IIS:\Sites\$fullPath", rather than Join-Path, but still get the same result.
Any ideas why Split-Path doesn't seem to work when using the IIS: drive, or how to fix it?
===UPDATE===
So I created a tiny sample script to see if I could reproduce the problem. Here it is:
$Block = {
Import-Module WebAdministration
$path = "IIS:\Sites\Application Services\2.5.12\OurProductServicesDirectory\"
Test-Path -Path $path
$parent = Split-Path -Path $path -Parent
Write-Host Parent is $parent
}
$Session = New-PSSession -ComputerName "Our.WebServer.local"
Invoke-Command -Session $Session -ScriptBlock $Block
Using this script $parent does get a value, but the text written to the console is:
True
Parent is IIS:Sites\Application Services
when I expect it to be:
True
Parent is IIS:\Sites\Application Services\2.5.12
So in my simple sample script I do get a value back, but it's the wrong value; it returns the grandparent directory instead of the parent directory, and it removes the backslash from after IIS:.
I'm not sure why I get different results in this sample script then in my main script, but both results appear to be wrong. Any suggestions are appreciated.
So because the IIS: qualifier is made valid by importing the WebAdministration module, I'm going to assume that Split-Path was simply never designed to work with the IIS: qualifier, and that is why it doesn't handle it properly.
The work around I found was simply to just exclude IIS:\Sites\ from my path when using Split-Path. So my original example would change to:
Remove-Item -Path "IIS:\Sites\$fullPath"
$parent = Split-Path -Path $fullPath -Parent
So basically I just leave IIS:\Sites\ off of all my paths, and then explicitly add it when needed, such as when calling Remove-Item, Test-Path, Get-ChildItems, etc. It's not the greatest solution, but it works.

Resources