PowerShell multithreaded compress dirs - multithreading

I have a rootDir with 3 dirs under it, I just want to compress each dir and listen the result by using multithreaded code, so I tried the following:
Set-Location "C:\test"
$sw = [Diagnostics.Stopwatch]::StartNew()
Get-Job | Remove-Job
$rootDirectory = $PWD
$dirs = $(Get-ChildItem -Path $rootDirectory -Directory).Name
# $dirs = d1, d2, d3
$sb = {
Param($init)
Compress-Archive -Path $init -DestinationPath "$init.zip" -CompressionLevel NoCompression
}
$jobs = #()
$dirs | ForEach-Object {
$jobs += Start-Job -ScriptBlock $sb -ArgumentList $_
}
Wait-Job -Job $jobs | Out-Null
Receive-Job -Job $jobs
I got
The path 'd1' either does not exist or is not a valid file system path.
+ CategoryInfo : InvalidArgument: (d1:String) [Compress-Archive], InvalidOperationException
+ FullyQualifiedErrorId : ArchiveCmdletPathNotFound,Compress-Archive
+ PSComputerName : localhost
if I run the compress command serially, not using the Start-Job, all seems to work.

The jobs don't inherit their working directory directory from your script. Put the Set-Location inside the scriptblock and pass the directory as a parameter:
$rootdir = 'C:\test'
...
$jobs = Get-ChildItem -Path $rootdir -Directory | ForEach-Object {
Start-Job -ScriptBlock {
Param($init, $dir)
Set-Location $dir
Compress-Archive -Path $init -DestinationPath "${init}.zip" -CompressionLevel NoCompression
} -ArgumentList $_, $rootdir
}
...
Better yet, pass the full path as the parameter, not just the name:
$rootdir = 'C:\test'
...
$jobs = Get-ChildItem -Path $rootdir -Directory | ForEach-Object {
Start-Job -ScriptBlock {
Param($init)
Compress-Archive -Path $init -DestinationPath "${init}.zip" -CompressionLevel NoCompression
} -ArgumentList $_.FullName
}
...

Related

Powershell Script Can't Run Properly With Task Scheduler

I have an odd problem. I have a script that works fine when manually run it. I created a scheduled job in windows and run this script automatically. The script works fine until the last stage of script.
$deploymentfiles_mdm = Get-ChildItem 'D:\DeploymentTriggerApp\*'
Write-Host $deploymentfiles_mdm
$timestamp_app = Get-Date -Format o | ForEach-Object {$_ -replace ":", "."}
Write-Host $timestamp_app
$server = Get-Content 'C:\Users\Administrator\Desktop\Scripts\AutoDeployment\ProdMDMapps.txt'
$User = 'domain\user'
$SecurePassword = Get-Content C:\Users\Administrator\Desktop\Scripts\Password.txt | ConvertTo-SecureString
$UserCred = New-Object System.Management.Automation.PSCredential ($User, $SecurePassword)
if (Test-Path -Path $deploymentfiles_mdm)
{
do{
try
{
$ServerSessions = New-PSSession -ComputerName $server -Credential $UserCred -ErrorAction Stop
Write-Host ("$ServerSessions")
}
catch [Exception]
{
Write-Host("Credential is incorrect or password is expired. Either change credential and run CredentialEncryption.ps1 or communicate with dc admin to open expired password!")
}
}while(!$ServerSessions)
Copy-Item "D:\Deployment_Files\*.zip" -ToSession $ServerSessions -Destination "D:\Deployment_Files\" -ErrorAction SilentlyContinue
try{
Invoke-Command -Session $ServerSessions -ScriptBlock {
param($timestampApp)
$appPath = Get-ChildItem 'D:\MDM\live\bin\'
Expand-Archive -Path 'D:\Deployment_Files\*.zip' -DestinationPath 'D:\Deployment_Files\' -Force
Remove-Item -Path 'D:\Deployment_Files\*.zip'
$nodeProcess = Get-Process | Where-Object { $_.Name -eq "node"}
if($nodeProcess -Or $appPath)
{
Get-Process | Where-Object { $_.Name -eq "node"} | Select-Object -First 1 | Stop-Process -Force
New-Item -Path 'D:\Backups\' -Name $timestampApp -ItemType 'directory'
Get-ChildItem -Path "D:\MDM\live\bin\" -Recurse | Move-Item -Destination "D:\Backups\$timestampApp\"
}
Copy-Item "D:\Deployment_Files\bin" -Destination "D:\MDM\live\" -Recurse -Force -ErrorAction SilentlyContinue
Remove-Item "D:\Deployment_Files\*" -Recurse -Force
Start-Job -ScriptBlock{ node D:\MDM\live\bin\main.js}
} -ArgumentList $timestamp_app
}
catch
{
$_.Exception.Message
}
}
Remove-Item D:\DeploymentTriggerApp\*
In section Start-Job -ScriptBlock{ node D:\MDM\live\bin\main.js} the script can't start the node process. When I manually run it, it runs without any problem.
Any suggestions for that? (The node process needed to be in background job. If any alternative commands to that, I can also try that solution)
Below line solved my problem.
Start-Job -ScriptBlock{& 'C:\Program Files\nodejs\node.exe' D:\MDM\live\bin\main.js}

Struggling to export PowerShell job information to excel

I have a PowerShell script that is running a web request as part of a job. With the script block that is run by the job I am logging the Endpoint Uri and the response time. and then after all the jobs are finished but before I remove the jobs I am trying to export the result into excel.
PS Verion:
Major Minor Build Revision
----- ----- ----- --------
5 1 17763 503
Code:
$Code = {
Param(
[array]$Domain,
[array]$Services,
[array]$TestData,
[string]$Path
)
$Log = #()
## Get Random School ##
$Random = Get-Random -InputObject $TestData -Count 1
## Get Random Service ##
$RandService = Get-Random -InputObject $Services
## Get Random Endpoint ##
$ServiceEndpoints = Get-Content "$Path\Service.Endpoints.json" | Out-String | ConvertFrom-Json
$GatewayEndpoints = $ServiceEndpoints.Services.$RandService.Gateway
$RandomEndpoint = Get-Random -InputObject $GatewayEndpoints
$Headers = #{
"Authorization" = "Bearer" + ' ' + $Random.Token
}
$Uri = 'https://' + $Domain + $RandomEndpoint
Try {
$TimeTaken = Measure-Command -Expression {
$JsonResponse = Invoke-WebRequest -Uri $Uri -Headers $Headers -ContentType 'application/json' -Method Get -UseBasicParsing
}
}
Catch {
}
$ResponseTime = [Math]::Round($TimeTaken.TotalMilliseconds, 1)
$LogItem = New-Object PSObject
$LogItem | Add-Member -type NoteProperty -Name 'Endpoint' -Value $Uri
$LogItem | Add-Member -type NoteProperty -Name 'Time' -Value $ResponseTime
$Log += $LogItem
Write-Host $Log
}
#Remove all jobs
Get-Job | Remove-Job
#Start the jobs. Max 4 jobs running simultaneously.
foreach($Row in $TestData){
While ($(Get-Job -state running).count -ge $MaxThreads){
Start-Sleep -Milliseconds 3
}
Start-Job -Scriptblock $Code -ArgumentList $Domain, $Services, $TestData, $Path
}
#Wait for all jobs to finish.
While ($(Get-Job -State Running).count -gt 0) {
start-sleep 1
}
$Log | Export-XLSX -Path .\Test.Results\Performance\Performance.Test.Log.xlsx -ClearSheet
#Get information from each job.
foreach($Job in Get-Job) {
$Info = Receive-Job -Id ($Job.Id)
}
#Remove all jobs created.
Get-Job | Remove-Job
I cannot seem to get the endpoint uri and the response time out of the script block. When I try to export the $Log, all that happens is it creates an empty excel file.
Write-Host $Log
#{Endpoint=https://domain/customer/v1/years/2019/marks; Time=1233.3}
#{Endpoint=https://domain/customer/v1/years/2019/marks; Time=2131.7}
You've to return $Log in your script block, since $Log lives in another scope. You can return $Log in your $Code script block, and finally, fetch it via Receive-Job.
Change your code to:
$Code = {
...
$Log += $LogItem
Write-Host $Log
$Log # return via pipeline to the caller
}
As Niraj Gajjar's answer suggests you can use Export-Csv cmdlet to create an Excel file:
#Get information from each job.
foreach($Job in Get-Job) {
Receive-Job -Id ($Job.Id) | Export-CSV -Path .\Test.Results\Performance\Performance.Test.Log.xlsx -Append -NoTypeInformation
}
Global variables inside of jobs aren't visible in the main script since job is running in a new session with its own global space.
You can use Export-Csv to open file in excel.
sample code with multiple jobs to CSV format :
$jobs = #() # INITILIZING ARRAY
$jobs += Start-Job { appwiz.cpl } # START JOB 1 AND ADDING TO ARRAY
$jobs += Start-Job { compmgmt.msc } # START JOB 2 AND ADDING TO ARRAY
$jobs += Start-Job { notepad.exe } # START JOB 3 AND ADDING TO ARRAY
foreach ( $job in $jobs) # INTERATION OF JOBS
{
Export-Csv -InputObject $job "C:\result.csv" -Append # SAVING TO FILE
}
Note : There are some internal properties of job which are not converted by CSV because depth is 1 but some basic properties are available.

Execute SQLs in Parallel from directory

I am able to execute all the .SQL files from directory on to the server serially using below command. But, we want to execute all these files in parallel.
My code for serial execution -
$PSScriptRoot = 'Y:\test\'
$ServerName = $env:COMPUTERNAME
foreach ($f in Get-ChildItem -path $PSScriptRoot -Filter *.sql | sort-object
-desc )
{
invoke-sqlcmd -InputFile $f.fullname -ServerInstance $ServerName
}
You could use jobs to have them execute in the background:
foreach ($f in Get-ChildItem -path $PSScriptRoot -Filter *.sql | sort-object
-desc)
{
Start-Job -ScriptBlock { invoke-sqlcmd -InputFile $args[0].fullname -ServerInstance $args[1] } -ArgumentList $f, $ServerName
}
If there is any output you can collect it via:
Get-Job | Wait-Job | Receive-Job

PowerShell Mutex & Jobs

I'm trying to run a PowerShell file delete operation that deletes files based on specific parameters. So, the code goes like this:
$sb = {
Get-ChildItem -Path $SubFolder | Remove-ChildItem
if ($Error[0] -eq $null) {
$results = New-Object PSObject -Property #{
$Foldername = $Subfolder.Name
$TotalFiles = $SubFolder.(Count)
}
} else {
$errorresult = "Error deleting files from $($Subfolder.Name)"
}
#Error Mutex
$ErrorLogmutex = New-Object System.Threading.Mutex($false, "ErrorLogFileHandler")
$ErrorLogmutex.WaitOne()
Out-File -Append -InputObject $errorresult -FilePath $Errorlog
$ErrorLogmutex.ReleaseMutex()
#Success Mutex
$Successmutex = New-Object System.Threading.Mutex($false, "SuccessFileHandler")
$SuccessLogmutex.WaitOne()
Out-File -Append -InputObject $results -FilePath $successlog
$Successmutex.ReleaseMutex()
}
#Calling scriptblock in multi-thread count of 5
{
foreach ($Subfolder in $Folder) {
while ((Get-Job -State Running).Count -ge 5) {
Start-Sleep -Seconds 5
}
Start-Job -ScriptBlock $sb -ArgumentList $arguments | Out-Null
}
The script runs, able to see the results if I explicitly call out Receive-Job -Name JobID, but the output does not produce any log files as I would've thought it would.

Wait until all threads complete before running next task

I would wrap everything inside foreach($computer in $computers) in a Start-Job to make them run simultaneously. The only problem is, I need to wait for all the jobs to complete before I do the ConvertTo-Json at the bottom.
$sb = "OU=some,OU=ou,DC=some,DC=domain"
$computers = Get-ADComputer -Filter {(Enabled -eq $true)} -SearchBase "$sb" -Properties *
$hasmanufacturer = New-Object System.Collections.Generic.List[System.Object]
foreach($computer in $computers)
{
$drives = try{#(Get-WMIObject -Class Win32_CDROMDrive -Property * -ComputerName $computer.Name -ErrorAction Stop)} catch {$null}
foreach($drive in $drives)
{
if($drive.Manufacturer)
{
$hasmanufacturer.Add($computer)
continue
}
} # inner foreach
}
ConvertTo-Json $hasmanufacturer
Use a Get-Job | Wait-Job before executing the ConvertTo-Json
How about using the array of computer names as a parameter to Invoke-Command. It will run, by default, 32 concurrent remote sessions. The number can be changed with the -Throttle parameter.
$computers = Get-ADComputer -Filter {(Enabled -eq $true)} -SearchBase "OU=Servers,DC=xxx,DC=com" -Properties Name |
Where-Object { $_.Name -match 'LAX_*' } |
ForEach-Object { $_.Name }
$computers
$j = Invoke-Command `
-ComputerName $computers `
-ScriptBlock { Get-WMIObject -Class Win32_CDROMDrive -Property * -ErrorAction Stop } `
-AsJob
while ( (Get-Job -Id $j.Id).Status -eq 'Running') {}
Get-Job -Id $j.Id | Wait-Job
$results = Receive-Job -Id $j.Id
$results

Resources