Does anyone know how to perform IISRESET with a PowerShell script? I'm using the PowerGUI editor with PowerShell 1.0 installed on a Windows 2008 box.
You can do it with the Invoke-Command cmdlet:
invoke-command -scriptblock {iisreset}
UPDATE:
You can also simplify the command using the & call operator:
& {iisreset}
Having used & {iisreset} with occasional failure lead me to this:
Start-Process "iisreset.exe" -NoNewWindow -Wait
Now it waits for iisreset.exe to end gracefully.
This works well for me. In this application, I don't care about the return code:
Start-Process -FilePath C:\Windows\System32\iisreset.exe -ArgumentList /RESTART -RedirectStandardOutput .\iisreset.txt
Get-Content .\iisreset.txt | Write-Log -Level Info
The Write-Log cmdlet is a custom one I use for logging, but you could substitute something else.
I know that this is very old, but you can run any command line processes from Powershell's command line. So you would just need a script that calls IISReset with whatever switches you need.
Not sure what you are looking for exactly, but create a script with a body of "iisreset /noforce"
Here's an example: http://technet.microsoft.com/en-us/library/cc785436.aspx
IIS Stop or Start (tested)
WaitForExit and ExitCode work fine
[System.Reflection.Assembly]::LoadWithPartialName("System.Diagnostics").FullName
$procinfo = New-object System.Diagnostics.ProcessStartInfo
$procinfo.CreateNoWindow = $true
$procinfo.UseShellExecute = $false
$procinfo.RedirectStandardOutput = $true
$procinfo.RedirectStandardError = $true
$procinfo.FileName = "C:\Windows\System32\iisreset.exe"
$procinfo.Arguments = "/stop"
$proc = New-Object System.Diagnostics.Process
$proc.StartInfo = $procinfo
[void]$proc.Start()
$proc.WaitForExit()
$exited = $proc.ExitCode
$proc.Dispose()
Write-Host $exited
iisreset.exe supports computer names as a parameter. An example below show basic idea how to reset IIS on multiple servers:
$servers = #()
$servers += 'server1'
$servers += 'server2'
...
$servers += 'serverN'
Since iisreset.exe doesn't support multivalued parameters we have to wrap it in a loop:
$servers | %{ iisreset $_ /restart /noforce }
You may want to add simple monitoring:
$servers | %{ Write-Host "`n`n$_`n" -NoNewline ; iisreset $_ /restart /noforce /timeout:30 }
If you have many servers you may be interested in failures only:
$servers | %{ Write-Host "`n`n$_`n" -NoNewline ; iisreset $_ /restart /noforce /timeout:30 | Select-String "failed" }
Multiline version for better readability:
foreach ( $server in $servers ) {
Write-Host "`n`n$server`n" -NoNewline ;
iisreset $server /restart /noforce /timeout:30 | Select-String "failed"
}
I would strongly recommend testing your script with /status before implementing /reset action:
$servers | %{ iisreset $_ /status }
You may check stopped components with /status as well:
$servers | %{ Write-Host "`n`n$_`n" -NoNewline ; iisreset $_ /status | Select-String "Stopped" }
Reference
/restart is the default parameter. iisreset.exe users /restart in case no other actions params specified
/noforce will prevent iisreset.exe from running in case of an error.
/timeout - sometime you need to allow server more time to process request to avoid IIS stuck in Stopped state.
I found using the simple command below the easiest.
D:\PS\psexec \server_name iisreset
Related
Summary:
I have 4 systems running Windows Server 2022. One system has the PS code and the other 3 are targets to run this code on via PS "Invoke-Command". The objective is to turn off the automatic time sync option on the remote system, then setting the date to the current date + 1 day. Then checking again on the date on those servers.
First, the script collects the list of servers:
$servers = Get-content -Path .\testservers.txt
Then, turns off the time sync option (via the only way I found: registry)
foreach ($server in $servers){
Write-host "Turning off time sync on $server - " -NoNewline -ForegroundColor Yellow
$scriptBlock = {
$props = Get-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Services\W32Time\Parameters\"
if ($props.Type -ne "NoSync"){
Set-ItemProperty -Path "HKLM:\SYSTEM\CurrentControlSet\Services\W32Time\Parameters" -Name Type -Value "NoSync"
}
}
$result = Invoke-command -ComputerName $server -ScriptBlock $scriptBlock
Write-Host "$result"
}
Then, I set the date
foreach ($server in $servers){
Write-host "Adding 1 day on $server - " -NoNewline -ForegroundColor Yellow
$scriptBlock = { Set-Date -Date ($Using:currentDateTime).AddDays(1) }
$result = Invoke-command -ComputerName $server -ScriptBlock $scriptBlock
Write-Host "$result"
}
Then after the date change, I recheck the date
foreach ($server in $servers){
Write-host "Date on server $server - " -NoNewline
$result = Invoke-command -ComputerName $server -ScriptBlock { Get-Date }
Write-Host "$result"
}
I often see at least one of them switched back by this point. If I do another loop to check the dates perhaps 30 seconds later, they've all reverted back to current date.
(Yes, I know the code can be more streamlined.)
Any ideas? Could this be a matter of the registry change or the time change being done in a PS session and only living within that session?
In my profession I make forensic images from "foreign" PCs which I extract later on my local storage.
To clean up the data I'd hope to delete all files that aren't relevant for me. (not limited to: audio, movies, systemfiles,...)
Since we're speaking of multiple TB of data, I'd have hoped to use threads, especially since my storage is all flash and the limitation on the disk is somewhat less of a problem.
To speed the process up after an initial manual run, I would want the script to exclude files older then 1 day (since I have done that one already with a manual run).
what I have so far:
$IncludeFiles = "*.log", "*.sys", "*.avi", "*.mpg", "*.mkv", ".mp3", "*.mp4",
"*.mpeg", "*.mov", "*.dll", "*.mof", "*.mui", "*.zvv", "*.wma",
"*.wav", "*.MPA", "*.MID", "*.M4A", "*.AIF", "*.IFF", "*.M3U",
"*.3G2", "*.3GP", "*.ASF", "*.FLV", "*.M4V", "*.RM", "*.SWF",
"*.VOB"
$ScriptBlock = {
Param($mypath = "D:\")
Get-ChildItem -Path $mypath -Recurse -File -Include $file | Where-Object {
$_.CreationTime -gt (Get-Date).AddDays(-1)
}
foreach ($file in $IncludeFiles) {
Start-Job -ScriptBlock $ScriptBlock -ArgumentList $file
}
Get-Job | Wait-Job
$out = Get-Job | Receive-Job
Write-Host $out
the only thing that doesn't work is the limitation that it only looks at files "younger" than 1 day. If I run the script without it, it seems to work perfectly. (as it gives me a list of files with the extensions I want to remove)
Parameter passing doesn't work the way you seem to expect. Param($mypath = "D:\") defines a parameter mypath with a default value of D:\. That default value is superseded by the value you pass into the scriptblock via -ArgumentList. Also the variable $file inside the scriptblock and the variable $file outside the scriptblock are not the same. Because of that an invocation
Start-Job -ScriptBlock $ScriptBlock -ArgumentList '*.log'
will run the command
Get-ChildItem -Path '*.log' -Recurse -File -Include $null | ...
Change your code to something like this to make it work:
$ScriptBlock = {
Param($extension)
$mypath = "D:\"
Get-ChildItem -Path $mypath -Recurse -File -Filter $extension | Where-Object {
$_.CreationTime -gt (Get-Date).AddDays(-1)
}
}
foreach ($file in $IncludeFiles) {
Start-Job -ScriptBlock $ScriptBlock -ArgumentList $file
}
Get-Job | Wait-Job | Receive-Job
Using -Filter should provide better performance than -Include, but accepts only a single string (not a list of strings like -Include), so you can only filter one extension at a time.
I'm trying to use Start-Job to run a command to collect security logs from some servers.
I'm parsing a .ini file to get the list of servers, number of days etc.
#___Collect Logs from Servers___#
$servList = $iniContent["SERVERS"]["svr"]
$days = $iniContent["DAYS"]["days"]
$date = $(get-date -format ddMMyyyy)
$err = "Error Collecting $($logType) from $($server) or the Event Log is empty! | $(Get-Date -format g) "
$serv = $servList.Split(",")
foreach ($server in $serv){
$outfile = "D:\DCLogs\$($date)_$($server)_$logType.txt"
$ScriptBlock = cmd /c "D:\CollectLog\Dumpel.exe -f $($outFile) -l $($logType) -s $($server) -d $($days)"
Start-Job -ScriptBlock $ScriptBlock
Get-Job | Wait-Job
$file = Get-ChildItem D:\DCLogs -Filter "$($date)_$($server)*" -Name
$len = $file.length/1KB # Check LogFile Size
if ($len -eq 0){
$errCount = 1
write-output $err | Out-File $errLog -append
}
}
It's only starting one job at a time so I know I'm doing something wrong. If someone could please point out the problem I'd greatly appreciate it.
Thank you.
Amelia
Get-Job | Wait-Job in the loop, just serialize the jobs. You can use the loop to start the jobs and then use Get-Job | Wait-Job outside the loop.
Try to define your ScriptBlock using :
$ScriptBlock = {...}
Why isn't the first example equivalent to the second ?
1:
$volumeNum = 2
Invoke-Command -ComputerName $IP -Credential $GuestVM -ScriptBlock {"select volume $volumeNum" | diskpart}
2:
Invoke-Command -ComputerName $IP -Credential $GuestVM -ScriptBlock {"select volume 2" | diskpart}
Why does't powershell evaluate
"select volume $volumeNum"
to
select volume 2
The script block that is executed via Invoke-Command does not have access to the current environment state, it is run in a separate process. If you were running the command on your local computer it would work.
The issue is that the string "select volume $volumeNum" is not being evaluated until it is executed on the remote machine. So it is looking for the value in the environment of the current process on the remote machine and $volumeNum is not defined there.
PowerShell provides a mechanism for passing arguments via Invoke-Command. This works from my local machine to a remote:
Invoke-Command -ComputerName $ip -ScriptBlock { param($x) "hello $x" } -ArgumentList "world"
I believe a similar approach would work for you:
Invoke-Command -ComputerName $IP -Credential $GuestVM -ScriptBlock {param($volumeNum) "select volume $volumeNum" | diskpart} -ArgumentList $volumeNum
Script blocks are compiled. That means the variable references in them are fixed at compile time. You can work around this by deferring creation of the script block until run time:
$sb = [scriptblock]::create("select volume $volumeNum | diskpart")
Invoke-Command -ComputerName $IP -Credential $GuestVM -ScriptBlock $sb
Further note for others comming along: GetNewClosure doesn't work as well.
$filt = "*c*"
$cl = { gci D:\testdir $filt }.GetNewClosure()
& $cl
# returns 9 items
Invoke-command -computer mylocalhost -script $cl
# returns 9 items
Invoke-command -computer mylocalhost -script { gci D:\prgs\tools\Console2 $filt }
# returns 4 items
Invoke-command -computer mylocalhost -script { gci D:\prgs\tools\Console2 "*c*" }
I have a powershell script to do some batch processing on a bunch of images and I'd like to do some parallel processing. Powershell seems to have some background processing options such as start-job, wait-job, etc, but the only good resource I found for doing parallel work was writing the text of a script out and running those (PowerShell Multithreading)
Ideally, I'd like something akin to parallel foreach in .net 4.
Something pretty seemless like:
foreach-parallel -threads 4 ($file in (Get-ChildItem $dir))
{
.. Do Work
}
Maybe I'd be better off just dropping down to c#...
You can execute parallel jobs in Powershell 2 using Background Jobs. Check out Start-Job and the other job cmdlets.
# Loop through the server list
Get-Content "ServerList.txt" | %{
# Define what each job does
$ScriptBlock = {
param($pipelinePassIn)
Test-Path "\\$pipelinePassIn\c`$\Something"
Start-Sleep 60
}
# Execute the jobs in parallel
Start-Job $ScriptBlock -ArgumentList $_
}
Get-Job
# Wait for it all to complete
While (Get-Job -State "Running")
{
Start-Sleep 10
}
# Getting the information back from the jobs
Get-Job | Receive-Job
The answer from Steve Townsend is correct in theory but not in practice as #likwid pointed out. My revised code takes into account the job-context barrier--nothing crosses that barrier by default! The automatic $_ variable can thus be used in the loop but cannot be used directly within the script block because it is inside a separate context created by the job.
To pass variables from the parent context to the child context, use the -ArgumentList parameter on Start-Job to send it and use param inside the script block to receive it.
cls
# Send in two root directory names, one that exists and one that does not.
# Should then get a "True" and a "False" result out the end.
"temp", "foo" | %{
$ScriptBlock = {
# accept the loop variable across the job-context barrier
param($name)
# Show the loop variable has made it through!
Write-Host "[processing '$name' inside the job]"
# Execute a command
Test-Path "\$name"
# Just wait for a bit...
Start-Sleep 5
}
# Show the loop variable here is correct
Write-Host "processing $_..."
# pass the loop variable across the job-context barrier
Start-Job $ScriptBlock -ArgumentList $_
}
# Wait for all to complete
While (Get-Job -State "Running") { Start-Sleep 2 }
# Display output from all jobs
Get-Job | Receive-Job
# Cleanup
Remove-Job *
(I generally like to provide a reference to the PowerShell documentation as supporting evidence but, alas, my search has been fruitless. If you happen to know where context separation is documented, post a comment here to let me know!)
There's so many answers to this these days:
jobs (or threadjobs in PS 6/7 or the module for PS 5)
start-process
workflows (PS 5 only)
powershell api with another runspace
invoke-command with multiple computers, which can all be localhost (have to be admin)
multiple session (runspace) tabs in the ISE, or remote powershell ISE tabs
Powershell 7 has a foreach-object -parallel as an alternative for #4
Using start-threadjob in powershell 5.1. I wish this worked like I expect, but it doesn't:
# test-netconnection has a miserably long timeout
echo yahoo.com facebook.com |
start-threadjob { test-netconnection $input } | receive-job -wait -auto
WARNING: Name resolution of yahoo.com microsoft.com facebook.com failed
It works this way. Not quite as nice and foreach-object -parallel in powershell 7 but it'll do.
echo yahoo.com facebook.com |
% { $_ | start-threadjob { test-netconnection $input } } |
receive-job -wait -auto | ft -a
ComputerName RemotePort RemoteAddress PingSucceeded PingReplyDetails (RTT) TcpTestS
ucceeded
------------ ---------- ------------- ------------- ---------------------- --------
facebook.com 0 31.13.71.36 True 17 ms False
yahoo.com 0 98.137.11.163 True 97 ms False
Here's workflows with literally a foreach -parallel:
workflow work {
foreach -parallel ($i in 1..3) {
sleep 5
"$i done"
}
}
work
3 done
1 done
2 done
Or a workflow with a parallel block:
function sleepfor($time) { sleep $time; "sleepfor $time done"}
workflow work {
parallel {
sleepfor 3
sleepfor 2
sleepfor 1
}
'hi'
}
work
sleepfor 1 done
sleepfor 2 done
sleepfor 3 done
hi
Here's an api with runspaces example:
$a = [PowerShell]::Create().AddScript{sleep 5;'a done'}
$b = [PowerShell]::Create().AddScript{sleep 5;'b done'}
$c = [PowerShell]::Create().AddScript{sleep 5;'c done'}
$r1,$r2,$r3 = ($a,$b,$c).begininvoke() # run in background
$a.EndInvoke($r1); $b.EndInvoke($r2); $c.EndInvoke($r3) # wait
($a,$b,$c).streams.error # check for errors
($a,$b,$c).dispose() # clean
a done
b done
c done
In Powershell 7 you can use ForEach-Object -Parallel
$Message = "Output:"
Get-ChildItem $dir | ForEach-Object -Parallel {
"$using:Message $_"
} -ThrottleLimit 4
http://gallery.technet.microsoft.com/scriptcenter/Invoke-Async-Allows-you-to-83b0c9f0
i created an invoke-async which allows you do run multiple script blocks/cmdlets/functions at the same time. this is great for small jobs (subnet scan or wmi query against 100's of machines) because the overhead for creating a runspace vs the startup time of start-job is pretty drastic. It can be used like so.
with scriptblock,
$sb = [scriptblock] {param($system) gwmi win32_operatingsystem -ComputerName $system | select csname,caption}
$servers = Get-Content servers.txt
$rtn = Invoke-Async -Set $server -SetParam system -ScriptBlock $sb
just cmdlet/function
$servers = Get-Content servers.txt
$rtn = Invoke-Async -Set $servers -SetParam computername -Params #{count=1} -Cmdlet Test-Connection -ThreadCount 50
Backgrounds jobs are expensive to setup and are not reusable. PowerShell MVP Oisin Grehan
has a good example of PowerShell multi-threading.
(10/25/2010 site is down, but accessible via the Web Archive).
I'e used adapted Oisin script for use in a data loading routine here:
http://rsdd.codeplex.com/SourceControl/changeset/view/a6cd657ea2be#Invoke-RSDDThreaded.ps1
To complete previous answers, you can also use Wait-Job to wait for all jobs to complete:
For ($i=1; $i -le 3; $i++) {
$ScriptBlock = {
Param (
[string] [Parameter(Mandatory=$true)] $increment
)
Write-Host $increment
}
Start-Job $ScriptBlock -ArgumentList $i
}
Get-Job | Wait-Job | Receive-Job
If you're using latest cross platform powershell (which you should btw) https://github.com/powershell/powershell#get-powershell, you can add single & to run parallel scripts. (Use ; to run sequentially)
In my case I needed to run 2 npm scripts in parallel: npm run hotReload & npm run dev
You can also setup npm to use powershell for its scripts (by default it uses cmd on windows).
Run from project root folder: npm config set script-shell pwsh --userconfig ./.npmrc
and then use single npm script command: npm run start
"start":"npm run hotReload & npm run dev"
This has been answered thoroughly. Just want to post this method i have created based on Powershell-Jobs as a reference.
Jobs are passed on as a list of script-blocks. They can be parameterized.
Output of the jobs is color-coded and prefixed with a job-index (just like in a vs-build-process, as this will be used in a build)
Can be used to startup multiple servers at a time or running build steps in parallel or so..
function Start-Parallel {
param(
[ScriptBlock[]]
[Parameter(Position = 0)]
$ScriptBlock,
[Object[]]
[Alias("arguments")]
$parameters
)
$jobs = $ScriptBlock | ForEach-Object { Start-Job -ScriptBlock $_ -ArgumentList $parameters }
$colors = "Blue", "Red", "Cyan", "Green", "Magenta"
$colorCount = $colors.Length
try {
while (($jobs | Where-Object { $_.State -ieq "running" } | Measure-Object).Count -gt 0) {
$jobs | ForEach-Object { $i = 1 } {
$fgColor = $colors[($i - 1) % $colorCount]
$out = $_ | Receive-Job
$out = $out -split [System.Environment]::NewLine
$out | ForEach-Object {
Write-Host "$i> "-NoNewline -ForegroundColor $fgColor
Write-Host $_
}
$i++
}
}
} finally {
Write-Host "Stopping Parallel Jobs ..." -NoNewline
$jobs | Stop-Job
$jobs | Remove-Job -Force
Write-Host " done."
}
}
sample output:
There is a new built-in solution in PowerShell 7.0 Preview 3.
PowerShell ForEach-Object Parallel Feature
So you could do:
Get-ChildItem $dir | ForEach-Object -Parallel {
.. Do Work
$_ # this will be your file
}-ThrottleLimit 4