This question already has answers here:
How do I measure execution time of a command on the Windows command line?
(32 answers)
Equivalent of Unix time command in PowerShell?
(4 answers)
Closed 8 years ago.
I've got what might be a dumb question but I can't seem to find the answer anywhere online. In linux based systems, in the terminal typing "time" before any command gives how long the command takes in terms of real, user, and system time. For example, typing
time ls
lists the files and folders in the current directory then gives the amount of real, user, and system time that it took to list the files and folders. Is there a windows equivalent to this? I am trying to compare the performance of different algorithms but don't have a linux machine to work on so I was hoping that there was a similar command in Windows.
The following is far from perfect. But it's the closest I could come up with to simulate UNIX time behavior. I'm sure it can be improved a lot.
Basically I'm creating a cmdlet that receives a script block, generates a process and uses GetProcessTimes to get Kernel, User and Elapsed times.
Once the cmdlet is loaded, just invoke it with
Measure-Time -Command {your-command} [-silent]
The -Silent switch means no output generated from the command (I.e you are interested only in the time measures)
So for example:
Measure-Time -Command {Get-Process;sleep -Seconds 5} -Silent
The output generated:
Kernel time : 0.6084039
User time : 0.6864044
Elapsed : 00:00:06.6144000
Here is the cmdlet:
Add-Type -TypeDefinition #"
using System;
using System.Runtime.InteropServices;
public class ProcessTime
{
[DllImport("kernel32.dll", CharSet = CharSet.Unicode)]
public static extern bool GetProcessTimes(IntPtr handle,
out IntPtr creation,
out IntPtr exit,
out IntPtr kernel,
out IntPtr user);
}
"#
function Measure-Time
{
[CmdletBinding()]
param ([scriptblock] $Command,
[switch] $Silent = $false
)
begin
{
$creation = 0
$exit = 0
$kernel = 0
$user = 0
$psi = new-object diagnostics.ProcessStartInfo
$psi.CreateNoWindow = $true
$psi.RedirectStandardOutput = $true
$psi.FileName = "powershell.exe"
$psi.Arguments = "-command $Command"
$psi.UseShellExecute = $false
}
process
{
$proc = [diagnostics.process]::start($psi)
$buffer = $proc.StandardOutput.ReadToEnd()
if (!$Silent)
{
Write-Output $buffer
}
$proc.WaitForExit()
}
end
{
$ret = [ProcessTime]::GetProcessTimes($proc.handle,
[ref]$creation,
[ref]$exit,
[ref]$kernel,
[ref]$user
)
$kernelTime = [long]$kernel/10000000.0
$userTime = [long]$user/10000000.0
$elapsed = [datetime]::FromFileTime($exit) - [datetime]::FromFileTime($creation)
Write-Output "Kernel time : $kernelTime"
Write-Output "User time : $userTime"
Write-Output "Elapsed : $elapsed"
}
}
I found a similar question on SuperUser which covers some alternatives. First and foremost being my suggestion to use Measure-Command in PowerShell.
Measure-Command {ls}
Got the syntax wrong in my comment.
Related
So this issue is a bit convoluted but I need this for a very specific case in azure. I'm trying to create an APIM subnet inside an azure k8s vnet, but I haven't been able to find a return value from the k8s terraform call that gives me the ID/name for the vnet. Instead I used a powershell command to query Azure and get the name of the new vnet. I was working on this code locally on my windows box and it works fine:
data "external" "cluster_vnet_name" {
program = [var.runPSVer6 == true ? "pwsh" : "powershell","Select-AzSubscription '${var.subscriptionName}' |out-null; Get-AzVirtualNetwork -ResourceGroupName '${module.kubernetes-service.k8ResourceGroup}' | Select Name | convertto-json}"]
depends_on = [module.kubernetes-service]
}
I have a toogle in my variables for runpsver6 so when I run on a linux machine it will change powershell to pwsh. Now, this is were is starts getting a little weird. When I run this on my windows machine, its not an issue, however when I run this from a linux box I get the following error:
can't find external program "pwsh"
I have tried a number of different work arounds (such as using the full powershell snapin path /snap/bin/powershell and putting the commands in a .ps1 file) to no avail. Every single time it throws the error that it can't find pwsh as an external program.
I use this same runPSVer6 toggle for local-exec terraform commands with no issue, but I need the name of the Vnet as a response.
Anyone have any ideas what I'm missing here?
ADDED AFTER SEPT 30th
So I tried the alternative way of firing commands:
variable "runPSVer6" {
type = bool
default = true
}
variable "subscriptionName" {
type = string
}
variable "ResourceGroup" {
type = string
}
data "external" "runpwsh" {
program = [var.runPSVer6 == true ? "pwsh" : "powershell", "test.ps1"]
query = {
subscriptionName = var.subscriptionName
ResourceGroup = var.ResourceGroup
}
}
output "vnet" {
value = data.external.runpwsh.result.name
}
and this appears to allow the command to execute, however its not pulling back the result of the json response (even when I confirmed that I do get a response).
This is what I'm using for my .ps1:
Param($subscriptionName,$ResourceGroup)
$subscription = Select-AzSubscription $subscriptionName
$name = (Get-AzVirtualNetwork -ResourceGroupName $ResourceGroup | Select Name).Name
Write-Output "{`n`t""name"":""$name""`n}"
When i don't use the .name in the out, this is what I get:
data.external.runpwsh: Refreshing state...
Apply complete! Resources: 0 added, 0 changed, 0 destroyed.
Outputs:
vnet = { "name" = "" }
And this is the output from the .ps1:
{
"name":"vnettest"
}
Can you check if pwsh is working in the terminal. It should bring up the powershell prompt...
The path of pwsh must be added to the PATH.. /usr/bin is in my PATH as you can see below.
ubuntu#myhost:~$ whereis pwsh
pwsh: /usr/bin/pwsh
ubuntu#myhost:~$
ubuntu#myhost:~$ echo $PATH
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin
ubuntu#myhost:~$ pwsh
PowerShell 7.0.3
Copyright (c) Microsoft Corporation. All rights reserved.
https://aka.ms/powershell
Type 'help' to get help.
PS /home/ubuntu>
PS /home/ubuntu> exit
ubuntu#myhost:~$
============
Added later after 29 Sep 2020.
I tried again in Ubuntu 20.
Terraform 13 was downloaded as Zip from main site
PowerShell 7.0.3 installed with snap install powershell --classic
I tried the below test code which worked.
varriable runPSVer6 {}
default = true
}
data "external" "testps" {
program = [var.runPSVer6 == true ? "pwsh" : "powershell","/tmp/testScript.ps1"]
}
output "ps_out" {
value = data.external.testps.result
}
The output was like...
Outputs:
ps_out = {
"name" = "My Resource Name"
"region" = "West Europe"
}
/tmp/testScript.ps1 code was simple output statement...
Write-Output '{ "name" : "My Resource Name", "region" : "West Europe" }'
I tried to null out the path variable just to see if i get the error message you mentioned. I did, as expected..
ubuntu#ip-172-31-53-128:~$ ./terraform apply
data.external.test_ps: Refreshing state...
Error: can't find external program "pwsh"
on main.tf line 5, in data "external" "test_ps":
5: data "external" "test_ps" {
but when i used the full path, it worked again. (even /snap/bin/powershell works)
program = [var.runPSVer6 == true ? "/snap/bin/pwsh" : "powershell","/tmp/testScript.ps1"]
I ealier wrognly blamed snap with my issue, but snap did work now.
This does not give any clue here or pin-point the issue you are having. But may be you try a couple of things just to be sure,
1.) issue "pwsh" in the current directory and see that Powershell prompt does come up.. not sure if you already checked this, but sometime some other characters in path could cause an issue.
2.) can you run tf once after exporting PATH=/snap/bin ... (do it inside a shell and exit later so that you back to old path. or. export the correct path later after test)
3.) If you used full path, the error message must have been different other "Error: can't find external program "pwsh" ... can you cross check if there was diff error msg
this is how the pwsh bin and the sym link looks like in my machine...
ubuntu#ip-172-31-53-128:~$ /usr/bin/ls -lt /snap/bin/pwsh
lrwxrwxrwx 1 root root 10 Sep 29 15:40 /snap/bin/pwsh -> powershell
ubuntu#ip-172-31-53-128:~$
ubuntu#ip-172-31-53-128:~$ /usr/bin/ls -lt /snap/bin/powershell
lrwxrwxrwx 1 root root 13 Sep 29 15:40 /snap/bin/powershell -> /usr/bin/snap
ubuntu#ip-172-31-53-128:~$
Context: Running an Azure Automation Account solution where a caller PS script executes another PS script (executed on a VM) with parameter passing via 'Invoke-AzureRmVMRunCommand'.
Story: I had running a PowerShell (caller) script that executed another (called) PowerShell script on a remote Azure Win VM. That flow ran via an Automation Account schedule every day but suddenly stopped working two days ago because the parameter passing from the caller to the called script is not working anymore. I currently blame the MSFT Azure people for breaking my PRD solution.
Here the caller PS script code for the arguments to pass on:
$hshParams = #{
strSAName = $hshParameters.strStagingSA
strSAAccessKey = $strSAAccessKey
strFileShare = '"' + $strFileShare + '"'
strCopyObjects = $hshParameters.strCopyObjects
strSrcDriveLetter = $strSrcDriveLetter
strDstDriveLetter = $strDstDriveLetter
}
Here the invocaton of the VM-run PS script:
Invoke-AzureRmVMRunCommand -ResourceGroupName $objVM.ResourceGroupName -Name $objVM.Name `
-CommandId 'RunPowerShellScript' -ScriptPath $strRemoteScriptFileNameTmp -Parameter $hshParams
Here the parameter reception code on the VM-run PS script side:
# Parameters
Param (
[string] $strCopyObjects = $null,
[string] $strSAAccessKey = $null,
[string] $strFileShare = $null,
[string] $strSAName = $null,
[string] $strDstDriveLetter = $null,
[string] $strSrcDriveLetter = $null
)
Until two days ago all those six string values were populated properly and according to the argument setup in the hash table '$hshParams':
$strSAAccessKey = 92LO1Q4tuyeiqxxx
$strFileShare = 129xxxa1.file.core.windows.net\solutionfiles
$strSAName = 12xsa1
$strDstDriveLetter = D
$strSrcDriveLetter = Z
$strCopyObjects = AutoTopUp\Application\Live
Problem: Now I see five string values suddenly not being populated anymore with one being garbage, here is what they look like today:
$strSAAccessKey = []
$strFileShare = []
$strSAName = []
$strDstDriveLetter = []
$strSrcDriveLetter = []
$strCopyObjects = AutoTopUp\Application\Live" -strSAAccessKey 92LO1Q4tuyeiqxxx -strFileShare 129xxxa1.file.core.windows.net\solutionfiles -strSAName 12xsa1 -strDstDriveLetter D -strSrcDriveLetter Z
The solution was not touched, it just had been running as per schedule. $Args.Count on the VM-run script returns '2'.
My Question: Anyone with an explanation on this new behaviour? Frustratingly, I did not manage to arrange the parameter passing in a different way as it is all a bit unclear what the proper way of receiving the hash table values would be. The MSFT help page for 'Invoke-AzureRmVMRunCommand' is (of course) not helping here, also did I not find any other clear ways on the parameter passing on SO or Google...
Related question is raised in this MSDN thread; Just sharing this for the benefit of broader audience who might face similar issue.
For various reasons, I started to write a PowerShell portscanner, not least to start learning it.
First iteration used Test-Netconnection. This seemed as if it would be too slow; so I went one level down to use sockets, specifically System.Net.Sockets.TcpClient. (Have started looking at System.Net.Sockets.Socket as the MS docs make mention of the Socket.BeginConnect() method which can begin an asynchronous request for a remote connection, but not sure if this will help yet.)
This still seemed too slow, so I looked at jobs. All this did was consume more resources for not much speed increase, so after much googling, I managed to make threading (or what PowerShell calls threading any way) work through the use of RunSpacePools. I thought it was pretty much done, and performance is ok if you're looking at an input file of 5 IP addresses. However, tried it out with a CIDR /24 this morning, and it took about 20-30 minutes.
[Edit] I should add that this script will take a 'thread' value, but if none is provided uses a default thread value of number of cores + 1 in order to take advantage of the RunSpacePool multithreading.
So I started looking at how Fyodor increased his speed, and in The Art of Scanning in PHRACK article 11 he states (whilst talking about TCP Connect() scanning):
While making a separate connect() call for every targeted port in a
linear fashion would take ages over a slow connection, you can hasten
the scan by using many sockets in parallel.
That is clearly where some optimisation is available.
So, is anyone able to point me in the direction of how I enable this - as I say, still quite new to PoSH, so am pushing the limits of my comprehension with RunSpacePools.
Specifically, I would like some advice on a) if my instincts are right to increase the scan speed to increase socket parallelism, b) how to do that and c) if System.Net.Sockets.Socket is more appropriate.
function doConnect {
$ipLoopCount = 0
$portLoopCount = 0
# check for randomise switch
if ($randomise) {
$ipArray = makeRange | Sort-Object {Get-Random}
$portArray = makePortRange | Sort-Object {Get-Random}
} else {
# Connects to IPs in order
$ipArray = makeRange
$portArray = makePortRange
}
# initialise runspaces
if ($threads) {
$useThreads = $threads
} else {
$useThreads = ([int]$env:NUMBER_OF_PROCESSORS + 1)
}
$pool = [RunspaceFactory]::CreateRunspacePool(1, $useThreads)
$pool.ApartmentState = "MTA"
$pool.Open()
$runspaces = #()
# set higher priority for powershell process
if ($priority) {
$proc = Get-Process -Id $pid;
$proc.PriorityClass = 'High'
} else{
$proc = Get-Process -Id $pid;
$proc.PriorityClass = 'Normal'
}
# info object
$infoDisplay = #{
InputFile = $inFile
Target_IPs = $ipArray
Target_Ports = $portArray
Process_Priority = $proc.PriorityClass
Threads = $useThreads
}
[PSCustomObject]$infoDisplay
# set up scriptblock to pass to runspaces
$scriptblock = {
Param (
[IPAddress]$sb_ip,
[int]$sb_port
)
# This progress bar doesn't work yet
Write-Progress -Activity "Scan range $StartIPaddress - $EndIPAddress" -Status "% Complete:" -PercentComplete((($portLoopCount)/($ipArray.Length*$portArray.Length))*100)
if ($delay) {
$delay = Get-Random -Maximum 1000 -Minimum 1;
Start-Sleep -m $delay
}
$socket = New-Object System.Net.Sockets.TcpClient
$socket.Connect($sb_ip, $sb_Port)
if ($socket.Connected) {
Write-Output "Connected to $sb_port on $sb_ip"
#} else {
# Write-Output "Failed to connect to port $sb_port on $sb_ip"
}
$socket.Close()
}
foreach ($nIP in $ipArray) {
$ipLoopCount++
foreach ($nPort in $portArray) {
$portLoopCount++
$runspace = [PowerShell]::Create()
$null = $runspace.AddScript($scriptblock)
$null = $runspace.AddArgument($nIP)
$null = $runspace.AddArgument($nPort)
$runspace.RunspacePool = $pool
$runspaces += [PSCustomObject]#{
Pipe = $runspace;
Status = $runspace.BeginInvoke()
}
}
}
while ($runspaces.Status -ne $null) {
$completed = $runspaces | Where-Object { $_.Status.IsCompleted -eq $true }
foreach ($runspace in $completed) {
$runspace.Pipe.EndInvoke($runspace.Status)
$runspace.Status = $null
}
}
$pool.Close()
$pool.Dispose()
}
It may be that PowerShell is entirely the wrong thing to attempt this in, but is a useful exercise as the environment is quite locked down, and installing a 'proper' portscanner - i.e. nmap - is impossible.
[Edit 2] I don't think reducing the timeout and plumbing that into the logic is the solutiuon that I'm after.
[Edit 3] The parallel switch didn't help.
[Edit 4] Have been thinking about Asynchronous socket connections, as this may help the overall connections speed - but then you have to have another thread/process looking after the incoming traffic. Unsure as to the efficacy.
Im trying to write some basic chat system just to learn perl. Im trying to get the chatlog into a 1 file and print new message if it's appears in the chatlog.dat file, So i've wrote a function that does almost the same thing, but I have got some problems and don't know how to solve them.
So now I have 2 problems!
I could not understand how to keep checkFile function always active (like multiprocession) to continuously check for new messages
This problem occurs when I'm trying to write a new message that will be appended into the chatlog. The Interpreter waits for my input on the line my $newMessage = <STDIN>;, but, what if someone writes a new message? it will not be shown until he press enter... how to void that?
my ($sec,$min,$hour) = localtime();
while(1){
my $userMessage = <STDIN>;
last if $userMessage eq "::quit";
`echo "($hour:$min:$sec): $userMessage" >>chatlog.dat`;
}
sub checkFile{
my $lastMessage = "";
my $newMessage = "";
while (1) {
my $context = `cat chatlog.dat`;
split(/\n/, $context);
$newMessage = $_[$#_];
if ($newMessage ne $lastMessage) {
print $newMessage;
$lastMessage = $newMessage;
}
}
}
First:
don't use echo within a perl script. It's nasty to shell escape when you've got perfectly good IO routines.
using cat to read files is about as nasty as using 'echo'.
reading <STDIN> like that will be a blocking call - which means your script will pause.
but that's not as bad as it sounds, because otherwise you're running a 'busy wait' loop which'll repeatedy cat the file. This is a very bad idea.
You're assuming writing a file like that is an atomic operation, when it's not. You'll hit problems with doing that too.
What I would suggest you do it look at IO::Handle and also consider using flock to ensure you've got the file locked for IO. You may also wish to consider File::Tail instead.
I would actually suggest though, you want to consider a different mode of IPC - as 'file swapping' is quite inefficient. If you really want to use the filesystem for your IO, you might want to consider using a FIFO pipe - have each 'client' open it's own, and have a server reading and coalescing them.
Either way though - you'll either need to use IO::Select or perhaps multithreading, just to swap back and forth between reading and writing. http://perldoc.perl.org/IO/Select.html
Answering my own question
sub checkFile{
my $lastMessage = "";
my $newMessage = "";
my $userName = $_[0];
while (1) {
my $context = `cat chatlog.dat`;
split(/\n/, $context);
$newMessage = $_[$#_];
if ($newMessage ne $lastMessage) {
$newMessage =~ /^\(.+\):\((.+)\) (.+$)/;
if ($1 ne $userName) { print "[$1]: $2";}
$lastMessage = $newMessage;
}
}
}
my $userName = "Rocker";
my ($sec,$min,$hour) = localtime();
my $thr = threads -> create ( \&checkFile, $userName ); #Starting a thread to continuously check for the file update
while (1) {
my $userMessage = <STDIN>; #STDIN will not interfere file checking
last if $userMessage eq "::quit";
`echo "($hour:$min:$sec):($userName) $userMessage" >>chatlog.dat` if $userMessage =~ /\S+/;
}
$thr -> join();
I am trying to get the downloaded script from an iex expression directly from memory and I think there is something I am missing. $MyInvocation.MyCommand.ScriptBlock should get the current script block.
In the example below it is on one side the thread-function and on the other side the iex-expression.
How do I get the things in between? I know that the full script is there somewhere in some kind of thread but i don't get what PowerShell is doing here.
# run-self in iex -
# two down, one up - or why $MyINvocation after iex is the iex command
# how to get the script itself, not the thread-function nor the iex-cmd
# save this script on webserver and call it with: iex((new-object net.webclient).DownloadString('http://some.url/script.ps1') )
$sharedData = [HashTable]::Synchronized(#{})
$sessionstate = [system.management.automation.runspaces.initialsessionstate]::CreateDefault()
$runspacepool = [runspacefactory]::CreateRunspacePool(1,2,$sessionstate,$Host)
$runspacepool.Open()
$selfcallhelper = {
param($sharedData)
$sharedData.Mysource = $MyINvocation.MyCommand.ScriptBlock
}
# start thread
$thread = [powershell]::Create().AddScript($selfcallhelper).AddArgument($sharedData)
$thread.RunspacePool = $runspacepool
$thread.BeginInvoke()
# write output to files in current directory
$sharedData.Mysource | out-file "myscript-from-thread.txt"
$MyINvocation.MyCommand.ScriptBlock | out-file "myscript-from-self.txt"
$MyInvocation always refers to the callers context. It's the way a bit of script can ask "who called me?"
It is sometimes useful to know where some script comes from, not who invoked it. In cases like this, you can simply invoke a nested script block, e.g.
$selfcallhelper = {
param($sharedData)
$sharedData.Mysource = & { $MyINvocation.MyCommand.ScriptBlock }
}
The change here was to evaluate $MyInvocation inside it's own script block.