For various reasons, I started to write a PowerShell portscanner, not least to start learning it.
First iteration used Test-Netconnection. This seemed as if it would be too slow; so I went one level down to use sockets, specifically System.Net.Sockets.TcpClient. (Have started looking at System.Net.Sockets.Socket as the MS docs make mention of the Socket.BeginConnect() method which can begin an asynchronous request for a remote connection, but not sure if this will help yet.)
This still seemed too slow, so I looked at jobs. All this did was consume more resources for not much speed increase, so after much googling, I managed to make threading (or what PowerShell calls threading any way) work through the use of RunSpacePools. I thought it was pretty much done, and performance is ok if you're looking at an input file of 5 IP addresses. However, tried it out with a CIDR /24 this morning, and it took about 20-30 minutes.
[Edit] I should add that this script will take a 'thread' value, but if none is provided uses a default thread value of number of cores + 1 in order to take advantage of the RunSpacePool multithreading.
So I started looking at how Fyodor increased his speed, and in The Art of Scanning in PHRACK article 11 he states (whilst talking about TCP Connect() scanning):
While making a separate connect() call for every targeted port in a
linear fashion would take ages over a slow connection, you can hasten
the scan by using many sockets in parallel.
That is clearly where some optimisation is available.
So, is anyone able to point me in the direction of how I enable this - as I say, still quite new to PoSH, so am pushing the limits of my comprehension with RunSpacePools.
Specifically, I would like some advice on a) if my instincts are right to increase the scan speed to increase socket parallelism, b) how to do that and c) if System.Net.Sockets.Socket is more appropriate.
function doConnect {
$ipLoopCount = 0
$portLoopCount = 0
# check for randomise switch
if ($randomise) {
$ipArray = makeRange | Sort-Object {Get-Random}
$portArray = makePortRange | Sort-Object {Get-Random}
} else {
# Connects to IPs in order
$ipArray = makeRange
$portArray = makePortRange
}
# initialise runspaces
if ($threads) {
$useThreads = $threads
} else {
$useThreads = ([int]$env:NUMBER_OF_PROCESSORS + 1)
}
$pool = [RunspaceFactory]::CreateRunspacePool(1, $useThreads)
$pool.ApartmentState = "MTA"
$pool.Open()
$runspaces = #()
# set higher priority for powershell process
if ($priority) {
$proc = Get-Process -Id $pid;
$proc.PriorityClass = 'High'
} else{
$proc = Get-Process -Id $pid;
$proc.PriorityClass = 'Normal'
}
# info object
$infoDisplay = #{
InputFile = $inFile
Target_IPs = $ipArray
Target_Ports = $portArray
Process_Priority = $proc.PriorityClass
Threads = $useThreads
}
[PSCustomObject]$infoDisplay
# set up scriptblock to pass to runspaces
$scriptblock = {
Param (
[IPAddress]$sb_ip,
[int]$sb_port
)
# This progress bar doesn't work yet
Write-Progress -Activity "Scan range $StartIPaddress - $EndIPAddress" -Status "% Complete:" -PercentComplete((($portLoopCount)/($ipArray.Length*$portArray.Length))*100)
if ($delay) {
$delay = Get-Random -Maximum 1000 -Minimum 1;
Start-Sleep -m $delay
}
$socket = New-Object System.Net.Sockets.TcpClient
$socket.Connect($sb_ip, $sb_Port)
if ($socket.Connected) {
Write-Output "Connected to $sb_port on $sb_ip"
#} else {
# Write-Output "Failed to connect to port $sb_port on $sb_ip"
}
$socket.Close()
}
foreach ($nIP in $ipArray) {
$ipLoopCount++
foreach ($nPort in $portArray) {
$portLoopCount++
$runspace = [PowerShell]::Create()
$null = $runspace.AddScript($scriptblock)
$null = $runspace.AddArgument($nIP)
$null = $runspace.AddArgument($nPort)
$runspace.RunspacePool = $pool
$runspaces += [PSCustomObject]#{
Pipe = $runspace;
Status = $runspace.BeginInvoke()
}
}
}
while ($runspaces.Status -ne $null) {
$completed = $runspaces | Where-Object { $_.Status.IsCompleted -eq $true }
foreach ($runspace in $completed) {
$runspace.Pipe.EndInvoke($runspace.Status)
$runspace.Status = $null
}
}
$pool.Close()
$pool.Dispose()
}
It may be that PowerShell is entirely the wrong thing to attempt this in, but is a useful exercise as the environment is quite locked down, and installing a 'proper' portscanner - i.e. nmap - is impossible.
[Edit 2] I don't think reducing the timeout and plumbing that into the logic is the solutiuon that I'm after.
[Edit 3] The parallel switch didn't help.
[Edit 4] Have been thinking about Asynchronous socket connections, as this may help the overall connections speed - but then you have to have another thread/process looking after the incoming traffic. Unsure as to the efficacy.
Related
I'm having some trouble with using multithreading in powershell. I've tried creating a synced hash table and doing ForEach-Object -Parallel -ThrottleLimit 3 -AsJob{...} but this just gives me errors.
A sample of what I'm trying to do is:
$index = [System.Collections.Hashtable]::Synchronized(#{})
Import-Csv -path .\csvfile.csv | ForEach-Object -Parallel -ThrottleLimit 3 -AsJob{
$nameKey = $_.($FILE_HEADER)
$dirKey = $_.($DIR_HEADER)
$extKey = $_.($EXTENSION_HEADER)
$nameLetter = $nameKey.Substring(0,1) #Retrieves the very first character of the name for indexing
#Confirm we are indexing into non-null array as we step through dimensions
if($null -eq $index[$dirKey])
{$index[$dirKey] = #{}}
if($null -eq $index[$dirKey][$extKey])
{$index[$dirKey][$extKey] = #{}}
if($null -eq $index[$dirKey][$extKey][$nameLetter])
{$index[$dirKey][$extKey][$nameLetter] = #{}}
if($null -eq $index[$dirKey][$extKey][$nameLetter][$nameKey])
{$index[$dirKey][$extKey][$nameLetter][$nameKey] = 0}
$index[$dirKey][$extKey][$nameLetter][$nameKey]++
}
In the result I would have a 4d hash table where I can call $index[$dirKey][$extKey][$nameLetter][$nameKey] to get a counter representing the number of times this name was added.
I am doing this because this CSV is half a million lines long and simply building my hash table linearly takes two hours. The next stage where I go through these entries takes even longer.
What I am looking for is the ability to run through all the entries of the CSV once and build my index file using as many threads as I want. What is the best way to go about this? Also how do I determine the most sensible number of threads?
Given a script the implements multi-threaded operations via a runspace pool, how does one get all the threads to output to a single file? I understand there are synchronization and/or locking issues to deal with, I'm just not sure what options are available.
Here is an example of how my threads are created. The example code hangs.
$_ps = [Powershell]::Create()
$_ps.RunspacePool = $_runspace_pool
$null = $_ps.AddScript({
Param (
[Parameter(Mandatory = $true)]
$ComputerName,
[Parameter(Mandatory = $true)]
$LibPath,
[Parameter(Mandatory=$false)]
$Logger = $null
)
$ErrorActionPreference = 'SilentlyContinue'
Import-Module -Name "$LibPath\MyObjectModule" -Force
$_obj = MyObjectModule\New-MyObject
if ( $Logger ) { $_obj.Logger = $Logger }
$_obj.InvokeDiscovery($ComputerName)
}) # end of $_ps.AddScript()
# set script parameters
$null = $_ps.AddParameters(#{ComputerName = $_computer; LibPath = $_lib_path; Logger = $_logger})
I'm thinking I might create a synchronized sorted list as a queue that is added to a logger object in my root thread and also passed to each child thread. Separate logger objects could be instantiated in each child thread that will put messages into the synchronized queue. The root thread logger would periodically flush the queue via a call like $logger.flush().
Maybe something like this...
$_queue = some synchronized queue-like object
$_logger = MyLoggerModule\New-MyLogger
$_logger.queue = $_queue
$_ps = [Powershell]::Create()
$_ps.RunspacePool = $_runspace_pool
$null = $_ps.AddScript({
Param (
[Parameter(Mandatory = $true)]
$Queue,
...
)
$ErrorActionPreference = 'SilentlyContinue'
Import-Module -Name "$LibPath\MyObjectModule" -Force
Import-Module -Name "$LibPath\MyLoggerModule" -Force
$_obj = MyObjectModule\New-MyObject
$_logger = MyLoggerModule\New-MyLogger
$_logger.queue = $Queue
$_obj.InvokeDiscovery($ComputerName)
}) # end of $_ps.AddScript()
# set script parameters
$null = $_ps.AddParameters(#{ComputerName = $_computer; LibPath = $_lib_path; Queue = $_queue})
# wait for threads to complete
do {
$_logger.flush()
Start-Sleep -seconds 5
} ( threads still running )
Assuming that made sense, is it a workable solution? Are there other options? Am I barking up an impossible tree and should abandon the idea altogether?
I was able to solve my problem by implementing creating a mutex and passing it to relevant functions. Worked like a charm.
What I'm trying to do
The below script loops through every item in an Array of data streams and requests a summary value for output to a text file. This external request is by far the most expensive part of the process, and so I am now using a Runspacepool to run multiple (5) requests in parallel, and whichever finishes first outputs its results.
These requests all write to a synchronised hashtable, $hash, which holds a running total ($hash.counter) and tracks which thread ($hash.thread) is updating the total and a .txt output file, to avoid potential write collisions.
What isn't working
Each thread is able to update the counter easily enough $hash.counter+=$r, but when I try and Read the value into an Add-Content statement:
Add-Content C:\Temp\test.txt "$hash.counter|$r|$p|$ThreadID"
it adds an object reference rather than a number:
System.Collections.Hashtable+SyncHashtable.counter|123|MyStreamName|21252
And so I've ended up passing the counter through a temporary variable that can be used in the string:
[int]$t = $hash.counter+0
Add-Content C:\Temp\test.txt "$t|$r|$p|$ThreadID"
Which does output the true total:
14565423|123|MyStreamName|21252
What I'm asking
Is it possible to remove this temporary variable and output directly from the hashtable? Why does the object reference have a '+' in the middle?
I've had to add logic to 'lock' the hashtable to prevent data collisions. Should this be necessary? I'd been told that synchronised hashtables were supposed to be thread-safe for R/W operations, but without this logic my counter doesn't reach the correct total.
Full code for the loop itself below - I've left out setup of the Runspacepool etc
ForEach($i in $Array){
# Save down the data stream name and create parameter list for passing to the new job
$p = $i.Point.Name
$parameters = #{
hash = $hash
conn = $Conn
p = $p
}
# Instantiate new powershell runspace and send a script to it
$PowerShell = [powershell]::Create()
$PowerShell.RunspacePool = $RunspacePool
[void]$Powershell.AddScript({
# Receive parameter list, retrieve threadid
Param (
$hash,
$conn,
$p
)
$ThreadID = [appdomain]::GetCurrentThreadId()
# Send data request to the PI Data Archive using the existing connection
$q = Get-something (actual code removed)
[int]$r = $q.Values.Values[0].Value
# Lock out other threads from writing to the hashtable and output file to prevent collisions
# If the thread isn't locked, attempt to lock it. If it is locked, sleep for 1ms and try again. Tracked by synchronised Hashtable.
Do{
if($hash.thread -eq 0){
$hash.thread = $ThreadID
}
# Sleep for 1ms and check that the lock wasn't overwritten by another parallel thread.
Start-Sleep -Milliseconds 1
}Until($hash.thread -eq $ThreadID)
# Increment the synchronised hash counter. Save the new result to a temporary variable (can't figure out how to get the hash counter itself to output to the file)
$hash.counter+=$r
[int]$t = $hash.counter+0
# Write to output file new counter total, result, pointName and threadID
Add-Content C:\Temp\test.txt "$t|$r|$p|$ThreadID"
# release lock on the hashtable and output file
$hash.thread = 0
})
# Add parameter list to instance (matching param() list from the script. Invoke the new instance and save a handle for closing it
[void]$Powershell.AddParameters($parameters)
$Handle = $PowerShell.BeginInvoke()
# Save down the handle into the $jobs list for closing the instances afterwards
$temp = [PSCustomObject]#{
PowerShell=$Powershell
Handle=$Handle
}
[void]$jobs.Add($Temp)
}
This question already has answers here:
How do I measure execution time of a command on the Windows command line?
(32 answers)
Equivalent of Unix time command in PowerShell?
(4 answers)
Closed 8 years ago.
I've got what might be a dumb question but I can't seem to find the answer anywhere online. In linux based systems, in the terminal typing "time" before any command gives how long the command takes in terms of real, user, and system time. For example, typing
time ls
lists the files and folders in the current directory then gives the amount of real, user, and system time that it took to list the files and folders. Is there a windows equivalent to this? I am trying to compare the performance of different algorithms but don't have a linux machine to work on so I was hoping that there was a similar command in Windows.
The following is far from perfect. But it's the closest I could come up with to simulate UNIX time behavior. I'm sure it can be improved a lot.
Basically I'm creating a cmdlet that receives a script block, generates a process and uses GetProcessTimes to get Kernel, User and Elapsed times.
Once the cmdlet is loaded, just invoke it with
Measure-Time -Command {your-command} [-silent]
The -Silent switch means no output generated from the command (I.e you are interested only in the time measures)
So for example:
Measure-Time -Command {Get-Process;sleep -Seconds 5} -Silent
The output generated:
Kernel time : 0.6084039
User time : 0.6864044
Elapsed : 00:00:06.6144000
Here is the cmdlet:
Add-Type -TypeDefinition #"
using System;
using System.Runtime.InteropServices;
public class ProcessTime
{
[DllImport("kernel32.dll", CharSet = CharSet.Unicode)]
public static extern bool GetProcessTimes(IntPtr handle,
out IntPtr creation,
out IntPtr exit,
out IntPtr kernel,
out IntPtr user);
}
"#
function Measure-Time
{
[CmdletBinding()]
param ([scriptblock] $Command,
[switch] $Silent = $false
)
begin
{
$creation = 0
$exit = 0
$kernel = 0
$user = 0
$psi = new-object diagnostics.ProcessStartInfo
$psi.CreateNoWindow = $true
$psi.RedirectStandardOutput = $true
$psi.FileName = "powershell.exe"
$psi.Arguments = "-command $Command"
$psi.UseShellExecute = $false
}
process
{
$proc = [diagnostics.process]::start($psi)
$buffer = $proc.StandardOutput.ReadToEnd()
if (!$Silent)
{
Write-Output $buffer
}
$proc.WaitForExit()
}
end
{
$ret = [ProcessTime]::GetProcessTimes($proc.handle,
[ref]$creation,
[ref]$exit,
[ref]$kernel,
[ref]$user
)
$kernelTime = [long]$kernel/10000000.0
$userTime = [long]$user/10000000.0
$elapsed = [datetime]::FromFileTime($exit) - [datetime]::FromFileTime($creation)
Write-Output "Kernel time : $kernelTime"
Write-Output "User time : $userTime"
Write-Output "Elapsed : $elapsed"
}
}
I found a similar question on SuperUser which covers some alternatives. First and foremost being my suggestion to use Measure-Command in PowerShell.
Measure-Command {ls}
Got the syntax wrong in my comment.
I have a perl tk application where in i create many objects and update the perl tk gui display with information in objects.I need to add large number of jobs(say 30k) in the tree in the gui.If i add all jobs in one go,the gui freezes.
Below is the code snippet:
sub Importjobs
{
#================= start creation of objects=============================
my JobList $self = shift;
my $exportedJobList = shift;
# third parameter whether to clear the list
$self->clear () unless shift;
my $noOfProcsToBeAdded = shift || 3000;
my $cellCollection = Tasks::CellCollection::instance ();
my $calcActionsPathHash = $cellCollection->caPathCAHash ();
my $collectionCellNames = $cellCollection->allCellNames ();
my #importedJobs = ();
# if the given job list is empty, add import job list to it
push #{$self->_importJobList()}, #$exportedJobList;
$exportedJobList = [];
# do not import new jobs if the previous jobs are still being created
foreach my $taskGenJob(#{$self->getTaskGenJobObjs()}) {
goto FINISH if TaskGenJobState::CREATE == $taskGenJob->state();
}
# now get each job and add it into the imported jobs till the noOfJobs exceeds $noOfJobsToBeAdded
while(my $jobDescription = shift #{$self->_importJobList()}) {
my $taskInstantiation = $jobDescription->{'taskInstantiation'};
my $caPath = $taskInstantiation->{'calcActionPath'};
my $errMsgPrefix = 'Error importing ' . join ('-', $task, $command, $method, $caPath);
my #calcActionList;
if(defined $caPath) {
my $calcAction = $calcActionsPathHash->{ $caPath };
unless($calcAction) {
my $errMsg = $errMsgPrefix . ": the calcAction is not defined within the current CellCollection : " . $caPath;
$logger4Perl -> error ($errMsg);
next;
}
push #calcActionList, $calcAction;
} else {
my #mList;
if(not defined $method) {
push #mList, #{$task->getMethods(cellCollection => $cellCollection, command => $command)};
$method = join(' ', #mList);
} elsif($method eq $task_desc::default_method) {
#mList = ($task_desc::default_method);
} else {
#mList = sort (grep { $_ } split(/\s+|__/, $method));
}
foreach my $m (#mList) {
push(#calcActionList, #{$cellCollection->findCalcActions($task, $command, $m)});
}
}
foreach my $calcAction(#calcActionList) {
my TaskGenJob $job = TaskGenJob->new ();
$logger4Perl->info ("Adding $caPath");
push (#importedJobs, $job);
my $noOfProcsBeingAdded = $job->calculateNoOfJobExecObjs();
$noOfProcsToBeAdded -= $noOfProcsBeingAdded;
}
last if 1 > $noOfProcsToBeAdded;
}
#================= End creation of objects=============================
#Below function updates the GUI display
$self->addJobs (\#importedJobs);
#================= Mechanism which am using so that GUI will be active after certain time limit=============================
FINISH:
if(#{$self->_importJobList()}) {
$self->parentDlg()->parentWnd()->after(60000,
sub {
$GuiTasksAppl::mainDlg->Disable();
$self->importJobList([], 'noclear', 200);
$GuiTasksAppl::mainDlg->Enable();
});
}
}
Currently the way am doing it is to add say 3000 jobs using $noOfProcsToBeAdded variable and stay idle for some time and repeat the process after some time.During this idle process,there is different process which processes the jobs in GUI.
can someone propose a better approach than this ???
Expecting ideas on threading ,shared memory.
First, if the GUI freezes (and never unfreezes) during your large 30k update then you might have found a Tk bug since that shouldn't happen. However, if its merely unresponsive for a period of time, then it make sense to mitigate the delay.
In the past, i've used either Tk::repeat() or Tk::after() to drive my UI update routine. The user interface doesn't typically need to be updated at a high rate, so every few hundred milliseconds can be a reasonable delay. The determining factor is largely determined by how responsive of an interface you need. Then during the job import step: append references to a list for the UI update routine and then periodically call $MW->update(). The update routine doesn't necessarily need to process the full list during each call but you don't want the processing to get too far behind.
I'd also recommend some visual indicator to identify that the update is still in-progress.
If ImportJobs is computationally expensive, obviously one could perform multi-process / multi-threading tricks to exploit multiple processors on the system. But that'll add a bit of complexity and testing effort.