Powershell Script to loop through folders checking security permissions - security

I have part of the code: at the moment its coming empty in the CSV file. But i need a command to specify the path/folders to look at, how do i modify this for that purpose.
Param(
[String]$path,
[String]$outfile = ".\outfile.csv"
)
$output = #()
ForEach ($item in (Get-ChildItem -Path $path -Recurse -Directory)) {
ForEach ($acl in ($item.GetAccessControl().Access)){
$output += $acl |
Add-Member `
-MemberType NoteProperty `
-Name 'Folder' `
-Value $item.FullName `
-PassThru
}
}
$output | Export-Csv -Path $outfile -NoTypeInformation

Ok, let's do this. I've made it into a function, and removed the OutFile part of it. If you want to output it to a file, pipe it to Export-CSV. If you want it saved as a variable, assign it to a variable. Just simpler this way.
Function Get-RecursiveACLs{
Param(
[String]$Path=$(Throw "You must specify a path")
)
$Output = GCI $Path -Recurse -Directory|%{
$PathName=$_.FullName
$_.GetAccessControl().Access|%{
Add-Member -InputObject $_ -NotePropertyName "Path" -NotePropertyValue $PathName -PassThru
}
}
}
Then it's a simple matter of storing it in a variable like:
$ACLList = Get-RecursiveACLs "C:\Example\Path"
Or piping it to output to a CSV if you would prefer:
Get-RecursiveACLs "C:\Example\Path" | Export-CSV "C:\Results.csv" -NoType
Put the function at the top of your script and call it as needed.

Related

Powershell: Convert each file in folder to UTF-8 csv

I need a script to convert excel to csv format UTF-8. I think it can be done on Powershell, but I can’t. Can you see where the error is? Thank you very much in advance.
$configFiles = Get-ChildItem "c:\HR\test"
foreach ($file in $configFiles) {
$a = -join ("c:\HR\test", "\", $file)
Get-Content $a | Set-Content -path -Encoding utf8 $a
}
You can use Doug Finke's awesome module to get this done: https://github.com/dfinke/ImportExcel
To install it:
Install-Module ImportExcel
Then you can do something like this. Note that Out-File has a few different encoding options for UTF8: https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/out-file?view=powershell-5.1#parameters
param(
$excelFilesDir = (Get-ChildItem $PSScriptRoot\excel_files),
$csvFilesDir = "$PSScriptRoot\csv_files"
)
# NOTE: I had to supply the path to the module on my system. You may not have to.
Import-Module "<ABSOLUTE>:\<PATH>\<TO>\<MODULE>\ImportExcel" -Force
# Import-Module ImportExcel
if ( (Test-Path -Path $csvFilesDir) -eq $false) {
New-Item -Path $csvFilesDir -ItemType Directory -Force
}
foreach ($file in $excelFilesDir) {
Import-Excel $file.FullName | `
ConvertTo-Csv -NoTypeInformation | `
Out-File -FilePath "$csvFilesDir\$($file.Name)" -Encoding utf8 -Force
}

Code to grab file data into PSOObject and sort by LastWriteTime

I am looking to recursively grab a list of recently modified files under two network drives, sort them in descending date order, and make some edits to the CSV file to tidy the list for Excel
I have cobbled the code below from a number of sources (I am a powershell beginner) and it is now doing what I need (i.e. producing a list).
I need help in going a step further, I cannot sort the resultant CSV file by file last write time date, is this because my array is expecting text rather than a numeric field?
I also am returning the domain name as well as the file owner with ((Get-ACL $_.FullName).Owner). I tried using Replace to cut down the string, but had no luck with this approach.
$arr = #()
$days_to_check=$(Get-Date).AddDays(-28)
$items = #(Get-ChildItem '\\ND\dir 1\*.*' -Recurse -ErrorAction SilentlyContinue | where { $_.LastWriteTime -gt $days_to_check})
$items += #(Get-ChildItem '\\ND\dir 1\*.*' -Recurse -ErrorAction SilentlyContinue |
where { $_.LastWriteTime -gt $days_to_check})
$items | Foreach {
$obj = New-Object PSObject -prop $hash
$obj | Add-Member NoteProperty FullName $_.FullName
$obj | Add-Member NoteProperty Directory $_.Directory
$obj | Add-Member NoteProperty Name $_.Name
$obj | Add-Member NoteProperty LastTime $_.LastWriteTime
$obj | Add-Member NoteProperty Owner ((Get-ACL $_.FullName).Owner)
$arr += $obj
}
$arr | Format-List
$arr | Sort-Object -Property LastTime -Descending
$arr | Export-CSV -notypeinformation C:\temp\filenamesFO.csv
CSV file sorted by date field
You did sort your array in the output but that's all you did.
If you want to actually export it that way, you have to assign the sort to $arr
Replace
$arr | Sort-Object -Property LastTime -Descending
with
$arr = $arr | Sort-Object -Property LastTime -Descending
You can remove the Owner domain using the following Replace -replace '(.*\\)(.*)','$2'
Here's a complete example implementing the changes mentionned above.
$arr = new-object -TypeName 'System.Collections.Generic.List[PSObject]'
$days_to_check=$(Get-Date).AddDays(-28)
$items = #(Get-ChildItem '\\ND\dir 1\*.*' -Recurse -ErrorAction SilentlyContinue | where { $_.LastWriteTime -gt $days_to_check})
$items += #(Get-ChildItem '\\ND\dir 1\*.*' -Recurse -ErrorAction SilentlyContinue |
where { $_.LastWriteTime -gt $days_to_check})
Foreach ($item in $items) {
$obj = [PSCustomObject]#{
FullName = $item.FullName
Directory = $item.Directory
Name = $item.Name
LastTime = $item.LastWriteTime
Owner = (Get-ACL $item.FullName).Owner -replace '(.*\\)(.*)','$2'
}
$arr.add($obj)
}
$arr = $arr | Sort-Object -Property LastTime -Descending
#$arr | Format-List
$arr | Export-CSV -notypeinformation C:\temp\filenamesFO.csv
I made some additional changes:
Instead of using an array, I used a List of PSObject. If you have a lot of files, the processing time will be improved in comparison with an array.
I used the PSCustomObject declaration just to show an alternative to all those Add-member. I find it cleaner but it is up to you in the end.

add colum to merged csv file

Ok heres what I have code wise:
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
(get-content $a) | set-content $b
This pulls all the data of all the files into one merged file, but I need one additional item, I need to pull the name of the individual files and append it to the first column of the file for multiple files, several hundred at a time.
Not tested but something like this should do it:
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
Get-ChildItem $a | % {
Import-Csv $_.Fullname | Add-Member -MemberType NoteProperty -Name 'File Name' -Value $_.Name
} | Export-Csv $b
Assuming the CSV files each have the same column headings, I would lean toward using Import-CSV instead of Get-Content so that you can work with the CSV contents as arrays of objects with named properties.
Then all you need to do is iterate through each item of the array and append a property containing the file path, which you can do using the Add-Member cmdlet. Once that's done, export the array of objects using the Export-CSV cmdlet.
$directory = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\"
$search = $directory + "*.csv"
$exportpath = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
$paths = get-childitem $search
$objectArrays = #()
$paths | %{
$filepath = $_.fullname;
$objectArray = Import-CSV $filepath;
$objectArray | %{
Add-Member -inputobject $_ -Name "SourceFile" -Value $filepath -MemberType NoteProperty};
$objectArrays += $objectArray}
$objectArrays | export-csv -path $exportpath -notype
This puts the SourceFile property as the last column in the outputted CSV file
Ok, simplification... Search target folder, pipe to a ForEach-Object loop (shorthand % used), capture the file name as variable, import the CSV, add the sourcefile using the Select-Object cmdlet, convert it back to a CSV, end loop, pipe to destination file.
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
GCI $a | %{$FileName=$_.Name;Import-CSV $_|Select #{l='SourceFile';e={$FileName}},*|ConvertTo-CSV -NoType} | set-content $b

Decreased output with PowerShell multithreading than with singlethread script

I am using PowerShell 2.0 on a Windows 7 desktop. I am attempting to search the enterprise CIFS shares for keywords/regex. I already have a simple single threaded script that will do this but a single keyword takes 19-22 hours. I have created a multithreaded script, first effort at multithreading, based on the article by Surly Admin.
Can Powershell Run Commands in Parallel?
Powershell Throttle Multi thread jobs via job completion
and the links related to those posts.
I decided to use runspaces rather than background jobs as the prevailing wisdom says this is more efficient. Problem is, is I am only getting partial resultant output with the multithreaded script I have. Not sure if it is an I/O thing or a memory thing, or something else. Hopefully someone here can help. Here is the code.
cls
Get-Date
Remove-Item C:\Users\user\Desktop\results.txt
$Throttle = 5 #threads
$ScriptBlock = {
Param (
$File
)
$KeywordInfo = Select-String -pattern KEYWORD -AllMatches -InputObject $File
$KeywordOut = New-Object PSObject -Property #{
Matches = $KeywordInfo.Matches
Path = $KeywordInfo.Path
}
Return $KeywordOut
}
$RunspacePool = [RunspaceFactory]::CreateRunspacePool(1, $Throttle)
$RunspacePool.Open()
$Jobs = #()
$Files = Get-ChildItem -recurse -erroraction silentlycontinue
ForEach ($File in $Files) {
$Job = [powershell]::Create().AddScript($ScriptBlock).AddArgument($File)
$Job.RunspacePool = $RunspacePool
$Jobs += New-Object PSObject -Property #{
File = $File
Pipe = $Job
Result = $Job.BeginInvoke()
}
}
Write-Host "Waiting.." -NoNewline
Do {
Write-Host "." -NoNewline
Start-Sleep -Seconds 1
} While ( $Jobs.Result.IsCompleted -contains $false)
Write-Host "All jobs completed!"
$Results = #()
ForEach ($Job in $Jobs) {
$Results += $Job.Pipe.EndInvoke($Job.Result)
$Job.Pipe.EndInvoke($Job.Result) | Where {$_.Path} | Format-List | Out-File -FilePath C:\Users\user\Desktop\results.txt -Append -Encoding UTF8 -Width 512
}
Invoke-Item C:\Users\user\Desktop\results.txt
Get-Date
This is the single threaded version I am using that works, including the regex I am using for socials.
cls
Get-Date
Remove-Item C:\Users\user\Desktop\results.txt
$files = Get-ChildItem -recurse -erroraction silentlycontinue
ForEach ($file in $files) {
Select-String -pattern '[sS][sS][nN]:*\s*\d{3}-*\d{2}-*\d{4}' -AllMatches -InputObject $file | Select-Object matches, path |
Format-List | Out-File -FilePath C:\Users\user\Desktop\results.tx -Append -Encoding UTF8 -Width 512
}
Get-Date
Invoke-Item C:\Users\user\Desktop\results.txt
I am hoping to build this answer over time as I dont want to over comment. I dont know yet why you are losing data from the multithreading but i think we can increase performace with an updated regex. For starters you have many greedy quantifiers that i think we can shrink down.
[sS][sS][nN]:*\s*\d{3}-*\d{2}-*\d{4}
Select-String is case insensitive by default so you dont need the portion in the beginning. Do you have to check for multiple colons? Since you looking for 0 or many :. Same goes for the hyphens. Perhaps these would be better with ? which matches 0 or 1.
ssn:?\s*\d{3}-?\d{2}-?\d{4}
This is assuming you are looking for mostly proper formatted SSN's. If people are hiding them in text maybe you need to look for other delimiters as well.
I would also suggest adding the text to separate files and maybe combining them after execution. If nothing else just to test.
Hoping this will be the start of a proper solution.
It turns out that for some reason the Select-String cmdlet was having problems with the multithreading. I don't have enough of a developer background to be able to tell what is happening under the hood. However I did discover that by using the -quiet option in Select-String, which turns it into a boolean output, I was able to get the results I wanted.
The first pattern match in each document gives a true value. When I get a true then I return the Path of the document to an array. When that is finished I run the pattern match against the paths that were output from the scriptblock. This is not quite as effective performance wise as I had hoped for but still a pretty dramatic improvement over singlethread.
The other issue I ran into was the read/writes to disk by trying to output results to a document at each stage. I have changed that to arrays. While still memory intensive, it is much quicker.
Here is the resulting code. Any additional tips on performance improvement are appreciated:
cls
Remove-Item C:\Users\user\Desktop\output.txt
$Throttle = 5 #threads
$ScriptBlock = {
Param (
$File
)
$Match = Select-String -pattern 'ssn:?\s*\d{3}-?\d{2}-?\d{4}' -Quiet -InputObject $File
if ( $Match -eq $true ) {
$MatchObjects = Select-Object -InputObject $File
$MatchOut = New-Object PSObject -Property #{
Path = $MatchObjects.FullName
}
}
Return $MatchOut
}
$RunspacePool = [RunspaceFactory]::CreateRunspacePool(1, $Throttle)
$RunspacePool.Open()
$Jobs = #()
$Files = Get-ChildItem -Path I:\ -recurse -erroraction silentlycontinue
ForEach ($File in $Files) {
$Job = [powershell]::Create().AddScript($ScriptBlock).AddArgument($File)
$Job.RunspacePool = $RunspacePool
$Jobs += New-Object PSObject -Property #{
File = $File
Pipe = $Job
Result = $Job.BeginInvoke()
}
}
$Results = #()
ForEach ($Job in $Jobs) {
$Results += $Job.Pipe.EndInvoke($Job.Result)
}
$PathValue = #()
ForEach ($Line in $Results) {
$PathValue += $Line.psobject.properties | % {$_.Value}
}
$UniqValues = $PathValue | sort | Get-Unique
$Output = ForEach ( $Path in $UniqValues ) {
Select-String -Pattern '\d{3}-?\d{2}-?\d{4}' -AllMatches -Path $Path | Select-Object -Property Matches, Path
}
$Output | Out-File -FilePath C:\Users\user\Desktop\output.txt -Append -Encoding UTF8 -Width 512
Invoke-Item C:\Users\user\Desktop\output.txt

Powershell filter a List by Name and Date

I need a bit of help... I'm new to powershell and i want to Filter a List (csv). I would love to remove all lines with certain names in it. and cut the list down to the last month. In the script you can see how far i got till now.
param(
[Parameter(ValueFromPipeline=$true,HelpMessage="Enter CSV path(s)")]
[String[]]$Path = $null
)
if($Path -eq $null) {
Add-Type -AssemblyName System.Windows.Forms
$Dialog = New-Object System.Windows.Forms.OpenFileDialog
$Dialog.InitialDirectory = "$InitialDirectory"
$Dialog.Title = "Select CSV File(s)"
$Dialog.Filter = "CSV File(s)|*.csv"
$Dialog.Multiselect=$true
$Result = $Dialog.ShowDialog()
if($Result -eq 'OK') {
Try {
$Path = $Dialog.FileNames
}
Catch {
$Path = $null
Break
}
}
else {
Write-Host -ForegroundColor Yellow "Notice: No file(s) selected."
Break
}
}
$info=Import-Csv "$path" -Delimiter ';'
$info | Get-Member
$info | Format-Table
as you can see i tryed to link the path to a filebrowser.
For the purposes of discussion, I will assume that the full pathname of the CSV is in the variable $InputPath, and that you want to write the result to a CSV file whose full pathname is in the variable $OutputPath. I will also assume that the CSV file contains a column named 'Name', and that the value from the Name column that you want to exclude is in the variable $ExcludedName. Given that, you can simply do
Import-CSV -Path $InputPath | Where-Object {$_.Name -ne $ExcludedName} | Export-CSV -Path $OutputPath -NoTypeInformation
You can do this by my code,but dont forget that first row must contains names of column and delimiter must be ';' and $nameslist is array of names that you need delete:
$info=Import-Csv "D:\testdir\file2.csv" -Delimiter ';'
$nameslist=#('James','John','andrew')
foreach($i in $info){
if($nameslist -contains $i.Name){
$i.Name=""
}
$i|Export-Csv -Path "D:\testdir\file1.csv" -Delimiter ';' -NoTypeInformation -Force -Encoding UTF8 -Append
}
Try this:
$data = Import-Csv "Path" | Select-Object * -ExcludeProperty Names
$data | export-csv "Path" -Notype
This will cut the column names.
Try it first without using a function:
Import-Csv <Filename> | Where-Object {$_.<FieldName> -notlike "*<Value>*"}
Also, you might consider something like this:
[CmdletBinding()]
param (
[Parameter(ValueFromPipeline = $true, HelpMessage = "Enter CSV path(s)")]
[String[]]$Path = $(
Add-Type -AssemblyName System.Windows.Forms
$DialogProperties = #{
Title = 'Select CSV File(s)'
Filter = 'CSV File(s)|*.csv'
Multiselect = $True
}
$Dialog = New-Object System.Windows.Forms.OpenFileDialog -Property $DialogProperties
$Dialog.ShowDialog()
If ($Result -eq 'OK') {
$Path = $Dialog.FileNames
} Else {
Write-Error 'Notice: No file(s) selected.'
}
)
)
Process {
ForEach ($PathItem in $Path) {
Import-Csv $PathItem | Where-Object { $_.Name -notlike "*NotThisOne*" }
}
}

Resources