Powershell filter a List by Name and Date - excel

I need a bit of help... I'm new to powershell and i want to Filter a List (csv). I would love to remove all lines with certain names in it. and cut the list down to the last month. In the script you can see how far i got till now.
param(
[Parameter(ValueFromPipeline=$true,HelpMessage="Enter CSV path(s)")]
[String[]]$Path = $null
)
if($Path -eq $null) {
Add-Type -AssemblyName System.Windows.Forms
$Dialog = New-Object System.Windows.Forms.OpenFileDialog
$Dialog.InitialDirectory = "$InitialDirectory"
$Dialog.Title = "Select CSV File(s)"
$Dialog.Filter = "CSV File(s)|*.csv"
$Dialog.Multiselect=$true
$Result = $Dialog.ShowDialog()
if($Result -eq 'OK') {
Try {
$Path = $Dialog.FileNames
}
Catch {
$Path = $null
Break
}
}
else {
Write-Host -ForegroundColor Yellow "Notice: No file(s) selected."
Break
}
}
$info=Import-Csv "$path" -Delimiter ';'
$info | Get-Member
$info | Format-Table
as you can see i tryed to link the path to a filebrowser.

For the purposes of discussion, I will assume that the full pathname of the CSV is in the variable $InputPath, and that you want to write the result to a CSV file whose full pathname is in the variable $OutputPath. I will also assume that the CSV file contains a column named 'Name', and that the value from the Name column that you want to exclude is in the variable $ExcludedName. Given that, you can simply do
Import-CSV -Path $InputPath | Where-Object {$_.Name -ne $ExcludedName} | Export-CSV -Path $OutputPath -NoTypeInformation

You can do this by my code,but dont forget that first row must contains names of column and delimiter must be ';' and $nameslist is array of names that you need delete:
$info=Import-Csv "D:\testdir\file2.csv" -Delimiter ';'
$nameslist=#('James','John','andrew')
foreach($i in $info){
if($nameslist -contains $i.Name){
$i.Name=""
}
$i|Export-Csv -Path "D:\testdir\file1.csv" -Delimiter ';' -NoTypeInformation -Force -Encoding UTF8 -Append
}

Try this:
$data = Import-Csv "Path" | Select-Object * -ExcludeProperty Names
$data | export-csv "Path" -Notype
This will cut the column names.

Try it first without using a function:
Import-Csv <Filename> | Where-Object {$_.<FieldName> -notlike "*<Value>*"}
Also, you might consider something like this:
[CmdletBinding()]
param (
[Parameter(ValueFromPipeline = $true, HelpMessage = "Enter CSV path(s)")]
[String[]]$Path = $(
Add-Type -AssemblyName System.Windows.Forms
$DialogProperties = #{
Title = 'Select CSV File(s)'
Filter = 'CSV File(s)|*.csv'
Multiselect = $True
}
$Dialog = New-Object System.Windows.Forms.OpenFileDialog -Property $DialogProperties
$Dialog.ShowDialog()
If ($Result -eq 'OK') {
$Path = $Dialog.FileNames
} Else {
Write-Error 'Notice: No file(s) selected.'
}
)
)
Process {
ForEach ($PathItem in $Path) {
Import-Csv $PathItem | Where-Object { $_.Name -notlike "*NotThisOne*" }
}
}

Related

Powershell script to search through a directory of excel files to find a string only searching through 1 file

I have found this script on https://shuaiber.medium.com/
I want to use it to find a certain string in a folder full of excel files.
the problem I am encountering is that it basically only searches through 1 file and then stops...
here is the script
Function Search-Excel {
[cmdletbinding()]
Param (
[parameter(Mandatory, ValueFromPipeline)]
[ValidateScript({
Try {
If (Test-Path -Path $_) {$True}
Else {Throw "$($_) is not a valid path!"}
}
Catch {
Throw $_
}
})]
[string]$Source,
[parameter(Mandatory)]
[string]$SearchText
#You can specify wildcard characters (*, ?)
)
$Excel = New-Object -ComObject Excel.Application
Try {
$Source = Convert-Path $Source
}
Catch {
Write-Warning "Unable locate full path of $($Source)"
BREAK
}
$Workbook = $Excel.Workbooks.Open($Source)
ForEach ($Worksheet in #($Workbook.Sheets)) {
$Found = $WorkSheet.Cells.Find($SearchText) #What
If ($Found) {
$BeginAddress = $Found.Address(0,0,1,1)
#Initial Found Cell
[pscustomobject]#{
WorkSheet = $Worksheet.Name
Column = $Found.Column
Row =$Found.Row
Text = $Found.Text
Address = $BeginAddress
}
Do {
$Found = $WorkSheet.Cells.FindNext($Found)
$Address = $Found.Address(0,0,1,1)
If ($Address -eq $BeginAddress) {
BREAK
}
[pscustomobject]#{
WorkSheet = $Worksheet.Name
Column = $Found.Column
Row =$Found.Row
Text = $Found.Text
Address = $Address
}
} Until ($False)
}
Else {
Write-Warning "[$($WorkSheet.Name)] Nothing Found!"
}
}
$workbook.close($false)
[void][System.Runtime.InteropServices.Marshal]::ReleaseComObject([System.__ComObject]$excel)
[gc]::Collect()
[gc]::WaitForPendingFinalizers()
Remove-Variable excel -ErrorAction SilentlyContinue
}
And then I would use
Get-ChildItem -Path "C:\excelfiles" -Recurse -Include *.xls, *.xlsx, *.xlsm | Select-Object -Property Directory, Name | ForEach-Object { "{0}{1}" -f $.Directory, $.Name } | Search-Excel -SearchText MyText
I know its only searching through 1 file because I looked at another file and tried to get it to send me back to confirm yet it doesnt work.
Any help would be greatly appreciated.
You're going to need to include a loop within your function, or put the function within a ForEach-Object loop. For the function change you could do:
Function Search-Excel {
[cmdletbinding()]
Param (
[parameter(Mandatory, ValueFromPipeline)]
[ValidateScript({
Try {
If (Test-Path -Path $_) {$True}
Else {Throw "$($_) is not a valid path!"}
}
Catch {
Throw $_
}
})]
[string]$Source,
[parameter(Mandatory)]
[string]$SearchText
#You can specify wildcard characters (*, ?)
)
$Excel = New-Object -ComObject Excel.Application
ForEach($Path in $Source){
### <the rest of your existing code here, adjusted to work with $Path instead of $Source>
}
[void][System.Runtime.InteropServices.Marshal]::ReleaseComObject([System.__ComObject]$excel)
[gc]::Collect()
[gc]::WaitForPendingFinalizers()
Remove-Variable excel -ErrorAction SilentlyContinue
}
Or, to work with your existing function you could just do:
Get-ChildItem -Path "C:\excelfiles" -Recurse -Include *.xls, *.xlsx, *.xlsm |
ForEach-Object { Search-Excel -Source $_.FullName -SearchText MyText }
So let's talk about a couple of things with your command.
Get-ChildItem -Path "C:\excelfiles" -Recurse -Include *.xls, *.xlsx, *.xlsm | Select-Object -Property Directory, Name | ForEach-Object { "{0}{1}" -f $.Directory, $.Name } | Search-Excel -SearchText MyText
If we break this up into the individual pieces (separated by pipes), we might be able to figure out what's wrong. The first part, Get-ChildItem, is only looking for files matching *.xls, *.xlsx, and *.xlsm, so already directories are excluded. Well, what exactly does this function return? If you were to look at the object types, you'll see System.IO.FileInfo, which has a bunch of properties built in. One of which is the full file path needed, .FullName.
Currently, Search-Excel is setup to only search one file. If you want to search multiple files at once, you'll need a loop somewhere. In my opinion, the easiest place to do that will be outside of the function, like this:
Get-ChildItem -Path "C:\excelfiles" -Recurse -Include *.xls, *.xlsx, *.xlsm | Foreach-Object { Search-Excel -Source $_.FullName -SearchText MyText }

In Azure,Loadbalancer details in to csv file using powershell script in pipeline of azure devops

Csv fiile should contain ResourceGroupName,FrontendIpConfigurationsName,FrontIPAddress,VMstatus.I have written a below code.Am getting the details in csv file but the issue is FrontendIpConfigurationsName starts with same name comes in same row like ersfrontend-A1,ersfrontend-B2,ersfrontend-B3,ersfrontend-D1, and also IPs respective to the names comes in different column but in same row.But I wanted them to come different rows.Please suggest
report = #()
$LBlist = Get-AZLoadBalancer | Where-Object {$_ResourceGroupName -eq '*$(grp-wildcard)' } |Select-Object
$VM =Get-AzVm -Status | Where-Object {$_ResourceGroupName -eq '*$(grp-wildcard) '} |Select-Object
$power= $VM.powerstate
foreach($LB in LBlist){
Array = "" |Select-Object ResourceGroupName,FrontendIpConfigurationsName,FrontIPAddress,VMstatus
$Array.ResourceGroupName = $LB.ResourceGroupName
$Array.FrontendIpConfigurationsName =($LB.FrontendIpConfigurationsName.name -join ',')
$Array.FrontendIpAddress= ($LB.FrontendIpConfigurationsName.privateIpAddress -join ',')
$Array.VMstate = $power
}
$report+=$Array
$report |Format-Table ResourceGroupName,FrontendIpConfigurationsName,FrontIPAddress,VMstatus
$report | Export-Csv -NTI -path "$(BuildArtifactStagingDirectory)/LBlist.csv"
In order to create a proper CSV file with headers and rows of data, you need to collect an array of Objects and send that to the Export-Csv cmdlet.
For example
$report = #()
foreach($LB in $LBlist){
$obj= [PsCustomObject]#{
'ResourceGroupName ' = $LB.ResourceGroupName
'FrontendIpConfigurationsName ' =($LB.FrontendIpConfigurationsName.name -join ',')
'FrontendIpAddress' =($LB.FrontendIpConfigurationsName.privateIpAddress -join ',')
'VMstate ' = $power
}
$report += $obj
}
$report |Format-Table ResourceGroupName,FrontendIpConfigurationsName,FrontIPAddress,VMstatus
$report | Export-Csv -NTI -path "($BuildArtifactStagingDirectory)/LBlist.csv" -NoTypeInformation
Hope that helps

Remove matching collection object from text file

I have a list of users that I am storing in a text file. I am trying to update the text file so it removes any user that match $NotExpiring users variable, which is a collection. I just can't figure out how I would update the text file properly if more than one user needs to be removed from text file.
Below is the full function. You can ignore most of it Just look under #Stuck Here to get to the point.
function Get-NotExpiring{
$NotExpiring=New-Object System.Collections.Generic.List[System.Object]
$MatchedUser=New-Object System.Collections.Generic.List[System.Object]
$textfiles = Get-ChildItem $email_dir
#Day of Span
$Days="20"
#Settings
$Date=Get-Date ((Get-Date).adddays($Days))
$Users=Get-ADUser -filter {(Enabled -eq $True) -and (PasswordNeverExpires -eq $False)} -Properties SamAccountName, DisplayName, msDS-UserPasswordExpiryTimeComputed, Mail | Where-Object { $_.DisplayName -ne $nul -and ($_."msDS-UserPasswordExpiryTimeComputed" -gt ($NotExpDate.ToFileTime()))} | Select SamAccountName, Mail, DisplayName,#{Name="ExpiryDate";Expression={([datetime]::fromfiletime($_."msDS-UserPasswordExpiryTimeComputed")).DateTime}}
#Magic
foreach ($Entry in $Users) {
$EntryDate = Get-date($Entry.ExpiryDate)
if ($EntryDate -gt $Date){
$Account = $Entry.SamAccountName
$ExpDate = $Entry.ExpiryDate
$NotExpiring.add($Account)
}
}
#STUCK HERE
foreach($file in $textfiles){
foreach ($user in $NotExpiring){
if((Get-Content "$email_dir\$file") -contains $user){
$temp_get = Get-Content $email_dir\$file | where {$_ -notmatch $user}
}}}
$temp_get}
I tried below but it doesn't seem to work if more than one user are $NotExpiring that are also in the existing textfile. Any help would be appreciated. I know this is a simple fix but I can't seem to figure it out.
Get-Content $email_dir\$file | where {$_ -notmatch $user} | Set-Content <path>.txt
I was able to achieve exactly what I needed using the following solution.
foreach($file in $textfiles){ foreach ($user in $NotExpiring){
if((Get-Content "$email_dir\$file") -contains $user){
$MatchedUser.add($user)
}}
Get-Content "$email_dir\$file" | Where {$MatchedUser -NotContains $_ } | Set Content "$temp_dir\$file"
Copy-Item -path "$temp_dir\$file" -Destination "$email_dir\$file" -ErrorAction SilentlyContinue }
Basicly you are trying to match two arrays.
With where you do it foreach object. Now you have to match the single object $_ with the array $user.
Use:
...| where {$_ -notin $user}
or
...| where {$user -notcontains $_}

How to read NTFS permissions in a list of shares from a text file

I am trying to get it easier to use on file servers. I would like to import a file called shares.txt. In that shares.txt there are for example lines, e.g. folder name → root folder and share name → subfolder.
\\10.10.15.240\folder name\share name
\\10.10.15.240\folder name\share2 name
\\10.10.15.240\folder name\share3 name
\\10.10.15.240\folder2 name\share name
\\10.10.15.240\folder2 name\share2 name
\\10.10.15.240\folder2 name\share3 name
and so on.
And I would like to create the output to:
Outputs the data to a text file on \\10.10.15.240\output\ using the folder name and share name as part of the file name
\\10.10.15.240\output\Company_name_$folder name_$share name.csv
\\10.10.15.240\output\Company_name_$folder name_$share2 name.csv
\\10.10.15.240\output\Company_name_$folder name_$share3 name.csv
\\10.10.15.240\output\Company_name_$folder2 name_$share name.csv
\\10.10.15.240\output\Company_name_$folder2 name_$share2 name.csv
\\10.10.15.240\output\Company_name_$folder2 name_$share3 name.csv
and continue.
Do you know how to import this in your script? I tried several things but they all comes up with errors.
Script:
I want to write the permissions list to a CSV file instead of writing it directly to Excel.
$ErrorActionPreference = "SilentlyContinue"
$a = New-Object -ComObject Excel.Application
$a.visible = $true
$b = $a.Workbooks.Add()
$intRow = 1
$c = $b.Worksheets.Item(1)
$c.Cells.Item($intRow,1) = "Folder"
$c.Cells.Item($intRow,2) = "Compte/groupe"
$c.Cells.Item($intRow,3) = "Type d'Acces"
$c.Cells.Item($intRow,4) = "Droits"
$d = $c.UsedRange
$d.EntireColumn.AutoFit()|Out-Null
$d.Interior.ColorIndex = 19
$d.Font.ColorIndex = 11
$d.Font.Bold = $true
Remove-Variable arrayOfPath
$depth = 2
$RootFolder = "\\MySRV\Folder"
for ($i=0; $i -le $depth; $i++) {
$arrayOfPath += ,$RootFolder
$RootFolder = $RootFolder + "\*"
}
$arrayOfPath | Get-ChildItem | %{
Get-Acl $_.FullName
} | %{
$intRow = $intRow + 1
$c.Cells.Item($intRow, 1) = $_.Path.ToString().Replace("Microsoft.PowerShell.Core\FileSystem::", "")
$droit = $_.Access
$droit | %{
$c.Cells.Item($intRow, 2) = $_.IdentityReference.ToString();
$c.Cells.Item($intRow, 3) = $_.AccessControlType.ToString();
$c.Cells.Item($intRow, 4) = $_.FileSystemRights.ToString();
$intRow = $intRow+1
}
}
$d.EntireColumn.AutoFit() | Out-Null
Last Update:
$arrayOfPaths= Get-Content "\\UNC PATH\myfile.txt"
Get-ChildItem $arrayOfPaths -Recurse | Where-Object {$_.mode -match "d"} | ForEach-Object {
$csv = $_.FullName -replace '^\\\\[^\\]+\\([^\\]+)\\(.*)', '\\UNC PATH\Company_name_${1}_${2}.csv' # <- construct CSV path here
$path = $_.FullName
Get-Acl $path | Select-Object -Expand Access |
Select-Object #{n='Path';e={$path}}, IdentityReference, AccessControlType,
FileSystemRights |
Export-Csv $csv -Append -NoType
}
Basically you'd do something like this:
loop over the paths
get each folder's ACL
expand the ACEs
select the ACE properties you want exported
add the path with a calculated property
export the selected properties to a CSV
Appending to the output CSV(s) inside the loop allows you to control to which file each ACL is exported. You can for instance make the file name depend on elements of the source path, or the number of elements already written (if you add a counter variable).
Get-ChildItem $arrayOfPaths | ForEach-Object {
$csv = "..." # <- construct CSV path here
$path = $_.FullName
Get-Acl $path | Select-Object -Expand Access |
Select-Object #{n='Path';e={$path}}, IdentityReference, AccessControlType,
FileSystemRights |
Export-Csv $csv -Append -NoType
}
}
Note that -Append was added to Export-Csv in PowerShell v3. On earlier versions you can sort of emulate it with ConvertTo-Csv and Add-Content.

add colum to merged csv file

Ok heres what I have code wise:
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
(get-content $a) | set-content $b
This pulls all the data of all the files into one merged file, but I need one additional item, I need to pull the name of the individual files and append it to the first column of the file for multiple files, several hundred at a time.
Not tested but something like this should do it:
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
Get-ChildItem $a | % {
Import-Csv $_.Fullname | Add-Member -MemberType NoteProperty -Name 'File Name' -Value $_.Name
} | Export-Csv $b
Assuming the CSV files each have the same column headings, I would lean toward using Import-CSV instead of Get-Content so that you can work with the CSV contents as arrays of objects with named properties.
Then all you need to do is iterate through each item of the array and append a property containing the file path, which you can do using the Add-Member cmdlet. Once that's done, export the array of objects using the Export-CSV cmdlet.
$directory = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\"
$search = $directory + "*.csv"
$exportpath = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
$paths = get-childitem $search
$objectArrays = #()
$paths | %{
$filepath = $_.fullname;
$objectArray = Import-CSV $filepath;
$objectArray | %{
Add-Member -inputobject $_ -Name "SourceFile" -Value $filepath -MemberType NoteProperty};
$objectArrays += $objectArray}
$objectArrays | export-csv -path $exportpath -notype
This puts the SourceFile property as the last column in the outputted CSV file
Ok, simplification... Search target folder, pipe to a ForEach-Object loop (shorthand % used), capture the file name as variable, import the CSV, add the sourcefile using the Select-Object cmdlet, convert it back to a CSV, end loop, pipe to destination file.
$a = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\*.csv"
$b = "C:\Users\some.deranged.character\Desktop\SomeAwfulPlace\Checklists\C_F\merge.csv"
GCI $a | %{$FileName=$_.Name;Import-CSV $_|Select #{l='SourceFile';e={$FileName}},*|ConvertTo-CSV -NoType} | set-content $b

Resources