Powershell comparing data in a CSV against files in a folder - excel

I'm fairly new to powershell.
I'm trying to compare data in a CSV File against random files in a specific folder.
I want to see if and what has changed and then log that in another column called "Changed".
Here's what I've done below, it seems to create a new column called 'Changed' but doesn't input the changes in it.
$Spreadsheet = 'C:\Powershell\CSV\inv.csv'
$SpreadSheetPath = "C:\Powershell\CSV"
Import-Csv $Spreadsheet -Delimiter "|" -Encoding Default | ForEach-Object -
{
$Path += $_.Path
$Filename += $_.Filename
$DateModified += $_.DateModified
$FileSize += $_.FileSize
$MD5Hash += $_.MD5Hash
}
{
$Msg1 = "Path changed"
$Msg2 = "File Name changed"
$Msg3 = "Date Modified changed"
$Msg4 = "File Size changed"
$Msg5 = "MD5 changed"
$Msg6 = "Files are the same"
$psdata = "D:\ps-test\data\*.*"
}
If (($Path -eq $psdata))
{
Import-Csv C:\Powershell\CSV\inv.csv |
Select-Object *,#{Name='Changed';Expression={$Msg6}} |
Export-Csv C:\Powershell\CSV\NewSpreadsheet4.csv
}
Else
{
Import-Csv C:\Powershell\CSV\inv.csv |
Select-Object *,#{Name='Changed';Expression={$Msg1}} |
Export-Csv C:\Powershell\CSV\NewSpreadsheet4.csv
}
Here is an example of what the CSV looks like:
Path Filename Date Modified File Size MD5 Hash
D:\ps-test\data adminmodeinfo.htm 03/11/2010 22:42 1079 BD1C9468D71FD33BB35716630C4EC6AC
E:\ps-test\data admintoolinfo.htm 03/11/2010 22:42 868 24B99B6316F0C49C23F27FEA6FF1C6AC
E:\ps-test\data admin_ban.bmp 03/11/2010 22:42 63480 C856F1F3C58962B456E749F2EA9C933A
E:\ps-test\data baseline.dat 03/20/2010 03:18:33 173818 F13183D88AABD1A725437802F8551A06
E:\ps-test\data blueRule.gif 03/11/2010 22:42 815 D1AEFE884935095DAB42DAFD072AA46F
E:\ps-test\data deffactory.dat 03/20/2010 03:18:33 706 862D4DFD2F49021BB7C145BDAFE62F6F
E:\ps-test\data dividerArt.jpg 03/11/2010 22:42 367 F7050C596C097C0B01A443058CD15E35

There are many issues with your code.I will try to highlight a few of the issues, link to documentation and point you in the right direction so that you can resolve your issues. A proper solution would require getting many more requirements, or writing code (off-topic for StackOverflow)
Change
| ForEach-Object -
{
to
| ForEach-Object {
In the Foreach-Object, you are concatenating values from each line because you are using +=.
On the first run, $Path contains D:\ps-test\data.
After the second run, it contains D:\ps-test\dataE:\ps-test\data.
At the end of your test data, it contains D:\ps-test\dataE:\ps-test\dataE:\ps-test\dataE:\ps-test\dataE:\ps-test\dataE:\ps-test\dataE:\ps-test\data
The messages are contained in a script block, but it does not look like this is intentional as this is never executed. So after the scriptblock, the variable $Msg1 has not been created; it's blank.
If (($Path -eq $psdata))
double brackets not required.
will always be false because the variable $psdata does not exist as it was stated inside a script block.
will always be false because you are attempting to equate the strings; your input does not literally contain "D:\ps-test\data\*.*". You probably want -like instead of -eq.
will always be inaccurate because even if the paths are compared, there is no check that the file actually exists on the system.
Useful links
Test-Path to check if file exists.
Get-FileHash to get MD5 hash and compare to file.
Get-ChildItem to get a list of directories/files in a directory.
Write-Output so that you can print variables and make sure they contain what you expect.
about_comparison_operators - -in and -contains will help you.

This is a suggestion to help you get started. It's not complete and not tested! Let me know if it works as expected and if you have any questions.
Import-Csv 'C:\Powershell\CSV\inv.csv' -Delimiter "|" -Encoding Default | foreach {
$Path += $_.Path
$Filename += $_.Filename
$DateModified += $_.DateModified
$FileSize += $_.FileSize
$MD5Hash += $_.MD5Hash
$file = [System.IO.FileInfo](Join-Path $Path $Filename)
if (-not $file.Exists) {
$message = "File does not exist"
}
elseif ($file.LastWriteTime -ne [DateTime]$DateModified) {
$message = "Dates differ"
}
elseif ($file.Length -ne [int]$FileSize) {
$message = "Sizes differ"
}
# and so on...
# (You cannot really compared a changed file name btw)
New-Object -Type PSObject -Prop #{
Path = $Path
Filename = $Filename
DateModified = $DateModified
FileSize = $FileSize
MD5Hash = $MD5Hash
Message = $message
}
} | Export-CSV 'C:\Powershell\CSV\NewSpreadsheet4.csv'

Related

How to modify excel data and export to text file using PowerShell script?

First time poster here. Apologies if I am not following best practices for posting this question.
I am very new to scripting and PowerShell.
Problem:
I have data in an excel sheet in this format.
Excel Data Image Link
I want to modify and export this data into a text file. In this format.
Required Output Image Link
Till now I have tried to modify the excel data by accessing each cell. To access each cell I am using a similar code mentioned below.
for (($i = 1); $i -lt 4; $i++)
{
$column=$ExcelWorkSheet.Columns.Item(1).Rows.Item($i).Text
$dataType=$ExcelWorkSheet.Columns.Item(2).Rows.Item($i).Text
$c1=("`"" + "$column" + "`""+":")
$c2=("`"" + "$dataType" + "`"" + ",")
$ExcelWorkSheet.Columns.Item(1).Rows.Item($i).Value=$c1
$ExcelWorkSheet.Columns.Item(2).Rows.Item($i).Value=$c2
}
I am still not sure if this is the correct way to go.
what would be the best way to solve this?
Just want to understand what I should do to solve this problem. I am not looking for the exact code.
Step by step instructions or some resources would be helpful.
Thanks!
This might help... maybe...
# Import Stuff
$Data = Import-Csv -Path .\Desktop\data.csv
# New Array
$Output = #()
# Run through Unique Owners
foreach ($Owner in ($Data | Select-Object OWNER -Unique)) {
$Lines = $Data | Where-Object {$_.OWNER -eq $Owner.OWNER}
# Lazy way to do a bit of checking, if same then use it or Break
if ($Lines[0].TABLE_NAME -eq $Lines[1].TABLE_NAME) {
$Out_TableName = $Lines[0].TABLE_NAME
# ID and NAME data
$Out_ID = $Lines | Where-Object {$_.COLUMN_NAME -eq "ID"} | Select-Object COLUMN_NAME, DATA_TYPE, DATA_LENGTH
$Out_NAME = $Lines | Where-Object {$_.COLUMN_NAME -eq "NAME"} | Select-Object COLUMN_NAME, DATA_TYPE, DATA_LENGTH
} else {
# Show the user that something
Write-Host "Problem with Owner ""$($Owner.OWNER)"" Data?!" -ForegroundColor Red
Break
}
# Output into the array in format
$Output += #"
"$($Owner.OWNER).$($Out_TableName)":{
"$($Out_ID.COLUMN_NAME)": "$($Out_ID.DATA_TYPE) ($($Out_ID.DATA_LENGTH))",
"$($Out_NAME.COLUMN_NAME)": "$($Out_NAME.DATA_TYPE) ($($Out_NAME.DATA_LENGTH))"
}
"#
}
# Put Output in a text file
$Output | Set-Content .\Desktop\output.txt -Force
I should add, that I had your data in a CSV like this...
OWNER,TABLE_NAME,COLUMN_NAME,DATA_TYPE,DATA_LENGTH
A,Employee,ID,NUMBER,22
A,Employee,NAME,VARCHAR2,22
B,Department,ID,NUMBER,23
B,Department,NAME,VARCHAR2,24

Powershell script to extract data from multiple text files into an excel spreadsheet

I'm pretty new to PS and been struggling for a few days.
I have multiple text files in a folder with specific data that I would like to extract into an excel spreadsheet.
each files look like this :
Client n° : xxx Client name : xxx
Computer status
pc group 1 :
n°1 OK n°2 Disconnected n°3 Unresponsive
n°4 Unreachable host n°5 Unresponsive
Data read 11/11/20 12:50:07
Version: x.x.x
I would like to have an output file that looks like this :
Client name and n° OK Disconnected Unresponsive Unreachable host version
xxx/xxx 1 1 2 1 x.x.x
For the status columns it's the sum number of pc with that status and not the pc n° that I would like to display.
At the moment I'm working with multiple .bat files that searches for the status and output one file per status
find /c "Disconnected" *.* > disconnected.txt
find /c "Unresponsive" *.* > unresponsive.txt
And then I sort every single output in an excel which takes me too much time, I was wondering if it was possible to automate this task with a script.
I really don't have any knowledge of PS, only basic batch commands.
Let's assume your files are all in one folder and all of them have the .txt extension.
Then you need to loop through these files and parse the data you need from it:
# create a Hashtable to add the different status values in
$status = #{'OK' = 0; 'Disconnected'= 0; 'Unresponsive' = 0; 'Unreachable host'= 0}
# loop through the files in your path and parse the information out
$result = Get-ChildItem -Path 'D:\Test' -Filter '*.txt' -File | ForEach-Object {
switch -Regex -File $_.FullName {
'^Client n°\s*:\s*([^\s]+)\s+Client name\s*:\s*(.+)$' {
# start collecting data for this client
$client = '{0}/{1}' -f $matches[2], $matches[1]
# reset the Hashtable to keep track of the status values
$status = #{'OK' = 0; 'Disconnected'= 0; 'Unresponsive' = 0; 'Unreachable host'= 0 }
}
'^\d+' {
# increment the various statuses in the Hahstable
($_ -split '\d+').Trim() | ForEach-Object { $status[$_]++ }
}
'^Version:\s(.+)$' {
$version = $matches[1]
# since this is the last line for this client, output the collected data as object
[PsCustomObject]#{
'Client name and n°' = $client
'OK' = $status['OK']
'Disconnected' = $status['Disconnected']
'Unresponsive' = $status['Unresponsive']
'Unreachable host' = $status['Unreachable host']
'Version' = $version
}
}
}
}
# output on screen
$result | Format-Table -AutoSize
# output to CSV file
$result | Export-Csv -Path 'D:\Test\clientdata.csv' -UseCulture -NoTypeInformation
Result on screen:
Client name and n° OK Disconnected Unresponsive Unreachable host Version
------------------ -- ------------ ------------ ---------------- -------
xxx/xxx 1 1 2 1 x.x.x
I used this as an exercise to test my abilities. I created three of the same files, with different data, and tested this script. As long as they are text files in the directory the script will iterate through each file and pull the data from each as you stated it needs to be. If a stray text file gets added the script does not know nor care and will treat it like the others. If there is data it can find it will, and it will output that data to the excel file. Lastly the file is set to save itself and then immediately close.
It starts by Creating the Excel file, then Workbook. (I commented out the naming of the workbook. If you like you can add it back.) Finds all text files in a directory, then searches the text for the specific content within the text you specified above.
During the script I commented as much as I thought might be needed to assist with modification later on.
Output formatted like this:
Excel Output
#Create An Excel File
$excel = New-Object -ComObject excel.application
$excel.visible = $True
#Add Workbook
$workbook = $excel.Workbooks.Add()
<#Rename Workbook
$workbook= $workbook.Worksheets.Item(1)
$workbook.Name = 'Client name and #'#>
#create the column headers
$workbook.Cells.Item(1,1) = 'Client name and n°'
$workbook.Cells.Item(1,2) = 'OK'
$workbook.Cells.Item(1,3) = 'Disconnected'
$workbook.Cells.Item(1,4) = 'Unresponsive'
$workbook.Cells.Item(1,5) = 'Unreachable'
$workbook.Cells.Item(1,6) = 'Version'
$workbook.Cells.Item(1,7) = 'Date Gathered'
$move = "C:\Users\iNet\Desktop\Testing"
$root = "C:\Users\iNet\Desktop\Testing"
$files = Get-ChildItem -Path $root -Filter *.txt
#Starting on Row 2
[int]$i = 2
ForEach ($file in $files){
$location = $root+"\"+$file
#Format your client data to output what you want to see.
$ClientData = select-string -path "$location" -pattern "Client"
$ClientData = $ClientData.line
$ClientData = $ClientData -replace "Client n° :" -replace ""
$ClientData = $ClientData -replace "Client name :" -replace "|"
$row = $i
$Column = 1
$workbook.Cells.Item($row,$column)= "$ClientData"
#Data Read Date
$DataReadDate = select-string -path "$location" -pattern "Data read"
$DataReadDate = $DataReadDate.line
$DataReadDate = $DataReadDate -replace "Data read " -replace ""
#Data Read Date, you asked for everything but this.
$row = $i
$Column = 7
$workbook.Cells.Item($row,$column)= "$DataReadDate"
#Version
$Version = select-string -path "$location" -pattern "Version:"
$Version = $Version.line
$Version = $Version -replace "Version: " -replace ""
$row = $i
$Column = 6
$workbook.Cells.Item($row,$column)= "$Version"
#How Many Times Unresponsive Shows Up
$Unresponsive = (Get-Content "$location" | select-string -pattern "Unresponsive").length
$row = $i
$Column = 4
$workbook.Cells.Item($row,$column)= "$Unresponsive"
#How Many Times Disconnected Shows Up
$Disconnected = (Get-Content "$location" | select-string -pattern "Disconnected").length
$row = $i
$Column = 3
$workbook.Cells.Item($row,$column)= "$Disconnected"
#How Many Times Unreachable host Shows Up
$Unreachable = (Get-Content "$location" | select-string -pattern "Unreachable host").length
$row = $i
$Column = 5
$workbook.Cells.Item($row,$column)= "$Unreachable"
#How Many Times OK Shows Up
$OK = (Get-Content "$location" | select-string -pattern "OK").length
$row = $i
$Column = 2
$workbook.Cells.Item($row,$column)= "$OK"
#Iterate by one so each text file goes to its own line.
$i++
}
#Save Document
$output = "\Output.xlsx"
$FinalOutput = $move+$output
#saving & closing the file
$workbook.SaveAs($move)
$excel.Quit()

PowerShell does not replace string although you can see it in cmd

I normally find the answer to my problem by going through the site, but this time I have read every question yet still I am in despair and really need an experienced eye.
What I have is basically a structural health monitoring system. I measure strains and receive raw data. This raw data is processed by a MATLAB executable that I wrote myself and then uploaded to an ftp-server. We had a student that automated this with a PowerShell script which was working perfectly until I changed literally one small line in MATLAB and recompiled the code.
I do not understand much about PowerShell, so please be patient with me. The error I receive is you cannot call a method on a null-valued expression. This occurs when I try to replace a set of strings (just called xxx_xxx) with a date that exists as a variable in PowerShell. I can see xxx_xxx in the command window (see attached image), I can print out the date that I want to use as replacement, but somehow it does not work.
I cannot provide a working code snippet because you would need the DAQ to generate data, and as I said, I don't understand the language much. But below is the code. For easier reading, the line that I am receiving the error is the following:
$outData = $cmdOutput.Replace("xxx_xxx",$snaps[$i].Substring(6,4)+"-"+$snaps[$i].Substring(3,2)+"-"+$snaps[$i].Substring(0,2)+" "+$snaps[$i].Substring(11,8)+";")
If anyone could help me with this, I would be eternally grateful!
$retry=3
while(1){
#$dir = "C:\Users\Petar\Documents\Zoo\PetarData\INPUT DATA\New folder\"
$dir = "C:\Users\Yunus\Documents\Micron Optics\ENLIGHT\Data\" + $(get-date -f yyyy) + "\" + $(get-date -f MM) + "\"
#$outdir = "C:\Users\Petar\Documents\Zoo\PetarData\OUTPUT DATA\New folder\"
$archivedirin = "C:\Users\Yunus\Documents\Elefantenhaus\Archive\IN\"
$archivedirout = "C:\Users\Yunus\Documents\Elefantenhaus\Archive\OUT\"
$tempdir = "C:\Users\Yunus\Documents\Elefantenhaus\Archive\TEMP\"
$prefix = "EHZZ";
$filecount=(Get-ChildItem $dir).Count
$latest = Get-ChildItem -Path $dir | Sort-Object LastAccessTime -Descending | Select-Object -First 1
if($filecount -gt 1){
$exclude = $latest.name
$Files = GCI -path $dir | Where-object {$_.name -ne $exclude}
$dest = $archivedirin + "batch_"+$(get-date -f MM-dd-yyyy_HH_mm_ss)+"\"
new-item -type directory $dest
foreach ($file in $Files){move-item -path ($dir+$file) -destination $dest}
$latest = Get-ChildItem -Path $dest | Sort-Object LastAccessTime -Descending | Select-Object -First 1
$filename = $dest + $latest.name
$s=Get-Content $filename
while($s -eq $null){
if($retry -lt 0){break}
write-host "could not read file"
$retry = $retry -1
$s=Get-Content $filename
}
#read content of input file
$snaps = $s
#loop through the lines in the file until the first occurence of a timestamp, that is our desired line
for ($i = 0; $i -lt $snaps.length; $i++)
{
$ismatch =[regex]::Matches($snaps[$i], '^(\d\d.\d\d.\d\d\d\d\s\d\d+)')
if ( $ismatch -ne $null -and $ismatch[0].Groups[1].Value)
{
$temp=Get-Content $filename | select -skip $i
$filenametemp = $tempdir+"\temp.txt" #temp file path, don't change the filename "temp.txt"
#$filename3 = $tempdir+"\test.txt"
Add-Content $filenametemp $temp
$filename = $archivedirout+$prefix+"_"+$snaps[$i].Substring(8,2)+$snaps[$i].Substring(3,2)+$snaps[$i].Substring(0,2)+"_"+$snaps[$i].Substring(11,2)+$snaps[$i].Substring(14,2)+$snaps[$i].Substring(17,2)+".txt"
$cmdOutput = (cmd /c new_modified.exe $tempdir) | Out-String
write-output $cmdOutput #"$cmdOutput is:"
#IF ([string]::IsNullOrWhitespace($cmdOutput)){
# break
#}
$outData = $cmdOutput.Replace("xxx_xxx",$snaps[$i].Substring(6,4)+"-"+$snaps[$i].Substring(3,2)+"-"+$snaps[$i].Substring(0,2)+" "+$snaps[$i].Substring(11,8)+";")
Add-Content $filename $outData
remove-item -path $filenametemp
break
}
}
#break
}
else
{
write-host "waiting for file"
}
Start-Sleep -s 30
}
I think what is happening is that the output of the external program isn't being piped into a variable correctly. I haven't had a chance to test this but Tee-Object looks like the appropriate method for you.
I would suggest you try replacing...
$cmdOutput = (cmd /c new_modified.exe $tempdir) | Out-String
with...
cmd /c new_modified.exe $tempdir | Tee-Object -variable $cmdOutput

Powershell - Optimizing a very, very large csv and text file search and replace

I have a directory with ~ 3000 text files in it, and I'm doing periodic search and replaces on those text files as I transition a program to a new server.
Each text file may have an average of ~3000 lines, and I need to search the files for maybe 300 - 1000 terms at a time.
I'm replacing the server prefix which is related to the string I'm searching for. So for every one of the csv entries, I'm looking for Search_String, \\Old_Server\"Search_String" and making sure that after the program completes, the result is "\\New_Server\Search_String".
I cobbled together a powershell program, and it works. But it's so slow I've never seen it complete.
Any suggestions for making it faster?
EDIT 1:
I changed get-content as suggested, but it still took 3 minutes to search two files (~8000 lines) for 9 separate search terms. I must still be screwing up; a notepad++ search and replace would still be way faster if done manually 9 times.
I'm not sure how to get rid of the first (Get-Content) because I want to make a copy of the file for backup before I make any changes to it.
EDIT 2:
So this is an order of magnitude faster; it's searching a file in maybe 10 seconds. But now it doesn't write changes to files, and it only searches the first file in the directory! I didn't change that code, so I don't know why it broke.
EDIT 3:
Success! I adapted a solution posted below to make it much, much faster. It's searching each file in a couple of seconds now. I may reverse the loop order, so that it loads the file into the array and then searches and replaces each entry in the CSV rather than the other way around. I'll post that if I get it to work.
Final script is below for reference.
#get input from the user
$old = Read-Host 'Enter the old cimplicity qualifier (F24, IRF3 etc'
$new = Read-Host 'Enter the new cimplicity qualifier (CB3, F24_2 etc)'
$DirName = Get-Date -format "yyyy_MM_dd_hh_mm"
New-Item -ItemType directory -Path $DirName -force
New-Item "$DirName\log.txt" -ItemType file -force -Value "`nMatched CTX files on $dirname`n"
$logfile = "$DirName\log.txt"
$VerbosePreference = "SilentlyContinue"
$points = import-csv SearchAndReplace.csv -header find #Import CSV File
#$ctxfiles = Get-ChildItem . -include *.ctx | select -expand fullname #Import local directory of CTX Files
$points | foreach-object { #For each row of points in the CSV file
$findvar = $_.find #Store column 1 as string to search for
$OldQualifiedPoint = "\\\\"+$old+"\\" + $findvar #Use escape slashes to escape each invidual bs so it's not read as regex
$NewQualifiedPoint = "\\"+$new+"\" + $findvar #escape slashes are NOT required on the new string
$DuplicateNew = "\\\\" + $new + "\\" + "\\\\" + $new + "\\"
$QualifiedNew = "\\" + $new + "\"
dir . *.ctx | #Grab all CTX Files
select -expand fullname | #grab all of those file names and...
foreach {#iterate through each file
$DateTime = Get-Date -Format "hh:mm:ss"
$FileName = $_
Write-Host "$DateTime - $FindVar - Checking $FileName"
$FileCopied = 0
#Check file contents, and copy matching files to newly created directory
If (Select-String -Path $_ -Pattern $findvar -Quiet ) {
If (!($FileCopied)) {
Copy $FileName -Destination $DirName
$FileCopied = 1
Add-Content $logfile "`n$DateTime - Found $Findvar in $filename"
Write-Host "$DateTime - Found $Findvar in $filename"
}
$FileContent = Get-Content $Filename -ReadCount 0
$FileContent =
$FileContent -replace $OldQualifiedPoint,$NewQualifiedPoint -replace $findvar,$NewQualifiedPoint -replace $DuplicateNew,$QualifiedNew
$FileContent | Set-Content $FileName
}
}
$File.Dispose()
}
If I'm reading this correctly, you should be able to read a 3000 line file into memory, and do those replaces as an array operation, eliminating the need to iterate through each line. You can also chain those replace operations into a single command.
dir . *.ctx | #Grab all CTX Files
select -expand fullname | #grab all of those file names and...
foreach {#iterate through each file
$DateTime = Get-Date -Format "hh:mm:ss"
$FileName = $_
Write-Host "$DateTime - $FindVar - Checking $FileName"
#Check file contents, and copy matching files to newly created directory
If (Select-String -Path $_ -Pattern $findvar -Quiet ) {
Copy $FileName -Destination $DirName
Add-Content $logfile "`n$DateTime - Found $Findvar in $filename"
Write-Host "$DateTime - Found $Findvar in $filename"
$FileContent = Get-Content $Filename -ReadCount 0
$FileContent =
$FileContent -replace $OldQualifiedPoint,$NewQualifiedPoint -replace $findvar,$NewQualifiedPoint -replace $DuplicateNew,$QualifiedNew
$FileContent | Set-Content $FileName
}
}
On another note, Select-String will take the filepath as an argument, so you don't have to do a Get-Content and then pipe that to Select-String.
Yes, you can make it much faster by not using Get-Content... Use Stream Reader instead.
$file = New-Object System.IO.StreamReader -Arg "test.txt"
while (($line = $file.ReadLine()) -ne $null) {
# $line has your line
}
$file.dispose()
i wanted to use PowerShell for this and created a script like the one below:
$filepath = "input.csv"
$newfilepath = "input_fixed.csv"
filter num2x { $_ -replace "aaa","bbb" }
measure-command {
Get-Content -ReadCount 1000 $filepath | num2x | add-content $newfilepath
}
It took 19 minutes on my laptop to process 6.5Gb file. The code below is reading file in a batch (using ReadCount) and uses filter that should optimize performance.
But then I tried FART and it did the same thing in 3 minutes! quite a difference!

PowerShell set each string part as a variable for reuse

I have a list of files in a folder each are in this format: custID_invID_prodID or custID_invID_prodID_Boolvalue. For every file I need to break it into sections based on '_'. Currently I have this code:
$files = Get-ChildItem test *.txt
foreach($f in $files){
$file = #()
$file += ([String]$f).Split("_")
$total = ([String]$f).Split("_") | Measure-Object | select count
Write-Host "${total}"
if($total -eq 2) {
for($i = 2; $i -lt $file.length; $i+=3) {
$file[$i] = $file[$i].trimend(".txt")
Write-Host "${file}"
}
}
}
The problem is that Write-Host "${total}" equals #{Count=#} where # is real number of times "_" is found in file. How can I use $total inside my if statement to do different operations based upon the number of "_" found?
Would it not be simpler just to assign the parts you want directly to named variables rather than working with an array?
foreach($f in (Get-ChildItem test *.txt)) {
$custId, $invID, $prodID, $Boolvalue = $f.BaseName -split "_"
Write-Host $custId, $invID, $prodID, $Boolvalue
}
If the name only has 3 parts this will simply set $Boolvalue to an empty string.
Also note that you don't have to trim the extension off the last element after splitting, just use the BaseName property to get the name without extension.
You need to get the count-property value, like $total.count in your if test. You could also clean it up like this.
$files = Get-ChildItem test *.txt
foreach($f in $files){
$file = #(([String]$f).Split("_"))
Write-Host "$($file.Count)"
if($file.Count -eq 2) {
for($i = 2; $i -lt $file.length; $i+=3) {
$file[$i] = $file[$i].trimend(".txt")
Write-Host "${file}"
}
}
}
If you had included more information about what you were trying to do, we could clean it up alot more. Ex. It's seems like you want to do something like this:
Get-ChildItem test *.txt | ForEach-Object {
$file = #($_.BaseName.Split("_"))
Write-Host "$($file.Count)"
if($file.Count -eq 2) {
Write-Host $file
}
}
Seems to me that you're doing it the hard way. Why not:
$x = "aaa_bbb_ccc"
$cnt = $x.Split("_").count

Resources