I'm searching to replace 2 occurrences of a specific string but each by one data different.
Here the file with 2 lines :
[SERVER1]
NAME = SERVER1\SQLEXPRESS
ODBCLINK = idms
USER = idms
PSW = idms
[SERVER2]
NAME = SERVER2\SQLEXPRESS
ODBCLINK = backupidms
USER = idms
PSW = idms
For the moment I have that code:
Get-Content ".\test.ini") | ForEach-Object { $_ -replace ".+\SQLEXPRESS" , "Name = $hostname\SQLEXPRESS" } | Set-Content ".\test.ini"
The goal is to have that :
[SERVER1]
NAME = Paris\SQLEXPRESS
ODBCLINK = idms
USER = idms
PSW = idms
[SERVER2]
NAME = Nantes\SQLEXPRESS
ODBCLINK = backupidms
USER = idms
PSW = idms
I read these 2 strings, Paris and Nantes, from another file.
Test1 and TEst2 can be ALieoej and PAodj45p. it's arbitrary choice
I think i need a script to search one line with sqlexpress into, and change it by one data, and the second time where i find it, replace it by another data
Seems like this may be what you are looking for. This will CREATE the test.ini file instead of editing it as you asked.
Set-Content -Value "Test1
Test2" -Path computers.txt
(Get-Content ".\computers.txt") | ForEach-Object {"Name = $_\SQLEXPRESS" } | Set-Content ".\test.ini"
Where computers.txt is your source of computers that have SQL Express instances.
Hi i found myself a solution :
rm $fichierDest
$content = get-content $fichierSource
$i = 0
while($i -lt $content.Length - 1)
{
$line = $content[$i]
$line2 = $content[$i+1]
if($line -match "SERVER1")
{
$line2 = $line2 -Replace ".+\SQLEXPRESS" , "Name = TEST1\SQLEXPRESS"
add-content $fichierDest $line
add-content $fichierDest $line2
$i += 2
}
elseif($line -match "SERVER2")
{
$line2 = $line2 -Replace ".+\SQLEXPRESS" , "Name = TEST2\SQLEXPRESS"
add-content $fichierDest $line
add-content $fichierDest $line2
$i += 2
}
else
{
add-content $fichierDest $line
$i++
}
}
Related
New to StackOverflow, I'll do my best to post correctly :)
Hoping someone can help me to get my code running faster.
The code is run against RoboCopy Migration logs from a massive DFS server migration (20 DFS servers being migrated).
The code first captures the source/destination of the log in question and then looks for the 'Newer', 'Older', 'New File' and 'Extra File' entries/rows. It then checks to see if these files exist at each side, what attributes they have and does a DFSR hash check against both sides (as the files are now being replicated via DFSR).
The main concern is if the hashes match for source and destination and if the temporary attribute is in place.
The problem I am having is that there are millions of files logged under these types (the migration was gargantuan) so the script is taking forever to run. To add to this the client will not allow ports for psremoting/invoke-command.
At present I am running my code without multi-threading, with a copy on each of the DFS servers looking at their respective logs but it is still slow.
I have been looking at running a foreach parallel on looping through each log row (not the loop of log files) but:
With so much data within each log/loop my understanding is that I have to write it out rather than keep it in an PsCustomObject? Otherwise I would run out of RAM?
I don't really understand how to use MUTEXes to get multiple writes to the CSV.
Can someone please advise me on the above 2 points? And maybe give me some more ideas on what I can do to optimise things?
My full code is below..
#Get Start Time
$ReportStartTime = (Get-Date).ToString('yyy-MM-dd_HH-mm-ss')
If(!(test-path "C:\Temp\MasterReport_$ReportStartTime\")){
new-item -type directory -path "C:\Temp\MasterReport_$ReportStartTime\" | Out-Null
}
"Script Started:$ReportStartTime" >> "C:\Temp\MasterReport_$ReportStartTime\Log_$ReportStartTime.Log"
#Get Logs from folder (Recursive)
$Logs = Try{
Get-ChildItem -path 'C:\Temp\RoboCopyLogs\*\*.log' -Recurse -ErrorAction Stop | Select FullName
}
catch{
$_.Exception >> "C:\Temp\MasterReport_$ReportStartTime\Errors_$ReportStartTime.log"
}
#Initialise Log Counters
$NumberOfFiles = 0
$DesktopFile = 0
$ProcessedFiles = 0
$Totalsize = 0
#Count Logs
$Logs | foreach {
$SourceLog = $_
#Get Logfile
$Log = Get-Content $SourceLog.FullName
#Get Log rows for required Error Types and begin loop
$Log | Select-String -Pattern '(^\t+ +(Newer|Older|New File|Extra File))' `
|foreach {
$NumberOfFiles=$NumberOfFiles+1
If($_ | Select-String -pattern 'Desktop.ini' -SimpleMatch){
$DesktopFile=$DesktopFile+1
}
}
}
$Expected = $NumberOfFiles - $DesktopFile
"Total Files To Check = $NumberOfFiles" >> "C:\Temp\MasterReport_$ReportStartTime\Log_$ReportStartTime.Log"
"Total Files Excluded = $DesktopFile" >> "C:\Temp\MasterReport_$ReportStartTime\Log_$ReportStartTime.Log"
"Total Files To Ingest = $Expected" >> "C:\Temp\MasterReport_$ReportStartTime\Log_$ReportStartTime.Log"
$Main = (Get-Date).ToString('yyy-MM-dd_HH-mm-ss')
"Main Script:$Main" >> "C:\Temp\MasterReport_$ReportStartTime\Log_$ReportStartTime.Log"
$Logs | foreach {
$SourceLog = $_
#Get Logfile
$Log = Get-Content $SourceLog.FullName
#Collect Source and Destination
$S = $Log | Select-String -Pattern 'Source :'
$D = $Log | Select-String -Pattern 'Dest :'
$SourceLocation = $S -replace '\s+Source : ',''
$DestLocation = $D -replace '\s+Dest : ',''
#Get Log rows for required Error Types and begin loop
$Log | Select-String -Pattern '(^\t+ +(Newer|Older|New File|Extra File))' | Select-String -pattern 'Desktop.ini' -SimpleMatch -NotMatch `
|foreach {
#This loop could be a foreach -parallel???
#Check Percent Completed
If($ProcessedFiles>0){
$PercentComplete=[Math]::Ceiling(($ProcessedFiles/$Expected)*100)
If($PercentComplete -match ('([0-9]0)')){
"$($PercentComplete)% Completed" > "C:\Temp\MasterReport_$ReportStartTime\PercentComplete.Log"
($ProcessedFiles/$Expected)*100
}
}
#Count Logs Processed
$ProcessedFiles=$ProcessedFiles+1
#Populate FilePath
$FilePath = $_ -Replace '.*(?=\\\\)', ''
#Populate Error type
$RoboErrorRaw = $_ -replace '\s+','|'
$RoboError = $RoboErrorRaw.split("|")[1]
#Check if file path relates to Source or the Destination and set path variables
if($FilePath -like "$SourceLocation*"){
$SourceFilePath = $FilePath
$DestFilePath = $FilePath.replace($SourceLocation,$DestLocation)
}
Elseif($FilePath -like "$DestLocation*"){
$DestFilePath = $FilePath
$SourceFilePath = $FilePath.replace($DestLocation,$SourceLocation)
$IsAtPartner = Test-Path $SourceFilePath
}
Else{
$DestFilepath = "Could Not Resolve UNC to Source or Destination"
}
#Check if file exists at source and destination
Try{
$IsAtPartner = Test-Path $DestFilePath -ErrorAction Stop
}
catch{
$IsAtPartner = $_.Exception
}
Try{
$IsAtSource = Test-path $SourceFilePath -ErrorAction Stop
}
catch{
$IsAtSource = $_.Exception
}
If($IsAtSource){
#Get the file details
Try{
$SourceFileDetails = Get-ChildItem $FilePath -Hidden -ErrorAction Stop
}
catch{
$SourceFileDetails = 'Failed'
}
if($SourceFileDetails -ne 'Failed'){
#Check has temp attribute
if((($SourceFileDetails).Attributes -band 0x100) -eq 0x100){
$TempAttribute = "Yes"
}
Else{
$TempAttribute = "No"
}
#Get attributes and last modified
Try{
$AllAttributes = ($SourceFileDetails).Attributes
}
catch{
$AllAttributes = $_.Exception
}
Try{
$Modified = ($SourceFileDetails).LastWriteTime.ToString()
}
catch{
$Modified = $_.Exception
}
}
}
#Check if .bak file
if($filePath -match '\.bak$'){
$Bakfile = "Yes"
}
Else{
$Bakfile = "No"
}
#Get Hashes
If($IsAtPartner -and $IsAtSource){
$HashSource = (Get-DfsrFileHash -Path $SourceFilepath).FileHash
$HashDest = (Get-DfsrFileHash -Path $DestFilepath).FileHash
}
ElseIf(!$IsAtSource -and !$IsAtPartner){
$HashSource = 'File Does not Exist at Source'
$HashDest = 'File Does not Exist At Partner'
}
ElseIf(!$IsAtPartner){
$HashSource = (Get-DfsrFileHash -Path $SourceFilepath).FileHash
$HashDest = 'File Does not Exist At Partner'
}
ElseIf(!$IsAtSource){
$HashSource = 'File Does not Exist at Source'
$HashDest = (Get-DfsrFileHash -Path $DestFilepath).FileHash
}
Else{
$HashSource = 'ERROR'
$HashDest = 'ERROR'
}
#Compare Valid Hashes
If($HashSource -eq $HashDest){
$HashMatch = 'Yes'
}
Else{
$HashMatch = 'No'
}
#Check Filesize where hashes do not match
If($HashMatch = 'No'){
$FileSizeMB = ($SourceFileDetails).length/1MB
}
#Create output object
$Obj = [PSCustomObject]#{
ErrorType = $RoboError
FilePath = $SourceFilePath
PartnerUNC = $DestFilePath
IsAtSource = $IsAtSource
IsAtDestination = $IsAtPartner
BakFile = $Bakfile
TepmpAttribute = $TempAttribute
LastModified = $Modified
AllAttributes = $AllAttributes
HashSource = $HashSource.FileHash
HashDest = $HashDest.FileHash
HashMatch = $HashMatch
RoboSource = $SourceLocation
RoboDest = $DestLocation
FileSizeMB = $FileSizeMB
SourceLog = $SourceLog.FullName
}
$Source = $SourceLocation.split('\\')[2]
$Destination = $DestLocation.split('\\')[2]
if(!(test-path "C:\Temp\$($Source)-$($Destination)_$($ReportStartTime)")){
new-item -type directory -path "C:\Temp\$($Source)-$($Destination)_$($ReportStartTime)" | Out-Null
}
#export to csv
$obj | Export-Csv -Path "C:\Temp\$($Source)-$($Destination)_$($ReportStartTime)\RoboCopyLogChecks_$ReportStartTime.csv" -NoTypeInformation -Append
$obj | Export-Csv -Path "C:\Temp\MasterReport_$ReportStartTime\RoboCopyLogChecks_$ReportStartTime.csv" -NoTypeInformation -Append
#Increment total size of data
If($HashMatch -eq "Yes"){
$Totalsize = $Totalsize + $SourceFileDetails.Length
}
clear-variable -name RoboError,SourceFilePath,DestFilePath,IsAtSource,IsAtPartner,Bakfile,TempAttribute,Modified,AllAttributes,HashSource,HashDest,HashMatch,FileSizeMB,Source,Destination
if($SourceFileDetails){
Remove-Variable -name SourceFileDetails
}
}
}
$Completion = (Get-Date).ToString('yyy-MM-dd_HH-mm-ss')
"Script Completed:$Completion Excluded Processed = $DesktopFile ,Total Processed = $ProcessedFiles" >> "C:\Temp\MasterReport_$ReportStartTime\Log_$ReportStartTime.Log"
"Files without Matching Hashses amount to $($Totalsize/1GB)GB" >> "C:\Temp\MasterReport_$ReportStartTime\Log_$ReportStartTime.Log"
Here is some example log data (could be put in C:\Temp\RoboCopyLogs\Logs\ to run with above code)
-------------------------------------------------------------------------------
ROBOCOPY :: Robust File Copy for Windows
-------------------------------------------------------------------------------
Started : 24 April 2022 17:29:57
Source : \\Test01\
Dest : \\Test02\
Files : *.*
Exc Files : ~*.*
*.TMP
Exc Dirs : \\Test01\DfsrPrivate
Options : *.* /FFT /TS /L /S /E /DCOPY:DA /COPY:DAT /PURGE /MIR /B /NP /XJD /MT:8 /R:0 /W:0
------------------------------------------------------------------------------
Newer 30720 2021/07/20 14:49:36 \\Test01\Test2121.xls
Older 651776 2020/10/25 21:49:32 \\Test01\testppt.ppt
Older 94720 2019/06/10 11:46:03 \\Test01\Thumbs.db
*EXTRA File 1.7 m 2020/09/17 10:36:57 \\Test02\months.jpg
*EXTRA File 1.8 m 2020/09/17 10:36:57 \\Test02\happy.jpg
New File 6421 2020/10/26 10:32:43 \\Test01\26-10-20.pdf
New File 6321 2020/10/26 10:32:43 \\Test01\Testing20.pdf
I'm pretty new to PS and been struggling for a few days.
I have multiple text files in a folder with specific data that I would like to extract into an excel spreadsheet.
each files look like this :
Client n° : xxx Client name : xxx
Computer status
pc group 1 :
n°1 OK n°2 Disconnected n°3 Unresponsive
n°4 Unreachable host n°5 Unresponsive
Data read 11/11/20 12:50:07
Version: x.x.x
I would like to have an output file that looks like this :
Client name and n° OK Disconnected Unresponsive Unreachable host version
xxx/xxx 1 1 2 1 x.x.x
For the status columns it's the sum number of pc with that status and not the pc n° that I would like to display.
At the moment I'm working with multiple .bat files that searches for the status and output one file per status
find /c "Disconnected" *.* > disconnected.txt
find /c "Unresponsive" *.* > unresponsive.txt
And then I sort every single output in an excel which takes me too much time, I was wondering if it was possible to automate this task with a script.
I really don't have any knowledge of PS, only basic batch commands.
Let's assume your files are all in one folder and all of them have the .txt extension.
Then you need to loop through these files and parse the data you need from it:
# create a Hashtable to add the different status values in
$status = #{'OK' = 0; 'Disconnected'= 0; 'Unresponsive' = 0; 'Unreachable host'= 0}
# loop through the files in your path and parse the information out
$result = Get-ChildItem -Path 'D:\Test' -Filter '*.txt' -File | ForEach-Object {
switch -Regex -File $_.FullName {
'^Client n°\s*:\s*([^\s]+)\s+Client name\s*:\s*(.+)$' {
# start collecting data for this client
$client = '{0}/{1}' -f $matches[2], $matches[1]
# reset the Hashtable to keep track of the status values
$status = #{'OK' = 0; 'Disconnected'= 0; 'Unresponsive' = 0; 'Unreachable host'= 0 }
}
'^\d+' {
# increment the various statuses in the Hahstable
($_ -split '\d+').Trim() | ForEach-Object { $status[$_]++ }
}
'^Version:\s(.+)$' {
$version = $matches[1]
# since this is the last line for this client, output the collected data as object
[PsCustomObject]#{
'Client name and n°' = $client
'OK' = $status['OK']
'Disconnected' = $status['Disconnected']
'Unresponsive' = $status['Unresponsive']
'Unreachable host' = $status['Unreachable host']
'Version' = $version
}
}
}
}
# output on screen
$result | Format-Table -AutoSize
# output to CSV file
$result | Export-Csv -Path 'D:\Test\clientdata.csv' -UseCulture -NoTypeInformation
Result on screen:
Client name and n° OK Disconnected Unresponsive Unreachable host Version
------------------ -- ------------ ------------ ---------------- -------
xxx/xxx 1 1 2 1 x.x.x
I used this as an exercise to test my abilities. I created three of the same files, with different data, and tested this script. As long as they are text files in the directory the script will iterate through each file and pull the data from each as you stated it needs to be. If a stray text file gets added the script does not know nor care and will treat it like the others. If there is data it can find it will, and it will output that data to the excel file. Lastly the file is set to save itself and then immediately close.
It starts by Creating the Excel file, then Workbook. (I commented out the naming of the workbook. If you like you can add it back.) Finds all text files in a directory, then searches the text for the specific content within the text you specified above.
During the script I commented as much as I thought might be needed to assist with modification later on.
Output formatted like this:
Excel Output
#Create An Excel File
$excel = New-Object -ComObject excel.application
$excel.visible = $True
#Add Workbook
$workbook = $excel.Workbooks.Add()
<#Rename Workbook
$workbook= $workbook.Worksheets.Item(1)
$workbook.Name = 'Client name and #'#>
#create the column headers
$workbook.Cells.Item(1,1) = 'Client name and n°'
$workbook.Cells.Item(1,2) = 'OK'
$workbook.Cells.Item(1,3) = 'Disconnected'
$workbook.Cells.Item(1,4) = 'Unresponsive'
$workbook.Cells.Item(1,5) = 'Unreachable'
$workbook.Cells.Item(1,6) = 'Version'
$workbook.Cells.Item(1,7) = 'Date Gathered'
$move = "C:\Users\iNet\Desktop\Testing"
$root = "C:\Users\iNet\Desktop\Testing"
$files = Get-ChildItem -Path $root -Filter *.txt
#Starting on Row 2
[int]$i = 2
ForEach ($file in $files){
$location = $root+"\"+$file
#Format your client data to output what you want to see.
$ClientData = select-string -path "$location" -pattern "Client"
$ClientData = $ClientData.line
$ClientData = $ClientData -replace "Client n° :" -replace ""
$ClientData = $ClientData -replace "Client name :" -replace "|"
$row = $i
$Column = 1
$workbook.Cells.Item($row,$column)= "$ClientData"
#Data Read Date
$DataReadDate = select-string -path "$location" -pattern "Data read"
$DataReadDate = $DataReadDate.line
$DataReadDate = $DataReadDate -replace "Data read " -replace ""
#Data Read Date, you asked for everything but this.
$row = $i
$Column = 7
$workbook.Cells.Item($row,$column)= "$DataReadDate"
#Version
$Version = select-string -path "$location" -pattern "Version:"
$Version = $Version.line
$Version = $Version -replace "Version: " -replace ""
$row = $i
$Column = 6
$workbook.Cells.Item($row,$column)= "$Version"
#How Many Times Unresponsive Shows Up
$Unresponsive = (Get-Content "$location" | select-string -pattern "Unresponsive").length
$row = $i
$Column = 4
$workbook.Cells.Item($row,$column)= "$Unresponsive"
#How Many Times Disconnected Shows Up
$Disconnected = (Get-Content "$location" | select-string -pattern "Disconnected").length
$row = $i
$Column = 3
$workbook.Cells.Item($row,$column)= "$Disconnected"
#How Many Times Unreachable host Shows Up
$Unreachable = (Get-Content "$location" | select-string -pattern "Unreachable host").length
$row = $i
$Column = 5
$workbook.Cells.Item($row,$column)= "$Unreachable"
#How Many Times OK Shows Up
$OK = (Get-Content "$location" | select-string -pattern "OK").length
$row = $i
$Column = 2
$workbook.Cells.Item($row,$column)= "$OK"
#Iterate by one so each text file goes to its own line.
$i++
}
#Save Document
$output = "\Output.xlsx"
$FinalOutput = $move+$output
#saving & closing the file
$workbook.SaveAs($move)
$excel.Quit()
I want to create two way table in excel by exporting object from powershell. I am able to create a table in powershell.
The code as shown below:
class sampleClass {
[String] $var1
[String] $var2
[Bool] $boolVar
sampleClass([String] $var1, [String] $var2, [Bool] $boolVar)
{
$this.var1 = $var1
$this.var2 = $var2
$this.boolVar = $boolVar
}
[String] ToString()
{
return $this.var1 + ": " + $this.var2 + ": " + $this.boolVar
}
}
$s1 = [sampleClass]::new("Comp1", "S1", $false)
$s2 = [sampleClass]::new("Comp2", "S2", $true)
$s3 = [sampleClass]::new("Comp1", "S2", $false)
$s4 = [sampleClass]::new("Comp2", "S1", $false)
$s = #()
$s += $s1
$s += $s2
$s += $s3
$s += $s4
$s | Export-Csv .\out.csv -NoTypeInformation
The output for above code is as shown below:
But the output that I want is not that, but as shown below:
Kindly help.
This code may be your best bet:
function Transpose-Object
{ [CmdletBinding()]
Param([OBJECT][Parameter(ValueFromPipeline = $TRUE)]$InputObject)
BEGIN
{ # initialize variables just to be "clean"
$Props = #()
$PropNames = #()
$InstanceNames = #()
}
PROCESS
{
if ($Props.Length -eq 0)
{ # when first object in pipeline arrives retrieve its property names
$PropNames = $InputObject.PSObject.Properties | Select-Object -ExpandProperty Name
# and create a PSCustomobject in an array for each property
$InputObject.PSObject.Properties | %{ $Props += New-Object -TypeName PSObject -Property #{Property = $_.Name} }
}
if ($InputObject.Name)
{ # does object have a "Name" property?
$Property = $InputObject.Name
} else { # no, take object itself as property name
$Property = $InputObject | Out-String
}
if ($InstanceNames -contains $Property)
{ # does multiple occurence of name exist?
$COUNTER = 0
do { # yes, append a number in brackets to name
$COUNTER++
$Property = "$($InputObject.Name) ({0})" -f $COUNTER
} while ($InstanceNames -contains $Property)
}
# add current name to name list for next name check
$InstanceNames += $Property
# retrieve property values and add them to the property's PSCustomobject
$COUNTER = 0
$PropNames | %{
if ($InputObject.($_))
{ # property exists for current object
$Props[$COUNTER] | Add-Member -Name $Property -Type NoteProperty -Value $InputObject.($_)
} else { # property does not exist for current object, add $NULL value
$Props[$COUNTER] | Add-Member -Name $Property -Type NoteProperty -Value $NULL
}
$COUNTER++
}
}
END
{
# return collection of PSCustomobjects with property values
$Props
}
}
It will allow you to turn you columns into rows and then export the object. Use like this:
$s | Transpose-Object | Export-Csv .\out.csv -NoTypeInformation
I've a bunch of files in which I need to replace content like for e.g. wherever there is 'AA' I need to replace with 'E1', 'A1' with 'P4'. The same content needs to be changed differently in different files. So for example in the 2nd file 'AA' would become 'P1', 'A1' would become 'E1', etc. To accomplish this I've an Excel sheet with 2 columns like the below:
TC CodeChange
086 AA-E1; A1-P2
099 AA-P2; A1-E1; A2-E2; Z3-E3
100 AA-P2; A1-E2; A2-E3; Z3-O3
PowerShell script which I wrote for the above:
Script 1:
function func3 {
Param($arr3, $pat)
$arr3.GetEnumerator() | ?{$_.key -like $pat} | ForEach-Object {
$output = $_.value
return $output
}
}
$src = "C:\...xlsx"
$src1 = "C:\...\..."
$sheetName = "Sheet1"
$arr = #{};
$objExcel = New-Object -ComObject Excel.Application
$workbook = $objExcel.Workbooks.Open($src)
$sheet = $workbook.Worksheets.Item($sheetName)
$objExcel.Visible = $false
$rowMax = ($sheet.UsedRange.Rows).count
$rowTC, $colTC = 1, 1
$rowCodeChange, $colCodeChange = 1, 2
for ($i=1; $i -le $rowMax-1; $i++) {
$TC = $sheet.Cells.Item($rowTC+$i, $colTC).Text
$CodeChg = [String]($sheet.Cells.Item($rowCodeChange+$i, $colCodeChange).Text)
if ($arr.ContainsKey($TC) -eq $false) {
$arr.Add($TC, $CodeChg)
}
}
$inputfiles = (Get-ChildItem -Path $src1 -Recurse)
foreach ($inputfile in $inputfiles) {
$pat1 = $inputfile.Name.SubString(8, 3)
$val = func3 $arr $pat1
$arry1 = $val -split ';'
Write-Host $arry1.Length
$j = 0
do {
#skipping these 3 items from getting replaced
if (($arry1[$j].Trim() -ne "S1") -and ($arry1[$j].Trim() -ne "S2") -and ($arry1[$j].Trim() -ne "S3")) {
(Get-Content $inputfile.FullName) | ForEach-Object {
$_ -replace "$($arry1[$j].Split('-')[0])","$($arry1[$j].Split('-')[1])"
} | Set-Content $inputfile.FullName
}
$j++
} while ($j -le ($arry1.Length-1))
}
$objExcel.Quit()
Script 2:
function func3 {
param($arr3, $pat)
$arr3.GetEnumerator() | ?{$_.key -like $pat} | ForEach-Object {
$output=$_.value
return $output
}
}
$src = "C:\...xlsx"
$src1 = "C:\..."
$sheetName = "Sheet1"
$arr = #{};
$objExcel = New-Object -ComObject Excel.Application
$workbook = $objExcel.Workbooks.Open($src)
$sheet = $workbook.Worksheets.Item($sheetName)
$objExcel.Visible = $false
$rowMax = ($sheet.UsedRange.Rows).Count
$rowTC, $colTC = 1, 1
$rowCodeChange, $colCodeChange = 1, 2
for ($i=1; $i -le $rowMax-1; $i++) {
$TC = $sheet.Cells.Item($rowTC+$i, $colTC).Text
$CodeChg = [String]($sheet.Cells.Item($rowCodeChange+$i, $colCodeChange).Text)
if ($arr.ContainsKey($TC) -eq $false) {
$arr.Add($TC, $CodeChg)
}
}
$inputfiles = (Get-ChildItem -Path $src1 -Recurse)
foreach ($inputfile in $inputfiles) {
$pat1 = $inputfile.Name.SubString(8, 3)
$val = func3 $arr $pat1
$arry1 = $val -split ';'
Write-Host $arry1.Length
$j = 0
do {
#skipping these 3 items from getting replaced
if (($arry1[$j].Trim() -ne "S1") -and ($arry1[$j].Trim() -ne "S2") -and ($arry1[$j].Trim() -ne "S3")){
$content = [System.IO.File]::ReadAllText($inputfile.FullName).Replace($arry1[$j].Split('-')[0], $arry1[$j].Split('-')[1])
[System.IO.File]::WriteAllText($inputfile.FullName, $content)
Write-Host $arry1[$j].Split('-')[0]' replaced with '$arry1[$j].Split('-')[1]' in file: '$inputfile.FullName
}
$j++
} while ($j -le ($arry1.Length-1))
}
$objExcel.Quit()
The folder where the files are has the files having names containing the same digits in the 'TC' column in my Excel sheet. Example:
TC 086.txt
TC 099.txt
etc.
That way after I import the contents of the Excel into a hashtable I extract the digits from the filenames and get the corresponding values for the same key in the hashtable. For example the value for the key '086' from the hashtable would be 'AA-E1; A1-P2'. Then I split the items to be replaced from the hashtable value (separated by ;) and then store that in an array. The using a loop I try to replace the contents of each file based on the data retrieved from the spreadsheet.
The issue I'm facing with both the approaches is that only the 1st item in each file is getting replaced. The rest of the items are not getting replaced. For example only 'AA' value in file 'TC 086.txt' is getting replaced with 'E1'. 'A1' is not getting replaced with 'P2'.
I found out what the issue was. I basically had to trim the elements of the array
$arry1
after splitting them (separated by ;) and before passing them as parameters to the 'Replace' function. Apparently there was a space before every element in that array except the 1st element (that's how they were stored in the source: excel spreadsheet). Hence the 'Replace' method was not finding that element in the file and hence not replacing it. Removing the spaces before the elements solved the issue
How can I check if a string exists in:
1 text file;
size up until 10GB;
taking into account that the file is only one line;
the file only contains random numbers 1 to 9;
using powershell (because I think it will be more efficient, although I don't know how to program in this language);
I have tried this in batch:
FINDSTR "897516" decimal_output.txt
pause
But as I said I need the faster and more efficient way to do this.
I also tried this code that I have found in stackoverflow:
$SEL = Select-String -Path C:\Users\fabio\Desktop\CONVERTIDOS\dec_output.txt -Pattern "123456"
if ($SEL -ne $null)
{
echo Contains String
}
else
{
echo Not Contains String
}
But I get the error below, and I don't know if this code is the most solid or adequate. The error:
Select-String : Tipo de excepção 'System.OutOfMemoryException' accionado.
At C:\Users\fabio\Desktop\1.ps1:1 char:8
+ $SEL = Select-String -Path C:\Users\fabio\Desktop\CONVERTIDOS\dec_out ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Select-String], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.SelectStringCommand
This should do the job:
#################################################################################################################
#
# Searches for a user defined string in the $input_file and counts matches. Works with files of any size.
#
# Adjust source directory and input file name.
#
$source = "C:\adjust\path"
$input_file = "file_name.extension"
#
#
# Define the string you want to search for. Keep quotation marks even if you only search for numbers (otherwise
# $pattern.Length will be 1 and this script will no longer work with files larger than the $split_size)!
#
$pattern = "Enter the string to search for in here"
#
#
# Using Get-Content on an input file with a size of 1GB or more will cause System.OutOfMemoryExceptions,
# therefore a large file gets temporarily split up.
#
$split_size = 100MB
#
#
# Thanks #Bob (https://superuser.com/a/1295082/868077)
#################################################################################################################
Set-Location $source
if (test-path ".\_split") {
while ($overwrite -ne "true" -and $overwrite -ne "false") {
"`n"
$overwrite = Read-Host ' Splitted files already/still exist! Delete and overwrite?'
if ($overwrite -match "y") {
$overwrite = "true"
Remove-Item .\_split -force -recurse
$a = "`n Deleted existing splitted files!"
} elseif ($overwrite -match "n") {
$overwrite = "false"
$a = "`n Continuing with existing splitted files!"
} elseif ($overwrite -match "c") {
exit
} else {
Write-Host "`n Error: Invalid input!`n Type 'y' for 'yes'. Type 'n' for 'no'. Type 'c' for 'cancel'. `n`n`n"
}
}
}
Clear-Host
if ((Get-Item $input_file).Length -gt $split_size) {
while ($delete -ne "true" -and $delete -ne "false") {
"`n"
$delete = Read-Host ' Delete splitted files afterwards?'
if ($delete -match "y") {
$delete = "true"
$b = "`n Splitted files will be deleted afterwards!"
} elseif ($delete -match "n") {
$delete = "false"
$b = "`n Splitted files will not be deleted afterwards!"
} elseif ($delete -match "c") {
exit
} else {
Write-Host "`n Error: Invalid input!`n Type 'y' for 'yes'. Type 'n' for 'no'. Type 'c' for 'cancel'. `n`n`n"
}
}
Clear-Host
$a
$b
Write-Host `n This may take some time!
if ($overwrite -ne "false") {
New-Item -ItemType directory -Path ".\_split" >$null 2>&1
[Environment]::CurrentDirectory = Get-Location
$bytes = New-Object byte[] 4096
$in_file = [System.IO.File]::OpenRead($input_file)
$file_count = 0
$finished = $false
while (!$finished) {
$file_count++
$bytes_to_read = $split_size
$out_file = New-Object System.IO.FileStream ".\_split\_split_$file_count.splt",CreateNew,Write,None
while ($bytes_to_read) {
$bytes_read = $in_file.Read($bytes, 0, [Math]::Min($bytes.Length, $bytes_to_read))
if (!$bytes_read) {
$finished = $true
break
}
$bytes_to_read -= $bytes_read
$out_file.Write($bytes, 0, $bytes_read)
}
$out_file.Dispose()
}
$in_file.Dispose()
}
$i++
while (Test-Path ".\_split\_split_$i.splt") {
$cur_file = (Get-Content ".\_split\_split_$i.splt")
$temp_count = ([regex]::Matches($cur_file, "$pattern")).Count
$match_count += $temp_count
$n = $i - 1
if (Test-Path ".\_split\_split_$n.splt") {
if ($cur_file.Length -ge $pattern.Length) {
$file_transition = $prev_file.Substring($prev_file.Length - ($pattern.Length - 1)) + $cur_file.Substring(0,($pattern.Length - 1))
} else {
$file_transition = $prev_file.Substring($prev_file.Length - ($pattern.Length - 1)) + $cur_file
}
$temp_count = ([regex]::Matches($file_transition, "$pattern")).Count
$match_count += $temp_count
}
$prev_file = $cur_file
$i++
}
} else {
$a
$match_count = ([regex]::Matches($input_file, "$pattern")).Count
}
if ($delete -eq "true") {
Remove-Item ".\_split" -Force -Recurse
}
if ($match_count -ge 1) {
Write-Host "`n`n String '$pattern' found:`n`n $match_count matches!"
} else {
Write-Host "`n`n String '$pattern' not found!"
}
Write-Host `n`n`n`n`n
Pause
This will split a large file into mutliple smaller files, search them for $pattern and count the matches (taking file transitions into account).
It also offers you to delete or keep the splitted files afterwards so you can reuse them and don't have to split the large file every time you run this script.