Excluding lines, which are not containing one or multiple strings from text file - python-3.x

I have multiple server log files. In total they contain around 500.000 lines of log text. I only want to keep the lines that contain "Downloaded" and "Log". Lines I want to exclude are focussing on error logs and basic system operations like "client startup", "client restart" and so on.
An example of the lines we are looking for is this one:
[22:29:05]: Downloaded 39 /SYSTEM/SAP logs from System-4, customer (000;838) from 21:28:51,705 to 21:29:04,671
The lines that are to be kept should be complemented by the date string, which is part of the log-file name. ($date)
Further, as the received logs are rather unstructured, the filtered files should be transformed into one csv-file (columns: timestamp, log downloads, system directory, system type, customer, start time, end time, date [to be added to every line from file name]. The replace operation of turning spaced into comma is just a first try to bring in some structure to the data. This file is supposed to be loaded into a python dashboard program.
At the moment it takes 2,5 mins to preprocess 3 Txt-Files, while the target is 5-10 seconds maximum, if even possible.
Thank you really much for your support, as I'm struggeling with this since Monday last week. Maybe powershell is not the best way to go? I'm open for any help!
At the moment I'm running this powershell script:
$files = Get-ChildItem "C:\Users\AnonUser\RestLogs\*" -Include *.log
New-Item C:\Users\AnonUser\RestLogs\CleanedLogs.txt -ItemType file
foreach ($f in $files){
$date = $f.BaseName.Substring(22,8)
(Get-Content $f) | Where-Object { ($_ -match 'Downloaded' -and $_ -match 'SAP')} | ForEach-Object {$_ -replace " ", ","}{$_+ ','+ $date} | Add-Content CleanedLogs.txt
}

This is about the fastest I could manage. I didn't test using -split vs -replace or special .NET methods:
$files = Get-ChildItem "C:\Users\AnonUser\RestLogs\*" -Include *.log
New-Item C:\Users\AnonUser\RestLogs\CleanedLogs.txt -ItemType file
foreach ($f in $files) {
$date = $f.BaseName.Substring(22,8)
(((Get-Content $f) -match "Downloaded.*?SAP") -replace " ",",") -replace "$","$date" | add-content CleanedLogs.txt
}
In general, speed is gained by removing loops and Where-Object "filtering."

Related

Manipulate strings in a txt file with Powershell - Search for words in each line containing "." and save them in new txt file

I have a text file with different entries. I want to manipulate it to filter out always the word, containing a dot (using Powershell)
$file = "C:\Users\test_folder\test.txt"
Get-Content $file
Output:
Compass Zype.Compass 1.1.0 thisisaword
Pomodoro Logger zxch3n.PomodoroLogger 0.6.3 thisisaword
......
......
......
Bla Word Program.Name 1.1.1 this is another entry
As you can see, in all lines, the "second" "word" contains a dot, like "Program.Name".
I want to create a new file, which contains just those words, each line one word.
So my file should look something like:
Zype.Compass
zxch3n.PomodoroLogger
Program.Name
What I have tried so far:
Clear-Host
$folder = "C:\Users\test_folder"
$file = "C:\Users\test_folder\test.txt"
$content_txtfile = Get-Content $file
foreach ($line in $content_textfile)
{
if ($line -like "*.*"){
$line | Out-File "$folder\test_filtered.txt"
}
}
But my output is not what I want.
I hope you get what my problem is.
Thanks in advance! :)
Here is a solution using Select-String to find sub strings by RegEx pattern:
(Select-String -Path $file -Pattern '\w+\.\w+').Matches.Value |
Set-Content "$folder\test_filtered.txt"
You can find an explanation and the ability to experiment with the RegEx pattern at RegEx101.
Note that while the RegEx101 demo also shows matches for the version numbers, Select-String gives you only the first match per line (unless argument -AllMatches is passed).
This looks like fixed-width fields, and if so you can reduce it to this:
Get-Content $file | # Read the file
%{ $_.Substring(29,36).Trim()} | # Extract the column
?{ $_.Contains(".") } | # Filter for values with "."
Set-Content "$folder\test_filtered.txt" # Write result
Get-content is slow and -like is sometimes slower than -match. I prefer -match but some prefer -like.
$filename = "c:\path\to\file.txt"
$output = "c:\path\to\output.txt"
foreach ($line in [System.IO.File]::ReadLines($filename)) {
if ($line -match "\.") {
$line | out-file $output -append
}
}
Otherwise for a shorter option, maybe
$filename = "c:\path\to\file.txt"
$output = "c:\path\to\output.txt"
Get-content "c:\path\to\file.txt" | where {$_ -match "\.") | Out-file $output
For other match options that are for the first column, either name the column (not what you do here) or use a different search criteria
\. Means a period anywhere seein the whole line
If it's all periods and at the beginning you can use begining of line so..
"^\." Which means first character is a period.
If it's always a period before the tab maybe do an anything except tab period anything except tab or...
"^[^\t]*\.[^\t]*" this means at the start of the line anything except tab any quantity then a period then anything except a tab any number of times.

Powershell: Replace string in File1 based on string in File2

I am being forced to use Powershell because of my work. I have used it to do a couple of things but one of my codes is now trash because I have to update a string in a file to include a year that is in a second file. Here is what I'm working with:
File1: Contains a few strings but in there is 48 strings that say:
Jenga_Sequence-XXXX.consensus_Bob_0.6_quality_20
The main point of the string is Sequence-XXXX, sorry for the random place holders.
File2: is a table that has the strings:
John/USA/Sequence-XXXX/Year
I need to replace the strings in File1 with the corresponding Strings in File2.
Sample Text of File1:
Jenga_Sequence-0001.consensus_Bob_0.6_quality_20
AAAAAAAAAAAAAAAAAAAAAAAAA
Jenga_Sequence-0002.consensus_Bob_0.6_quality_20
aaaaaaaaaaaaaaaaaaaaaaaaa
Jenga_Sequence-0003.consensus_Bob_0.6_quality_20
bbbbbbbbbbbbbbbbbbbbbbbbb
Jenga_Sequence-0004.consensus_Bob_0.6_quality_20
BBBBBBBBBBBBBBBBBBBBBBBBB
Jenga_Sequence-0005.consensus_Bob_0.6_quality_20
QQQQQQQQQQQQQQQQQQQQQ
Sample Table of File2:
|Sequence_ID|Date|
|---------------------------|----------|
|John/USA/Sequence-0003/2020|10/11/2020|
|John/USA/Sequence-0001/2021|1/5/2021|
|John/USA/Sequence-0005/2021|1/10/2021|
|John/USA/Sequence-0004/2020|12/23/2020|
|John/USA/Sequence-0002/2021|1/6/2021|
So, I need a Powershell code that replaces
Jenga_Sequence-0001.consensus_Bob_0.6_quality_20 with John/USA/Sequence-0001/2021,
Jenga_Sequence-0002.consensus_Bob_0.6_quality_20 with John/USA/Sequence-0002/2021,
Jenga_Sequence-0003.consensus_Bob_0.6_quality_20 with John/USA/Sequence-0003/2020, and so on. There are typically 48 of these in a file.
My previous code simple replaced "Jenga_" with "John/USA/" and ".consensus_Bob_0.6_quality_20" with "/2020" but now that we are seeing "/2021" the static code will not work.
I am still open to replacing pieces of the string and having a code that sets the year replacement to the correct year.
That was the angle I was doing a broad search on but I could never find anything specific enough to help.
Any help will be appreciated!
EDIT: Here is the part of my previous code that dealt with the finding and replacing, even though I feel it needs to be trashed:
$filePath = 'Jenga_Combined.txt'
$tempFilePath = "$env:TEMP\$($filePath | Split-Path -Leaf)"
$find = 'Jenga_'
$replace = 'John/USA/'
$find2 = '.consensus_Bob_0.6_quality_20'
$replace2 = '/2020'
(Get-Content -Path $filePath) -replace $find, $replace -replace $find2, $replace2 | Add-Content -Path $tempFilePath
Remove-Item -Path $filePath
Move-Item -Path $tempFilePath -Destination $filePath
EDIT2: The "Real Data" from file2. File2 is a Tab Delimited .txt file which makes it not "look great" when copy and pasting. Hopefully this helps. File1 is exactly like above (although the AAAAA stuff is roughly 30,000 letters long)
Sequence_ID date
John/USA/Sequence-0003/2020 2020-10-11
John/USA/Sequence-0001/2021 2021-01-05
John/USA/Sequence-0005/2021 2021-01-10
John/USA/Sequence-0004/2020 2020-12-23
John/USA/Sequence-0002/2021 2021-01-06
Dan
The common factor here is the Sequence_ID number in both files.
You can do this like:
$csvData = Import-Csv -Path 'D:\Test\File2.txt' -Delimiter "`t"
$result = switch -Regex -File 'D:\Test\Jenga_Combined.txt' {
'^Jenga_Sequence-(\d+).*' {
$replace = $csvData | Where-Object { $_.Sequence_ID -like "*Sequence-$($matches[1])*" }
if (!$replace) { Write-Warning "No corresponding Sequence_ID $($matches[1]) found!"; $_ }
else { $replace.Sequence_ID }
}
default { $_ }
}
# output on screen
$result
# output to new file
$result | Set-Content -Path 'D:\Test\Jenga_Combined_NEW.txt' -Force
Output on screen:
John/USA/Sequence-0001/2021
AAAAAAAAAAAAAAAAAAAAAAAAA
John/USA/Sequence-0002/2021
aaaaaaaaaaaaaaaaaaaaaaaaa
John/USA/Sequence-0003/2020
bbbbbbbbbbbbbbbbbbbbbbbbb
John/USA/Sequence-0004/2020
BBBBBBBBBBBBBBBBBBBBBBBBB
John/USA/Sequence-0005/2021
QQQQQQQQQQQQQQQQQQQQQ
Of course, you need to change the file paths to match your environment

New-item "Illegal Characters in path" when I use a Variable that contains a here string

foreach ($Target in $TargetUSBs)
{
$LogPath= #"
$SourceUSB\$(((Get-CimInstance -ClassName Win32_volume)|where {$_.DriveType -eq "2" -and $_.DriveLetter -eq $Target}).SerialNumber)_
$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Year)$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Month)
$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Day)_$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Hour)
$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Minute)$(((Get-CimInstance -ClassName Win32_OperatingSystem).LocalDateTime).Second).txt
"#
$LogPath = $LogPath.Replace("`n","").Trim()
New-item -Path "$LogPath"
}
The Irony is when I copy and paste the contents of my variable and manually create a new-item -path and paste said contents it works but when I use the variable it does not...
Brief summary of my goal I am taking a USB labelled ORIGINAL and obtaining the S/N of every USB plugged in at the time and creating separate log files for each with the title consisting of SERIALNUMBER_DATE_TIME.txt and these files are created in ORIGINAL USB
$LogPath contains for example the following: E:\Mattel\1949721369_2018912_93427.txt
Yet when I use the Variable in New-item it indicates "Illegal characters in Path"
FYI $LogPath is a System.String not an object
$TargetUSBs is filled with all USB drives plugged into the system
this method of using a variable for a path usually works fine for me only difference is the here-string I used this time around does this cause my problem? I hope not because I really don't want to fill that variable all on one line. New-Item's helpfiles shows <String[]> for -path parameter does this mean I have to use a string array? and if so how do I convert this to make this work?
Your problem is that Windows uses CRLF line endings (Unix only LF), so you still have CR chars in your path.
To fix this just use:
.Replace("`r`n","")
However you can easily simplify your code so you do not require the messy here-string or replace/trim...
By using a single Get-Date call you can format it to your desired output. This means you can just build the Path as a simple string and involves much less code:
foreach ($Target in $TargetUSBs)
{
$SerialNumber = Get-CimInstance -ClassName Win32_volume | where {$_.DriveType -eq "2" -and $_.DriveLetter -eq $Target} | Select-Object -ExpandProperty SerialNumber
$DateTime = Get-Date -Format "yyyyMd_Hms"
New-item -Path "$SourceUSB\$SerialNumber_$DateTime.txt"
}

Powershell - Pulling string from txt, splitting it, then concatenating it for archive

I have an application where I am getting a list of new\modified files from git status, then I take the incomplete strings from that file, concatenate them with the root dir file path, then move those files to an archive. I have it half working, but the nature of how I am using powershell does not provide error reports and the process is obviously erroring out. Here is the code I am trying to use. (It has gone through several iterations, please excuse the commented out portions) Basically I am trying to Get-Content from the txt file, then replace ? with \ (for some reason the process that creates the txt love forward slashes...), then split that string at the spaces. The only part of the string I am interested in is the last part, which I am trying to concatenate with the known working root directory, then I am attempting to move those to an archive location. Before you ask, this is something we are not willing to track in git, due to the nature of the files (they are test outputs that are time stamped, we want to save them on a per test run basis, not in git) I am still fairly new to powershell and have been banging my head against this rock for far too long.
Get-Content $outfile | Foreach-Object
{
#$_.Replace("/","\")
#$lineSplit = $_.Split(' ')
$_.Split(" ")
$filePath = "$repo_dir\$_[-1]"
$filePath.Replace('/','\')
"File Path Created: $filePath"
$untrackedLegacyTestFiles += $filePath
}
Get-Content $untrackedLegacyTestFiles | Foreach-Object
{
Copy-Item $_ $target_root -force
"Copying File: $_ to $target_root"
}
}
the $outfile is a text file where each line has a partial file path leading to a txt file generated by a test application we use. This info is provided by git, so it looks like this in the $outfile txt file:
!! Some/File/Path/Doc.txt
The "!!" mean git sees it as a new file, however it could be several characters from a " M" to "??". Which is why I am trying to split it on the spaces and take only the last element.
My desired output would be to take the the last element of the split string from the $outfile (Some/File/Path/Doc.txt) and concatenate it with the $repo_dir to form a complete file path, then move the Doc.txt to an archive location ($target_root).
To combine a path in PowerShell, you should use the Join-Path cmdlet. To extract the path from your string, you can use a regex:
$extractedPath = [regex]::Match('!! Some/File/Path/Doc.txt', '.*\s(.+)$').Groups[1].Value
$filePath = Join-Path $repo_dir $extractedPath
The Join-Path cmldet will also convert all forward slashes to backslashes so no need to replace them :-).
Your whole script could look like this:
Get-Content $outfile | Foreach-Object {
$path = Join-Path $repo_dir ([regex]::Match($_, '.*\s(.+)$').Groups[1].Value)
Copy-Item $path $target_root -force
}
If you don't like to use regexin your code, you can also extract the path using:
$extractedPath = '!! Some/File/Path/Doc.txt' -split ' ' | select -Last 1
or
$extractedPath = ('!! Some/File/Path/Doc.txt' -split ' ')[-1]

Powershell - Optimizing a very, very large csv and text file search and replace

I have a directory with ~ 3000 text files in it, and I'm doing periodic search and replaces on those text files as I transition a program to a new server.
Each text file may have an average of ~3000 lines, and I need to search the files for maybe 300 - 1000 terms at a time.
I'm replacing the server prefix which is related to the string I'm searching for. So for every one of the csv entries, I'm looking for Search_String, \\Old_Server\"Search_String" and making sure that after the program completes, the result is "\\New_Server\Search_String".
I cobbled together a powershell program, and it works. But it's so slow I've never seen it complete.
Any suggestions for making it faster?
EDIT 1:
I changed get-content as suggested, but it still took 3 minutes to search two files (~8000 lines) for 9 separate search terms. I must still be screwing up; a notepad++ search and replace would still be way faster if done manually 9 times.
I'm not sure how to get rid of the first (Get-Content) because I want to make a copy of the file for backup before I make any changes to it.
EDIT 2:
So this is an order of magnitude faster; it's searching a file in maybe 10 seconds. But now it doesn't write changes to files, and it only searches the first file in the directory! I didn't change that code, so I don't know why it broke.
EDIT 3:
Success! I adapted a solution posted below to make it much, much faster. It's searching each file in a couple of seconds now. I may reverse the loop order, so that it loads the file into the array and then searches and replaces each entry in the CSV rather than the other way around. I'll post that if I get it to work.
Final script is below for reference.
#get input from the user
$old = Read-Host 'Enter the old cimplicity qualifier (F24, IRF3 etc'
$new = Read-Host 'Enter the new cimplicity qualifier (CB3, F24_2 etc)'
$DirName = Get-Date -format "yyyy_MM_dd_hh_mm"
New-Item -ItemType directory -Path $DirName -force
New-Item "$DirName\log.txt" -ItemType file -force -Value "`nMatched CTX files on $dirname`n"
$logfile = "$DirName\log.txt"
$VerbosePreference = "SilentlyContinue"
$points = import-csv SearchAndReplace.csv -header find #Import CSV File
#$ctxfiles = Get-ChildItem . -include *.ctx | select -expand fullname #Import local directory of CTX Files
$points | foreach-object { #For each row of points in the CSV file
$findvar = $_.find #Store column 1 as string to search for
$OldQualifiedPoint = "\\\\"+$old+"\\" + $findvar #Use escape slashes to escape each invidual bs so it's not read as regex
$NewQualifiedPoint = "\\"+$new+"\" + $findvar #escape slashes are NOT required on the new string
$DuplicateNew = "\\\\" + $new + "\\" + "\\\\" + $new + "\\"
$QualifiedNew = "\\" + $new + "\"
dir . *.ctx | #Grab all CTX Files
select -expand fullname | #grab all of those file names and...
foreach {#iterate through each file
$DateTime = Get-Date -Format "hh:mm:ss"
$FileName = $_
Write-Host "$DateTime - $FindVar - Checking $FileName"
$FileCopied = 0
#Check file contents, and copy matching files to newly created directory
If (Select-String -Path $_ -Pattern $findvar -Quiet ) {
If (!($FileCopied)) {
Copy $FileName -Destination $DirName
$FileCopied = 1
Add-Content $logfile "`n$DateTime - Found $Findvar in $filename"
Write-Host "$DateTime - Found $Findvar in $filename"
}
$FileContent = Get-Content $Filename -ReadCount 0
$FileContent =
$FileContent -replace $OldQualifiedPoint,$NewQualifiedPoint -replace $findvar,$NewQualifiedPoint -replace $DuplicateNew,$QualifiedNew
$FileContent | Set-Content $FileName
}
}
$File.Dispose()
}
If I'm reading this correctly, you should be able to read a 3000 line file into memory, and do those replaces as an array operation, eliminating the need to iterate through each line. You can also chain those replace operations into a single command.
dir . *.ctx | #Grab all CTX Files
select -expand fullname | #grab all of those file names and...
foreach {#iterate through each file
$DateTime = Get-Date -Format "hh:mm:ss"
$FileName = $_
Write-Host "$DateTime - $FindVar - Checking $FileName"
#Check file contents, and copy matching files to newly created directory
If (Select-String -Path $_ -Pattern $findvar -Quiet ) {
Copy $FileName -Destination $DirName
Add-Content $logfile "`n$DateTime - Found $Findvar in $filename"
Write-Host "$DateTime - Found $Findvar in $filename"
$FileContent = Get-Content $Filename -ReadCount 0
$FileContent =
$FileContent -replace $OldQualifiedPoint,$NewQualifiedPoint -replace $findvar,$NewQualifiedPoint -replace $DuplicateNew,$QualifiedNew
$FileContent | Set-Content $FileName
}
}
On another note, Select-String will take the filepath as an argument, so you don't have to do a Get-Content and then pipe that to Select-String.
Yes, you can make it much faster by not using Get-Content... Use Stream Reader instead.
$file = New-Object System.IO.StreamReader -Arg "test.txt"
while (($line = $file.ReadLine()) -ne $null) {
# $line has your line
}
$file.dispose()
i wanted to use PowerShell for this and created a script like the one below:
$filepath = "input.csv"
$newfilepath = "input_fixed.csv"
filter num2x { $_ -replace "aaa","bbb" }
measure-command {
Get-Content -ReadCount 1000 $filepath | num2x | add-content $newfilepath
}
It took 19 minutes on my laptop to process 6.5Gb file. The code below is reading file in a batch (using ReadCount) and uses filter that should optimize performance.
But then I tried FART and it did the same thing in 3 minutes! quite a difference!

Resources