How to send real output without headers to file in PowerShell - string

I have a PowerShell script where I use Add-Content to send $output to log.txt. $output is generated as Test-Connection nrk.no -count 1 (nrk.no is a random website I trust will remain up most of the time). I now get a long string: \\NOTREALNAME\root\cimv2:Win32_PingStatus.Address="nrk.no",BufferSize=32,NoFragmentation=false,RecordRoute=0,ResolveAddressNames=false,SourceRoute="",SourceRouteType=0,Timeout=4000,TimestampRoute=0,TimeToLive=80,TypeofService=0 . How can I get the long one with hyphens and headers (see below), but remove the header part so that I log the user friendly part of the output but do not get header and then info over and over again in my log file?
"The long one with hyphens and headers":

Pipe the output to Format-Table -HideTableHeaders to get table formatted output without the header:
Test-Connection nrk.no |Format-Table -HideTableHeaders |Out-String -Stream |Add-Content log.txt

Related

Trim powershell output after a specific line

When running the sfc /scannow command from a powershell script, how do you trim the output to only keep all the data after "Verification 100% complete."?
Simple code example:
$Result = Invoke-Command { sfc /scannow } | Out-String
A truncated version of this output is:
Verification 97% complete.
Verification 98% complete.
Verification 98% complete.
Verification 99% complete.
Verification 99% complete.
Verification 100% complete.
Windows Resource Protection found corrupt files and successfully repaired them.
For online repairs, details are included in the CBS log file located at
windir\Logs\CBS\CBS.log. For example C:\Windows\Logs\CBS\CBS.log. For offline
repairs, details are included in the log file provided by the /OFFLOGFILE flag.
I also wouldn't mind removing the extraneous line breaks so the final output when returning $Result would simply be:
Windows Resource Protection found corrupt files and successfully repaired them. For online repairs, details are included in the CBS log file located at windir\Logs\CBS\CBS.log. For example C:\Windows\Logs\CBS\CBS.log. For offline repairs, details are included in the log file provided by the /OFFLOGFILE flag.
I have a 1024 character limit for the output, and right now I'm just truncating it with the following:
if ($Result.length -gt 1024) {
$Result = $Result.Substring($Result.Length-1024)
}
But I'd like to get rid of all the "status" type messaging from the output and just retain the final result of the scan.
Thank you!
First and foremost this is an encoding issue. The sfc command outputs UTF-16, but PowerShell doesn't know that (because it's a native command), so you typically get embedded null characters between the actual characters (try sfc /? | Out-String | Format-Hex to see what I mean).
Removing the embedded null characters would only partially fix the symptoms. On non-english systems the encoding of non-ASCII characters will still be wrong.
To fix the cause, you have to tell PowerShell the output encoding used by the command (see this answer for background, albeit it focuses on UTF-8):
# Save the current encoding settings and temporarily switch to UTF-16 (aka UnicodeEncoding).
$oldOutputEncoding = $OutputEncoding; $oldConsoleEncoding = [Console]::OutputEncoding
$OutputEncoding = [console]::OutputEncoding = New-Object System.Text.UnicodeEncoding
# No need for Invoke-Command. Just call it directly.
$result = sfc /scannow | Out-String
# Restore the previous settings.
$OutputEncoding = $oldOutputEncoding; [Console]::OutputEncoding = $oldConsoleEncoding
# Encoding fixed, but duplicate newlines remain. Replace them by single newlines:
$result = $result -replace '\r?\n\r?\n', [Environment]::NewLine
#Finally strip off everything up to and including the `100%` line:
$result = ($result -replace '(?s)^.*100\s*%.*?\r?\n').Trim()
See this RegEx101 demo for an explanation of the RegEx used by the last -replace command. Note that (?s) enables single-line mode so dot . matches newlines.

Looking to validate that certain string is present in a text file, send warning if not

I have a process where files containing data are generated in separate locations, saved to a networked location, and merged into a single file.
And the end of the process, I would like to check that all locations are present in that merged file, and notify me if not.
I am having a problem finding a way to identify that a string specific to each location isn't present, to be used in an if statement, but it doesn't seem to be identifying the string correctly?
I have tried :
get-childitem -filter *daily.csv.ready \\x.x.x.x\data\* -recurse | where-object {$_ -notin 'D,KPI,KPI,1,'}
I know it's probably easier to do nothing if it is present, and perform the warning action if not, but I'm curious if this can be done in the reverse.
Thank you,
As Doug Maurer points out, your command does not search through the content of the files output by the Get-ChildItem command, because what that cmdlet emits are System.IO.FileInfo (or, potentially, System.IO.DirectoryInfo) instances containing metadata about the matching files (directories) rather than their content.
In other words: the automatic $_ variable in your Where-Object command refers to an object describing a file rather than its content.
However, you can pipe System.IO.FileInfo instances to the Select-String cmdlet, which indeed searches the input files' content:
Get-ChildItem -Filter *daily.csv.ready \\x.x.x.x\data\* -Recurse |
Where-Object { $_ | Select-String -Quiet -NotMatch 'D,KPI,KPI,1,' }

Delete rows in a .CSV file containing specific character with Powershell

I receive an automatic weekly export from a system in a .csv format. It contains a lot of usernames with the initials of the users (e.g. "fl", "nk"). A few of them have their first and last names, separated by coma (e.g. firstname.lastname). These are the ones, which have to be deleted from the .csv file.
My goal here is to write a Powershell script, which delete all rows, containing the character "." (dot) and then save the same .csv file by overwritting it.
Since I'm very new to Powershell, I'd highly appreciate a more detailed answer including the potential code. I tried various examples from similar issues, which I found here, but none of them worked and/or I am getting error messages, mostly because my syntax isn't correct.
Additional info. Here is a part of the table.
I tried this code:
Get-Content "D:\file.csv" | Where-Object {$_ -notmatch '\.'} | Set-Content "D:\File.csv"-Force -NoTypeInformation
As Mathias says, it is helpful to see what you have tried so we can help you come to a working result. It is easy to give you something like this:
$csv = Import-Csv -Path C:\Temp\temp.csv -Delimiter ";"
$newCSV = #()
foreach($row in $csv){
if(!$row.username -or $row.username -notlike "*.*"){
$newCSV += $row
}
}
$newCSV | Export-Csv -Path C:\Temp\temp.csv -Delimiter ";" -NoTypeInformation
The above code eliminates rows that have a dot on the username field. It leaves rows with an empty username intact with the 'if(!$row.username' part. But I have no idea whether this is helpful since there is no example CSV file, also there is no way to know what you have tried so far ;)
Note that I always prefer using ";" as delimiter, because opening the file in Excel will already be correctly seperated. If the current file uses ',' as a delimiter, you will need to change that when importing the CSV.
You were very close! For this you don't need a loop, you just need to do it using the correct cmdlets:
(Import-Csv -Path 'D:\file.csv' -Delimiter ';') |
Where-Object { $_.Initials -notmatch '\.' } |
Export-Csv -Path 'D:\file.csv' -Delimiter ';' -Force -NoTypeInformation
Get-Content simply reads a text file and returns the lines as string array, whereas Import-Csv parses the structure and creates objects with properties from the header line.
The brackets around the Import-Csv are needed to ensure the importing/parsing of the file is completely done before piping the results through. Without that, the resulting file may become completely empty because you cannot read and overwrite the same file at the same time.

Need to parse thousands of files for thousands of results - prefer powershell

I am getting consistently pinged from our government contract holder to search for IP addresses in our logs. I have three firewalls, 30 plus servers, etc so you can imagine how unwieldy it becomes. To amplify the problem, I have been provided a list of over 1500 IP addresses for which I am to search all log files...
I have all of the logs downloaded and can use powershell to go through them one by one but it takes forever. I need to be able to run the search using multi-thread in Powershell but cannot figure out the logic to do so. Here's my one by one script...
Any help would be appreciated!
$log = (import-csv C:\temp\FWLogs\IPSearch.csv)
$ip = ($log.IP)
ForEach($log in $log){ Get-ChildItem -Recurse -path C:\temp\FWLogs -filter *.log | Select-String $ip -List | Select Path
}

Adding a header to a '|' delimited CSV file in Powershell?

I was wondering if anybody knows a way to achieve this without breaking/mesing with the data itself?
I have a CSV file which is delimited by '|' which was created by retrieving data from Sharepoint using an SPQuery and exported using out-file (because export-csv is not an option since I would have to store the data in a variable and this would eat at the RAM of the server, querying remotely unfortuntely will also not work so i have to do this on the server itself). Nevertheless I have the Data i need but i want to perform some manipulations and move and autocalc certain data within an excel file and export the said excel file.
The problem I have right now is that I sort of need a header to the file. I have tried using the following code:
$header ="Name|GF_GZ|GF_Title|GF_UniqueId|GF_OldId|GFURL|GF_ITEMREDIRECTID"
$file = Import-Csv inputfilename.csv -Header $header | Export-Csv D:\outputfilename.csv
In powershell but the issue here is that when i perform the second Export-Csv it will delimit at anything that has a comma and thus remove it, i sort of need the data to remain intact.
I have tried playing with the -Delimit '|' setting both on the import and the export path but no matter what i do it seems to be cutting off the data. Is there a better way to simply add a row at the Top (a header) without messing with the already existing file structure?
I have found out that using a delimiter such as -delimiter '°' or any other special case character will remove my problem entirely, but i can never be sure if such a character is going to show up in the dataset and thus (as stated already) am looking for a more "elegant" solution.
Thanks
One option you have is to create the original CSV with the headers first. Then when you are exporting the SharePoint data, use the switch -Append in the Out-File command to append the SP data to the CSV.
I wouldn't even bother messing with it in csv format.
$header ="Name|GF_GZ|GF_Title|GF_UniqueId|GF_OldId|GFURL|GF_ITEMREDIRECTID"
$in_file = '.\inputfilename.csv'
$out_file = '.\outputfilename.csv'
$x = Get-Content $in_file
Set-Content $out_file -Value $header,$x
There's probably a more eloquent/refined two-liner for some of this, but this should get you what you need.

Resources