Trim powershell output after a specific line - string

When running the sfc /scannow command from a powershell script, how do you trim the output to only keep all the data after "Verification 100% complete."?
Simple code example:
$Result = Invoke-Command { sfc /scannow } | Out-String
A truncated version of this output is:
Verification 97% complete.
Verification 98% complete.
Verification 98% complete.
Verification 99% complete.
Verification 99% complete.
Verification 100% complete.
Windows Resource Protection found corrupt files and successfully repaired them.
For online repairs, details are included in the CBS log file located at
windir\Logs\CBS\CBS.log. For example C:\Windows\Logs\CBS\CBS.log. For offline
repairs, details are included in the log file provided by the /OFFLOGFILE flag.
I also wouldn't mind removing the extraneous line breaks so the final output when returning $Result would simply be:
Windows Resource Protection found corrupt files and successfully repaired them. For online repairs, details are included in the CBS log file located at windir\Logs\CBS\CBS.log. For example C:\Windows\Logs\CBS\CBS.log. For offline repairs, details are included in the log file provided by the /OFFLOGFILE flag.
I have a 1024 character limit for the output, and right now I'm just truncating it with the following:
if ($Result.length -gt 1024) {
$Result = $Result.Substring($Result.Length-1024)
}
But I'd like to get rid of all the "status" type messaging from the output and just retain the final result of the scan.
Thank you!

First and foremost this is an encoding issue. The sfc command outputs UTF-16, but PowerShell doesn't know that (because it's a native command), so you typically get embedded null characters between the actual characters (try sfc /? | Out-String | Format-Hex to see what I mean).
Removing the embedded null characters would only partially fix the symptoms. On non-english systems the encoding of non-ASCII characters will still be wrong.
To fix the cause, you have to tell PowerShell the output encoding used by the command (see this answer for background, albeit it focuses on UTF-8):
# Save the current encoding settings and temporarily switch to UTF-16 (aka UnicodeEncoding).
$oldOutputEncoding = $OutputEncoding; $oldConsoleEncoding = [Console]::OutputEncoding
$OutputEncoding = [console]::OutputEncoding = New-Object System.Text.UnicodeEncoding
# No need for Invoke-Command. Just call it directly.
$result = sfc /scannow | Out-String
# Restore the previous settings.
$OutputEncoding = $oldOutputEncoding; [Console]::OutputEncoding = $oldConsoleEncoding
# Encoding fixed, but duplicate newlines remain. Replace them by single newlines:
$result = $result -replace '\r?\n\r?\n', [Environment]::NewLine
#Finally strip off everything up to and including the `100%` line:
$result = ($result -replace '(?s)^.*100\s*%.*?\r?\n').Trim()
See this RegEx101 demo for an explanation of the RegEx used by the last -replace command. Note that (?s) enables single-line mode so dot . matches newlines.

Related

Mass Conversion of (macintosh) .csv to (ms-dos) .csv

I am using a program to export hundreds of rows in an Excel sheet into separate documents, but the problem is that a PLC will be reading the files and they only save in (macintosh).csv with no option for windows. Is there a way to bulk convert multiple files with different names into the correct format?
I have used this code for a single file but I do not have the knowledge to use it for multiple in a directory
$path = 'c:\filename.csv';
[System.IO.File]::WriteAllText($path.Remove($path.Length-3)+'txt',[System.IO.File]::ReadAllText($path).Replace("`n","`r`n"));
Thank you
The general PowerShell idiom for processing multiple files one by one:
Use Get-ChildItem (or Get-Item) to enumerate the files of interest, as System.IO.FileInfo instances.
Pipe the result to a ForEach-Object call, whose script-block argument ({ ... }) is invoked once for each input object received via the pipeline, reflected in the automatic $_ variable.
Specifically, since you're calling .NET API methods, be sure to pass full, file-system-native file paths to them, because .NET's working directory usually differs from PowerShell's. $_.FullName does that.
Therefore:
Get-ChildItem -LiteralPath C:\ -Filter *.csv |
ForEach-Object {
[IO.File]::WriteAllText(
[IO.Path]::ChangeExtension($_.FullName, 'txt'),
[IO.File]::ReadAllText($_.FullName).Replace("`n", "`r`n")
)
}
Note:
In PowerShell type literals such as [System.IO.File], the System. part is optional and can be omitted, as shown above.
[System.IO.Path]::ChangeExtension(), as used above, is a more robust way to obtain a copy of a path with the original file-name extension changed to a given one.
While Get-ChildItem -Path C:\*.csv or even Get-ChildItem C:\*.csv would work too (Get-ChildItem's first positional parameter is -Path), -Filter, as shown above, is usually preferable for performance reasons.
Caveat: While -Filter is typically sufficient, it does not use PowerShell's wildcard language, but delegates matching to the host platform's file-system APIs. This means that range or character-set expressions such as [0-9] and [fg] are not supported, and, on Windows, several legacy quirks affect the matching behavior - see this answer for more information.

Strange characters found in XML file and PowerShell output after exporting from Excel: ​

I have an XML file that I'm trying to read with PowerShell. However when I read it, the output of some of the XML objects have the following characters in them: ​
I simply downloaded an XML file I needed from a third-party, which opens in Excel. Then I grab the columns I need and paste them into a new Excel Workbook. Then I map the fields with an XML Schema and then export it as an XML file, which I then use for scripting.
In the Excel spreadsheet my data looks clean, but then when I export it and run the PS script, these strange characters appear in the output. The characters even appear in the actual XML file after exporting. What am I doing wrong?
I tried using -Encoding UTF8, but I'm relatively new to PowerShell and am not sure how to appropriately apply it to my script. Appreciate any help!
PowerShell
$xmlpath = 'Path\To\The\File.xml'
[xml]$xmldata = (Get-Content $xmlpath)
$xmldata.applications.application.name
Example of Output
​ABC_DEF_GHI​.com​​
​JKL_MNO_PQRS​.com​
TUV_WXY_Z.com
AB_CD_EF_GH​.com
This is a prime example of why you shouldn't use the idiom [xml]$xmldata = (Get-Content $xmlpath) - as convenient as it is.[1] The problem is indeed one of character encoding: your file is UTF-8-encoded, but Windows PowerShell's Get-Content cmdlet interprets it as ANSI-encoded in the absence of a BOM - this answer explains the encoding part in detail.Thanks, choroba.
Instead, to ensure that the XML file's character encoding is interpreted correctly, use the following:
# Note: If you know that $xmlPath contains a *full*, native path,
# you don't need the Convert-Path call.
($xmlData = [xml]::new()).Load((Convert-Path -LiteralPath $xmlPath))
This delegates interpretation of the character encoding to the System.Xml.XmlDocument.Load .NET API method, which not only assumes the proper default for XML (UTF-8), but also respects any explicit encoding specification as part of the XML declaration, if present (e.g., <?xml version="1.0" encoding="iso-8859-1"?>)
See also:
the bottom section of this answer for background information.
GitHub proposal #14505, which proposes introducing a New-Xml cmdlet that robustly parses XML files.
[1] If you happen to know the encoding of the input file ahead of time, you can get away with using Get-Content's -Encoding parameter in your original approach ([xml]$xmldata = (Get-Content -Encoding utf8 $xmlpath), but the .Load()-based approach is much more robust.

Find and replace a specific string within a specific file type located in wildcard path

Problem:
Update a specific string within numerous configuration files that are found within the subfolders of a partial path using PowerShell.
Expanded Details:
I have multiple configuration files that need a specific string to be updated; however, I do not know the name of these files and must begin my search from a partial path. I must scan each file for the specific string. Then I must replace the old string with the new string, but I must make sure it saves the file with its original name and in the same location it was found. I must also be able to display the results of the script (number of files affected and their names/path). Lastly, this must all be done in PowerShell.
So far I have come up with the following on my own:
$old = "string1"
$new = "string2"
$configs = Get-ChildItem -Path C:\*\foldername\*.config -Recurse
$configs | %{(Get-Content $_) -Replace $old, $new | Set-Content $_FullName
When I run this, something seems to happen.
If the files are open, they will tell me that they were modified by another program.
However, nothing seems to have changed.
I have attempted various modifications of the below code as well. To my dismay, it only seems to be opening and saving each file rather than actually making the change I want to happen.
$configFiles = GCI -Path C:\*\Somefolder\*.config -Recurse
foreach ($config in $configFiles) {
(GC $config.PSPath) | ForEach-Object {
$_ -Replace "oldString", "newString"
} | Set-Content $config.PSPath)
}
To further exasperate the issue, all of my attempts to perform a simple search against the specified string seems to be posing me issues as well.
Discussing with several others, and based on what have learned via SO... the following code SHOULD return results:
GCI -Path C:\*\Somefolder\*.config -Recurse |
Select-String -Pattern "string" |
Select Name
However, nothing seems to happen. I do not know if I am missing something or if the code itself is wrong...
Some questions I have researched and tried that are similar can be found at the below links:
UPDATE:
It is possible that I am being thwarted by special characters such as
+ and /. For example, my string might be: "s+r/ng"
I have applied the escape character that PowerShell says to use, but it seems this is not helping either.
Replacing a text at specified line number of a file using powershell
Find and replacing strings in multiple files
PowerShell Script to Find and Replace for all Files with a Specific Extension
Powershell to replace text in multiple files stored in many folders
I will continue my research and continue making modifications. I'll be sure to notate anything that get's me to my goal or even a step closer. Thank you all in advance.

Replace strings in text files with string literals and file names in powershell

My google-fu has failed me, so I'd love to get some help with this issue. I have a directory full of markup files (extension .xft). I need to modify these files by adding string literals and the filename (without the file extension) to each file.
For example, I currently have:
<headerTag>
<otherTag>Some text here </otherTag>
<finalTag> More text </finalTag>
What I need to end up with is:
<modifiedHeaderTag>
<secondTag> filenameGoesHere </secondTag>
<otherTag>Some text here </otherTag>
<finalTag> More text </finalTag>
So in this example,
"<modifiedHeaderTag>
<secondTag>"
would be my first string literal (this is a constant that gets inserted into each file in the same place),
filenameGoesHere
would be the variable string (the name of each file) and,
"</secondTag>"
would be my second constant string literal.
I was able to successfully replace text using:
(Get-Content *.xft).Replace("<headerTag>", "<modifiedHeaderTag>")
However, when I tried
(Get-Content *.xft).Replace("<headerTag>", "<modifiedHeaderTag> `n
<secondTag> $($_.Name) </secondTag>")
I just got an error message. Replacing $($_.Name) with ${$_.Name) also had no effect.
I've tried other things, but this method was the closest that I had gotten to success. I would appreciate any help that I can get. It's probably simple and I'm just not seeing something due to inexperience with Powershell, so a helping hand would be great.
If the above isn't clear enough, I'd be happy to provide more info, just let me know. Thanks everyone!
Here's my approach, assuming you have all of the XFT's in one folder and you want to write the updates back to the same file:
$path = "C:\XFTs_to_Modify"
$xfts = Get-ChildItem $path -Include "*.xft"
foreach ($xft in $xfts) {
$replace = "<modifiedHeaderTag>
<secondTag> $($xft.Name) </secondTag>"
(Get-Content *.xft).Replace("<headerTag>", $replace) | Set-Content $xft.FullName -Force
}

Powershell how to take string from file and put into a variable

I am writing a Powershell script to "build" Windows 7 PCs: adding users, printers, applications, changing settings, et al. I am adding some printer drivers using PNPUtil, but the problem is I won't know what "Published name" the drivers will be given.
If I put the output from the PNPUtil command into a .txt file, is there a way for me to then take the __.inf Published name and put it into a variable so that I can then use that name to add the printer using $printerclass.CreateInstance()?
You don't have to use a file if PNPUtil only outputs the name your interested in. That is, you can assign its output to a variable like so:
$result = pnputil.exe
BTW if you want to use a file, to read content from a file you use Get-Content:
pnputil.exe > result.txt
$result = Get-Content result.txt
$line = $result | Foreach {if ($_ -match 'assigned an (\w+\.inf)') {$matches[1]}}
Okay - I found my own solution: Once the .inf file is added, all the driver names in that .inf are stored in the Microsoft update files. I just need to know the specific name of the driver I need from each .inf file in order to add the Printers.
However, I'd still love to know how to get a string from a line from a file using Powershell.

Resources