Could someone clarify how sls (Select-String) works compared to grep and findstr?
grep: grep <pattern> files.txt
sls: sls <pattern> files.txt
(default parameter position for sls is pattern then file)
grep examples: grep "search text" *.log ; cat *.log | grep "search text"
sls examples: sls "search text" *.log ; cat *.log | grep "search text"
As an aside, all PowerShell Cmdlets are case-insensitive, unlike Linux tools which are generally always case-sensitive but also older tools like findstr which are case sensitive too, but findstr can be used in PowerShell, and works in situations where sls does not, for example: Get-Service | findstr "Sec" (this works without a problem!), but when we try to use sls in a similar way Get-Service | sls "Sec" we get nothing (presumably this fails because sls works with strings, but Get-Service returns an object, so that's understandable - but what is findstr doing then in that it can see the output as a string?).
So, my thinking is "ok, I need to make the output from Get-Service into a string to work with PowerShell Cmdlets", but that doesn't work (or not in a way that I would expect):
Get-Service | Out-String | sls "Sec" (gives results, but odd)
(Get-Service).ToString() | sls "Sec" (.ToString() just returns "System.Object[]")
How in general should I turn an object into a string so that it can manipulate the information (in the same way that Get-Service | findstr "Sec" can do so easily)?
Would appreciate if someone could clarify how things fit together in the above so that I can make more use of sls. In particular, Get-Service | Out-String | sls "Sec" does return stuff, just not the stuff I was expecting (is it searching for each character of "s" and "e" and "c" so is returning lots - that would not be very intuitive if so in my opinion)?
When you use Out-String by default, it turns the piped input object (an array of service objects in this case) into a single string. Luckily, the -Stream switch allows each line to be output as a single string instead. Regarding case-sensitivity, Select-String supports the -CaseSensitive switch.
# For case-insensitive regex match
Get-Service | Out-String -Stream | Select-String "Sec"
# For case-sensitive regex match
Get-Service | Out-String -Stream | Select-String "Sec" -CaseSensitive
# For case-sensitive non-regex match
Get-Service | Out-String -Stream | Select-String "Sec" -CaseSensitive -SimpleMatch
In either case, Select-String uses regex (use the -SimpleMatch switch to do a string match) to pattern match against each input string and outputs the entire string that matched the pattern. So if you only pipe into it a single string with many lines, then all lines will be returned on a successful match.
To complement AdminOfThings' helpful answer:
In order to find strings among the lines of the for-display string representations of non-string input objects as they would print to the console you indeed have to pipe to Out-String -Stream, whereas by default, simple .ToString() stringification is applied[1].
You shouldn't have to do this manually, however: Select-String should do it implicitly, as suggested in GitHub issue #10726.
Curiously, when piping to external programs such as findstr.exe, PowerShell already does apply Out-String -Stream implicitly; e.g:
Get-Date 1/1/2019 | findstr January works (in en-based cultures), because it is implicitly the same as Get-Date 1/1/2019 | Out-String -Stream | findstr January
By contrast, Get-Date 1/1/2019 | Select-String January is the equivalent of (Get-Date 1/1/2019).ToString([cultureinfo]::InvariantCulture) | Select-String January, and therefore does not work, because the input evaluates to 01/01/2019 00:00:00.
[1] More accurately, .psobject.ToString() is called, either as-is, or - if the object's ToString method supports an IFormatProvider-typed argument - as .psobject.ToString([cultureinfo]::InvariantCulture) so as to obtain a culture-invariant representation - see this answer for more information.
Related
Is it possible to use any sort of diff utility to diff based on filename only, excluding the file extensions? I have multiple dirs that have various versions of a file, ie media.mov, media.mp4, media.jpg, etc. I want to make sure all versions were made for each file (1000s). So /dir1/media_99.mov and /dir2/media_99.mp4 would yield a TRUE condition. Man diff does not have "--ignore-extension" and I'm not sure how to possibly use "--exclude-from=FILE". I can use Linux (preferred) or PowerShell (if I must)
In PowerShell, if you want to know which file names are unique to /dir1, use a Compare-Object call, followed by reducing those file names to their base name (file name without extension), weeding out duplicates, and sorting via Sort-Object
Compare-Object -PassThru -Property Name (Get-ChildItem -File /dir1) (Get-ChildItem -File /dir2) |
Where-Object SideIndicator -eq '<=' |
ForEach-Object BaseName |
Sort-Object -Unique
Note: The assumption is that both Get-ChildItem calls return at least one file-info object, otherwise the Compare-Object call will fail - guard against that with if statements, if necessary.
Any way to concatenate commands in Powershell for Linux?
This is what I'm trying to run:
pwsh -Command Invoke-ScriptAnalyzer -Path . | ConvertTo-Json | Out-File -FilePath "/home/administrator/scripts/test.json"
So, run the Invoke-ScriptAnalyzer, convert the results to Json and save the result to test.json.
It doesn't recognize anything after the | sign:
./test.sh: line 1: ConvertTo-Json: command not found
./test.sh: line 1: Out-File: command not found
I need this to go in a bash script, hence the need to launch pwsh with the -Command option.
Any ideas?
Thanks
As written, your | symbols are interpreted by your Linux shell (such as bash), not by PowerShell.
There are two soutions:
Preferably, quote the entire -Command argument (a '...' string in a POSIX-compatible shell such as bash uses the string's content verbatim):
pwsh -Command 'Invoke-ScriptAnalyzer -Path . | ConvertTo-Json | Out-File -FilePath "/home/administrator/scripts/test.json"'
Alternatively, individually \-escape all Linux-shell metacharacters that you want to pass through verbatim, such as \| and \" in this case:
pwsh -Command Invoke-ScriptAnalyzer -Path . \| ConvertTo-Json \| Out-File -FilePath \"/home/administrator/scripts/test.json\"
Note: In this case, pwsh receives multiple arguments after -Command. However, PowerShell simply joins them before interpreting the result as PowerShell code.
This differs from the command-line processing of POSIX-compatible shells, which interpret the first post--c argument as an implicit, ad-hoc script, with additional arguments getting passed as verbatim arguments to that script, accessible as usual as $1, ...
Currently I'm using this command on Linux:
grep Ban /var/log/fail2ban.log | grep -v 'Restore Ban' | sed 's/\s\s*/ /g' | cut -d" " -f8 | sort | uniq -c | sort -t ' ' -n -b
The log file looks like this:
2019-03-04 07:14:45,778 fail2ban.filter [19052]: INFO [sshd] Found 2*8.1*7.1*9.2*9
2019-03-04 07:14:46,412 fail2ban.actions [19052]: NOTICE [sshd] Ban 2*8.1*7.1*9.2*9
2019-03-04 07:15:04,708 fail2ban.actions [19052]: NOTICE [sshd] Unban 1*9.2*.2*4.1*6
...
The output looks like this:
8 1*2.2*6.1*1.1*5
12 3*.1*.*4.*6
18 1*5.2*8.2*5.4
19 1*2.2*6.1*1.1*4
72 3*.1*6.2*.9*
I already tried it with Get-Content but I don't understand all of the PowerShell syntax.
Your Linux command packs a lot of functionality into a single pipeline.
Your lack of effort to solve the problem yourself notwithstanding, constructing an equivalent PowerShell command is an interesting exercise in contrasting
a Unix-utilities solution with a PowerShell solution:
To set the scene, let me explain what your command does:
grep Ban /var/log/fail2ban.log case-sensitively finds lines that contain the word Ban in file /var/log/fail2ban.log and passes only those on.
grep -v 'Restore Ban' further (case-sensitively) filters out (-v) lines that contain the phrase 'Restore Ban'.
sed 's/\s\s*/ /g' replaces all (g) runs of 1 or more whitespace chars. (\s; in a modern regex dialect you'd use \s+) with a single space ...
... which then allows cut -d" " -f8 to reliably extract the 8th field from each line from the resulting space-separated list (e.g., 2*8.1*7.1*9.2*9).
sort then lexically sorts the resulting lines, and uniq -c weeds out duplicates, while prepending each unique line with the count of duplicates (-c), with 1 indicating a unique line.
Finally, sort -t ' ' -n -b sorts the resulting lines numerically by duplicate count.
In short: your command filters a log file via regex matching, extracts the 8th field from each line, eliminates duplicates, and prints unique fields prefixed with their duplicate count, sorted by duplicate count in ascending order.
Below is a near-equivalent PowerShell command, which:
is more readable (and therefore, of necessity, more verbose)
involves fewer steps
ultimately offers much more flexibility, due to:
sending objects through the pipeline, not just text that must often be (re)parsed - it is this feature that constitutes PowerShell's evolutionary quantum leap from traditional shells.
far superior language features (compared to POSIX-like shells such as bash) that can easily be woven into a pipeline.
That said, the price you pay for the increased power is performance:
Directly comparable commands perform much better using Unix utilities, though the usually higher level of abstraction and flexibility provided by PowerShell cmdlets may make up for that.
Here's the command, with the roughly corresponding Unix-utility calls in comments:
Select-String -CaseSensitive '(?<!Restore )Ban' /var/log/fail2ban.log | #grep,grep -v
ForEach-Object { (-split $_.Line)[7] } | # sed, cut -f8
Group-Object | # uniq -c
Select-Object Count, Name | # construction of output *objects*
Sort-Object Count, Name # sort, sort -n
The command outputs objects with a .Count (duplicate count) and .Name property (the 8th field from the log file), which:
allow for robust additional processing (no parsing of textual output needed).
render in a friendly manner to the console (see below).
Example output:
Count Name
----- ----
8 1*2.2*6.1*1.1*5
12 3*.1*.*4.*6
18 1*5.2*8.2*5.4
19 1*2.2*6.1*1.1*4
72 3*.1*6.2*.9*
For an explanation of the command, consult the following help topics, which are also available locally, as part of a PowerShell installation, via the Get-Help cmdlet:
Select-String
about_Regular_Expressions
ForEach-Object
about_Split (the -split operator)
Group-Object
Select-Object
To learn about renaming properties or creating calculated properties, see this answer.
Sort-Object
((Get-Content "fail2ban.log") -cmatch "(?<!Restore )Ban" | Select-String -Pattern "[0-9.*]+$" -AllMatches).matches.value | Group-Object | foreach {"$($_.count) $($_.name)"}
Get-Content here is grabbing each line of the fail2ban.log file. The -cmatch operator is performing a case-sensitive regex match. The regex pattern looks for the string Ban with a negative look behind of string Restore . The Select-String is looking for a regex pattern at the end of each line that has characters in the set (0123456789.*). The matches.value property outputs only the matched strings from the regex. Group-Object groups each identically matched value as property Name and adds a count property. Since the OP was capturing count, I decided to use Group-Object to easily get that. The foreach is simply doing formatting to match the output presentation of the OP.
Update - tried text without any back ticks still not replacing.
I have a file test.txt with these records:
name="BLUE_TBL_AC_EA_FREQ"` owner="BLUE_TBL_AC_EA_FREQ"
name="BLUE_TBL_AC_EA_FREQ" owner="RED_TBL_AC_EA_FREQ"
I would like to replace the string for the name attribute so I would use
name="BLUE_TBL_AC_EA_FREQ"
In the powershell script. This is what I did but the output file does not contain the replaced text, it is the same as the input.
Then I use a batch script that has this powershell command in the batch script
powershell -Command "(Get-Content "test.txt") | Foreach-Object {$_ -replace 'name="BLUE_TBL_AC_EA_FREQ"', 'name="BLUE_TBL_ACEAFREQ"'} | Set-Content "testo.txt""
The batch script does not replace the string that contains the string.
Am I doing something wrong, do I have to escape the double quotes for the string?
It is still not working for me.
End update
I am trying to use Powershell to replace some text in a file that makes use of the back tick.
In the file there is text such as
' name="BLUE_TBL_AC_EA_`FREQ"'
I would like to replace this text by
' name="BLUE_TBL_ACEAFREQ"'
The replace that I use in Powershell look like this
powershell -command "& {(Get-Content "file.txt") | Foreach-Object {$_ -replace ' name="BLUE_TBL_AC_EA_`FREQ"', ' name="BLUE_TBL_ACEAFREQ"'} | Set-Content "fileO.txt;}"
The ' name= is needed since there are multiple strings that contain the name="... string.
When I use this command nothing gets replaced. I have tried to escape the back tick by using a double back tick (``) and still nothing gets replaced.
I have searched and read many articles about Powershell and replacing text and the need for escaping characters. I thought I did what was needed but nothing gets replaced.
How do I replace the text that includes a back tick?
It looks like that when a replace is used in a batch script and the text contains either a back tick character or a double quote, then all the back ticks and the double quotes need to be escaped in the batch script.
The powershell command that was
powershell -Command "(Get-Content "test.txt") | Foreach-Object {$_ -replace 'name="BLUE_TBL_AC_EA_FREQ"', 'name="BLUE_TBL_ACEAFREQ"'} | Set-Content "testo.txt""
Becomes:
powershell -Command "(Get-Content "test.txt") | Foreach-Object {$_ -replace 'name=\"BLUE_TBL_AC_EA_FREQ\"', 'name=\"BLUE_TBL_ACEAFREQ\"'} | Set-Content "testo.txt""
If there is a back tick in the string then the back tick must be escaped as well using the reverse slash such as
\`
What a mess in a batch script.
Thanks for helping me figure this out.
I think your command have error that's why it is not changed. You missed the double quote after the txt, before the semicolon.
I tried this in my powershell and it is working.
(Get-Content "C:\test.txt") | Foreach-Object {$_ -replace ' name="BLUE_TBL_AC_EA_`FREQ"', ' name="BLUE_TBL_ACEAFREQ"'} | Set-Content "C:\test.txt"
I´m trying to make a Powershell script that reports if there´s a file not older than x hours which contains some string pattern. I made this:
Get-ChildItem C:\Folder -recurse | Select-String -pattern "err" | group path | select name | Where {$_.LastWriteTime -gt (Get-Date).AddHours(-12)}
Problem is that the last part of code which should select only files younger than x hours does not work - shows no files. When I change -gt to -lt it shows every file in the folder which contains pattern including younger than defined hours.
Does anyone has a solution please?
Thank you in advance
Your pipeline is in the wrong order. You are piping a collection of strings to Group-Object, which pipes a different collection to Select-Object, etc. Your call to Where-Object is receiving the output of Select-Object, which is a collection of PSCustomObjects.
What you want is to pipe the file objects themselves to Where-Object, and then pass those file objects down the pipeline:
Get-ChildItem C:\Folder -recurse |
Where {$_.LastWriteTime -gt (Get-Date).AddHours(-12)} |
Select-String -pattern "err" | group path | select name