I wrote a simple PHP shell script which parses files and outputs certain element.
It generates lots of output. In different (bash) colors, green for OK, yellow for warnings, red for errors, etc.
During development I want to filter some lines out. For example all lines that contains red text.
Can I use a grep (or other) command for this?
I have no idea what your input looks like, but as a proof of concept you can filter any lines in ls output that use green colour:
ls --color=always | grep '^[\[01;32m'
The lookup table for other colours can be found here: http://en.wikipedia.org/wiki/ANSI_escape_code#Colors
Hint: In case you didn't know, the ^[ part above should be entered like Ctrl-VEsc (or indeed Ctrl-VCtrl-[ on most terminals).
I'm sure there will be some option to grep to make it understand \x1B instead, but I haven't found it
As far as I understand, you parse the input once to colorize it anyway, right? Why not 'cut out' warnings/errors in the same function? Make your script use command line options, like myscript --nowarnings
There is getopt for PHP tutorial here
I don't know any php, but something like (pseudocode):
paintred(string, show){
match(string);
if(show){
print(string) in red;
}
else return 0;
}
Where show would depend on command line option.
This way you only parse the file once, and you give the future users an option to skip OK lines or warnings.
Related
When using fish shell in a terminal-emulator (such as terminator) together with a command that outputs lots of text it could be useful to get some color coding on the output. I know that a script can add color code information to the output like "grep --color=auto". I guess it's possible to modify the fish terminal to scan through the output and add this in special places?
What I want to do is that the text "error" appearing in the output from any script always is marked red and that "warning" always is marked yellow. Anyone knows if this is possible by introducing function files in the ~/.config/fish/functions dir or similar?
This is basically a layering violation. Usually the output of external commands does not go back through the shell. It goes directly to the terminal.
Also, anything you do here has the potential to slow output down. (And because of fish issue #1396, this can be rather extreme).
That said, it's possible if you always pipe to a function like this
function colorstuff
while read -l input
switch $input
case "*error*"
set_color red
echo $input
case "*warning*"
set_color yellow
echo $input
case "*"
set_color normal
echo $input
end
end
set_color normal
end
Use it like somecommand | colorstuff. (And maybe add 2>&1 if you also wish to have stderr colored)
In my tests, this causes a noticeable slowdown, and even with that issue fixed it will still be slower since it has to match every single line.
Really, the real solution is for whatever tool you are using to color itself, since it knows what the output means. All this can do is look for keywords.
For general output colorization needs, I added the grc plugin to Tackle for precisely that purpose.
I ran into this problem with grep and would like to know if it's a bug or not. The reproducible scenario is a file with the contents:
string
string-
and save it as 'file'. The goal is to use grep with --color=always to output 'string' while excluding 'string-'. Without --color, the following works as expected:
$ grep string file | grep -v string-
but using --color outputs both instances:
$ grep --color=always string file | grep -v string-
I experimented with several variations but it seems --color breaks the expected behavior. Is this a bug or am I misunderstanding something? My assumption is that passing --color should have no effect on the outcome.
#Jake Gould's answer provides a great analysis of what actually happens, but let me try to phrase it differently:
--color=always uses ANSI escape codes for coloring.
In other words: --color=always by design ALTERS its output, because it must add the requisite escape sequences to achieve coloring.
Never use --color=always, unless you know the output is expected to contain ANSI escape sequences - typically, for human eyeballs on a terminal.
If you're not sure how the input is processed, use --color=auto, which - I believe - causes grep to apply coloring only if its stdout is connected to a terminal.
I a given pipeline, it typically only makes sense to apply --color=auto (or --color=always) to a grep command that is the LAST command in the pipeline.
When you use --color grep adds ANSI (I believe?) color coding. So your text which looks like this:
string
string-
Will actually look like this in terms of pure, unprocessed ASCII text:
^[[01;31m^[[Kstring^[[m^[[K
^[[01;31m^[[Kstring^[[m^[[K-
There is some nice info provided in this question thread including this great this answer.
My assumption is that passing --color should have no effect on the outcome.
Nope. The purpose of grep—as most all Unix/Linux tools—is to provide a basic simple service & do that well. And that service is to search a plain-text (key here) input file based on a patter & return the output. The --color option is a small nod to the fact that we are humans & staring at screens with uncolored text all day can drive you nuts. Color coding makes work easier.
So color coding with ANSI is usually considered a final step in a process. It’s not the job of grep to assume that if it comes across ANSI in it’s input it should ignore it. Perhaps a case could be made to add a --decolor option to grep, but I doubt that is a feature worth the effort.
grep is a base level plain-text parsing tool. Nothing more & nothing less.
I'd like to have a command I can insert into a command pipeline that adds color escapes to its input according to vim's syntax highlighting capabilities.
For example:
cat somefile.js | vim - <???> | less
The resulting text would be that of somefile.js, but colorized according to how the current vim configuration would do it in-editor.
It occurs to me that this must be possible. I agree that the example up there isn't what a sane man might call exactly useful, but that doesn't mean the idea never is.
I think your idea has one basic flaw: that nobody ever thought about allowing such a thing.
Clearly vim is capable of doing syntax highlighting. But I'll bet you an ice cream cone that if you can manage to get vim to stream text through and process it, that you won't like the results.
Consider what happens when you pipe text through more (or less if you prefer). When it goes to the terminal, these programs display one screenful and wait for you to hit the space bar. But if you redirect stdout to some other place than the terminal, these programs notice this and simply copy their input to their output unchanged.
If vim doesn't notice that you are piping text through, it is likely to send cursor-movement commands that you probably don't want in your output. If vim does notice, it is likely to just pass the text, and not syntax-color it. Only if vim does do the syntax-coloring but does not inject cursor-movement stuff will your idea work.
You could try it. Here's an answer that discusses piping stuff through vim:
Execute a command within Vim from the command line
But I say why not pipe your text through a program that was designed and intended to have text piped through it? Pygments can colorize every major programming language and markup format.
http://pygments.org/
The major advantage I see for your idea: you can customize the way vim does syntax coloring, get it the way you want it, and then also use vim to process your text. But it's probably not that hard to customize Pygments, and it might even be satisfactory out of the box, in which case it would definitely be the easiest way to go. And Pygments not only has ANSI sequence output, it also has HTML output, RTF, LaTeX, etc. So if you get Pygments working the way you want it to, it should be able to output whatever output format you need; vim will only have the ANSI sequence one.
There's a Perl module called Text::VimColor that I've heard will do kinda what you're looking for.
http://search.cpan.org/dist/Text-VimColor/
But let me ask this: Why do want it to go through less? Why not use vim as a perfectly good file viewer? view - will read from standard input in read-only mode.
https://gist.github.com/echristopherson/4090959
Via https://superuser.com/a/554531/7198.
Tried on /etc/passwd and it works surprisingly well!
This might be what you're after
cat filename.sh | vim - -c 'syntax on; syn=bash'
This is ugly, but you could alias this:
alias vim.sh="vim -c 'syntax on; syn=bash'"
Then use like this:
cat filename.sh | vim.sh -
Use vimcat !
wget -O /usr/local/bin/vimcat "https://www.vim.org/scripts/download_script.php?src_id=23422"
chmod 755 /usr/local/bin/vimcat
vimcat /etc/passwd
See also: https://www.vim.org/scripts/script.php?script_id=4325
I'm using cucumber to run some tests. It colorizes its output using ANSI escapes. This is great, but currently its producing more output than I care about, and shoving things I do care about off the screen. There doesn't seem to be a way to eliminate the other lines from within cucumber, but I can pipe the output through grep to pare down to the ones I care about.
The downside of this solution, though, is that all the colors are lost. I know it's not my shell or grep's fault, because % echo "\e[35mhello\e[00m world" | grep hello works just fine, so it must be cucumber disabling its own color somehow.
How can I preserve the colored output when I pipe the output of cucumber?
Doh. It's covered in cucumber -h. Use the -c flag to force colorized output.
I need to read through some gigantic log files on a Linux system. There's a lot of clutter in the logs. At the moment I'm doing something like this:
cat logfile.txt | grep -v "IgnoreThis\|IgnoreThat" | less
But it's cumbersome -- every time I want to add another filter, I need to quit less and edit the command line. Some of the filters are relatively complicated and may be multi-line.
I'd like some way to apply filters as I am reading through the log, and a way to save these filters somewhere.
Is there a tool that can do this for me? I can't install new software so hopefully it's something that would already be installed -- e.g., less, vi, something in a Python or Perl lib, etc.
Changing the code that generates the log to generate less is not an option.
Use &pattern command within less.
From the man page for less
&pattern
Display only lines which match the pattern; lines which do not
match the pattern are not displayed. If pattern is empty (if
you type & immediately followed by ENTER), any filtering is
turned off, and all lines are displayed. While filtering is in
effect, an ampersand is displayed at the beginning of the
prompt, as a reminder that some lines in the file may be hidden.
Certain characters are special as in the / command:
^N or !
Display only lines which do NOT match the pattern.
^R Don't interpret regular expression metacharacters; that
is, do a simple textual comparison.
Try the multitail tool - as well as letting you view multile logs at once, I'm pretty sure it lets you apply regex filters interactively.
Based on ghostdog74's answer and the less manpage, I came up with this:
~/.bashrc:
export LESSOPEN='|~/less-filter.sh %s'
export LESS=-R # to allow ANSI colors
~/less-filter.sh:
#!/bin/sh
case "$1" in
*logfile*.log*) ~/less-filter.sed < $1
;;
esac
~/less-filter.sed:
/deleteLinesLikeThis/d # to filter out lines
s/this/that/ # to change text on lines (useful to colorize using ANSI escapes)
Then:
less logfileFooBar.log.1 -- applies the filter applies automatically.
cat logfileFooBar.log.1 | less -- to see the log without filtering
This is adequate for now but I would still like to be able to edit the filters on the fly.
see the man page of less. there are some options you can use to search for words for example. It has line editing mode as well.
There's an application by Casstor Software Solutions called LogFilter (www.casstor.com) that can edit Windows/Mac/Linux text files and can easily perform file filtering. It supports multiple filters as well as regular expressions. I think it might be what you're looking for.