Color information disappears when writing a result of 'grep --color=auto' into a file - linux

When I do as below with an aliased grep(grep --color=auto)
echo abcde | grep 'ab'
it returns abcde (ab in red).
but
echo abcde | grep 'ab' >foo.txt
foo.txt has just abcde.
I guess my terminal shows ab in red in the 1st case according to some tags by 'grep' but foo.txt does not contain them. Is it due to grep?
Does grep judge what returning value should be?
My grep is grep (GNU grep) 2.20

grep recognizes where you store the result and disables coloring in case of auto setting for redirections (colors is enabled only for terminal).
Use --color=always to force it to use it ... always, but I don't think you will find those control sequences nice to view in a text file.

With --color=auto, grep checks whether the output goes to a terminal, and switches colours on only when it does. You need to specify --color=always instead.

Related

How to grep with input ansi color encoding maintained?

The following grep does not maintain the input ANSI color encoding. Is there a way to maintain the original color encoding?
$ builtin printf '%s\n' $'\e[33mx\e[0m' | grep $'\e\[33m'
x
Disable grep's coloring with --color=never:
$ builtin printf '%s\n' $'\e[33mx\e[0m' | grep --color=never $'\e\[33m'
Otherwise, grep inserts an escape sequence before the searched string \e\[33m to color it, and then after inserts another sequence to reset all colorings, which in turn causes the letter x to be not colored on the output.
The default coloring mode of grep is --color=auto, which colors the output only if the output is a terminal. So, another way of disabling the coloring would be redirecting the output of grep to somewhere other than the terminal, e.g. cat:
$ builtin printf '%s\n' $'\e[33mx\e[0m' | grep $'\e\[33m' | cat

is it possible run a linux command on the output of a previous command ASSUMING that the previous command comes first?

I know that I can use `` to get the output of a command, for example:
echo `ls`
but is there a way for me to use the ls command first and then run echo on it? For example: ls <some special redirection> echo? I tried ls > echo and it does not do what I want.
The reason I am asking is that sometimes I write complicated commands to get certain output for example: bjobs -u username01 | grep normal | awk '{print $1}' is a simple "complicated" command (sometimes they are 6 or 7 changed together). Now, I am currently having to do
Mycommand `(complicated string of commands)`
but I would much rather just do
(complicated string of commands) <some special redirection> Mycommand
is this possible?
You may use xargs
ls | xargs echo
When you need more actions on your result, you can parse them:
Parsing ls should be avoided, just rewriting the example:
ls | while read file; do
echo I found ${file}
done
This construction can be useful for more difficult parsing:
echo "red ford 2012
blue mustang 1998" | while read color car year; do
echo "My ${color} ${car} is from the year ${year}"
done

linux command grep -is "abc" filename|wc -l

what does the s mean there and also when pipe into wc what is that for? I know it eventually count the number of abc appeared in file filename, but not sure about the option s for and also pipe to wc mean
linux command grep -is "abc" filename|wc -l
output
47
-s means "suppress error messages about unreadable files" and the pipe to wc means "take the output and send it to the wc -l command" which effectively counts the number of lines matched. You can accomplish the same with the -c option to grep: grep -isc "abc" filename
Consider,
command_1 | command_2
Role of the pipe is that- it takes output of command written before it (command_1 here) and supplies that output to the command written after it (command_2 here).
The man page has everything you would want to know about the options for grep:
-s, --no-messages
Suppress error messages about nonexistent or unreadable files.
Portability note: unlike GNU grep, traditional grep did not con-
form to POSIX.2, because traditional grep lacked a -q option and
its -s option behaved like GNU grep's -q option. Shell scripts
intended to be portable to traditional grep should avoid both -q
and -s and should redirect output to /dev/null instead.
The pipe to wc -l is what gives you the count of how many lines the string "abc" appeared on. It isn't necessarily the number of times the string appeared in the file since one line with multiple occurrences is going to be counted as only 1.
grep man page says:
-s, --no-messages suppress error messages
grep returns the lines that have abc (case insensitive) in them. You pipe them to wc to get a count of the number of lines.
From man grep:
-s, --no-messages
Suppress error messages about nonexistent or unreadable files.
The wc command counts line, words and characters. With -l it returns the number of lines.

Preserve colouring after piping grep to grep

There is a simlar question in Preserve ls colouring after grep’ing but it annoys me that if you pipe colored grep output into another grep that the coloring is not preserved.
As an example grep --color WORD * | grep -v AVOID does not keep the color of the first output. But for me ls | grep FILE do keep the color, why the difference ?
grep sometimes disables the color output, for example when writing to a pipe. You can override this behavior with grep --color=always
The correct command line would be
grep --color=always WORD * | grep -v AVOID
This is pretty verbose, alternatively you can just add the line
alias cgrep="grep --color=always"
to your .bashrc for example and use cgrep as the colored grep. When redefining grep you might run into trouble with scripts which rely on specific output of grep and don't like ascii escape code.
A word of advice:
When using grep --color=always, the actual strings being passed on to the next pipe will be changed. This can lead to the following situation:
$ grep --color=always -e '1' * | grep -ve '12'
11
12
13
Even though the option -ve '12' should exclude the middle line, it will not because there are color characters between 1 and 2.
The existing answers only address the case when the FIRST command is grep (as asked by the OP, but this problem arises in other situations too).
More general answer
The basic problem is that the command BEFORE | grep, tries to be "smart" by disabling color when it realizes the output is going to a pipe. This is usually what you want so that ANSI escape codes don't interfere with your downstream program.
But if you want colorized output emanating from earlier commands, you need to force color codes to be produced regardless of the output sink. The forcing mechanism is program-specific.
Git: use -c color.status=always
git -c color.status=always status | grep -v .DS_Store
Note: the -c option must come BEFORE the subcommand status.
Others
(this is a community wiki post so feel free to add yours)
Simply repeat the same grep command at the end of your pipe.
grep WORD * | grep -v AVOID | grep -v AVOID2 | grep WORD

Pipe output to use as the search specification for grep on Linux

How do I pipe the output of grep as the search pattern for another grep?
As an example:
grep <Search_term> <file1> | xargs grep <file2>
I want the output of the first grep as the search term for the second grep. The above command is treating the output of the first grep as the file name for the second grep. I tried using the -e option for the second grep, but it does not work either.
You need to use xargs's -i switch:
grep ... | xargs -ifoo grep foo file_in_which_to_search
This takes the option after -i (foo in this case) and replaces every occurrence of it in the command with the output of the first grep.
This is the same as:
grep `grep ...` file_in_which_to_search
Try
grep ... | fgrep -f - file1 file2 ...
If using Bash then you can use backticks:
> grep -e "`grep ... ...`" files
the -e flag and the double quotes are there to ensure that any output from the initial grep that starts with a hyphen isn't then interpreted as an option to the second grep.
Note that the double quoting trick (which also ensures that the output from grep is treated as a single parameter) only works with Bash. It doesn't appear to work with (t)csh.
Note also that backticks are the standard way to get the output from one program into the parameter list of another. Not all programs have a convenient way to read parameters from stdin the way that (f)grep does.
I wanted to search for text in files (using grep) that had a certain pattern in their file names (found using find) in the current directory. I used the following command:
grep -i "pattern1" $(find . -name "pattern2")
Here pattern2 is the pattern in the file names and pattern1 is the pattern searched for
within files matching pattern2.
edit: Not strictly piping but still related and quite useful...
This is what I use to search for a file from a listing:
ls -la | grep 'file-in-which-to-search'
Okay breaking the rules as this isn't an answer, just a note that I can't get any of these solutions to work.
% fgrep -f test file
works fine.
% cat test | fgrep -f - file
fgrep: -: No such file or directory
fails.
% cat test | xargs -ifoo grep foo file
xargs: illegal option -- i
usage: xargs [-0opt] [-E eofstr] [-I replstr [-R replacements]] [-J replstr]
[-L number] [-n number [-x]] [-P maxprocs] [-s size]
[utility [argument ...]]
fails. Note that a capital I is necessary. If i use that all is good.
% grep "`cat test`" file
kinda works in that it returns a line for the terms that match but it also returns a line grep: line 3 in test: No such file or directory for each file that doesn't find a match.
Am I missing something or is this just differences in my Darwin distribution or bash shell?
I tried this way , and it works great.
[opuser#vjmachine abc]$ cat a
not problem
all
problem
first
not to get
read problem
read not problem
[opuser#vjmachine abc]$ cat b
not problem xxy
problem abcd
read problem werwer
read not problem 98989
123 not problem 345
345 problem tyu
[opuser#vjmachine abc]$ grep -e "`grep problem a`" b --col
not problem xxy
problem abcd
read problem werwer
read not problem 98989
123 not problem 345
345 problem tyu
[opuser#vjmachine abc]$
You should grep in such a way, to extract filenames only, see the parameter -l (the lowercase L):
grep -l someSearch * | xargs grep otherSearch
Because on the simple grep, the output is much more info than file names only. For instance when you do
grep someSearch *
You will pipe to xargs info like this
filename1: blablabla someSearch blablabla something else
filename2: bla someSearch bla otherSearch
...
Piping any of above line makes nonsense to pass to xargs.
But when you do grep -l someSearch *, your output will look like this:
filename1
filename2
Such an output can be passed now to xargs
I have found the following command to work using $() with my first command inside the parenthesis to have the shell execute it first.
grep $(dig +short) file
I use this to look through files for an IP address when I am given a host name.

Resources