grep recursion - inconsistencies - linux

I'm looking for a particular string (mainly in .c files) recursively from the root directory.
When I use this, I get a list back almost immediately.
grep -rl "F_capture" .
However, if I try to speed things up by just searching .c files:
grep -r --include=*.c "F_capture" .
I end up with a slew of recursive directory warnings like this:
grep: warning: ./sys/block/fd0/device/bus/drivers/i8042/i8042/serio1/input:event1/subsystem/event3/device/bus/drivers/pcnet32/0000:00:03.0/subsystem: recursive directory loop
When I tried suppressing the warnings using the -s parameter, I don't get the warnings but I don't get anything back either - seems like it's going off into never never land.
grep -rsl --include="*.c" "F_capture" .
So I guess my question is, why does the first grep I used return something immediately and the other types where I'm targeting a specific type of file seem to hang up. I would think the targeted search would be faster.

You can try this:
find . -type f -name "*.c" -print|xargs grep -l "F_capture"

Try the following command:
grep "F_capture" `find -type f -name "*.c"`

Related

"find" specific contents [linux]

I would like to go through all the files in the current directory (or sub-directories) and echoes me back the name of files only if they contain certain words.
More detail:
find -type f -name "*hello *" will give me all file names that have "hello" in their names. But instead of that, I want to search through the files and if that file's content contains "hello" then prints out the name of the file.
Is there a way to approach this?
You can use GNU find and GNU grep as
find /path -type f -exec grep -Hi 'hello' {} +
This is efficient in a way that it doesn't invoke as many grep instances to as many files returned from find. This works in an underlying assumption that find returns a set of files for grep to search on. If you are unsure if the files may not be available, as a fool-proof way, you can use xargs with -r flag, in which case the commands following xargs are executed only if the piped commands return any results
find /path -type f | xargs -r0 grep -Hi 'hello'

find -exec doesn't recognize argument

I'm trying to count the total lines in the files within a directory. To do this I am trying to use a combination of find and wc. However, when I run find . -exec wc -l {}\;, I recieve the error find: missing argument to -exec. I can't see any apparent issues, any ideas?
You simply need a space between {} and \;
find . -exec wc -l {} \;
Note that if there are any sub-directories from the current location, wc will generate an error message for each of them that looks something like that:
wc: ./subdir: Is a directory
To avoid that problem, you may want to tell find to restrict the search to files :
find . -type f -exec wc -l {} \;
Another note: good idea using the -exec option . Too many times people pipe commands together thinking to get the same result, for instance here it would be :
find . -type f | xargs wc -l
The problem with piping commands in such a manner is that it breaks if any files has spaces in it. For instance here if a file name was "a b" , wc would receive "a" and then "b" separately and you would obviously get 2 error messages: a: no such file and b: no such file.
Unless you know for a fact that your file names never have any spaces in them (or non-printable characters), if you do need to pipe commands together, you need to tell all the tools you are piping together to use the NULL character (\0) as a separator instead of a space. So the previous command would become:
find . -type f -print0 | xargs -0 wc -l
With version 4.0 or later of bash, you don't need your find command at all:
shopt -s globstar
wc -l **/*
There's no simple way to skip directories, which as pointed out by Gui Rava you might want to do, unless you can differentiate files and directories by name alone. For example, maybe directories never have . in their name, while all the files have at least one extension:
wc -l **/*.*

Display multiple files in Linux/Unix

I'm looking to display 3 different files, if they exist. I thought the following would work, but it doesn't:
ls -R | grep 6-atom2D.vector$ 6-atom2D.klist 6-atom2D.struct
How can I do it?
Knowing the (base) filenames, you can use find:
find . -name '6-atom2D.vector$' -o -name '6-atom2D.klist' -o -name '6-atom2D.struct'
It searches recursive by default.
For case-insensitive search, use -iname instead.
ls -R | egrep "6-atom2D\.vector$|6-atom2D\.klist|6-atom2D\.struct"
If $ is supposed to be end of line regexp, then you might need to use \> instead. That works for me at least.
Edit: Backslash before .

Piping find results into grep for fast directory exclusion

I am successfully using find to create a list of all files in the current subdirectory, excluding those in the subdirectory "cache." Here's my first bit of code:
find . -wholename './cach*' -prune -o -print
I now wish to pipe this into a grep command. It seems like that should be simple:
find . -wholename './cach*' -prune -o -print | xargs grep -r -R -i "samson"
... but this is returning results that are mostly from the cache directory. I've tried removing the xargs reference, but that does what you'd expect, running the grep on text of the file names, rather than on the files themselves. My goal is to find "samson" in any files that aren't cached content.
I'll probably get around this issue by just using doubled greps in this instance, but I'm very curious about why this one-liner behaves this way. I'd love to hear thoughts on a way to modify it while still using these two commands (as there are speed advantages to doing it this way).
(This is in CentOS 5, btw.)
The wholename match may be the reason why it's still including "cache" files. If you're executing the find command in the directory that contains the "cache" folder, it should work. If not, try changing it to -name '*cache*' instead.
Also, you do not need the -r or -R for your grep, that tells it to recurse through directories - but you're testing individual files.
You can update your command using the piped version, or a single-command:
find . -name '*cache*' -prune -o -print0 | xargs -0 grep -il "samson"
or
find . -name '*cache*' -prune -o -exec grep -iq "samson" {} \; -print
Note, the -l in the first command tells grep to "list the file" and not the line(s) that match. The -q in the second does the same; it tells grep to respond quietly so find will then just print the filename.
You've told grep itself to recurse (twice! -r and -R are synonyms). Since one of the arguments you're passing is . (the top directory), grep is searching in every file (some of them twice, or even more if they're in subdirectories).
If you're going to use find and grep, do this:
find . -path './cach*' -prune -o -print0 | xargs -0 grep -i "samson"
Using -print0 and -0 makes your script work even with file names that contain spaces or punctuation characters.
However, you probably don't need to bother with find here, since GNU grep is capable of excluding directories:
grep -R --exclude-dir='cach*' -i "samson" .
(This also excludes ./deeply/nested/directory/cache. If you only want to exclude cache directories at the toplevel, use find as you did.)
Use the -exec option on find instead of piping them to another command. From there you can use grep "samson" {} \; to look for samson in each file listed.
For example:
find . -wholename './cach*' -prune -o -exec grep "samson" "{}" +

What's the best way to find a string/regex match in files recursively? (UNIX)

I have had to do this several times, usually when trying to find in what files a variable or a function is used.
I remember using xargs with grep in the past to do this, but I am wondering if there are any easier ways.
grep -r REGEX .
Replace . with whatever directory you want to search from.
The portable method* of doing this is
find . -type f -print0 | xargs -0 grep pattern
-print0 tells find to use ASCII nuls as the separator and -0 tells xargs the same thing. If you don't use them you will get errors on files and directories that contain spaces in their names.
* as opposed to grep -r, grep -R, or grep --recursive which only work on some machines.
This is one of the cases for which I've started using ack (http://petdance.com/ack/) in lieu of grep. From the site, you can get instructions to install it as a Perl CPAN component, or you can get a self-contained version that can be installed without dealing with dependencies.
Besides the fact that it defaults to recursive searching, it allows you to use Perl-strength regular expressions, use regex's to choose files to search, etc. It has an impressive list of options. I recommend visiting the site and checking it out. I've found it extremely easy to use, and there are tips for integrating it with vi(m), emacs, and even TextMate if you use that.
If you're looking for a string match, use
fgrep -r pattern .
which is faster than using grep.
More about the subject here: http://www.mkssoftware.com/docs/man1/grep.1.asp
grep -r if you're using GNU grep, which comes with most Linux distros.
On most UNIXes it's not installed by default so try this instead:
find . -type f | xargs grep regex
If you use the zsh shell you can use
grep REGEX **/*
or
grep REGEX **/*.java
This can run out of steam if there are too many matching files.
The canonical way though is to use find with exec.
find . -name '*.java' -exec grep REGEX {} \;
or
find . -type f -exec grep REGEX {} \;
The 'type f' bit just means type of file and will match all files.
I suggest changing the answer to:
grep REGEX -r .
The -r switch doesn't indicate regular expression. It tells grep to recurse into the directory provided.
This is a great way to find the exact expression recursively with one or more file types:
find . \\( -name '\''*.java'\'' -o -name '\''*.xml'\'' \\) | xargs egrep
(internal single quotes)
Where
-name '\''*.<filetype>'\'' -o
(again single quotes here)
is repeated in the parenthesis ( ) for how many more filetypes you want to add to your recursive search
an alias looks like this in bash
alias fnd='find . \\( -name '\''*.java'\'' -o -name '\''*.xml'\'' \\) | xargs egrep'

Resources