Linux find files and grep then list by date - linux

I'm trying to find files with the name of formClass.php that contain a string of checkCookie and I want to list the files by date showing the date, size, owner and the group. The files are in my home directory.
I have this working, but it doesn't show date, owner, etc...
find /home -name formClass.php -exec grep -l "checkCookie" {} \;
I was thinking that I could add "trhg" to the list like this, but it didn't work:
find /home -name formClass.php -exec grep -ltrhg "checkCookie" {} \;
Thanks

Try find /home -name formClass.php -print0 | xargs -0 grep -l --null checkCookie | xargs -0 ls -l

ls -lt `grep -lr --include=formClass.php "checkCookie" *`
Take a look at man ls for another sorting options.

It seems there are a few ways to do this. I found this and it worked for me.
find /home -name formClass.php -exec grep -l "checkCookie" {} \; | xargs ls -ltrhg
thank you for the other suggestions.

Related

Find all files pattern with total size

In order to find all logs files with a pattern from all subdirectories I used the command :
du -csh *log.2017*
But this command does not search in subdirectories. Is there any way to get the total size of all files with a pattern from all subdirectories?
This will do the trick:
find . -name *log.2017* | xargs du -csh
find . -name *log.2017* -type f -exec stat -c "%s" {} \; | paste -sd+ | bc
you can use find command
find /path -type f -name "*log.2017*" -exec stat -c "%s" {} \; | bc
It will do the search recursively.

Using find and grep in a specifc file in a specific directory

I have the following directory structure:
./A1
./A2
./A3
./B
./C
In each one of the A* directories I have:
./A*/logs
./A*/test
in the logs directory I have:
./log-jan-1
./log-jan-2
./log-feb-1
How do I grep for a string in all January logs in the A directories?
I tried this, but it did not find the string although it is present in the log files:
find . -type d -name 'A*' print | xargs -n1 -I PATH grep string - PATH/logs/log-jan*
What am I doing wrong?
Why don't you simply use
grep string ./A*/logs/log-jan*
?
If it is not a typo, you should use -print (or -print0) instead of print.
But as find + xargs + grep constructs are hard to debug, you should test in sequence :
find . -type d -name 'A*' -print
find . -type d -name 'A*' -print0 | xargs -0 -n1 -I PATH echo grep string - PATH
and finally :
find . -type d -name 'A*' -print0 | xargs -0 -n1 -I PATH grep string - PATH/logs/log-jan*
In you use case, -print and -print0 should give same results, but for having been burnt with -print, I always use -print0 before xargs

Print the directory where the 'find' linux command finds a match

I have a bunch of directories; some of them contain a '.todo' file.
/storage/BCC9F9D00663A8043F8D73369E920632/.todo
/storage/BAE9BBF30CCEF5210534E875FC80D37E/.todo
/storage/CBB46FF977EE166815A042F3DEEFB865/.todo
/storage/8ABCBF3194F5D7E97E83C4FD042AB8E7/.todo
/storage/9DB9411F403BD282B097CBF06A9687F5/.todo
/storage/99A9BA69543CD48BA4BD59594169BBAC/.todo
/storage/0B6FB65D4E46CBD8A9B1E704CFACC42E/.todo
I'd like the 'find' command to print me only the directory, like this
/storage/BCC9F9D00663A8043F8D73369E920632
/storage/BAE9BBF30CCEF5210534E875FC80D37E
/storage/CBB46FF977EE166815A042F3DEEFB865
...
here's what I have so far, but it lists the '.todo' file as well
#!/bin/bash
STORAGEFOLDER='/storage'
find $STORAGEFOLDER -name .todo -exec ls -l {} \;
Should be dumb stupid, but i'm giving up :(
To print the directory name only, use -printf '%h\n'. Also recommended to quote your variable with doublequotes.
find "$STORAGEFOLDER" -name .todo -printf '%h\n'
If you want to process the output:
find "$STORAGEFOLDER" -name .todo -printf '%h\n' | xargs ls -l
Or use a loop with process substitution to make use of a variable:
while read -r DIR; do
ls -l "$DIR"
done < <(exec find "$STORAGEFOLDER" -name .todo -printf '%h\n')
The loop would actually process one directory at a time whereas in xargs the directories are passed ls -l in one shot.
To make it sure that you only process one directory at a time, add uniq:
find "$STORAGEFOLDER" -name .todo -printf '%h\n' | uniq | xargs ls -l
Or
while read -r DIR; do
ls -l "$DIR"
done < <(exec find "$STORAGEFOLDER" -name .todo -printf '%h\n' | uniq)
If you don't have bash and that you don't mind about preserving changes to variables outside the loop you can just use a pipe:
find "$STORAGEFOLDER" -name .todo -printf '%h\n' | uniq | while read -r DIR; do
ls -l "$DIR"
done
The quick and easy answer for stripping off a file name and showing only the directory it’s in is dirname:
#!/bin/bash
STORAGEFOLDER='/storage'
find "$STORAGEFOLDER" -name .todo -exec dirname {} \;

Shell command for counting words in files

I want to run a command, that will count the number of words in all file. (From the selected number of files)
If i do like, find ABG-Development/ -name "*.php" | grep "<?" | wc -l , it will search only in the filename not the file contents.
And I tried one more way like
find ABG-Development/ -name "*.php" -exec grep "<?" {} \; | wc -l, I got error.
In above example I need how many time "
Please help..
use xargs
find ABG-Development/ -name "*.php" -print0 | xargs -0 grep "<?" | wc -l

Find all files matching 'name' on linux system, and search with them for 'text'

I need to find all instances of 'filename.ext' on a linux system and see which ones contain the text 'lookingfor'.
Is there a set of linux command line operations that would work?
find / -type f -name filename.ext -exec grep -l 'lookingfor' {} +
Using a + to terminate the command is more efficient than \; because find sends a whole batch of files to grep instead of sending them one by one. This avoids a fork/exec for each single file which is found.
A while ago I did some testing to compare the performance of xargs vs {} + vs {} \; and I found that {} + was faster. Here are some of my results:
time find . -name "*20090430*" -exec touch {} +
real 0m31.98s
user 0m0.06s
sys 0m0.49s
time find . -name "*20090430*" | xargs touch
real 1m8.81s
user 0m0.13s
sys 0m1.07s
time find . -name "*20090430*" -exec touch {} \;
real 1m42.53s
user 0m0.17s
sys 0m2.42s
Go to respective directory and type the following command.
find . -name "*.ext" | xargs grep
'lookingfor'
A more simple one would be,
find / -type f -name filename.ext -print0 | xargs -0 grep 'lookingfor'
-print0 to find & 0 to xargs would mitigate the issue of large number of files in a single directory.
Try:
find / -type f -name filename.ext -exec grep -H -n 'lookingfor' {} \;
find searches recursively starting from the root / for files named filename.ext and for every found occurrence it runs grep on the file name searching for lookingfor and if found prints the line number (-n) and the file name (-H).
I find the following command the simplest way:
grep -R --include="filename.ext" lookingfor
or add -i to search case insensitive:
grep -i -R --include="filename.ext" lookingfor

Resources