Linux find command get all text in the file and print file path - linux

I need to get all the texts in the matching file in the folder. However, at the same time need to get the matching file path as well. How can I get the matching file path as well using the following command.
find . -type f -name release.txt | xargs cat

try
find . -type f -name release.txt -exec grep -il {} \; | xargs cat

Skip xargs, just do:
find . -type f -name release.txt -exec sh -c 'echo "$1"; cat "$1"' _ {} \;

Related

Find command. How to process founded files to gunzip and then to grep by pattern

Example:
find . -name 'audit_log*.gz' -print -exec gunzip -c {} \| grep IP \;
Need to add a key to this to get:
-file name.
-list IP from audit_log*.gz files.
You've used the semi-column incorrectly. It should be at the end of the find command and before the pipe. Try this one-liner:
find . -type f -iname 'audit_log*.gz' -exec gunzip -c {} \; | grep IP

I want to get an output of the find command in shell script

Am trying to write a script that finds the files that are older than 10 hours from the sub-directories that are in the "HS_client_list". And send the Output to a file "find.log".
#!/bin/bash
while IFS= read -r line; do
echo Executing cd /moveit/$line
cd /moveit/$line
#Find files less than 600 minutes old.
find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log
done < HS_client_list
However, the script is able to cd to the folders from HS_client_list(this file contents the name of the subdirectories) but, the find command (find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log) is not working. The Output file is empty. But when I run find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log as a command it works and from the script it doesn't.
You are overwriting the file in each iteration.
You can use xargs to perform find on multiple directories; but you have to use an alternate delimiter to avoid having xargs populate the {} in the -execdir command.
sed 's%^%/moveit/%' HS_client_list |
xargs -I '<>' find '<>' -type f -iname "*.enc" -mmin +600 -execdir basename {} \; > /home/infa91punv/find.log
The xargs ls did not seem to perform any useful functionality, so I took it out. Generally, don't use ls in scripts.
With GNU find, you could avoid the call to an external utility, and use the -printf predicate to print just the part of the path name that you care about.
For added efficiency, you could invoke a shell to collect the arguments:
sed 's%^%/moveit/%' HS_client_list |
xargs sh -c 'find "$#" -type f -iname "*.enc" -mmin +600 -execdir basename {} \;' _ >/home/infa91punv/find.log
This will run as many directories as possible in a single find invocation.
If you want to keep your loop, the solution is to put the redirection after done. I would still factor out the cd, and take care to quote the variable interpolation.
while IFS= read -r line; do
find /moveit/"$line" -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';'
done < HS_client_list >/home/infa91punv/find.log

Pass a large variable into the diff command via bash

I am writing a script which does a checksum (md5sum) on a forum web directory.
It is a bash script. With the idea being to do a checksum on all the files in the directory, and then compare it to a text file which has a list of checksums.
The script works if I pass it into a text file, and then do a diff command between the text file and my list of known checksums, but I would like to not have it write to a text file and then have to remove the text file at the end of the script, hence why I am using a variable
The script below fails with the error:
/usr/bin/diff: Argument list too long
cd /var/www/html/forum/
VAR1=$(find . -type d \( -name store_sitemap \) -prune -o -type f -exec md5sum {} \; | grep -v "files\|that\|change")
/usr/bin/diff "${VAR1}" "/root/scripts/forum_checkum_original.txt"
How can I pass my variable along so that I can runn the diff command on it?
EDIT: with the help of the user devnull (thank you again) here is the completed and working script:
cd /var/www/html/forum/
MAIL=$(/usr/bin/diff <(find . -type d \( -name store_sitemap \) -prune -o -type f -exec md5sum {} \; | grep -v "files\|that\|change") /root/scripts/forum_checkum_original.txt)
if [[ -n $(/usr/bin/diff <(find . -type d \( -name store_sitemap \) -prune -o -type f -exec md5sum {} \; | grep -v "files that change") /root/scripts/forum_checkum_original.txt) ]]; then
echo "$MAIL" | mail -s "Forum Checksum" yourmailaddress#yourdomain.com
else
echo "no files have been changed"
fi
diff compares files, not variables. Use Process Substitution instead.
An equivalent of what you're trying to do would be:
/usr/bin/diff <(find . -type d \( -name store_sitemap \) -prune -o -type f -exec md5sum {} \; | grep -v "bidorbuy.log") /root/scripts/forum_checkum_original.txt
If you want to keep it in a variable you can give diff the variable as a filedescriptor by doing:
diff <(echo "$MAIL") "/root/scripts/forum_checkum_original.txt"

In Unix,cmd to search a file recursively and retrieve the file instead of just the path of the file

In Unix, what is the single cmd that lets me search and locate a file recursively and then retrieve the file instead of just the path of the file?
What do you mean by retrieve?
You can simply use -exec argument to find.
$ find /path/to/search -type f -name '*.txt' -exec cat {} \;
$ find /path/to/search -type f -name 'pattern' -exec cp {} /path/to/new \;
The second one should work.
cat `find /wherever/you/want/to/start/from -name name_of_file`
Note those quotes are backquotes (`).

how to find files containing a string using egrep

I would like to find the files containing specific string under linux.
I tried something like but could not succeed:
find . -name *.txt | egrep mystring
Here you are sending the file names (output of the find command) as input to egrep; you actually want to run egrep on the contents of the files.
Here are a couple of alternatives:
find . -name "*.txt" -exec egrep mystring {} \;
or even better
find . -name "*.txt" -print0 | xargs -0 egrep mystring
Check the find command help to check what the single arguments do.
The first approach will spawn a new process for every file, while the second will pass more than one file as argument to egrep; the -print0 and -0 flags are needed to deal with potentially nasty file names (allowing to separate file names correctly even if a file name contains a space, for example).
try:
find . -name '*.txt' | xargs egrep mystring
There are two problems with your version:
Firstly, *.txt will first be expanded by the shell, giving you a listing of files in the current directory which end in .txt, so for instance, if you have the following:
[dsm#localhost:~]$ ls *.txt
test.txt
[dsm#localhost:~]$
your find command will turn into find . -name test.txt. Just try the following to illustrate:
[dsm#localhost:~]$ echo find . -name *.txt
find . -name test.txt
[dsm#localhost:~]$
Secondly, egrep does not take filenames from STDIN. To convert them to arguments you need to use xargs
find . -name *.txt | egrep mystring
That will not work as egrep will be searching for mystring within the output generated by find . -name *.txt which are just the path to *.txt files.
Instead, you can use xargs:
find . -name *.txt | xargs egrep mystring
You could use
find . -iname *.txt -exec egrep mystring \{\} \;
Here's an example that will return the file paths of a all *.log files that have a line that begins with ERROR:
find . -name "*.log" -exec egrep -l '^ERROR' {} \;
there's a recursive option from egrep you can use
egrep -R "pattern" *.log
If you only want the filenames:
find . -type f -name '*.txt' -exec egrep -l pattern {} \;
If you want filenames and matches:
find . -type f -name '*.txt' -exec egrep pattern {} /dev/null \;

Resources