Find specific folders then search specific files inside them for a word - linux

I am trying to combine find and grep in a way to find folder names that start with k0 and search a specific file "test.log" for a word ERROR.
Something like:
find . -type d -name "k0*" -print | xargs grep ERROR test.log
unfortunately this command doesnt work as intended.

try this, I am assuming you have multiple files named test.log in the folders whose names start with k0 here:
for file in $(find ./k0* -name 'test.log'); do
grep -w 'ERROR' $file
done
You can make this into a one-liner command like this:
for file in $(find ./k0* -name 'test.log'); do grep -w 'ERROR' $file; done
It's executable on terminal if you just post it.

Related

Find and show information from logs inside a folder in linux

I'm trying to create a little script using bash in linux. That allows me to find if there is any tag 103=16 inside a log
I have multiple folders named for example l51prdsrv-api1.nebex.local, l51prdsrv-oe1.nebex.local, etc... inside those folders are .log files like TRADX_gsoe3.log, TRADX_gseuoe2.log, etc... .
I need to find if inside those logs there is the tag 103=16
I'm trying this command
find . /opt/FIXLOGS/l51prdsrv* -iname "TRADX_" -type f | grep -e 103=16
But what it does is that is showing just the logs names and not the content to see if there is a tag 103=16
First of all, you are not searching files of the form TRADX_something.log, but only files which are just named TRADX_ (case-insensitively, so TradX_ would also be found).
Then you are feeding to grep the names of the files, but never look into the content of those files. From the grep man page, you see that the file content can be supplied either via stdin, or by specifying the file name on the command line. In your case, the latter is the way to go. Therefore you can either do a
find . /opt/FIXLOGS/l51prdsrv* -iname "TRADX_*.log" -type f -exec grep -F 103=16 {} \;
if you are only interested in the matchin lines, or a
find . /opt/FIXLOGS/l51prdsrv* -iname "TRADX_*.log" -type f -exec grep -F 103=16 {} /dev/null \;
if you also want to see the file names where the pattern matches. The reason is that grep is printing the filename only if it sees more than 1 filename on the command line and the /dev/null provides a second dummy file. find replaces the {} by the filename.
BTW, I used -f for grep instead of your -e, because you don't seem to use any specific regular expression pattern anyway.
But you don't need find for this task. An alternative would be an explicit loop:
shopt -s nocasematch # make globbing case-insensitive
shopt -s globstar # turn on ** globbing
for f in {.,/opt/FIXLOGS/l51prdsrv*}/**/tradx_*.log
do
[[ -f $f ]] && grep -F 103=16 "$f" /dev/null
done
While the loop looks more complicated at first glance, it is easier to extend the logic in case you want to do more with the files instead of just grepping the lines, for instance taking specific actions on those files which contain the pattern.
You are doing:
find . /opt/FIXLOGS/l51prdsrv* -iname "TRADX_" -type f | grep -e 103=16
I propose you do:
find . /opt/FIXLOGS/l51prdsrv* -iname "TRADX_" -type f -exec grep -e "103=16" {} /dev/null \;
What's the difference?
find ... -type f
=> gives you a list of files.
When you add | grep -e 103=16, then you perform that on the filenames.
When you add -exec grep ..., then you perform that on the files itselfs.

Bash - Find directory containing specific logs file

I've created a script to quickly analyze some logs and automatically provide advices to solve problems based on errors found.
All works as expected.
However, it's appears that folders structure containing these logs can change (depends on system configuration) and my script not work any more.
I would like to find a way to find the directory containing a specifics files like logs or appinfo.txt file.
Once obtains I could use it as variable and finally solve my problem.
Here is an example:
AppLogDirectory ='Your_Special_Command_You_Will_HelpMe_To_Find'
grep -i "Error" $AppLogDirectory/esl*.log
Log format is: ESL.randomValue.log
Files analyzed : appinfo.txt,
system.txt etc ..
A suggested in comment section, I edit my orginal post with more detail to clarify the context, below an example:
Log files (esl.xxx.tt.ss.log ) can be in random directory, like:
/var/log/ApplicationName/logs/
/opt/ApplicationName/logs/
/var/data/ApplicationName/Extended/logs/
Because of random directory, I need to find a solution to print the directory names of the files that match esl*.log patter (without esl filename)
Use find and pass the output to xargs with grep, like so, which runs grep on multiple files and prints the output together with the file name where the pattern was found:
find /path/to/files /another/path/to/other/files \( -name 'appinfo.txt' -o -name 'system.txt' -o -name 'esl*.log' \) -print0 | xargs -0 grep -i 'Error'
Or simply use -exec ... \+, which gives the same effect, without the need for xargs:
find /path/to/files /another/path/to/other/files \( -name 'appinfo.txt' -o -name 'system.txt' -o -name 'esl*.log' -exec grep -i 'Error' \+
To find the directories which contain the files that contain the desired pattern, use grep -l to print file names only (not the lines that match), and pipe the results to xargs dirname to print the directory names. If you need the unique dir names, pipe it further to sort -u:
find /path/to/files /another/path/to/other/files \( -name 'appinfo.txt' -o -name 'system.txt' -o -name 'esl*.log' -exec grep -il 'Error' \+ | xargs dirname | sort -u
SEE ALSO:
GNU find manual
To search for files based on their contents
xargs
Solution found thanks to you thank you again!
#Ask for extracted tar.gz folder
read -p "Where did you extract the tar.gz file? r1
#directory path where esl files is located
logpath=`find $r1 -name "esl*.log" | xargs dirname | sort -u`
#Search value (here "Error") into all esl*.log
grep 'Error' $logpath/esl*.log | awk '{print $8}'

Copy multiple file from multiple directories with new filename

I want to make a specific copy.
I explain
So here my main folder :
Sub-Directory-name-01\filename-01.jpg
Sub-Directory-name-01\filename-02.jpg
Sub-Directory-name-01\filename-03.jpg
Sub-Directory-name-01\special-filename-01.jpg
Sub-Directory-name-02\filename2-01.jpg
Sub-Directory-name-02\filename2-02.jpg
Sub-Directory-name-02\filename2-03.jpg
Sub-Directory-name-02\special-filename2-01.jpg
Sub-Directory-name-02\filename2-01.jpg
Sub-Directory-name-02\filename2-02.jpg
Sub-Directory-name-02\filename2-03.jpg
Sub-Directory-name-02\special-filename2-01.jpg
I want to copy all file from all dir and :
- keep original file
- copy 2 times the original file
- add a prefix to the new name
- prefix-01 for first copy
- prefix-02 for second copy
- keep the new files in the same dir as original file
I allready succes with a command to copy 1 time with 1 prefix.
It works in the sub-directory
for file in *.jpg; do cp "$file" "prefix-$file"; done
I try to do for all sub-dirs but i got an error
find . -type f \( -iname "*.jpg" ! -iname "special-*.jpg" \) | xargs cp -v "$file" "prefix-$file"
( yes i exclude a special name )
But i got error :
cp: target `./Sub-Directory-name-01/filename-01.jpg' is not a directory
i dont know how to solve my problem and how to add the 2nd copy in the cmd.
Thanks
Edit : I havent found any similar question so any answser to solve this problem.
Note that above $file is set only by the for file in ... ; do ... ;done loop, i.e. in your xargs cmdline you were just using the last leftover value from the loop.
Some things to consider:
need to process each file separately => use xargs -l1 (process each 1 line).
need to separate DIR/FILENAME as the needed command is something like 'cp $DIR/$FILENAME $DIR/prefix-01-$FILENAME' (and prefix-02 also), use find ... -printf "%h %f\n" for this
for each line, need to do couple things (prefix-01,02) => use a scriptlet via sh -c '<scriptlet>'
better skip prefix-0?-*.jpg files from find, to be able to re-run it without "accumulating" copies
A possible implementation would be:
find . -type f \( -iname "*.jpg" ! -iname "special-*.jpg" ! -name "prefix-0?-*.jpg" \) -printf "%h %f\n" | \
xargs -l1 sh -c 'cp -v "$1/$2" "$1/prefix-01-$2"; cp -v "$1/$2" "$1/prefix-02-$2"' --
As xargs runs sh -c '<scriptlet>' -- DIR FILE for each line, the scriptlet will properly evaluate $1 and $2 respectively.
--jjo
PS: directory separator in Unix-like systems is / :)
[Update: fixed to use %f instead of %P, as per comments below]

From directories create files changing their ending

I have several directories with a pattern:
$find -name "*.out"
./trnascanse.out
./darn.out
./blast_rnaz.out
./erpin.out
./rnaspace_cli.out
./yass.out
./atypicalgc.out
./blast.out
./combine.out
./infernal.out
./ecoli.out
./athaliana.out
./yass_carnac.out
./rnammer.out
I can get the list into a file find -name "*.out" > files because I want to create for each directory a file ending with .ref instead of .out : trnascanse.ref, darn.ref, blast_rnaz.refand so on.
I would say that this is possible with some grep and touch but I don't know how to do it. Any idea? Or just create each one manually is the only way (as I did with this directories). Thanks
Here's one way:
for d in *.out ; do echo touch "${d%.out}.ref" ; done
The ${d%.out} expands $d and removes the trailing .out. Read about it in the bash man page.
If the output of above one-liner looks ok, pipe it to sh , or remove the echo and re-run it.
Use this:
find -maxdepth 1 -type d -printf "%f" -exec bash -c "mkdir $(echo '{}' | sed 's/\.out$//').ref" \;

How to list specific type of files in recursive directories in shell?

How can we find specific type of files i.e. doc pdf files present in nested directories.
command I tried:
$ ls -R | grep .doc
but if there is a file name like alok.doc.txt the command will display that too which is obviously not what I want. What command should I use instead?
If you are more confortable with "ls" and "grep", you can do what you want using a regular expression in the grep command (the ending '$' character indicates that .doc must be at the end of the line. That will exclude "file.doc.txt"):
ls -R |grep "\.doc$"
More information about using grep with regular expressions in the man.
ls command output is mainly intended for reading by humans. For advanced querying for automated processing, you should use more powerful find command:
find /path -type f \( -iname "*.doc" -o -iname "*.pdf" \)
As if you have bash 4.0++
#!/bin/bash
shopt -s globstar
shopt -s nullglob
for file in **/*.{pdf,doc}
do
echo "$file"
done
find . | grep "\.doc$"
This will show the path as well.
Some of the other methods that can be used:
echo *.{pdf,docx,jpeg}
stat -c %n * | grep 'pdf\|docx\|jpeg'
We had a similar question. We wanted a list - with paths - of all the config files in the etc directory. This worked:
find /etc -type f \( -iname "*.conf" \)
It gives a nice list of all the .conf file with their path. Output looks like:
/etc/conf/server.conf
But, we wanted to DO something with ALL those files, like grep those files to find a word, or setting, in all the files. So we use
find /etc -type f \( -iname "*.conf" \) -print0 | xargs -0 grep -Hi "ServerName"
to find via grep ALL the config files in /etc that contain a setting like "ServerName" Output looks like:
/etc/conf/server.conf: ServerName "default-118_11_170_172"
Hope you find it useful.
Sid
Similarly if you prefer using the wildcard character * (not quite like the regex suggestions) you can just use ls with both the -l flag to list one file per line (like grep) and the -R flag like you had. Then you can specify the files you want to search for with *.doc
I.E. Either
ls -l -R *.doc
or if you want it to list the files on fewer lines.
ls -R *.doc
If you have files with extensions that don't match the file type, you could use the file utility.
find $PWD -type f -exec file -N \{\} \; | grep "PDF document" | awk -F: '{print $1}'
Instead of $PWD you can use the directory you want to start the search in. file prints even out he PDF version.

Resources