Combining find -execdir with grep - linux

I want to go through a bunch of files in various sub folders within a root directory quickly to get the video and audio format type.
I can get the information I need from a single file using:
ffprobe file.mp4 2>&1 >/dev/null | grep "Stream"
I can run ffprobe for each file inside the root folder using
find . -name "*.mp4" -execdir ffprobe "{}" \;
What I am struggling with is to use grep with the second command to filter the output I want (as per the first command) and pipe the entire output into a file.
What is the missing link/s here?

Something like this should work:
find . -name "*.mp4" -execdir ffprobe "{}" \; 2>&1 | grep "Stream" > some_file.txt
The principle is the same really, it can be generalized as
some_command 2>&1 | grep "Stream"
where some_command in this case is
find . -name "*.mp4" -execdir ffprobe "{}" \;

Related

Linux search file with given name containing string recursively

From Linux shell, Let's say I'm in directory /dir and I want to find, recursively in all subfolders, all the files which contain in the name the string name_string and, inside, the string content_string. name_string might be at the beginning, center or end of the file name. How could I do that?
I was trying to sue grep as:
grep -r content_string /dir/*name_string*
But I haven't been lucky so far.
Thanks!
The find command's -exec grep can solve your question, as in this example:
find /dir -name "*name_string*" -exec grep "content_string" {} /dev/null \;
This, however, will not only show you the name of the file, but also the line, containing the content_string. In case you just want the name of the string:
find /dir -name "*name_string*" -exec grep -l "content_string" {} \;
Obviously, you can use -exec with other commands (head, tail, chmod, ...)
You could also use find with xargs
find /dir -name "*name_string*"|xargs -0 -I '{}' grep "content_string" '{}'
With xargs -0, grep is executed only once and its parameter are all files found with the specified pattern:
grep file1 file2 file3 filen
#it will much faster because there is no overhead in fork and exec like this:
grep file1
grep file2
grep file3
..

Bash - Find directory containing specific logs file

I've created a script to quickly analyze some logs and automatically provide advices to solve problems based on errors found.
All works as expected.
However, it's appears that folders structure containing these logs can change (depends on system configuration) and my script not work any more.
I would like to find a way to find the directory containing a specifics files like logs or appinfo.txt file.
Once obtains I could use it as variable and finally solve my problem.
Here is an example:
AppLogDirectory ='Your_Special_Command_You_Will_HelpMe_To_Find'
grep -i "Error" $AppLogDirectory/esl*.log
Log format is: ESL.randomValue.log
Files analyzed : appinfo.txt,
system.txt etc ..
A suggested in comment section, I edit my orginal post with more detail to clarify the context, below an example:
Log files (esl.xxx.tt.ss.log ) can be in random directory, like:
/var/log/ApplicationName/logs/
/opt/ApplicationName/logs/
/var/data/ApplicationName/Extended/logs/
Because of random directory, I need to find a solution to print the directory names of the files that match esl*.log patter (without esl filename)
Use find and pass the output to xargs with grep, like so, which runs grep on multiple files and prints the output together with the file name where the pattern was found:
find /path/to/files /another/path/to/other/files \( -name 'appinfo.txt' -o -name 'system.txt' -o -name 'esl*.log' \) -print0 | xargs -0 grep -i 'Error'
Or simply use -exec ... \+, which gives the same effect, without the need for xargs:
find /path/to/files /another/path/to/other/files \( -name 'appinfo.txt' -o -name 'system.txt' -o -name 'esl*.log' -exec grep -i 'Error' \+
To find the directories which contain the files that contain the desired pattern, use grep -l to print file names only (not the lines that match), and pipe the results to xargs dirname to print the directory names. If you need the unique dir names, pipe it further to sort -u:
find /path/to/files /another/path/to/other/files \( -name 'appinfo.txt' -o -name 'system.txt' -o -name 'esl*.log' -exec grep -il 'Error' \+ | xargs dirname | sort -u
SEE ALSO:
GNU find manual
To search for files based on their contents
xargs
Solution found thanks to you thank you again!
#Ask for extracted tar.gz folder
read -p "Where did you extract the tar.gz file? r1
#directory path where esl files is located
logpath=`find $r1 -name "esl*.log" | xargs dirname | sort -u`
#Search value (here "Error") into all esl*.log
grep 'Error' $logpath/esl*.log | awk '{print $8}'

Moving files with a specific modification date; "find | xargs ls | grep | -exec" fails w/ "-exec: command not found"

Iam using centos 7
If I want to find files that have specific name and specific date then moving these files to another folder iam issuing the command
find -name 'fsimage*' | xargs ls -ali | grep 'Oct 20' | -exec mv {} /hdd/fordelete/ \;
with the following error
-bash: -exec: command not found xargs: ls: terminated by signal 13
As another answer already explains, -exec is an action for find, you can't use it as a shell command. On contrary, xargs and grep are commands, and you can't use them as find actions, just like you can't use pipe | inside find.
But more importantly, even though you could use ls and grep on find's result just to move files older than some amount of time, you shouldn't. Such pipeline is fragile and fails on many corner cases, like symlinks, files with newlines in name, etc.
Instead, use find. You'll find it quite powerful.
For example, to mv files modified more than 7 days ago, use the -mtime test:
find -name 'fsimage*' -mtime +7 -exec mv '{}' /some/dir/ \;
To mv files modified on a specific/reference date, e.g. 2017-10-20, you can use the -newerXY test:
find -name 'fsimage*' -newermt 2017-10-20 ! -newermt 2017-10-21 -exec mv '{}' /some/dir/ \;
Also, if your mv supports the -t option (to give target dir first, multiple files after), you can use {} + placeholder in find for multiple files, reducing the total number of mv command invocations (thanks #CharlesDuffy):
find -name 'fsimage*' -mtime +7 -exec mv -t /some/dir/ '{}' +
the -exec as you wrote it is quite meaningless, moreover it seems you are mixing find syntax with shell oe (-exec as you wrote it should be passed to find)
there are probably more concise ways of doing, but this should do what you expect:
find -name 'fsimage*' -type f | xargs ls -ali | grep 'Oct 20' | awk '{ print $NF }' | while read file; do mv "$file" /hdd/fordelete/ ; done
nevertheless, you should take care of not just copy/paste things you do not really understand from the web, you may wreck you system...

Bash: How to tail then copy multiple files (eg using xargs)?

I've been trying various combinations of xargs and piping but I just can't get the right result. Previous questions don't quite cover exactly what I want to do:
I have a source directory somewhere, lets say /foo/source, with a mix of different files
I want to copy just the csv files found in source to a different destination, say /foo/dest
But I ALSO at the same time need to remove 232 header rows (eg using tail)
I've figured out that I need to pipe the results of find into xargs, which can then run commands on each find result. But I'm struggling to tail then copy. If I pipe tail into cp, cp does not seem to receive the file (missing file operand). Here's some examples of what I've tried so far:
find /foo/source -name "*.csv" | xargs -I '{}' sh -c 'tail -n +232 | cp -t /foo/dest'
cp: missing file operand
find /foo/source -name "*.csv" | xargs -I '{}' sh -c 'tail -n +232 {} | cp -t /foo/dest'
Result:
cp: failed to access '/foo/dest': No such file or directory ...
find /foo/source -name "*.csv" | xargs -I '{}' sh -c 'tail -n +232 {} > /foo/dest/{}'
sh: /foo/dest/foo/source/0001.csv: No such file or directory ...
Any pointers would be really appreciated!
Thanks
Just use find with exec and copy the file name in a variable:
find your_dir -name "*.csv" -exec sh -c 'f="$1"; tail -n +5 "$f" > dest_dir/$(basename "$f")' -- {} \;
See f={} makes $f hold the name of the file, with the full path. Then, it is a matter of redirecting the output of tail into the file, stripping the path from it.
Or, based on Random832's suggestion below in comments (thanks!):
find your_dir -name "*.csv" -exec sh -c 'tail -n +5 "$1" > dest_dir/$(basename "$1")' -- {} \;
Your last command is close, but the problem is that {} is replaced with the full pathname, not just the filename. Use the basename command to extract the filename from it.
find /foo/source -name "*.csv" | xargs -I '{}' sh -c 'tail -n +232 {} > /foo/dest/$(basename {})'
As an alternative to find and xargs you could use a for loop, and as an alternative to tail you could use sed, consider this:
source=/foo/source
dest=/foo/dest
for csv in $source/*.csv; do sed '232,$ !d' $csv > $dest/$(basename $csv); done
Using GNU Parallel you would do:
find /foo/source -name "*.csv" | parallel tail -n +232 {} '>' /foo/dest/{/}

search a string in a file with case insensitive file name

I want to grep for a string in all the files which have a particular patter in their name and is case-insensitive.
For eg if I have two files ABC.txt and aBc.txt, then I want something like
grep -i 'test' *ABC*
The above command should look in both the files.
You can use find and then grep on the results of that:
find . -iname "*ABC*" -exec grep -i "test" {} \;
Note that this will run grep once on each file found. If you want to run grep once on all the files (in which case you risk running into the command line length limit), you can use a plus at the end:
find . -iname "*ABC*" -exec grep -i "test" {} \+
You can also use xargs to process a really large number of results more efficiently:
find . -iname "*ABC*" -print0 | xargs -0 grep -i test
The -print0 makes find output 0-terminated results, and the -0 makes xargs able to deal with this format, which means you don't need to worry about any special characters in the filenames. However, it is not totally portable, since it's a GNU extension.
If you don't have a find that supports -print0 (for example SVR4), you can still use -exec as above or just
find . -iname "*ABC*" | xargs grep -i test
But you should be sure your filenames don't have newlines in them, otherwise xargs will treat each line of a filename as a new argument.
You should use find to match file and search string that you want with command grep which support regular expression, for your question, you should input command like below:
find . -name "*ABC*" -exec grep \<test\> {} \;

Resources