Loop over file names from `find`? - linux

If I run this command:
sudo find . -name *.mp3
then I can get a listing of lots of mp3 files.
Now I want to do something with each mp3 file in a loop. For example, I could create a while loop, and inside assign the first file name to the variable file. Then I could do something with that file. Next I could assign the second file name to the variable file and do with that, etc.
How can I realize this using a linux shell command? Any help is appreciated, thanks!

For this, use the read builtin:
sudo find . -name *.mp3 |
while read filename
do
echo "$filename" # ... or any other command using $filename
done
Provided that your filenames don't use the newline (\n) character, this should work fine.

My favourites are
find . -name '*.mp3' -exec cmd {} \;
or
find . -name '*.mp3' -print0 | xargs -0 cmd
While Loop
As others have pointed out, you can frequently use a while read loop to read filenames line by line, it has the drawback of not allowing line-ends in filenames (who uses that?).
xargs vs. -exec cmd {} +
Summarizing the comments saying that -exec...+ is better, I prefer xargs because it is more versatile:
works with other commands than just find
allows 'batching' (grouping) in command lines, say xargs -n 10 (ten at a time)
allows parallellizing, say xargs -P4 (max 4 concurrent processes running at a time)
does privilige separation (such as in the OP's case, where he uses sudo find: using -exec would run all commands as the root user, whereas with xargs that isn't necessary:
sudo find -name '*.mp3' -print0 | sudo xargs -0 require_root.sh
sudo find -name '*.mp3' -print0 | xargs -0 nonroot.sh
in general, pipes are just more versatile (logging, sorting, remoting, caching, checking, parallelizing etc, you can do that)

How about using the -exec option to find?
find . -name '*.mp3' -exec mpg123 '{}' \;
That will call the command mpg123 for every file found, i.e. it will play all the files, in the order they are found.

for file in $(sudo find . -name *.mp3);
do
# do something with file
done

Related

Bash find- is showing the files but returning no such file or directory

I have a bash script I cannot get working. I am a dead set beginner in bash this is actually the first script I've ever used. I'm trying to get omxplayer to play a list of files in a directory. When the script runs I get feedback showing the file then the error that there is no such file or directory. Please help me?
#!/bin/sh
find /media/pi/88DC-E668/MP3/ -name "*.mp3" -exec PLAY={} \;; omxplayer "$PLAY";
This is the echo:
find: `PLAY=/media/pi/88DC-E668/MP3/Dance.mp3': No such file or directory
find: `PLAY=/media/pi/88DC-E668/MP3/Whitemary.mp3': No such file or directory
find: `PLAY=/media/pi/88DC-E668/MP3/Limo.mp3': No such file or directory
find: `PLAY=/media/pi/88DC-E668/MP3/Silo.mp3': No such file or directory
File "" not found.
Easy way:
find /media/pi/88DC-E668/MP3 -name \*.mp3 -exec omxplayer {} \;
or
while IFS= read -r -d '' mp3
do
omxplayer "$mp3"
done < <(find /media/pi/88DC-E668/MP3 -name \*.mp3 -print0)
or
find /media/pi/88DC-E668/MP3 -name \*.mp3 -print0 | xargs -0 -n1 omxplayer
You can omit the -n1 if the omxplayer could handle multiple filenames. In such case the 1st could be written as:
find /media/pi/88DC-E668/MP3 -name \*.mp3 -exec omxplayer {} +
but the simplest probably will be
#shopt -s globstar #the default is on
for mp3 in /media/pi/88DC-E668/MP3/{,**/}*.mp3
do
omxplayer "$mp3"
done
EDIT I stand corrected, but won't delete the answer as you can also learn from the mistakes of others. See comment and rather use this answer :)
So please don't do it like this, as this is a typical "happy path" solution - meaning: it works if you know what you're doing and you know your paths (e.g. that they don't contain spaces). I keep forgetting that many people don't know yet that spaces in paths are evil.
Just use xargs to pass what you found to your player like this:
#!/bin/sh
find /media/pi/88DC-E668/MP3/ -name "*.mp3" | xargs omxplayer
The -exec foo part means run the command foo for each path found.
In your case, -exec PATH={}, the {} part is replaced with the path name, ending up with something like -exec PATH=/media/pi/88DC-E668/MP3/Dance.mp3, and so then find tries to run the command PATH=/media/pi/88DC-E668/MP3/Dance.mp3 which fails because there isn't actually any such program to execute.
xargs is the usual way to do what you're trying to do, as described in another comment already.
You could also do:
find /media/pi/88DC-E668/MP3/ -name \*.mp3 |
while read f; do
omxplayer "$f"
done

how to cp files with spaces in the filename when files are provided by find

I would like to ensure that all files found by find with a given criteria are properly copied to the required location.
$from = '/some/path/to/the/files'
$ext = 'custom_file_extension'
$dest = '/new/destination/for/the/files/with/given/extension'
cp 'find $from -name "*.$ext"' $dest
The problem here is that, when a file found with the proper extension and it is containing space cp cannot copy it properly.
You don't do that. You can't splat filenames with spaces that way.
You either get to use something from http://mywiki.wooledge.org/BashFAQ/001 to read the output from find line-by-line or into an array or you use find -exec to do the copy work.
Something like this:
from='/some/path/to/the/files'
ext='custom_file_extension'
dest='/new/destination/for/the/files/with/given/extension'
find "$from" -name "*.$ext" -exec cp -t "$dest" {} +
Using -exec command + here means that find will only execute as many cp commands as it needs based on command length limits. Using -exec command ; here would run one cp-per-file-found (but is more portable to older systems).
See comment from gniourf_gniourf about the use of -t in that cp command to make -exec command + work correctly.
Use -exec:
find "$from" -name "*.$ext" -exec cp {} "$dest" \;
you need to copy file one by one:
for file in "$from"/*."$ext"; do
cp "$file" "$dest"
done
I just use glob here, and it's enough and complete. I think find may introduce problem if the file name contains funny character.
The solution for this sort of problem is xargs -0 and the -print0 flag for find.
-print0 instructs find to print the results with a NUL character termination, instead of a newline, while -0 for xargs tells it expect input in that format.
Finally, the -J option for xargs allows one to put the arguments in the right place for a copy.
find "$from" -name "*.$ext" -print0 | xargs -0 -J % cp % "$dest"
It's better to use -exec argument of find command to do this:
find . -type f -name "*.ext" -exec cp {} ./destination_dir \;
I've checked this case with files containing spaces and it's work for me. Also don't forger to point out '-type f' if you want to find only files, not directories.

Insert line into multi specified files

I want to insert a line into the start of multiple specified type files, which the files are located in current directory or the sub dir.
I know that using
find . -name "*.csv"
can help me to list the files I want to use for inserting.
and using
sed -i '1icolumn1,column2,column3' test.csv
can use to insert one line at the start of file,
but now I do NOT know how to pipe the filenames from "find" command to "sed" command.
Could anybody give me any suggestion?
Or is there any better solution to do this?
BTW, is it work to do this in one line command?
Try using xargs to pass output of find and command line arguments to next command, here sed
find . -type f -name '*.csv' -print0 | xargs -0 sed -i '1icolumn1,column2,column3'
Another option would be to use -exec option of find.
find . -type f -name '*.csv' -exec sed -i '1icolumn1,column2,column3' {} \;
Note : It has been observed that xargs is more efficient way and can handle multiple processes using -P option.
This way :
find . -type f -name "*.csv" -exec sed -i '1icolumn1,column2,column3' {} +
-exec do all the magic here. The relevant part of man find :
-exec command ;
Execute command; true if 0 status is returned. All following arguments
to find are taken to be arguments to the command until an argument consisting
of `;' is encountered. The string `{}' is replaced by the current file name
being processed everywhere it occurs in the arguments to the command, not just
in arguments where it is alone, as in some versions of find. Both of
these constructions might need to be escaped (with a `\') or quoted to protect
them from expansion by the shell. See the EXAMPLES section for examples of
the use of the -exec option. The specified command is run once for each
matched file. The command is executed in the starting directory. There
are unavoidable security problems surrounding use of the -exec action;
you should use the -execdir option instead

How to give o/p of one command to the argument of another command in shell script?

i have compiled Linux kernel now i want to copy all *.ko files in one separate folder
so
find ./kernel -name "*.ko"
It will give me the list of all .ko files.
Now i want to give this list as argument to cp command.
like
cp -rpf $filer_ko temp/
so how to do this in shell script and on terminal?
This will find all the files that end in .ko and will pipe that list in a loop that iterates over each file (even if they have spaces -- that's probably not the case here though) and will copy each file in the temp directory.
find ./kernel -name "*.ko" | while read file; do cp $file temp/ ; done
As per what i think the below code should satisfy your need. I only gave a dry run to it.
Store the result in a variable and then looping to extract data one by one
#! /bin/bash
$op = find ./kernel -name *.ko"
for zf in $op
do
cp -rpf $zf
tail $zf
done
xargs is normally used for this. But not here.
See, find has this ability builtin
Try this:
find ./kernel -name "*.ko" -exec cp -rpf {} temp/ \;
another approach: use xargs
find ./kernel -name "*.ko" -print0 | xargs -0 cp -rpf -t temp/

Piping find results into grep for fast directory exclusion

I am successfully using find to create a list of all files in the current subdirectory, excluding those in the subdirectory "cache." Here's my first bit of code:
find . -wholename './cach*' -prune -o -print
I now wish to pipe this into a grep command. It seems like that should be simple:
find . -wholename './cach*' -prune -o -print | xargs grep -r -R -i "samson"
... but this is returning results that are mostly from the cache directory. I've tried removing the xargs reference, but that does what you'd expect, running the grep on text of the file names, rather than on the files themselves. My goal is to find "samson" in any files that aren't cached content.
I'll probably get around this issue by just using doubled greps in this instance, but I'm very curious about why this one-liner behaves this way. I'd love to hear thoughts on a way to modify it while still using these two commands (as there are speed advantages to doing it this way).
(This is in CentOS 5, btw.)
The wholename match may be the reason why it's still including "cache" files. If you're executing the find command in the directory that contains the "cache" folder, it should work. If not, try changing it to -name '*cache*' instead.
Also, you do not need the -r or -R for your grep, that tells it to recurse through directories - but you're testing individual files.
You can update your command using the piped version, or a single-command:
find . -name '*cache*' -prune -o -print0 | xargs -0 grep -il "samson"
or
find . -name '*cache*' -prune -o -exec grep -iq "samson" {} \; -print
Note, the -l in the first command tells grep to "list the file" and not the line(s) that match. The -q in the second does the same; it tells grep to respond quietly so find will then just print the filename.
You've told grep itself to recurse (twice! -r and -R are synonyms). Since one of the arguments you're passing is . (the top directory), grep is searching in every file (some of them twice, or even more if they're in subdirectories).
If you're going to use find and grep, do this:
find . -path './cach*' -prune -o -print0 | xargs -0 grep -i "samson"
Using -print0 and -0 makes your script work even with file names that contain spaces or punctuation characters.
However, you probably don't need to bother with find here, since GNU grep is capable of excluding directories:
grep -R --exclude-dir='cach*' -i "samson" .
(This also excludes ./deeply/nested/directory/cache. If you only want to exclude cache directories at the toplevel, use find as you did.)
Use the -exec option on find instead of piping them to another command. From there you can use grep "samson" {} \; to look for samson in each file listed.
For example:
find . -wholename './cach*' -prune -o -exec grep "samson" "{}" +

Resources