Running command recursively in linux - linux

I'm trying to come up with a command that would run mp3gain FOLDER/SUBFOLDER/*.mp3 in each subfolder, but I'm having trouble understanding why this command doesn't work:
find . -type d -exec mp3gain \"{}\"/*.mp3 \;
When run, I get error Can't open "./FOLDER/SUBFOLDER"/*.mp3 for reading for each folder and subfolder.
If I run command manually with mp3gain "./FOLDER/SUBFOLDER"/*.mp3 it works. What's going wrong?

If you have a fixed data structure like
folder1/subfolder1/
folder1/subfolder2/
folder2/subfolder1/
[...]
and using zsh or bash version >=4.0 you could try
mp3gain **/*.mp3
But to make sure check the output of
ls **/*.mp3
before you are getting serious with mp3gain.

When you run mp3gain "./FOLDER/SUBFOLDER"/*.mp3 from your shell, the *.mp3 is getting expanded by the shell before being passed to mp3gain. When find runs it, there is no shell involved, and the *.mp3 is getting passed literally to mp3gain. The latter has no idea how to deal with wildcards (because normally it doesn't have to).

Hmmm. Just tried this to test how the directory is parsed by replacing mp3gain with echo and it works:
find . -type d -exec echo {}\/\*.mp3 \;
Try running your version of the command but with echo to see the file output for yourself:
find . -type d -exec echo \"{}\"/*.mp3 \;
Seems the quotes get in the way in your original command.

this works...
find /music -name *mp3 -exec mp3gain -r -k {} \;

Related

Embedding a bash command inside the mv command

I have a directory that contains a list of files having the following format:
240-timestamp1.ts
240-timestamp2.ts
...
360-timestamp1.ts
360-timestamp2.ts
Now, I want to implement a bash command which matches the files that start with '240' and renames them so that instead of '240-timestampX.ts' the files look like '240-human-readable-timestampX.ts'.
I have tried the following:
find . -maxdepth 1 -mmin +5 -type f -name "240*"
-exec mv $0 {$0/240-***and here I want to insert
either stat -c %y filename or date -d #timestampX***} '{}' \;
I stuck here because I don't know if I can embed a bash command inside the mv command. I know the task may look a bit confusing and over-complicated, but I would like to know if it is possible to do so. Of course I can create a bash script that would go through all the files in the directory and while loop them with changing their respective names, but somehow I think that a single command would be more efficient (even if less readable).
The OS is Linux Ubuntu 12.04.5
The shell is bash
Thank you both Kenavoz and Kurt Stutsman for the proposed solutions. Both your answers perform the task; however, I marked Kenavoz's answer as the accepted one because of the degree of similarity between my question and Kenavoz's answer. Even if it is indeed possible to do it in a cleaner way with omitting the find command, it is necessary in my case to use the respective command because I need to find files older than X units of time. So thank you both once again!
In case you want to keep your mmin option, your can use find and process found files with a bash command using xargs :
find . -maxdepth 1 -mmin +5 -type f -name "240*.ts" | xargs -L 1 bash -c 'mv "${1}" "240-$(stat -c %y ${1}).ts"' \;
In bash if all your files are in a single directory, you don't need to use find at all. You can do a for loop:
for file in 240-*; do
hr_timestamp=$(date -d $(echo "$file" | sed 's/.*-\([0-9]*\)\.ts/\1/'))
mv "$file" "240-$hr_timestamp.ts"
done

bash find -exec sometimes works and sometimes doesn't

I'm probably missing something, but this oneliner in a bash-script to cycle through some scripts that dump data from different sources:
find . -name 'dump-*.sh' -exec {} "$DUMP_LOG" &>>"$DUMP_LOG" \;
will work when I execute the bash-script containing this oneliner directly, but it doesn't work when I invoke it as the cmd_preexec in rsnapshot. It doesn't spawn any errors, it just doesn't do anything.
I tried adding '(/bin/)bash -c', like this:
find . -name 'dump-*.sh' -exec bash -c '{} "$DUMP_LOG" &>>"$DUMP_LOG"' \;
but then I get an error about '(/bin/)bash not existing, even if Irun the script directly.
OK, silly me. Of course the first parameter of the find-cmd needs the working directory.
find /usr/local/sbin -name 'dump-*.sh' -exec {} "$DUMP_LOG" &>>"$DUMP_LOG" \;
has solved the problem.

Find files in a dir, executing a command with execdir and redirecting

It seems like I am unable to find a direct answer to this question.
I appreciate your help.
I'm trying to find all files with a specific name in a directory, read the last 1000 lines of the file and copy it in to a new file in the same directory. As an example:
Find all files names xyz.log in the current directory, copy the last 1000 lines to file abc.log (which doesn't exist).
I tried to use the following command with no luck:
find . -name "xyz.log" -execdir tail -1000 {} > abc.log \;
The problem I'm having is that for all the files in the current directory, they all write to abc.log in the CURRENT directory and not in the directory where xyz.log resides. Clearly the find with execdir is first executed and then the output is redirected to abc.log.
Can you guys suggest a way to fix this? I appreciate any information/help.
EDIT- I tried find . -name "xyz.log" -execdir sh -c "tail -1000 {} > abc.log" \; as suggested by some of the friends, but it gives me this error: sh: ./tail: No such file or directory error message. Do you guys have any idea what the problem is?
Luckily the solution to use -printf is working fine.
The simplest way is this:
find . -name "xyz.log" -execdir sh -c 'tail -1000 "{}" >abc.log' \;
A more flexible alternative is to first print out the commands and then execute them all with sh:
find . -name "xyz.log" -printf 'tail -1000 "%p" >"%h/abc.log"\n' | sh
You can remove the | sh from the end when you're trying it out/debugging.
There is a bug in some versions of findutils (4.2 and 4.3, though it was fixed in some 4.2.x and 4.3.x versions) that cause execdir arguments that contain {} to be prefixed with ./ (instead of the prefix being applied only to {} it is applied to the whole quoted string). To work around this you can use:
find . -name "xyz.log" -execdir sh -c 'tail -1000 "$1" >abc.log' sh {} \;
sh -c 'script' arg0 arg1 runs the sh script with arg0, arg1, etc. passed to it. By convention, arg0 is the name of the executable (here, "sh"). From the script you can access the arguments using $0 (corresponding to "sh"), $1 (corresponding to find's expansion of {}), etc.
The redirect isn't passed into execdir, so abc.log shows up in the directory you run the command in. -execdir also doesn't like embedded redirects. but you can workaround the problem by passing -execdir a shell command with a redirect embedded, like this:
find . -name "xyz.log" -execdir sh -c '/usr/bin/tail -1000 {} > abc.log' \;
Much credit to this blog post (not mine):
http://www.microhowto.info/howto/act_on_all_files_in_a_directory_tree_using_find.html
Edit
I put the full path to tail in the command (assuming it's in /usr/bin on your system), since sh may load a .profile with a PATH that differs from your current shell.
Here's another non-find (well, sorta - it still uses find but doesn't try to shoehorn find into doing the whole thing):
while read f
do
d=$(dirname "${f}")
tail -n 1000 "${f}" > "${d}/abc.log"
done < <(find . -type f -name xyz.log -print)

Find not working in script, working in terminal prompt

I'm trying to run a bash script in linux (ubuntu but also fedora) but it the find command won't work.
search=\"*${exten[iterext]}\"
find $direc{iterdir} $r_option -iname $search exec -rm {} \\\;
Now to explain the variables:
Exten is array with file extensions read from a text file (no problem here)
direc is also an array of directories read from the command line.
Iterdir and iterext are cicle integer variables.
Now I have two problems:
1- This find command will not delete or display for that matter if I run it inside a script; however if I put an echo before the find and copy paste the output to a command prompt find works fine. I've tried the script under ubuntu and fedora so I assume it's not a bash configuration issue. I should note that the issue seems to the $search as I replaced $search with a hardcoded string (like "*txt) and it works inside the script so it's seems to be a quotation issue.
2 - I run that entire find command and also get find:missing argument to '-exec'
Please help :-( it's driving me insane.
Start simple by placing everything in the find command then worry about parameterizing it.
${exten[iterext]} should be ${exten[$iterext]}
$direc{iterdir} should be ${direc[$iterdir]}
exec -rm should be -exec rm
\\\; should be \;
Quote your variables to prevent word splitting
The following will perform a dry run thanks to the echo. Simply remove the echo when you are satisfied with the output to perform the deletions.
find "${direc[$iterdir]}" "$r_option" -name "*${exten[$iterext]}" -exec echo rm {} \;
Your use of quotes seems a little odd to me. Try this:
find "$direc{iterdir}" $r_option -iname "*${exten[iterext]}" -exec -rm "{}" ";"
Oh, and run your shell script with the -x option. This will print every command line before it is executed.
set -x
find "$direc{iterdir}" $r_option -iname "*${exten[iterext]}" -exec -rm "{}" ";"
set +x

Shell command to find files in a directory pattern

With a shell command i need to list all files on my server in the following directory pattern:
/home/*/public_html/images/*.php
Theres a few an its taking a long time to do this manually. I really have no idea when it comes to these commands.
find /path/to/directory/. -path "*/match/this/path/*" -type f -name "*.php"
Shell Script:
find /home/*/public_html/images -iname "*php" -exec echo {} \;
You can then change the -exec command to do whatever actions you want to the returned files. In this case, we echo them, but you could easily perform other actions as well.
Let bash expand the files for you and use ls to list them:
ls /home/*/public_html/images/*.php
Example output:
/home/grant/public_html/images/bar.php
/home/grant/public_html/images/foo.php
/home/marcog/public_html/images/helloworld.php
Use the PHP glob function
glob('/home/*/public_html/images/*.php')
It will return an array of the matching path strings. You can also just use:
ls /home/*/public_html/images/*.php
or:
for i in /tmp/*/public_html/images/*.php;
do
some_command "$i"
done
from the shell.

Resources