How can I tell find to print a list of files based on the results of running a pipeline on each file? - linux

I know we can use find -exec ... to specify a command to run on each file, and output only those files for which the command succeeds, for example, find . -exec test -d {} \; -print will print out all the directories. I will like to give -exec a pipeline, and have find return files for which the last command of the pipeline returned true.
To be specific, I would like to run jar -t on each file, and grep the output for a class name. I tried find . -name \*jar -exec jar -tf {} \|grep -q foo \; -print, but its returning all the files. How can I change that?

Using a command pipe as argument to -exec would only work if find would utilize a subshell. But find just executes one command with the an exec() function call and all following tokens up to the ; will be passed as command arguments.
So when in need for nifty shell features, you will have to call the subshell on your own:
$ find . -name '*.jar' -exec sh -c 'jar -tf {} | grep -q foo' \; -print

Related

Better way of using the redirection symbol in conjunction with find -exec

My goal is to empty every file in a directory. I DON'T want to actually delete the file, I just want to delete it's contents.
If you want to do this with a single file you can do > file.txt
If I want to run this operation on every file in a directory I can do this:
find . -exec /bin/bash -c '> {}' \;
Notice how the above command has to call /bin/bash. This is because simply running the command like this, find . -exec > {} \; says find: invalid argument ;' to -exec' I suspect this is because the redirection symbol is confusing the command.
I would like to run this command without needing to run /bin/bash within -exec
How can this be done?
One easy way to do this is by using truncate:
find -type f -exec truncate -s0 {} +
If you want to only use bash, you could use a while loop:
find -type f -print0 |
while IFS= read -r -d '' file; do
> "$file"
done
Finally, if you didn't mind using bash -c, it would be better to do it as follows to avoid calling bash so many times:
find -type f -exec bash -c 'for file; do > "$file"; done' - {} +
although I don't like that solution.

Missing Syntax of moving file from one folder to another [duplicate]

I was helped out today with a command, but it doesn't seem to be working. This is the command:
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 && rm {}\;
The shell returns
find: missing argument to `-exec'
What I am basically trying to do is go through a directory recursively (if it has other directories) and run the ffmpeg command on the .rm file types and convert them to .mp3 file types. Once this is done, remove the .rm file that has just been converted.
A -exec command must be terminated with a ; (so you usually need to type \; or ';' to avoid interpretion by the shell) or a +. The difference is that with ;, the command is called once per file, with +, it is called just as few times as possible (usually once, but there is a maximum length for a command line, so it might be split up) with all filenames. See this example:
$ cat /tmp/echoargs
#!/bin/sh
echo $1 - $2 - $3
$ find /tmp/foo -exec /tmp/echoargs {} \;
/tmp/foo - -
/tmp/foo/one - -
/tmp/foo/two - -
$ find /tmp/foo -exec /tmp/echoargs {} +
/tmp/foo - /tmp/foo/one - /tmp/foo/two
Your command has two errors:
First, you use {};, but the ; must be a parameter of its own.
Second, the command ends at the &&. You specified “run find, and if that was successful, remove the file named {};.“. If you want to use shell stuff in the -exec command, you need to explicitly run it in a shell, such as -exec sh -c 'ffmpeg ... && rm'.
However you should not add the {} inside the bash command, it will produce problems when there are special characters. Instead, you can pass additional parameters to the shell after -c command_string (see man sh):
$ ls
$(echo damn.)
$ find * -exec sh -c 'echo "{}"' \;
damn.
$ find * -exec sh -c 'echo "$1"' - {} \;
$(echo damn.)
You see the $ thing is evaluated by the shell in the first example. Imagine there was a file called $(rm -rf /) :-)
(Side note: The - is not needed, but the first variable after the command is assigned to the variable $0, which is a special variable normally containing the name of the program being run and setting that to a parameter is a little unclean, though it won't cause any harm here probably, so we set that to just - and start with $1.)
So your command could be something like
find -exec bash -c 'ffmpeg -i "$1" -sameq "$1".mp3 && rm "$1".mp3' - {} \;
But there is a better way. find supports and and or, so you may do stuff like find -name foo -or -name bar. But that also works with -exec, which evaluates to true if the command exits successfully, and to false if not. See this example:
$ ls
false true
$ find * -exec {} \; -and -print
true
It only runs the print if the command was successfully, which it did for true but not for false.
So you can use two exec statements chained with an -and, and it will only execute the latter if the former was run successfully.
Try putting a space before each \;
Works:
find . -name "*.log" -exec echo {} \;
Doesn't Work:
find . -name "*.log" -exec echo {}\;
I figured it out now. When you need to run two commands in exec in a find you need to actually have two separate execs. This finally worked for me.
find . -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 \; -exec rm {} \;
You have to put a space between {} and \;
So the command will be like:
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 && rm {} \;
Just for your information:
I have just tried using "find -exec" command on a Cygwin system (UNIX emulated on Windows), and there it seems that the backslash before the semicolon must be removed:
find ./ -name "blabla" -exec wc -l {} ;
For anyone else having issues when using GNU find binary in a Windows command prompt. The semicolon needs to be escaped with ^
find.exe . -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 ^;
You need to do some escaping I think.
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} \-sameq {}.mp3 \&\& rm {}\;
Just in case anyone sees a similar "missing -exec args" in Amazon Opsworks Chef bash scripts, I needed to add another backslash to escape the \;
bash 'remove_wars' do
user 'ubuntu'
cwd '/'
code <<-EOH
find /home/ubuntu/wars -type f -name "*.war" -exec rm {} \\;
EOH
ignore_failure true
end
Also, if anyone else has the "find: missing argument to -exec" this might help:
In some shells you don't need to do the escaping, i.e. you don't need the "\" in front of the ";".
find <file path> -name "myFile.*" -exec rm - f {} ;
Both {} and && will cause problems due to being expanded by the command line. I would suggest trying:
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i \{} -sameq \{}.mp3 \; -exec rm \{} \;
In my case I needed to execute "methods" from by bash script, which does not work when using -exec bash -c, so I add another solution I found here, as well:
UploadFile() {
curl ... -F "file=$1"
}
find . | while read file;
do
UploadFile "$file"
done
This thread pops up first when searching for solutions to execute commands for each file from find, so I hope it's okay that this solution does not use the -exec argument
I got the same error when I left a blank space after the ending ; of an -exec command.So, remove blank space after ;
If you are still getting "find: missing argument to -exec" try wrapping the execute argument in quotes.
find <file path> -type f -exec "chmod 664 {} \;"

Find files in a dir, executing a command with execdir and redirecting

It seems like I am unable to find a direct answer to this question.
I appreciate your help.
I'm trying to find all files with a specific name in a directory, read the last 1000 lines of the file and copy it in to a new file in the same directory. As an example:
Find all files names xyz.log in the current directory, copy the last 1000 lines to file abc.log (which doesn't exist).
I tried to use the following command with no luck:
find . -name "xyz.log" -execdir tail -1000 {} > abc.log \;
The problem I'm having is that for all the files in the current directory, they all write to abc.log in the CURRENT directory and not in the directory where xyz.log resides. Clearly the find with execdir is first executed and then the output is redirected to abc.log.
Can you guys suggest a way to fix this? I appreciate any information/help.
EDIT- I tried find . -name "xyz.log" -execdir sh -c "tail -1000 {} > abc.log" \; as suggested by some of the friends, but it gives me this error: sh: ./tail: No such file or directory error message. Do you guys have any idea what the problem is?
Luckily the solution to use -printf is working fine.
The simplest way is this:
find . -name "xyz.log" -execdir sh -c 'tail -1000 "{}" >abc.log' \;
A more flexible alternative is to first print out the commands and then execute them all with sh:
find . -name "xyz.log" -printf 'tail -1000 "%p" >"%h/abc.log"\n' | sh
You can remove the | sh from the end when you're trying it out/debugging.
There is a bug in some versions of findutils (4.2 and 4.3, though it was fixed in some 4.2.x and 4.3.x versions) that cause execdir arguments that contain {} to be prefixed with ./ (instead of the prefix being applied only to {} it is applied to the whole quoted string). To work around this you can use:
find . -name "xyz.log" -execdir sh -c 'tail -1000 "$1" >abc.log' sh {} \;
sh -c 'script' arg0 arg1 runs the sh script with arg0, arg1, etc. passed to it. By convention, arg0 is the name of the executable (here, "sh"). From the script you can access the arguments using $0 (corresponding to "sh"), $1 (corresponding to find's expansion of {}), etc.
The redirect isn't passed into execdir, so abc.log shows up in the directory you run the command in. -execdir also doesn't like embedded redirects. but you can workaround the problem by passing -execdir a shell command with a redirect embedded, like this:
find . -name "xyz.log" -execdir sh -c '/usr/bin/tail -1000 {} > abc.log' \;
Much credit to this blog post (not mine):
http://www.microhowto.info/howto/act_on_all_files_in_a_directory_tree_using_find.html
Edit
I put the full path to tail in the command (assuming it's in /usr/bin on your system), since sh may load a .profile with a PATH that differs from your current shell.
Here's another non-find (well, sorta - it still uses find but doesn't try to shoehorn find into doing the whole thing):
while read f
do
d=$(dirname "${f}")
tail -n 1000 "${f}" > "${d}/abc.log"
done < <(find . -type f -name xyz.log -print)

Run expand on find results

I'm trying to run the expand shell command on all files found by a find command. I've tried -exec and xargs but both failed. Can anyone explain me why? I'm on a mac for the record.
find . -name "*.php" -exec expand -t 4 {} > {} \;
This just creates a file {} with all the output instead of overwriting each individual found file itself.
find . -name "*.php" -print0 | xargs -0 -I expand -t 4 {} > {}
And this just outputs
4 {}
xargs: 4: No such file or directory
Your command does not work for two reasons.
The output redirection is done by the shell and not by find. That means that the shell will redirect finds output into the file {}.
The redirection would occur immediately. That means that the file will be written even before it is read by the expand command. So it's not possible to redirect a command's output into the input file.
Unfortunately expand doesn't allow to write it's output into a file. So you have to use output redirection. If you use bash you could define a function that executes expand, redirects the output into a temporary file and move the temporary file back over the original file. The problem is that find will run a new shell to execute the expand command.
But there is a solution:
expand_func () {
expand -t 4 "$1" > "$1.tmp"
mv "$1.tmp" "$1"
}
export -f expand_func
find . -name \*.php -exec bash -c 'expand_func {}' \;
You are exporting the function expand_func to sub shells using export -f. And you don't execute expand itself using find -exec but you execute a new bash that executes the exported expand_func.
'expand' isn't really worth the trouble.
You can just use sed instead:
find . -name "*.php" | xargs sed -i -e 's/\t/ /g'

Why does find -exec mv {} ./target/ + not work?

I want to know exactly what {} \; and {} \+ and | xargs ... do. Please clarify these with explanations.
Below 3 commands run and output same result but the first command takes a little time and the format is also little different.
find . -type f -exec file {} \;
find . -type f -exec file {} \+
find . -type f | xargs file
It's because 1st one runs the file command for every file coming from the find command. So, basically it runs as:
file file1.txt
file file2.txt
But latter 2 find with -exec commands run file command once for all files like below:
file file1.txt file2.txt
Then I run the following commands on which first one runs without problem but second one gives error message.
find . -type f -iname '*.cpp' -exec mv {} ./test/ \;
find . -type f -iname '*.cpp' -exec mv {} ./test/ \+ #gives error:find: missing argument to `-exec'
For command with {} \+, it gives me the error message
find: missing argument to `-exec'
why is that? can anyone please explain what am I doing wrong?
The manual page (or the online GNU manual) pretty much explains everything.
find -exec command {} \;
For each result, command {} is executed. All occurences of {} are replaced by the filename. ; is prefixed with a slash to prevent the shell from interpreting it.
find -exec command {} +
Each result is appended to command and executed afterwards. Taking the command length limitations into account, I guess that this command may be executed more times, with the manual page supporting me:
the total number of invocations of the command will be much less than the number of matched files.
Note this quote from the manual page:
The command line is built in much the same way that xargs builds its command lines
That's why no characters are allowed between {} and + except for whitespace. + makes find detect that the arguments should be appended to the command just like xargs.
The solution
Luckily, the GNU implementation of mv can accept the target directory as an argument, with either -t or the longer parameter --target. It's usage will be:
mv -t target file1 file2 ...
Your find command becomes:
find . -type f -iname '*.cpp' -exec mv -t ./test/ {} \+
From the manual page:
-exec command ;
Execute command; true if 0 status is returned. All following arguments to find are taken to be arguments to the command until an argument consisting of `;' is encountered. The string `{}' is replaced by the current file name being processed everywhere it occurs in the arguments to the command, not just in arguments where it is alone, as in some versions of find. Both of these constructions might need to be escaped (with a `\') or quoted to protect them from expansion by the shell. See the EXAMPLES section for examples of the use of the -exec option. The specified command is run once for each matched file. The command is executed in the starting directory. There are unavoidable security problems surrounding use of the -exec action; you should use the -execdir option instead.
-exec command {} +
This variant of the -exec action runs the specified command on the selected files, but the command line is built by appending each selected file name at the end; the total number of invocations of the command will be much less than the number of matched files. The command line is built in much the same way that xargs builds its command lines. Only one instance of `{}' is allowed within the command. The command is executed in the starting directory.
I encountered the same issue on Mac OSX, using a ZSH shell: in this case there is no -t option for mv, so I had to find another solution.
However the following command succeeded:
find .* * -maxdepth 0 -not -path '.git' -not -path '.backup' -exec mv '{}' .backup \;
The secret was to quote the braces. No need for the braces to be at the end of the exec command.
I tested under Ubuntu 14.04 (with BASH and ZSH shells), it works the same.
However, when using the + sign, it seems indeed that it has to be at the end of the exec command.
The standard equivalent of find -iname ... -exec mv -t dest {} + for find implementations that don't support -iname or mv implementations that don't support -t is to use a shell to re-order the arguments:
find . -name '*.[cC][pP][pP]' -type f -exec sh -c '
exec mv "$#" /dest/dir/' sh {} +
By using -name '*.[cC][pP][pP]', we also avoid the reliance on the current locale to decide what's the uppercase version of c or p.
Note that +, contrary to ; is not special in any shell so doesn't need to be quoted (though quoting won't harm, except of course with shells like rc that don't support \ as a quoting operator).
The trailing / in /dest/dir/ is so that mv fails with an error instead of renaming foo.cpp to /dest/dir in the case where only one cpp file was found and /dest/dir didn't exist or wasn't a directory (or symlink to directory).
find . -name "*.mp3" -exec mv --target-directory=/home/d0k/Музика/ {} \+
no, the difference between + and \; should be reversed. + appends the files to the end of the exec command then runs the exec command and \; runs the command for each file.
The problem is find . -type f -iname '*.cpp' -exec mv {} ./test/ \+ should be find . -type f -iname '*.cpp' -exec mv {} ./test/ + no need to escape it or terminate the +
xargs I haven't used in a long time but I think works like +.

Resources