Variable causing issue while doing the test command in unix:: find $PWD -type d -exec sh -c 'test "{}" ">" "$PWD/$VersionFolders"' ; -print|wc -l` - linux

Variable causing issue while doing the test command in unix Command is::
find $PWD -type d -exec sh -c 'test "{}" ">" "$PWD/$VersionFolders"' \; -print|wc -l`
Input Values-
Here $PWD- Current Directory
b1_v.1.0
b1_v.1.2
b1_v.1.3
b1_v.1.4
Given Version folder as $VersionFolders
b1_v.1.2
The Command should check if any folders exist in current directory which is greater than the give version folder and it should count or display.
This approach has to be consider with out date or time created of folders.
Expected Output-
b1_v.1.3
b1_v.1.4
If I give hard code Directories its working fine. But when I pass it like as variable.it give all folders.
working fine this commend-
find $PWD -type d -exec sh -c 'test "{}" ">" "$PWD/b1_v.1.2"' \; -print|wc -l`
Not working this command with variable-
find $PWD -type d -exec sh -c 'test "{}" ">" "$PWD/$VersionFolders"' ; -print|wc -l`

The variable $VersionFolders won't get expanded inside the single quotes, and apparently you did not export it to make it visible to subprocesses.
An obscure but common hack is to put it in $0 (which nominally should contain the name of the shell itself, and is the first argument after sh -c '...') because that keeps the code simple.
find . -type d \
-exec sh -c 'test "$1" ">" "$0"' \
"$VersionFolders" {} \; \
-print | wc -l
But as #chepner remarks, you can run -exec test {} ">" "$VersionFolders" \; directly.
The shell already does everything in the current directory, so you don't need to spell out $PWD. Perhaps see also What exactly is current working directory?

Related

How to use grep to reverse search files in a folder

I'm trying to create a script which will find missing topics from multiple log files. These logfiles are filled top down, so the newest logs are at the bottom of the file. I would like to grep only the last line from this file which includes UNKNOWN_TOPIC_OR_PARTITION. This should be done in multiple files with completely different names. Is grep the best solution or is there another solution that suits my needs. I already tried adding tail, but that doesn't seem to work.
missingTopics=$(grep -Ri -m1 --exclude=*.{1,2,3,4,5} UNKNOWN_TOPIC_OR_PARTITION /app/tibco/log/tra/domain/)
You could try a combination of find, tac and grep:
find /app/tibco/log/tra/domain -type f ! -name '*.[1-5]' -exec sh -c \
'tac "$1" | grep -im1 UNKNOWN_TOPIC_OR_PARTITION' "sh" '{}' \;
tac prints files in reverse, the -exec sh -c SCRIPT "sh" '{}' \; action of find executes the shell SCRIPT each time a file matching the previous tests is found. The SCRIPT is executed with "sh" as parameter $0 and the path of the found file as parameter $1.
If performance is an issue you can probably improve it with:
find . -type f ! -name '*.[1-5]' -exec sh -c 'for f in "$#"; do \
tac "$f" | grep -im1 UNKNOWN_TOPIC_OR_PARTITION; done' "sh" '{}' +
which will spawn less shells. If security is also an issue you can also replace -exec by -execdir (even if with this SCRIPT I do not immediately see any exploit).

BASH: Filter list of files by return value of another command

I have series of directories with (mostly) video files in them, say
test1
1.mpg
2.avi
3.mpeg
junk.sh
test2
123.avi
432.avi
432.srt
test3
asdf.mpg
qwerty.mpeg
I create a variable (video_dir) with the directory names (based on other parameters) and use that with find to generate the basic list. I then filter based on another variable (video_type) for file types (because there is sometimes non-video files in the dirs) piping it through egrep. Then I shuffle the list around and save it out to a file. That file is later used by mplayer to slideshow through the list.
I currently use the following command to accomplish that. I'm sure it's a horrible way to do it, but it works for me and it's quite fast even on big directories.
video_dir="/test1 /test2"
video_types=".mpg$|.avi$|.mpeg$"
find ${video_dir} -type f |
egrep -i "${video_types}" |
shuf > "$TEMP_OUT"
I now would like to add the ability to filter out files based on the resolution height of the video file. I can get that from.
mediainfo --Output='Video;%Height%' filename
Which just returns a number. I have tried using the -exec functionality of find to run that command on each file.
find ${video_dir} -type f -exec mediainfo --Output='Video;%Height%' {} \;
but that just returns the list of heights, not the filenames and I can't figure out how to reject ones based on a comparison, like <480.
I could do a for next loop but that seems like a bad (slow) idea.
Using info from #mark-setchell I modified it to,
video_dir="test1"
find ${video_dir} -type f \
-exec bash -c 'h=$(mediainfo --Output="Video;%Height%" "$1"); [[ $h -gt 480 ]]' _ {} \; -print
Which works.
You can replace your egrep with the following so you are still inside the find command (-iname is case insensitive and -o represents a logical OR):
find test1 test2 -type f \
\( -iname "*.mpg" -o -iname "*.avi" -o -iname "*.mpeg" \) \
NEXT_BIT
The NEXT_BIT can then -exec bash and exit with status 0 or 1 depending on whether you want the current file included or excluded. So it will look like this:
-exec bash -c 'H=$(mediainfo -output ... "$1"); [ $H -lt 480 ] && exit 1; exit 0' _ {} \;
So, taking note of #tripleee advice in comments about superfluous exit statements, I get this:
find test1 test2 -type f \
\( -iname "*.mpg" -o -iname "*.avi" -o -iname "*.mpeg" \) \
-exec bash -c 'h=$(mediainfo ...options... "$1"); [ $h -lt 480 ]' _ {} \; -print
This Q&A was focused on one particular case, so the accepted answer is not as general as it could be.
find
If the list of files comes from find, one can use its filtering facilities, e.g. -exec:
find ${video_dir} -type f \
-exec COMMAND \; \
-print
Here
COMMAND is not enclosed in quotes -- find reads everything after -exec and up to a \;
find will expand {} to the current file name (including path -- you might find -execdir helpful, which will cd to the file's directory and replace {} with the leaf file name)
The exit code of COMMAND is treated as follows:
0 -> true
non-0 -> false
Note that you can build more complex expressions (e.g. -not -exec ...), which will be evaluated "from left to right, according to the rules of precedence ... -and is assumed where the operator is omitted." (per man find)
xargs
If the list of files comes from elsewhere (and is available on stdin), you can use xargs as follows (from
If xargs is map, what is filter? )
ls | xargs -I{} bash -c "COMMAND '{}' && echo '{}'"
Here is my solution.
#!/bin/bash
shopt -s nullglob
video_dir=(/test1 /test2)
while IFS= read -rd '' file; do
if [[ $file = *.#(mpg|avi|mpeg|mp4) ]]; then
h=$(mediainfo --Output="Video;%Height%" "$file")
(( h >= 480 )) && echo "$file"
fi
done < <(find "${video_dir[#]}" -type f -print0)
This solution you can process everything inside the while read loop.

How to exclude new lines (carriage returns) using find regex on GNU/Linux cli?

I'm trying to rename all files in a directory with de JPG extension to lowercase jpg.
I've made this bash code with the help of this post:
find . -regex ".*\.JPG" -exec sh -c 'echo "$0" | sed -r "s/\.JPG/\.jpg/" && mv "$0" "$1"' {} \;
But I get the following error:
./IMG_1352.jpg
mv: cannot move './IMG_1352.JPG' to '': No such file or directory
(and so on...)
I think I need to change names "places" but I don't know how.
TL;DR: Try this code:
find . -name '*.JPG' -type f -exec sh -c 'mv $0 ${0/.JPG/.jpg}' {} \;
More commentary:
Your code is pretty close to where it needs to be. There is an issue, however, in as much as $1 is not defined.
I ran the following code to figure out what value was going to be in $1.
$ ls
bye.jpg hi.JPG
$ find . -regex ".*\.JPG" -exec sh -c 'echo "$0" | sed "s/\.JPG/\.jpg/" && echo "0:$0 1:$1" ' {} \;
./hi.jpg
0:./hi.JPG 1:
According to my results above, there is no value in $1. You should be aware that $1 actually refers to the second parameter passed to the sh. That is, sh -c 'code' $0 $1 ...
Anyway, you'll need to capture the result of the sed command in a variable to pass it to the move command as follows:
find . -regex ".*\.JPG" -exec sh -c '
lower=$( echo "$0" | sed -r "s/\.JPG/\.jpg/" )
mv "$0" "$lower"
' {} \;
All that said, you could make this more concise. As PS suggests, the for loop is a good choice. Also, the JPG -> jpb replacement is more readable with PS's suggestion. If you are sold on using find, you can incorporate PS's suggestion as follows:
find . -regex ".*\.JPG" -exec sh -c '
filename=$0
mv "$filename" "${filename/.JPG/.jpg}"
' {} \;

Missing Syntax of moving file from one folder to another [duplicate]

I was helped out today with a command, but it doesn't seem to be working. This is the command:
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 && rm {}\;
The shell returns
find: missing argument to `-exec'
What I am basically trying to do is go through a directory recursively (if it has other directories) and run the ffmpeg command on the .rm file types and convert them to .mp3 file types. Once this is done, remove the .rm file that has just been converted.
A -exec command must be terminated with a ; (so you usually need to type \; or ';' to avoid interpretion by the shell) or a +. The difference is that with ;, the command is called once per file, with +, it is called just as few times as possible (usually once, but there is a maximum length for a command line, so it might be split up) with all filenames. See this example:
$ cat /tmp/echoargs
#!/bin/sh
echo $1 - $2 - $3
$ find /tmp/foo -exec /tmp/echoargs {} \;
/tmp/foo - -
/tmp/foo/one - -
/tmp/foo/two - -
$ find /tmp/foo -exec /tmp/echoargs {} +
/tmp/foo - /tmp/foo/one - /tmp/foo/two
Your command has two errors:
First, you use {};, but the ; must be a parameter of its own.
Second, the command ends at the &&. You specified “run find, and if that was successful, remove the file named {};.“. If you want to use shell stuff in the -exec command, you need to explicitly run it in a shell, such as -exec sh -c 'ffmpeg ... && rm'.
However you should not add the {} inside the bash command, it will produce problems when there are special characters. Instead, you can pass additional parameters to the shell after -c command_string (see man sh):
$ ls
$(echo damn.)
$ find * -exec sh -c 'echo "{}"' \;
damn.
$ find * -exec sh -c 'echo "$1"' - {} \;
$(echo damn.)
You see the $ thing is evaluated by the shell in the first example. Imagine there was a file called $(rm -rf /) :-)
(Side note: The - is not needed, but the first variable after the command is assigned to the variable $0, which is a special variable normally containing the name of the program being run and setting that to a parameter is a little unclean, though it won't cause any harm here probably, so we set that to just - and start with $1.)
So your command could be something like
find -exec bash -c 'ffmpeg -i "$1" -sameq "$1".mp3 && rm "$1".mp3' - {} \;
But there is a better way. find supports and and or, so you may do stuff like find -name foo -or -name bar. But that also works with -exec, which evaluates to true if the command exits successfully, and to false if not. See this example:
$ ls
false true
$ find * -exec {} \; -and -print
true
It only runs the print if the command was successfully, which it did for true but not for false.
So you can use two exec statements chained with an -and, and it will only execute the latter if the former was run successfully.
Try putting a space before each \;
Works:
find . -name "*.log" -exec echo {} \;
Doesn't Work:
find . -name "*.log" -exec echo {}\;
I figured it out now. When you need to run two commands in exec in a find you need to actually have two separate execs. This finally worked for me.
find . -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 \; -exec rm {} \;
You have to put a space between {} and \;
So the command will be like:
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 && rm {} \;
Just for your information:
I have just tried using "find -exec" command on a Cygwin system (UNIX emulated on Windows), and there it seems that the backslash before the semicolon must be removed:
find ./ -name "blabla" -exec wc -l {} ;
For anyone else having issues when using GNU find binary in a Windows command prompt. The semicolon needs to be escaped with ^
find.exe . -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 ^;
You need to do some escaping I think.
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} \-sameq {}.mp3 \&\& rm {}\;
Just in case anyone sees a similar "missing -exec args" in Amazon Opsworks Chef bash scripts, I needed to add another backslash to escape the \;
bash 'remove_wars' do
user 'ubuntu'
cwd '/'
code <<-EOH
find /home/ubuntu/wars -type f -name "*.war" -exec rm {} \\;
EOH
ignore_failure true
end
Also, if anyone else has the "find: missing argument to -exec" this might help:
In some shells you don't need to do the escaping, i.e. you don't need the "\" in front of the ";".
find <file path> -name "myFile.*" -exec rm - f {} ;
Both {} and && will cause problems due to being expanded by the command line. I would suggest trying:
find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i \{} -sameq \{}.mp3 \; -exec rm \{} \;
In my case I needed to execute "methods" from by bash script, which does not work when using -exec bash -c, so I add another solution I found here, as well:
UploadFile() {
curl ... -F "file=$1"
}
find . | while read file;
do
UploadFile "$file"
done
This thread pops up first when searching for solutions to execute commands for each file from find, so I hope it's okay that this solution does not use the -exec argument
I got the same error when I left a blank space after the ending ; of an -exec command.So, remove blank space after ;
If you are still getting "find: missing argument to -exec" try wrapping the execute argument in quotes.
find <file path> -type f -exec "chmod 664 {} \;"

Find and basename not playing nicely

I want to echo out the filename portion of a find on the linux commandline. I've tried to use the following:
find www/*.html -type f -exec sh -c "echo $(basename {})" \;
and
find www/*.html -type f -exec sh -c "echo `basename {}`" \;
and a whole host of other combinations of escaping and quoting various parts of the text. The result is that the path isn't stripped:
www/channel.html
www/definition.html
www/empty.html
www/index.html
www/privacypolicy.html
Why not?
Update: While I have a working solution below, I'm still interested in why "basename" doesn't do what it should do.
The trouble with your original attempt:
find www/*.html -type f -exec sh -c "echo $(basename {})" \;
is that the $(basename {}) code is executed once, before the find command is executed. The output of the single basename is {} since that is the basename of {} as a filename. So, the command that is executed by find is:
sh -c "echo {}"
for each file found, but find actually substitutes the original (unmodified) file name each time because the {} characters appear in the string to be executed.
If you wanted it to work, you could use single quotes instead of double quotes:
find www/*.html -type f -exec sh -c 'echo $(basename {})' \;
However, making echo repeat to standard output what basename would have written to standard output anyway is a little pointless:
find www/*.html -type f -exec sh -c 'basename {}' \;
and we can reduce that still further, of course, to:
find www/*.html -type f -exec basename {} \;
Could you also explain the difference between single quotes and double quotes here?
This is routine shell behaviour. Let's take a slightly different command (but only slightly — the names of the files could be anywhere under the www directory, not just one level down), and look at the single-quote (SQ) and double-quote (DQ) versions of the command:
find www -name '*.html' -type f -exec sh -c "echo $(basename {})" \; # DQ
find www -name '*.html' -type f -exec sh -c 'echo $(basename {})' \; # SQ
The single quotes pass the material enclosed direct to the command. Thus, in the SQ command line, the shell that launches find removes the enclosing quotes and the find command sees its $9 argument as:
echo $(basename {})
because the shell removes the quotes. By comparison, the material in the double quotes is processed by the shell. Thus, in the DQ command line, the shell (that launches find — not the one launched by find) sees the $(basename {}) part of the string and executes it, getting back {}, so the string it passes to find as its $9 argument is:
echo {}
Now, when find does its -exec action, in both cases it replaces the {} by the filename that it just found (for sake of argument, www/pics/index.html). Thus, you get two different commands being executed:
sh -c 'echo $(basename www/pics/index.html)' # SQ
sh -c "echo www/pics/index.html" # DQ
There's a (slight) notational cheat going on there — those are the equivalent commands that you'd type at the shell. The $2 of the shell that is launched actually has no quotes in it in either case — the launched shell does not see any quotes.
As you can see, the DQ command simply echoes the file name; the SQ command runs the basename command and captures its output, and then echoes the captured output. A little bit of reductionist thinking shows that the DQ command could be written as -print instead of using -exec, and the SQ command could be written as -exec basename {} \;.
If you're using GNU find, it supports the -printf action which can be followed by Format Directives such that running basename is unnecessary. However, that is only available in GNU find; the rest of the discussion here applies to any version of find you're likely to encounter.
Try this instead :
find www/*.html -type f -printf '%f\n'
If you want to do it with a pipe (more resources needed) :
find www/*.html -type f -print0 | xargs -0 -n1 basename
Thats how I batch resize files with imagick, rediving output filename from source
find . -name header.png -exec sh -c 'convert -geometry 600 {} $(dirname {})/$(basename {} ".png")_mail.png' \;
I had to accomplish something similar, and found following the practices mentioned for avoiding looping over find's output and using find with sh sidestepped these problems with {} and -printfentirely.
You can try it like this:
find www/*.html -type f -exec sh -c 'echo $(basename $1)' find-sh {} \;
The summary is "Don't reference {} directly inside of a sh -c but instead pass it to sh -c as an argument, then you can reference it with a number variable inside of sh -c" the find-sh is just there as a dummy to take up the $0, there is more utility in doing it that way and using {} for $1.
I'm assuming the use of echo is really to simplify the concept and test function. There are easier ways to simply echo as others have mentioned, But an ideal use case for this scenario might be using cp, mv, or any more complex commands where you want to reference the found file names more than once in the command and you need to get rid of the path, eg. when you have to specify filename in both source and destination or if you are renaming things.
So for instance, if you wanted to copy only the html documents to your public_html directory (Why? because Example!) then you could:
find www/*.html -type f -exec sh -c 'cp /var/www/$(basename $1) /home/me/public_html/$(basename $1)' find-sh {} \;
Over on unix stackexchange, user wildcard's answer on looping with find goes into some great gems on usage of -exec and sh -c. (You can find it here: https://unix.stackexchange.com/questions/321697/why-is-looping-over-finds-output-bad-practice)

Resources