Run command from variables in shell script - linux

I wrote this piece of code to scan a directory for files newer than a reference file while excluding specific subdirectories.
#!/bin/bash
dateMarker="date.marker"
fileDate=$(date +%Y%m%d)
excludedDirs=('./foo/bar' './foo/baz' './bar/baz')
excludedDirsNum=${#excludedDirs[#]}
for (( i=0; i < $excludedDirsNum; i++)); do
myExcludes=${myExcludes}" ! -wholename '"${excludedDirs[${i}]}"*'"
done
find ./*/ -type f -newer $dateMarker $myExcludes > ${fileDate}.changed.files
However the excludes are just being ignored. When I "echo $myExcludes" it looks just fine and furthermore the script behaves just as intended if I replace "$myExcludes" in the last line with the output of the echo command. I guess it's some kind of quoting/escaping error, but I haven't been able to eliminate it.

Seems to be a quoting problem, try using arrays:
#!/bin/bash
dateMarker=date.marker
fileDate=$(date +%Y%m%d)
excludedDirs=('./foo/bar' './foo/baz' './bar/baz')
args=(find ./*/ -type f -newer "$dateMarker")
for dir in "${excludedDirs[#]}"
do
args+=('!' -wholename "$dir")
done
"${args[#]}" > "$fileDate.changed.files"
Maybe you also need -prune:
args=(find ./*/)
for dir in "${excludedDirs[#]}"
do
args+=('(' -wholename "$dir" -prune ')' -o)
done
args+=('(' -type f -newer "$dateMarker" -print ')')

you need the myExcludes to evaluate to something like this:
\( -name foo/bar -o -name foo/baz -o -name bar/baz \)

Related

FInd and copy multiple files that contain a pattern in Linux

When I need to copy multiple files in the same dir I can just:
cp file{20..30} newLocation
But when I combine that with find, it doesn't work.
find . -name 'file{20..30}' -exec cp '{}' newLocation ';'
What am I doing wrong?
The {20..30} range syntax is a special feature of Bash's command-line parser. It is not part of standard POSIX globbing, such as find's -name test performs, and it's not even recognized by Bash in some contexts where you might like it to be.
You already know the simpler, more direct alternative that I would otherwise recommend for your example case. You could also do something like
find . '(' -name 'file2[0-9]' -o -name file30 ')' -exec cp '{}' newLocation ';'
, though that doesn't work very well if the endpoints of the range are determined dynamically.
If the point of using find is to avoid problems arising from some of the files not existing, then you might consider addressing it like this:
for f in file{20..30}; do
[[ -e "$f" ]] && cp "$f" newLocation
done
As mentioned from the other post/answer, enclosing the names with a ( and closing ) plus the -o and -name in between.
Something like this should be able to add those strings from your input file/string.
#!/usr/bin/env bash
format() {
local f
declare -ag files
for f; do
files+=( -o -name "$f" )
done
}
format file{10..20}
Now check the value of "${files[#]}"
declare -p files
Output
declare -a files=([0]="-o" [1]="-name" [2]="file10" [3]="-o" [4]="-name" [5]="file11" [6]="-o" [7]="-name" [8]="file12" [9]="-o" [10]="-name" [11]="file13" [12]="-o" [13]="-name" [14]="file14" [15]="-o" [16]="-name" [17]="file15" [18]="-o" [19]="-name" [20]="file16" [21]="-o" [22]="-name" [23]="file17" [24]="-o" [25]="-name" [26]="file18" [27]="-o" [28]="-name" [29]="file19" [30]="-o" [31]="-name" [32]="file20")
To see what would it look like when used as an input to find
printf '%s\n' "\( ${files[*]} \)"
Output
\( -o -name file10 -o -name file11 -o -name file12 -o -name file13 -o -name file14 -o -name file15 -o -name file16 -o -name file17 -o -name file18 -o -name file19 -o -name file20 \)
To use that array files as an input to find
find . -type f \( "${files[#]:1}" \) -exec bash -c 'echo cp -v -- "$#" /destination' _ {} +
The "${files[#]:1}" removes the leading -o
Remove the echo if you're satisfied with the output so files copying should occur.
Or just use globstar with nullglob'
#!/usr/bin/env bash
shopt -s globstar nullglob
cp -v ./**/file{10..20} /destination
The leading ./ means the current directory, It could be
/path/to/source/**/file{10..20}
where /path/to/source/ is the directory where the files in question are.

Circumvent Argument list too long in script (for loop)

I've seen a few answers regarding this, but as a newbie, I don't really understand how to implement that in my script.
it should be pretty easy (for those who can stuff like this)
I'm using a simple
for f in "/drive1/"images*.{jpg,png}; do
but this is simply overloading and giving me
Argument list too long
How is this easiest solved?
Argument list too long workaroud
Argument list length is something limited by your config.
getconf ARG_MAX
2097152
But after discuss around differences between bash specifics and system (os) limitations (see comments from that other guy), this question seem wrong:
Regarding discuss on comments, OP tried something like:
ls "/simple path"/image*.{jpg,png} | wc -l
bash: /bin/ls: Argument list too long
This happen because of OS limitation, not bash!!
But tested with OP code, this work finely
for file in ./"simple path"/image*.{jpg,png} ;do echo -n a;done | wc -c
70980
Like:
printf "%c" ./"simple path"/image*.{jpg,png} | wc -c
Reduce line length by reducing fixed part:
First step: you could reduce argument length by:
cd "/drive1/"
ls images*.{jpg,png} | wc -l
But when number of file will grow, you'll be buggy again...
More general workaround:
find "/drive1/" -type f \( -name '*.jpg' -o -name '*.png' \) -exec myscript {} +
If you want this to NOT be recursive, you may add -maxdepth as 1st option:
find "/drive1/" -maxdepth 1 -type f \( -name '*.jpg' -o -name '*.png' \) \
-exec myscript {} +
There, myscript will by run with filenames as arguments. The command line for myscript is built up until it reaches a system-defined limit.
myscript /drive1/file1.jpg '/drive1/File Name2.png' /drive1/...
From man find:
-exec command {} +
This variant of the -exec action runs the specified command on
the selected files, but the command line is built by appending
each selected file name at the end; the total number of invoca‐
tions of the command will be much less than the number of
matched files. The command line is built in much the same way
that xargs builds its command lines. Only one instance of `{}'
Inscript sample
You could create your script like
#!/bin/bash
target=( "/drive1" "/Drive 2/Pictures" )
[ "$1" = "--run" ] && exec find "${target[#]}" -type f \( -name '*.jpg' -o \
-name '*.png' \) -exec $0 {} +
for file ;do
echo Process "$file"
done
Then you have to run this with --run as argument.
work with any number of files! (Recursively! See maxdepth option)
permit many target
permit spaces and special characters in file and directrories names
you could run same script directly on files, without --run:
./myscript hello world 'hello world'
Process hello
Process world
Process hello world
Using pure bash
Using arrays, you could do things like:
allfiles=( "/drive 1"/images*.{jpg,png} )
[ -f "$allfiles" ] || { echo No file found.; exit ;}
echo Number of files: ${#allfiles[#]}
for file in "${allfiles[#]}";do
echo Process "$file"
done
There's also a while read loop:
find "/drive1/" -maxdepth 1 -mindepth 1 -type f \( -name '*.jpg' -o -name '*.png' \) |
while IFS= read -r file; do
or with zero terminated files:
find "/drive1/" -maxdepth 1 -mindepth 1 -type f \( -name '*.jpg' -o -name '*.png' \) -print0 |
while IFS= read -r -d '' file; do

BASH: Filter list of files by return value of another command

I have series of directories with (mostly) video files in them, say
test1
1.mpg
2.avi
3.mpeg
junk.sh
test2
123.avi
432.avi
432.srt
test3
asdf.mpg
qwerty.mpeg
I create a variable (video_dir) with the directory names (based on other parameters) and use that with find to generate the basic list. I then filter based on another variable (video_type) for file types (because there is sometimes non-video files in the dirs) piping it through egrep. Then I shuffle the list around and save it out to a file. That file is later used by mplayer to slideshow through the list.
I currently use the following command to accomplish that. I'm sure it's a horrible way to do it, but it works for me and it's quite fast even on big directories.
video_dir="/test1 /test2"
video_types=".mpg$|.avi$|.mpeg$"
find ${video_dir} -type f |
egrep -i "${video_types}" |
shuf > "$TEMP_OUT"
I now would like to add the ability to filter out files based on the resolution height of the video file. I can get that from.
mediainfo --Output='Video;%Height%' filename
Which just returns a number. I have tried using the -exec functionality of find to run that command on each file.
find ${video_dir} -type f -exec mediainfo --Output='Video;%Height%' {} \;
but that just returns the list of heights, not the filenames and I can't figure out how to reject ones based on a comparison, like <480.
I could do a for next loop but that seems like a bad (slow) idea.
Using info from #mark-setchell I modified it to,
video_dir="test1"
find ${video_dir} -type f \
-exec bash -c 'h=$(mediainfo --Output="Video;%Height%" "$1"); [[ $h -gt 480 ]]' _ {} \; -print
Which works.
You can replace your egrep with the following so you are still inside the find command (-iname is case insensitive and -o represents a logical OR):
find test1 test2 -type f \
\( -iname "*.mpg" -o -iname "*.avi" -o -iname "*.mpeg" \) \
NEXT_BIT
The NEXT_BIT can then -exec bash and exit with status 0 or 1 depending on whether you want the current file included or excluded. So it will look like this:
-exec bash -c 'H=$(mediainfo -output ... "$1"); [ $H -lt 480 ] && exit 1; exit 0' _ {} \;
So, taking note of #tripleee advice in comments about superfluous exit statements, I get this:
find test1 test2 -type f \
\( -iname "*.mpg" -o -iname "*.avi" -o -iname "*.mpeg" \) \
-exec bash -c 'h=$(mediainfo ...options... "$1"); [ $h -lt 480 ]' _ {} \; -print
This Q&A was focused on one particular case, so the accepted answer is not as general as it could be.
find
If the list of files comes from find, one can use its filtering facilities, e.g. -exec:
find ${video_dir} -type f \
-exec COMMAND \; \
-print
Here
COMMAND is not enclosed in quotes -- find reads everything after -exec and up to a \;
find will expand {} to the current file name (including path -- you might find -execdir helpful, which will cd to the file's directory and replace {} with the leaf file name)
The exit code of COMMAND is treated as follows:
0 -> true
non-0 -> false
Note that you can build more complex expressions (e.g. -not -exec ...), which will be evaluated "from left to right, according to the rules of precedence ... -and is assumed where the operator is omitted." (per man find)
xargs
If the list of files comes from elsewhere (and is available on stdin), you can use xargs as follows (from
If xargs is map, what is filter? )
ls | xargs -I{} bash -c "COMMAND '{}' && echo '{}'"
Here is my solution.
#!/bin/bash
shopt -s nullglob
video_dir=(/test1 /test2)
while IFS= read -rd '' file; do
if [[ $file = *.#(mpg|avi|mpeg|mp4) ]]; then
h=$(mediainfo --Output="Video;%Height%" "$file")
(( h >= 480 )) && echo "$file"
fi
done < <(find "${video_dir[#]}" -type f -print0)
This solution you can process everything inside the while read loop.

Exclude list of files from find

If I have a list of filenames in a text file that I want to exclude when I run find, how can I do that? For example, I want to do something like:
find /dir -name "*.gz" -exclude_from skip_files
and get all the .gz files in /dir except for the files listed in skip_files. But find has no -exclude_from flag. How can I skip all the files in skip_files?
I don't think find has an option like this, you could build a command using printf and your exclude list:
find /dir -name "*.gz" $(printf "! -name %s " $(cat skip_files))
Which is the same as doing:
find /dir -name "*.gz" ! -name first_skip ! -name second_skip .... etc
Alternatively you can pipe from find into grep:
find /dir -name "*.gz" | grep -vFf skip_files
This is what i usually do to remove some files from the result (In this case i looked for all text files but wasn't interested in a bunch of valgrind memcheck reports we have here and there):
find . -type f -name '*.txt' ! -name '*mem*.txt'
It seems to be working.
I think you can try like
find /dir \( -name "*.gz" ! -name skip_file1 ! -name skip_file2 ...so on \)
find /var/www/test/ -type f \( -iname "*.*" ! -iname "*.php" ! -iname "*.jpg" ! -iname "*.png" \)
The above command gives list of all files excluding files with .php, .jpg ang .png extension. This command works for me in putty.
Josh Jolly's grep solution works, but has O(N**2) complexity, making it too slow for long lists. If the lists are sorted first (O(N*log(N)) complexity), you can use comm, which has O(N) complexity:
find /dir -name '*.gz' |sort >everything_sorted
sort skip_files >skip_files_sorted
comm -23 everything_sorted skip_files_sorted | xargs . . . etc
man your computer's comm for details.
This solution will go through all files (not exactly excluding from the find command), but will produce an output skipping files from a list of exclusions.
I found that useful while running a time-consuming command (file /dir -exec md5sum {} \;).
You can create a shell script to handle the skipping logic and run commands on the files found (make it executable with chmod, replace echo with other commands):
$ cat skip_file.sh
#!/bin/bash
found=$(grep "^$1$" files_to_skip.txt)
if [ -z "$found" ]; then
# run your command
echo $1
fi
Create a file with the list of files to skip named files_to_skip.txt (on the dir you are running from).
Then use find using it:
find /dir -name "*.gz" -exec ./skip_file.sh {} \;
This should work:
find * -name "*.gz" $(printf "! -path %s " $(<skip_files.txt))
Working out
Assuming skip_files has a filename on each line, you can get the list of filenames via $(<skip_files.txt). E.g. echo $(<skip_files.txt) should print them all out.
For each filename you want to have a ! -path filename expression. To build this, use $(printf "! -path %s " $(<skip_files.txt))
Then, put it together with a filter on -name "*.gz"

file search bash script

i am trying to make a very simple bash script to find files matching the given name in the directory structure of the current directory. So, I used the find function like this
ARGS=1
E_BADARGS=65
E_NOFILE=66
if [ $# -ne "$ARGS" ] # Correct number of arguments not passed
then
echo "Usage: `basename $0` filename"
exit $E_BADARGS
fi
echo `find ./ -type f -name \$1`
this works fine but unlike when I use the find command in the command line, the resulting file paths are not separated by newline but just by a space. This naturally isn't too easy to see in the screen. How can I echo so that each file it finds will be separated by a newline.
I would change your find command to this one:
find . -maxdepth 1 -type f -name "$1"
Note that the double quotes are kind important in find command, to treat regular expressions correclty. Also I added maxdepth 1 to search files only in the current directory
As #codaddict noted, echo is unnecessary here. But it's also a good exercise to understand why does your code behave in such a way. Hint: compare
echo `find ./ -type f -name \$1`
and
echo "`find ./ -type f -name \$1`"
Try changing
echo `find ./ -type f -name \$1`
to
find ./ -type f -name $1

Resources