I have a piece of code that should work, but it doesn't.
I want to iterate through the files and subdirectories of directories given in the command line and see which one of them is a file. The program never entries in the if statement.
for i in $#;do
for j in `ls $i`;do
if [ -f $j ];then
echo $j is a file!
fi
done
done
Things can go wrong with your approach. Do it this way.
for i in "$#" ; do
for j in "$i"/* ; do
if [ -f "$j" ]; then
echo "$j is a regular file!"
fi
done
done
Changes :
Quoted the "$#" to avoid problems with file paths containing spaces, newlines.
Used shell globbing in the inner loop, as parsing ls output is not a good idea (see http://mywiki.wooledge.org/ParsingLs)
Double-quoted variable expansion inside the test, once again to allow for files with spaces, newlines.
Added "regular" in the output, because this is what this specific test operator tests for (e.g. will exclude files that correspond to devices, FIFOs, not just directories).
You could simplify a bit if you are so inclined :
for i in "$#" ; do
for j in "$i"/* ; do
! [ -f "$j" ] || echo "$j is a regular file!"
done
done
If you want to use find, you need to make sure you only list files at a depth of one level (or else the results could be different from your code). You can do it this way :
find "$#" -mindepth 1 -maxdepth 1 -type f -exec echo "{} is a file" \;
Please note that this will still be a bit different, as globbing (by default) excludes files that start with a period. Adding shopt -s dotglob to the loop-based solution would allow globbing to consider all files, which should then make both solutions operate on the same files.
I think you'll be better off using find:
find $# -type f
Related
I have a folder with a ton of old photos with many duplicates. Sorting it by hand would take ages, so I wanted to use the opportunity to use bash.
Right now I have the code:
#!/bin/bash
directory="~/Desktop/Test/*"
for file in ${directory};
do
for filex in ${directory}:
do
if [ $( diff {$file} {$filex} ) == 0 ]
then
mv ${filex} ~/Desktop
break
fi
done
done
And getting the exit code:
diff: {~/Desktop/Test/*}: No such file or directory
diff: {~/Desktop/Test/*:}: No such file or directory
File_compare: line 8: [: ==: unary operator expected
I've tried modifying working code I've found online, but it always seems to spit out some error like this. I'm guessing it's a problem with the nested for loop?
Also, why does it seem there are different ways to call variables? I've seen examples that use ${file}, "$file", and "${file}".
You have the {} in the wrong places:
if [ $( diff {$file} {$filex} ) == 0 ]
They should be at:
if [ $( diff ${file} ${filex} ) == 0 ]
(though the braces are optional now), but you should allow for spaces in the file names:
if [ $( diff "${file}" "${filex}" ) == 0 ]
Now it simply doesn't work properly because when diff finds no differences, it generates no output (and you get errors because the == operator doesn't expect nothing on its left-side). You could sort of fix it by double quoting the value from $(…) (if [ "$( diff … )" == "" ]), but you should simply and directly test the exit status of diff:
if diff "${file}" "${filex}"
then : no difference
else : there is a difference
fi
and maybe for comparing images you should be using cmp (in silent mode) rather than diff:
if cmp -s "$file" "$filex"
then : no difference
else : there is a difference
fi
In addition to the problems Jonathan Leffler pointed out:
directory="~/Desktop/Test/*"
for file in ${directory};
~ and * won't get expanded inside double-quotes; the * will get expanded when you use the variable without quotes, but since the ~ won't, it's looking for files under an directory actually named "~" (not your home directory), it won't find any matches. Also, as Jonathan pointed out, using variables (like ${directory}) without double-quotes will run you into trouble with filenames that contain spaces or some other metacharacters. The better way to do this is to not put the wildcard in the variable, use it when you reference the variable, with the variable in double-quotes and the * outside them:
directory=~/"Desktop/Test"
for file in "${directory}"/*;
Oh, and another note: when using mv in a script it's a good idea to use mv -i to avoid accidentally overwriting another file with the same name.
And: use shellcheck.net to sanity-check your code and point out common mistakes.
If you are simply interested in knowing if two files differ, cmp is the best option. Its advantages are:
It works for text as well as binary files, unlike diff which is for text files only
It stops after finding the first difference, and hence it is very efficient
So, your code could be written as:
if ! cmp -s "$file" "$filex"; then
# files differ...
mv "$filex" ~/Desktop
# any other logic here
fi
Hope this helps. I didn't understand what you are trying to do with your loops and hence didn't write the full code.
You can use diff "$file" "$filex" &>/dev/null and get the last command result with $? :
#!/bin/bash
SEARCH_DIR="."
DEST_DIR="./result"
mkdir -p "$DEST_DIR"
directory="."
ls $directory | while read file;
do
ls $directory | while read filex;
do
if [ ! -d "$filex" ] && [ ! -d "$file" ] && [ "$filex" != "$file" ];
then
diff "$file" "$filex" &>/dev/null
if [ "$?" == 0 ];
then
echo "$filex is a duplicate. Copying to $DEST_DIR"
mv "$filex" "$DEST_DIR"
fi
fi
done
done
Note that you can also use fslint or fdupes utilities to find duplicates
Task: concatinate array of string with delimiter, dilimeter is "/".
Metatask: i've a folder with many files. Need to copy them into another folder.
So i need to get "name of file" and "path to folder".
What's wrong: delimiter "/" works incorrectly. It doesn't concatinate with my strings. If i try to use "\/" - string disappeare at all.
What's going on?
loc_path='./test/*'
delim='\/'
for itt in $loc_path; do
IFS=$delim
read -ra res <<< "$itt"
str=''
for ((i = 1; i \<= ${#res[#]}; i++)); do
#str=($str${res[$i]}$delim)
str="$str${res[$i]}$delim"
done
echo $str
done
Please, give to two answers:
how to solve task-problem
better way to solve metatask
There is an issue in delim='\/'. Firstly, you need not to protect slash. Secondly all characters are already protected between simple quotes.
There is a syntax issue with your concatenation. You must not use parenthesis here! They can be used to open a sub shell. We need not that.
To solve your 'meta-task', you should avoid to use IFS, or read. They are complex to use (for example by modifying IFS globally as you do, you change how echo display the res array. It can mislead you while you troubleshoot...) I suggest you use more simple tool like: basename, etc.
Here few scripts to solve your meta (scholar?) task:
# one line :-)
cp src/* dst/
# to illustrate basename etc
for file in "$SRC/"*; do
dest="$DST/$(basename $file)"
cp "$file" "$dest"
done
# with a change of directory
cd "$SRC"
for file in *; do cp "$file" "$DST/$file"; done
cd -
# Change of directory and a sub shell
(cd "$SRC" ; for file in *; do cp "$file" "$DST/$file"; done)
Task solution:
arr=( string1 string2 string3 ) # array of strings
str=$( IFS='/'; printf '%s' "${arr[*]}" ) # concatenated with / as delimiter
$str will be the single string string1/string2/string3.
Meta task solution:
Few files:
cp path/to/source/folder/* path/to/dest/folder
Note that * matches any type of file and that it does not match hidden names. For hidden names, use shopt -s dotglob in bash. This will fail if there are thousands of files (argument list too long).
Few or many files files, only non-directories:
for pathaname in path/to/source/folder/*; do
[ ! -type d "$pathame" ] && cp "$pathname" path/to/dest/folder
done
or, with find,
find path/to/source/folder -maxdepth 1 ! -type d -exec cp {} path/to/dest/folder \;
The difference between these two is that the shell loop will refuse to copy symbolic links that resolve to directories, while the find command will copy them.
If any of this isn't particularly clear, please let me know and I'll do my best to clarify.
I basically need to sort a set of files with various extensions and similar patterns to the filename, into directories and subdirectories that match the pattern and type of extension.
To elaborate a bit:
All files, regardless of extension, begin with the pattern "zz####" where #### is a number from 1 to 900; "zz1.zip through zz950.zip, zz1.mov through zz950.mov, zz1.mp4 through zz950.mp4"
Some files contain additional characters; "zz360_hello_world.zip"
Some files contain spaces; "zz370_hello world.zip"
I need these files to be sorted and moved into directories and subdirectories following a particular format: "/home/hello/zz1/zip, /home/hello/zz1/vid"
If the directories and/or subdirectories don't exist, I need them created.
Example:
zz400_testing.zip ----> /home/hello/zz400/zip
zz400 testing video.mov ----> /home/hello/zz400/vid
zz500.zip ----> /home/hello/zz500/zip
zz500_testing another video.mp4 ----> /home/hello/zz500/vid
I found a few answers around here for simpler use-cases, but wasn't able to get anything working for my particular needs.
Any help at all would be much appreciated.
Thank you!
EDIT: Adding the code I've been messing with
for f in *.zip; do
set=`echo "$f"|sed 's/[0-9].*//'`
dir="/home/demo/$set/photos"
mkdir -p "$dir"
mv "$f" "$dir"
done
I think I'm just having trouble wrapping my head around how to match with regex. I've got this far with it:
[demo#alpha grep]$ echo zz433.zip|sed 's/[0-9].*//'
zz
The script will run the mkdir, and even move the zip files into their proper place. I just can't get it to create the proper top-level directory (zz433).
The sed command here doesn't do what you're trying to achieve:
set=`echo "$f"|sed 's/[0-9].*//'`
The meaning of the regular expression [0-9].* is "a digit followed by anything".
The s/// command of sed performs a replacement.
The result is effectively removing everything from the input starting from the first digit.
So for "zz360_hello_world.zip" it removes everything starting from "3",
leaving only "zz".
Note also that to match the files, the pattern *.zip doesn't match your description. You're looking for files starting with "zz" and a number from 1 up to 900. If you don't mind including numbers > 900 then you can write the loop expression like this:
for f in zz[0-9][^0-9]* zz[0-9][0-9][^0-9]* zz[0-9][0-9][0-9][^0-9]*; do
Or the same thing more compactly:
for f in zz{[0-9],[0-9][0-9],[0-9][0-9][0-9]}[^0-9]*; do
These are glob patterns.
zz[0-9][^0-9]* means "start with 'zz', followed by a digit, followed by a non-digit, followed by anything".
In the above example I use three patterns to cover the cases of "zz" followed by 1, 2 or 3 digits, followed by a non-digit.
The second example is a more compact form of the first,
the idea is that a{b,c}d expands to abd and acd.
Next, to get the appropriate prefix, you could use pattern matching with a case statement and extract substrings.
The syntax of these patterns is the same glob syntax as in the previous example in the for statement.
case "$f" in
zz[0-9][0-9][0-9]*) prefix=${f:0:5} ;;
zz[0-9][0-9]*) prefix=${f:0:4} ;;
zz[0-9]*) prefix=${f:0:3} ;;
esac
It seems you also want to create grouping by file type. You could get the file extension by chopping off the beginning of the name until the dot with ext=${f##*.}, and then use a case statement as in the earlier example to map extensions to the desired directory names.
Putting the above together:
for f in zz{[0-9],[0-9][0-9],[0-9][0-9][0-9]}[^0-9]*; do
case "$f" in
zz[0-9][0-9][0-9]*) prefix=${f:0:5} ;;
zz[0-9][0-9]*) prefix=${f:0:4} ;;
zz[0-9]*) prefix=${f:0:3} ;;
esac
ext=${f##*.}
case "$ext" in
mov|mp4) group=vid ;;
*) group=$ext ;;
esac
dir="/home/demo/$prefix/$group"
mkdir -p "$dir"
mv "$f" "$dir"
done
I've answered part of my own question!
for f in *.zip; do
set=`echo "$f"|grep -o -P 'zz[0-9]+.{0,0}'`
dir="/home/demo/$set/photos"
mkdir -p "$dir"
mv "$f" "$dir"
done
Basically, the following script will grab files like:
zz232.zip
zz233test.zip
zz234 test.zip
Then it will create the top-level directory (zz####), the photos sub-directory, and move the file into place:
/home/demo/zz232/photos/zz232.zip
/home/demo/zz233/photos/zz233test.zip
/home/demo/zz234/photos/zz234 test.zip
Moving on to expanding the script for additional functionality.
Thanks all!
How about:
#!/bin/bash
IFS=$'\n'
for file in *; do
if [[ $file =~ ^(zz[0-9]+).*\.(zip|mov|mp4)$ ]]; then
ext=${BASH_REMATCH[2]}
if [ $ext = "mov" -o $ext = "mp4" ]; then
ext="vid"
fi
dir="/home/hello/${BASH_REMATCH[1]}/$ext"
mkdir -p $dir
mv "$file" $dir
fi
done
Hope this helps.
Let's say I have the following files in my current directory:
1.jpg
1original.jpg
2.jpg
2original.jpg
3.jpg
4.jpg
Is there a terminal/bash/linux command that can do something like
if the file [an integer]original.jpg exists,
then move [an integer].jpg and [an integer]original.jpg to another directory.
Executing such a command will cause 1.jpg, 1original.jpg, 2.jpg and 2original.jpg to be in their own directory.
NOTE
This doesn't have to be one command. I can be a combination of simple commands. Maybe something like copy original files to a new directory. Then do some regular expression filter on files in the newdir to get a list of file names from old directory that still need to be copied over etc..
Turning on extended glob support will allow you to write a regular-expression-like pattern. This can handle files with multi-digit integers, such as '87.jpg' and '87original.jpg'. Bash parameter expansion can then be used to strip "original" from the name of a found file to allow you to move the two related files together.
shopt -s extglob
for f in +([[:digit:]])original.jpg; do
mv $f ${f/original/} otherDirectory
done
In an extended pattern, +( x ) matches one or more of the things inside the parentheses, analogous to the regular expression x+. Here, x is any digit. Therefore, we match all files in the current directory whose name consists of 1 or more digits followed by "original.jpg".
${f/original/} is an example of bash's pattern substitution. It removes the first occurrence of the string "original" from the value of f. So if f is the string "1original.jpg", then ${f/original/} is the string "1.jpg".
well, not directly, but it's an oneliner (edit: not anymore):
for i in [0-9].jpg; do
orig=${i%.*}original.jpg
[ -f $orig ] && mv $i $orig another_dir/
done
edit: probably I should point out my solution:
for i in [0-9].jpg: execute the loop body for each jpg file with one number as filename. store whole filename in $i
orig={i%.*}original.jpg: save in $orig the possible filename for the "original file"
[ -f $orig ]: check via test(1) (the [ ... ] stuff) if the original file for $i exists. if yes, move both files to another_dir. this is done via &&: the part after it will be only executed if the test was successful.
This should work for any strictly numeric prefix, i.e. 234.jpg
for f in *original.jpg; do
pre=${f%original.jpg}
if [[ -e "$pre.jpg" && "$pre" -eq "$pre" ]] 2>/dev/null; then
mv "$f" "$pre.jpg" targetDir
fi
done
"$pre" -eq "$pre" gives an error if not integer
EDIT:
this fails if there exist original.jpg and .jpg both.
$pre is then nullstring and "$pre" -eq "$pre" is true.
The following would work and is easy to understand (replace out with the output directory, and {1..9} with the actual range of your numbers.
for x in {1..9}
do
if [ -e ${x}original.jpg ]
then
mv $x.jpg out
mv ${x}original.jpg out
fi
done
You can obviously also enter it as a single line.
You can use Regex statements to find "matches" in the files names that you are looking through. Then perform your actions on the "matches" you find.
integer=0; while [ $integer -le 9 ] ; do if [ -e ${integer}original.jpg ] ; then mv -vi ${integer}.jpg ${integer}original.jpg lol/ ; fi ; integer=$[ $integer + 1 ] ; done
Note that here, "lol" is the destination directory. You can change it to anything you like. Also, you can change the 9 in while [ $integer -le 9 ] to check integers larger than 9. Right now it starts at 0* and stops after checking 9*.
Edit: If you want to, you can replace the semicolons in my code with carriage returns and it may be easier to read. Also, you can paste the whole block into the terminal this way, even if that might not immediately be obvious.
I'm trying to create a simple for loop that will take all the outputs of an ls command and put each output into a variable. so far i have
for i in ls /home/svn
do
echo $i
done
but it gives me an error.
Because the ls needs to be executed:
for i in $(ls ...); do
echo $i
done
Also, you might want to consider shell globbing instead:
for i in /home/svn/*; do
echo $i
done
... or find, which allows very fine-grained selection of the properties of items to find:
for i in $(find /home/svn -type f); do
echo $i
done
Furthermore, if you can have white space in the segments of the path or the file names themselves, use a while loop (previous example adjusted):
find /home/svn -type f|while read i; do
echo $i
done
while reads line-wise, so that the white space is preserved.
Concerning the calling of basename you have two options:
# Call basename
echo $(basename $i)
# ... or use string substitution
echo ${i##*/}
To explain the substitution: ## removes the longest front-anchored pattern from the string, # up to the first pattern match, %% the longest back-anchored pattern and % the first back-anchored full match.
You don't need to use ls to go over files in this case, the following will do the job: for i in /home/svn/*; do echo $i; done
You want to assign the output of the ls command to i, so you need to enclose it in backticks or the $() operator:
for i in $(ls /home/svn)
do
echo $i
done
That's because you're doing it wrong in the first place.
for i in /home/svn/*
do
echo "$i"
done