arguments in bash - linux

this script moves all the doc files to a specified directory....i have managed to put an argument but the problem im facing is puting the full path where the scripts are moving to for example i want to run the script like this below
./loo -d then path where im moving the files (i.e ./loo -d the second argument where files are moving to)
this is my code
#!/bin/bash
From="/home/elg19/lone/doc"
To="/home/elg19/documents"
if [ $1 = -d ]; then
cd "$From"
for i in pdf txt doc; do
find . -type f -name "*.${i}" -exec mv "{}" "$To" \;
done
fi

I'm not sure what is the exact problem?
Do you need to put a " around full path if it contains spaces?
./loo -d "full path with spaces"
Similary to $1, full path can be retrieved with $2.

how about this?
#!/bin/bash
from=/home/elg19/lone/doc
if [[ $1 = -d ]]; then
to=$2
else
to=/home/elg19/documents
fi
find "$from" -type f \( -name '*.pdf' -o -name '*.txt' -o -name '*.doc' \) -exec bash -c 'dest=$1; shift; mv "$#" "$dest"' _ "$to" {} +

Related

Bash Globbing Pattern Matching for Imagemagick recursive convert to pdf

I have the following 2 scripts, that recursively convert folders of images to pdf's for my wifes japanese manga kindle using find and Imagemagick convert:
#!/bin/bash
_d="$(pwd)"
echo "$_d"
find . -type d -exec echo "Will convert in the following order: {}" \;
find . -type d -exec echo "Converting: '{}'" \; -exec convert '{}/*.jpg' "$_d/{}.pdf" \;
and the same for PNG
#!/bin/bash
_d="$(pwd)"
echo "$_d"
find . -type d -exec echo "Will convert in the following order: {}" \;
find . -type d -exec echo "Converting: '{}'" \; -exec convert '{}/*.png' "$_d/{}.pdf" \;
Unfortunately I am not able make one universal script that works for all image formats.
How do I make one script that works for both ?
I would also need JPG,PNG as well as jpeg,JPEG
Thx in advance
I wouldn't use find at all, just a loop:
#!/use/bin/env bash
# enable recursive globs
shopt -s globstar
for dir in **/*/; do
printf "Converting jpgs in %s\n" "$dir"
convert "$dir"/*.jpg "$dir/out.pdf"
done
If you want to combine .jpg and .JPG in the same pdf, add nocaseglob to the shopt line. Add .jpeg to the mix? Add extglob and change "$dir"/*.jpg to "$dir"/*.#(jpg|jpeg)
You can do more complicated actions if you turn the find exec into a bash function (or even a standalone script).
#!/bin/bash
do_convert()(
shopt -s nullglob
for dir in "$#"; do
files=("$dir"/*.{jpg,JPG,PNG,jpeg,JPEG})
if [[ -z $files ]]; then
echo 1>&2 "no suitable files in $dir"
continue
fi
echo "Converting $dir"
convert "${files[#]}" "$dir.pdf"
done
)
export -f do_convert
pwd
echo "Will convert in the following order:"
find . -type d
# find . -type d -exec bash -c 'do_convert {}' \;
find . -type d -exec bash -c 'do_convert "$#"' -- {} \+
nullglob makes *.xyz return nothing if there is no match, instead of returning the original string unchanged
p/*.{a,b,c} expands into p/*.a p/*.b p/*.c before the * are expanded
x()(...) instead of the more normal x(){...} uses a subshell so we don't have to remember to unset nullglob again or clean up any variable definitions
export -f x makes function x available in subshells
we skip conversion if there are no suitable files
with the slightly more complicated find command, we can reduce the number of invocations of bash (probably doesn't save a great deal in this particular case)
how about a one-liner
dry-run
find -name \*.jpg -or -name \*.png | xargs -I xxx echo "xxx =>" xxx.pdf
run
find -name \*.jpg -or -name \*.png | xargs -I xxx echo xxx xxx.pdf
help
-name match name
-or logical or => both jpg and png
xargs map input into a name to execute a command on
-I select a name, it is like {} in file
NOTE
instead of $(pwd) which is a command substitution you can use variable $PWD
xxx maps into a name and xxx.pdf still has the matched extension found by find. which means filename.png becomes filename.png.pdf. If this is not desired, you can sed it
to run convert command in parallel you can use -P 0 with xargs -- see xargs --help
With sed to remove extensions
dry-run
find -name \*.jpg -or -name \*.png | sed 's/.\(png\|jpg\)$//g' | xargs -I xxx echo "xxx =>" xxx.pdf
#shawn Your solution works, just as I stated in the comments, I am to stupid to name the resulting pdf properly (folder name) and save in the script caller directory. Nevertheless, it solves my case insensitive jpg, jpeg, png problems just fine.
Here is shawns solution:
#!/bin/bash
# enable recursive globs
shopt -s globstar nocaseglob extglob
for dir in **/*/; do
printf "Converting (jpg|jpeg|png) in %s\n" "$dir"
convert "$dir"/*.#(jpg|jpeg|png) "$dir/out.pdf"
done
#jhnc Your solution works out of the box, it does exactly what I intended, and I really like calling functions, or even standalone scripts to increase complexity. One drawback is, that I can not Ctrl-c the process, because it is thereby threaded, or runs in a subshell ? I think you were missing an exit statement at the end of the function, it never stopped.
#!/bin/bash
do_convert()(
shopt -s nullglob
for dir in "$#"; do
files=("$dir"/*.{jpg,JPG,png,PNG,jpeg,JPEG})
if [[ -z $files ]]; then
echo 1>&2 "no suitable files in $dir"
continue
fi
echo "Converting $dir"
convert "${files[#]}" "$dir.pdf"
done
exit
)
export -f do_convert
pwd
echo "Will convert in the following order:"
find . -type d
# find . -type d -exec bash -c 'do_convert {}' \;
find . -type d -exec bash -c 'do_convert "$#"' -- {} \+
# everyone else, it's already after midnight again, I guess this is a trivial question for you guys, and I am very grateful for your ALL your answers, I didn't have the time to try everything.
I find linux bash very challenging.
A lot of ways to skin this cat. My thought is:
for F in `find . -type f -print`
do
TYPE=`file -n --mime-type $F`
if [ "$TYPE" = image/png ]
then
## do png conversion here
elif [ "$TYPE" = image/jpg ]
then
## do jpg conversion here
fi
done

Linux multiple FIND statements and IF condition / /bash

I'm trying to write a script that checks if there in no specific Files that has parameter in the name on any position, and after statement is true (there is no such file), function should create directory with same name as Parameter $1, move files from symlink Current to newly created directory, and after that find and remove all files that are not in /current/ and contains in name #tp or numbers. I've wrote something like this, but don't know why, it's not working, can I get any help? :)
if[-ne $1]; then mkdir $1 && mv /current/* data/data2/data3/$1
find /home/ -not -path /current/ -and -not -newer /current/ -and -name *#tmp" -exec rm -r {} \; -name "*[0-9]*" -exec rm -r {} \;
else
echo "There is this tag already "
fi
If you want to check for non-existing directory, you can use this:
if [ ! -d $1 ]
instead of
if [ -ne $1 ]
Update: the previous command will only look for matching folder name. If you want to search for folders containing your parameter, you could use something like this:
if [[ `find -maxdepth 1 -type d -name "*${1}*"` = "" ]]; then
# ... commands when your param is not part of any dir ...
fi

How to rename multiple files at once

I have lots of files, directories and sub-directories at my file system.
For example:
/path/to/file/test-poster.jpg
/anotherpath/my-poster.jpg
/tuxisthebest/ohyes/path/exm/bold-poster.jpg
I want to switch all file names from *-poster.jpg to folder.jpg
I have tried with sed and awk with no success.
little help?
You can do it with find:
find -name "*poster.jpg" -exec sh -c 'mv "$0" "${0%/*}/folder.jpg"' '{}' \;
Explanation
Here, for each filename matched, executes:
sh -c 'mv "$0" "${0%/*}/folder.jpg"' '{}'
Where '{}' is the filename passed as an argument to the command_string:
mv "$0" "${0%/*}/folder.jpg"
So, at the end, $0 will have the filename.
Finally, ${0%/*}/folder.jpg expands to the path of the old filename and adds /folder.jpg.
Example
Notice I'm replacing mv with echo
$ find -name "*poster.jpg" -exec sh -c 'echo "$0" "${0%/*}/folder.jpg"' '{}' \;
./anotherpath/my-poster.jpg ./anotherpath/folder.jpg
./path/to/file/test-poster.jpg ./path/to/file/folder.jpg
./tuxisthebest/ohyes/path/exm/bold-poster.jpg ./tuxisthebest/ohyes/path/exm/folder.jpg
Try this script, it should rename all the files as required.
for i in $(find . -name "*-poster.jpg") ; do folder=`echo $i | awk -F"-poster.jpg" {'print $1'}`; mv -iv $i $folder.folder.jpg; done
You can replace . to the directory where these files are placed in the command find . -name "*-poster.jpg" in the script. Let me know if it is working fine for you.
you can try it like
find -name '*poster*' -type f -exec sh -c 'mv "{}" "$(dirname "{}")"/folder.jpg' \;
find all files containing poster == find -name '*poster*' -type f
copy the directory path of the file and store it in a temporary variable and afterwards affix "folder.jpg" to directory path == -exec sh -c 'mv "{}" "$(dirname "{}")"/folder.jpg' \;

unix bash find file directories with 2 explicit file extensions

I am trying to create a small bash script that essentially looks through a directory that includes hundreds of sub directories. in SOME of these subdirectories include a textfile.txt and a htmlfile.html where the names textfile and htmlfile are variable.
I only really care about sub directories that have both the .txt and the .html, all other subdirecories can be ignored.
I then want to list all the .html files and .txt files that are in the same sub directory
this seems like a pretty simple issue to solve but I am at a loss. all I can really get working is a line of code that outputs sub directories that have either a .html file or .txt with no association with the actual sub directory they are in, and I am pretty new at bash scripting so I can't go any further
#!/bin/bash
files="$(find ~/file/ -type f -name '*.txt' -or -name '*.html')"
for file in $files
do
echo $file
done
The following find command looks checks every subdirectory and, if it has both html and txt files, it lists all of them:
find . -type d -exec env d={} bash -c 'ls "$d"/*.html &>/dev/null && ls "$d"/*.txt &>/dev/null && ls "$d/"*.{html,txt}' \;
Explanation:
find . -type d
This looks for all subdirectories of the current directory.
-exec env d={} bash -c '...' \;
This sets the environment variable d to the value of the found subdirectory and then executes the bash command that is contained within the single quotes (see below).
ls "$d"/*.html &>/dev/null && ls "$d"/*.txt &>/dev/null && ls "$d/"*.{html,txt}
This is the bash command that is executed. It consists of three statements and-ed together. The first checks to see if directory d has any html files. If so, the second statement runs and it checks to see if there are any txt files. If so, the last statement is executed and it lists all html and txt files in the directory d.
This command is safe for all file and directory names containing spaces, tabs, or other difficult characters.
You could do it by searching recursively with the globstar option:
shopt -s globstar
for file in **; do
if [[ -d $file ]]; then
for sub_file in "$file"/*; do
case "$sub_file" in
*.html)
html=1;;
*.txt)
txt=1;;
esac
done
[[ $html && $txt ]] && echo "$file"
html=""
txt=""
fi
done
You can make use of -o
#!/bin/bash
files=$(find ~/file/ -type f -name '*.txt' -o -name '*.html')
for file in $files
do
echo $file
done
#!/bin/bash
#A quick peek into a dir to see if there's at least one file that matches pattern
dir_has_file() { dir="$1"; pattern="$2";
[ -n "$(find "$dir" -maxdepth 1 -type f -name "$pattern" -print -quit)" ]
}
#Assumes there are no newline characters in the filenames, but will behave correctly with subdirectories that match *.html or *.txt
find "$1" -type d|\
while read d
do
dir_has_file "$d" '*.txt' &&
dir_has_file "$d" '*.html' &&
#Now print all the matching files
find "$d" -maxdepth 1 -type f -name '*.txt' -o -name '*.html'
done
This script takes the root directory to look into as the first argument ($1).
The test command is what you need to check for the existence of each file in each of the subdirs:
find . -type d -exec sh -c "if test -f {}/$file1 -a -f {}/$file2 ; then ls {}/*.{txt,html} ; fi" \;
where $file1 and $file2 are the two .txt and .html files you are looking for.

Include folder name in renaming a file in linux

I've already used that command to rename the files in multiple directories and change JPG to jpg, so I have consistency.
find . -name '*.jpg' -exec sh -c 'mv "$0" "${0%.JPG}$.jpg"' {} \;
Do you have any idea how to change that to include the folder name in the name of the file
I am executing that in a folder that contains about 2000 folders (SKU's) or products ... and inside every SKU folder, there are 9 images. 1.jpg 2.jpg .... 9.jpg.
So the bottom-line is I have 2000 images with name 1.jpg, 2.jpg ... 9.jpg. I need those files to be unique, for example:
folder-name-1.jpg ... folder-name.2.jpg ... so on, in every folder.
Any help will be appreciated.
For example I can do as follows:
$ find . -iname '*.jpg' | while read fn; do name=$(basename "$fn") ; dir=$(dirname "$fn") ; mv "$fn" "$dir/$(basename "$dir")-$name" ;done
./lib/bukovina/version.jpg ./lib/bukovina/bukovina-version.jpg
./lib/bukovina.jpg ./lib/lib-bukovina.jpg
You can use fine one-liner:
find . -name '*.jpg' -execdir \
bash -c 'd="${PWD##*/}"; [[ "$1" != "$d-"* ]] && mv "$1" "./$d-$1"' - '{}' \;
This command uses safe approach to check whether image name is already not prefixed by the current directory name. You can run it multiple times also and image name won't be renamed after first run.
To get the folder name of a file you can do $(basename $(dirname ${FILE})), where ${FILE} is a path that may be relative but must contain at least one folder before the file name in it. This should not be a problem with find. If it is, just run it from one directory up.
find . -name '*.jpg' -exec sh -c 'mv "$0" "$(basename $(dirname $0))-${0%.JPG}$.jpg"' {} \;
Or, if you have JPEGs in your current directory:
find ../<dirname> -name '*.jpg' -exec sh -c 'mv "$0" "$(basename $(dirname $0))-${0%.JPG}$.jpg"' {} \;

Resources