How to pass source and destination paths to rsync as variables [duplicate] - linux

This question already has answers here:
How can I use a variable that contains a space?
(2 answers)
Tilde expansion in quotes
(3 answers)
Closed 3 years ago.
I'm writing a bash script to copy files from my local machine to a remote server using rsync. There is a space in the folder name "local folder" on my local machine; I don't know if this makes any difference.
If I write it as plain text it works:
rsync -arv --filter=":- .gitignore" /local\ folder/path/ user#123.4.5.6:~/
I want to put the source and destination paths as variables, but I can't get it to work. The first thing I tried is this:
SOURCE="/local\ folder/path/"
DESTINATION="user#123.4.5.6:~/"
rsync -arv --filter=":- .gitignore" $SOURCE $DESTINATION
I see this error:
rsync: change_dir "/local folder/path//~" failed: No such file or directory (2)
It seems to be a) running source and destination together, and b) not seeing the server address.
I've tried a number of variations, including braces:
rsync -arv --filter=":- .gitignore" ${SOURCE} ${DESTINATION}
Using quotes:
rsync -arv --filter=":- .gitignore" "${SOURCE}" "${DESTINATION}"
and putting the options into an array:
OPTIONS=( --protect-args -arv --filter=":- .gitignore")
rsync "${OPTIONS[#]}" ${SOURCE} ${DESTINATION}
I have also tried this after checking it in https://www.shellcheck.net/
#!/bin/bash
SOURCE="/folder name/path"
DESTINATION=user#123.4.5.6:~/
rsync -arv --filter=":- .gitignore" "$SOURCE" $DESTINATION
and also:
#!/bin/bash
SOURCE="/folder\ name/path"
DESTINATION=user#123.4.5.6:~/
rsync -arv --filter=":- .gitignore" "$SOURCE" $DESTINATION
Each time I get the same error. What simple thing am I missing here? I've looked at various examples including:
https://www.redpill-linpro.com/sysadvent/2015/12/03/rsync-tricks.html
https://serverfault.com/questions/354112/rsync-and-bash-command-substitution
The space isn't the issue, or at least not the only issue. I've tried using double quotes around my variable names as suggested here
Thanks!

You need to quote your Variables: Try
rsync -arv --filter=":- .gitignore" "$SOURCE" "$DESTINATION"

Related

star wildcard in bash [duplicate]

This question already has answers here:
Rename multiple files in shell [duplicate]
(4 answers)
Closed 4 years ago.
I've got a small problem with my bash script. I try to change file name in current directory for whole files with txt extension to text extension. For exampel 1.txt to 1.text
My script looks like this now:
#!/bin/bash
FILES=`ls /home/name/*.txt`
NAME=*.txt
RENAME=*.text
for file in FILES
do
mv $NAME $RENAME
done
i try whole combination with single, double quotes and backticks and I receive errors all the time.
Do you have some ideas how to receive wildcards "*" in bash?
Thanks.
That's not at all how you do that.
#!/bin/bash
shopt -s nullglob
OLD=.txt
NEW=.text
FILES=(/home/name/*"$OLD")
for file in "${FILES[#]}"
do
mv "$file" "${file%$OLD}$NEW}"
done
There are a number of issues with your script. Firstly, you shouldn't run ls and attempt to store its output like that. If you want to iterate through those file, just do it in the loop:
for file in /home/name/*.txt
Now the shell is doing all the work for you, and as a bonus handling any kind of weird filenames that you might have.
In your example you were looping over the literal string "FILES", not the variable, but I guess that was just a typo.
The built-in way to change the filename is to use a parameter expansion to remove the old one, then concatenate with the new one:
old=txt
new=text
for file in /home/name/*"$old"
do
mv "$file" "${file%$old}$new"
done
If it is possible that there are no files matching the glob, then by default, the /home/name/*.txt will not be expanded and your loop will just run once. then you have a couple of options:
use shopt -s nullglob so that /home/name/*.txt expands to null, and the loop is skipped
add an explicit check inside the loop to ensure that $file exists before trying to mv:
for file in /home/name/*"$old"
do
[ -e "$file" ] || continue
mv "$file" "${file%$old}$new"
done
You can use rename to rename filenames.
rename .txt .text /home/name/*.txt
And if you want to do this by looping, you can
for FILE in /data/tmp/*.txt; do
mv "${FILE}" "${FILE/.txt/.text}"
done

Bash script can't find file in relative path directory [duplicate]

This question already has answers here:
When to wrap quotes around a shell variable?
(5 answers)
Closed 1 year ago.
I'm new to bash and I am trying to create a script that should find an archive in a given directory. $1 is the name of archive.
When the given path is ./1/ar.tgz the script works. But when path is ../data 1/01_text.tgz I have the following problem:
dirname: extra operand "1/01_text.tgz"
and then No such file or directory.
Here is my code fragment:
VAR=$1
DIR=$(dirname ${VAR})
cd $DIR
What am I doing wrong?
Ahmed's answer is right, but you also need to enclose VAR in double quotes. The correct code fragment is:
VAR=$1
DIR=$(dirname "$VAR")
cd "$DIR"
The space is causing the problem: cd $DIR gets expanded to cd ../data 1/01_text.tgz and cd doesn't know what to make of the third "argument". Add quotes around the directory: cd "$DIR".

Issues with nested for loop

From a central Linux box (RHEL 6.3) I'm attempting to push a set of zip files to a series of other Linux hosts and then unzip those files on each target host. My central host is hpdb1.
#Push zip files to other hosts
for i in {2..8}; do
scp -r /software/stage/*.zip root#hpdb$i:/software/stage
done
#Unzip files to /software/stage
for i in {2..8}; do
ssh hpdb$i "for f in /software/stage/*.zip; do unzip /software/stage/"$f" done";
done
The first for loop to push the files works fine. However, when running the nested for loop I get the following error:
[root#hpdb1 ~]# for i in {2..8}; do ssh hpdb$i "for f in /software/stage/*.zip; do unzip /software/stage/"$f"; done"; done
unzip: cannot find or open /software/stage/, /software/stage/.zip or /software/stage/.ZIP.
unzip: cannot find or open /software/stage/, /software/stage/.zip or /software/stage/.ZIP.
Looks like the $f variable is not getting interpreted. Any ideas?
Updated for answer
This code works.
for i in {2..7}; do
ssh hpdb$i 'for f in /software/stage/*.zip; do unzip "$f" -d /software/stage; done';
done
The problem may be the nested double-quotes. You probably want the outer quotes to be single-quotes so that the embedded $s are not expanded before getting sent to the remote server.
My first though is to use the other quote character like, so:
for i in {2..8}; do ssh hpdb$i 'for f in /software/stage/*.zip; do unzip /software/stage/“$f”; done'; done
Although you could use different quotes as suggested by another answer, this will alter variable expansion behaviour and may be undesirable in some cases.
You can simply escape the enclosed quotes instead, by pre-pending them with a backslash:
for i in {2..8}; do
ssh hpdb$i "for f in /software/stage/*.zip; do unzip /software/stage/\"$f\"; done";
done

How to find,copy and rename files in linux?

I am trying to find all files in a directory and sub-directories and then copy them to a different directory. However some of them have the same name, so I need to copy the files over and then if there are two files have the same name, rename one of those files.
So far I have managed to copy all found files with a unique name over using:
#!/bin/bash
if [ ! -e $2 ] ; then
mkdir $2
echo "Directory created"
fi
if [ ! -e $1 ] ; then
echo "image source does not exists"
fi
find $1 -name IMG_****.JPG -exec cp {} $2 \;
However, I now need some sort of if statement to figure out if a file has the same name as another file that has been copied.
Since you are on linux, you are probably using cp from coreutils. If that is the case, let it do the backup for you by using cp --backup=t
Try this approach: put the list of files in a variable and copy each file looking if the copy operation succeeds. If not, try a different name.
In code:
FILES=`find $1 -name IMG_****.JPG | xargs -r`
for FILE in $FILES; do
cp -n $FILE destination
# Check return error of latest command (i.e. cp)
# through the $? variable and, in case
# choose a different name for the destination
done
Inside the for statement, you can also put some incremental integer to try different names incrementally (e.g., name_1, name_2 and so on, until the cp command succeeds).
You can do:
for file in $1/**/IMG_*.jpg ; do
target=$2/$(basename "$file")
SUFF=0
while [[ -f "$target$SUFF" ]] ; do
(( SUFF++ ))
done
cp "$file" "$target$SUFF"
done
in your script in place of the find command to append integer suffixes to identically-named files
You can use rsync with the following switches for more control
rsync --backup --backup-dir=DIR --suffix=SUFFIX -az <source dire> <destination dir>
Here (from man page)
-b, --backup
With this option, preexisting destination files are renamed as each file is transferred or deleted. You can control where the backup file goes and what (if any) suffix gets appended using the --backup-dir and --suffix options.
--backup-dir=DIR
In combination with the --backup option, this tells rsync to store all backups in the specified directory on the receiving side. This can be used for incremental backups. You can additionally specify a backup suffix using the --suffix option (otherwise the files backed up in the specified directory will keep their original filenames).
--suffix=SUFFIX
This option allows you to override the default backup suffix used with the --backup (-b) option. The default suffix is a ~ if no --backup-dir was specified, otherwise it is an empty string.
You can use rsycn to either sync two folders on local file system or on a remote file system. You can even do syncing over ssh connection.
rsync is amazingly powerful. See the man page for all the options.

old rsync and spaces in filenames

Source directory is determined like so:
SHOW=${PWD##*/}
SRC=wells#server.com:"/mnt/bigfish/video/TV/${SHOW}/"
So it comes out something like:
wells#server.com:/mnt/bigfish/video/TV/The Name Of the Show With Spaces/
Then trying to run rsync like so:
rsync -avz -e ssh "${SRC}" .
But it tells me that ""/mnt/bigfish/video/TV/The" is not a directory, ""/mnt/bigfish/video/TV/Name" is not a directory, etc, for however many space-delimited words are in the name of the source directory.
How can I rectify this egregiously annoying issue?
UPDATE I'm running this on OS 10.6, and I ended up string-replacing spaces with escaped spaces like so:
SRC=wells#kittenfactory.com:"/mnt/bigfish/video/TV/${SHOW// /\ }/"
From the rsync manual:
-s, --protect-args
This option sends all filenames and most options to the
remote
rsync without allowing the remote shell to interpret them.
This
means that spaces are not split in names, and any
non-wildcard
special characters are not translated (such as ~, $, ;, &,
etc.). Wildcards are expanded on the remote host by
rsync
(instead of the shell doing it).
As your question is dedicated to OS X, according to the Apple rsync manual you can accomplish this using either simple quotes or the wildcard ?:
rsync -av host:'file\ name\ with\ spaces' /dest
rsync -av host:file?name?with?spaces /dest
Just had to do this and using the simple quotes works perfectly:
rsync -r --partial --progress --exclude=".cvs" --exclude=".svn" --exclude=".git" --rsh=ssh root#datakeep.local:'/volume1/devel/__To\ SORT/___XXXXX\ Saves\ 2011-04' ./Saves2011
This works:
rsync -avz -e ssh "wells#server.com:\"/mnt/bigfish/video/TV/${SHOW}/\""
So set:
SRC=wells#server.com:\"/mnt/bigfish/video/TV/${SHOW}/\"
At least, here on Debian it works like a charm, no OS 10 available to test with here.
You can do this on OSX if you're dealing with arguments in a script:
ESCAPED_SRC="$(echo "$SRC" | tr ' ' '\\ ')"
ESCAPED_DEST="$(echo "$DEST" | tr ' ' '\\ ')"
rsync -ravP "$ESCAPED_SRC" "$ESCAPED_DEST"

Resources