Loop in shell script not working on remote server - linux

The code tries to ssh from my local server to remote server and runs some commands.
ssh root#$remoteip 'bash -s' <<END3
gcdadirs=`strings binary | egrep '.gcda$'`
for dir in ${gcdadirs[#]}; do
directory=$(dirname ${dir})
echo $dir >> dirstr.txt
mkdir -p $directory
chown $root:$root $directory
chmod 777 $directory
done
END3
the above creates a directory structure on remote server which is working fine.
I want to tar up the same directory structure. so i'm using same logic as above.
ssh root#$remoteip 'bash -s' <<END3
touch emptyfile
tar -cf gcda.tar emptyfile
gcdadirs=`strings binary | egrep '.gcda$'`
for dir in ${gcdadirs[#]}; do
tar -rf gcda.tar $dir
done
END3
The above piece of code should create a tar with all the directories included returned by for loop. i tried the code logic by copying the code to remote server and running it there and it worked. But if I ssh connect from my local server to remote server and try it is not enetring the for loop. it is not appending anything to tar file created with empty file in second line.

Try <<'END3'
Note the quotes around END3, they prevent shell substitutions inside the here-document. You want the $-signs to be transferred to the other side of the ssh connection, not interpreted locally. Same for the backticks.
Extracted from the comments as the accepted answer. Posting as community wiki

Related

How to make Ubuntu bash script wait on password input when using scp command

I want to run a script that deletes a files on computer and copies over another file from a connected host using scp command
Here is the script:
#!/bin/bash
echo "Moving Production Folder Over"
cd ~
sudo rm -r Production
scp -r host#192.168.123.456:/home/user1/Documents/Production/file1 /home/user2/Production
I would want to cd into the Production directory after it is copied over. How can I go about this? Thanks!

How can I copy file from local server to remote with creating directories which absent via SSH?

I can copy file via SSH by using SCP like this:
cd /root/dir1/dir2/
scp filename root#192.168.0.19:$PWD/
But if on remote server some directories are absent, in example remote server has only /root/ and havn't dir1 and dir2, then I can't do it and I get an error.
How can I do this - to copy file with creating directories which absent via SSH, and how to make it the easiest way?
The easiest way mean that I can get current path only by $PWD, i.e. script must be light moveable without any changes.
This command will do it:
rsync -ahHv --rsync-path="mkdir -p $PWD && rsync" filename -e "ssh -v" root#192.168.0.19:"$PWD/"
I can make the same directories on the remote servers and copy file to it via SSH by using SCP like this:
cd /root/dir1/dir2/
ssh -n root#192.168.0.19 "mkdir -p '$PWD'"
scp -p filename root#192.168.0.19:$PWD/

mput Not Transferring All Files During FTP Transfer

I'm having issues with my Unix FTP script...
It's only transferring the first three files in the directory that I'm local cd'ing into during the FTP session.
Here's the bash script that I'm using:
#!/bin/sh
YMD=$(date +%Y%m%d)
HOST='***'
USER='***'
PASSWD=***
FILE=*.png
RUNHR=19
ftp -inv ${HOST} <<EOF
quote USER ${USER}
quote PASS ${PASSWD}
cd /models/rtma/t2m/${YMD}/${RUNHR}/
mkdir /models/rtma/t2m/${YMD}/
mkdir /models/rtma/t2m/${YMD}/${RUNHR}/
lcd /home/aaron/grads/syndicated/rtma/t2m/${YMD}/${RUNHR}Z/
binary
prompt
mput ${FILE}
quit
EOF
exit 0
Any ideas?
I had encountered same issue, I have to transfer 400K files but mput * or mput *.pdf was not moving all files in one go
tried timeout :fails
tried -r recursive :fails
tried increasing Data/control timeout in IIS :fails
tried -i
Prompt
scripting fails
Finally went to use portable filezilla connect to from source and transferred the all files

Automating mkdir, chmod and scp across all the servers

This seems to be a simple issue but, I'm not able to figure it out. I am trying to run a couple of small scripts on a server and i'm having issues with that. i have an allhosts file that has the list of servers which is in the same location as that of the .sh file.
script to create a directory structure across all the 20 servers with 777 permissions
#!bin/bash
for q in `cat allhosts`
do
ssh $q "mkdir -p /opt/acd/hgf/tom/hanks/"
chmod -R 777 $q "/opt/acd/hgf/tom/hanks/" >/dev/null 2>&1
done
in the above script, it is only creating the directory paths and not changing the permissions for that path. I tried running that chmod command in a separate script, but no use..
script to scp the contents of hanks to the hanks folder created in the new server.
#!bin/bash
for q in `cat allhosts`
do
scp /opt/acd/hgf/tom/hanks/* $q:/opt/acd/hgf/tom/hanks/ >/dev/null 2>&1
done
in this script too, when i run it, its not copying anything to any of the servers.
i know this is a very small issue, but please check and let me know where I am going wrong. thanks in advance..
The first script is failing because it is running the chmod on the local machine. You should run it on the remote machine via ssh - you could combine this with the other ssh invocation as follows:
ssh $q "mkdir -p /opt/acd/hgf/tom/hanks/ ; chmod -R 777 /opt/acd/hgf/tom/hanks/"
I'd guess the second script is failing because the first script isn't setting permissions; it looks okay.

Copying files and dirs on remote server while excluding some of them

Server 1 is connected to Server 2 via SSH.
We know this:
I can execute a command such as
" ssh server2 "cp -rv /var/www /tmp" "
which will copy the entire /var/www dir to /tmp. However inside of /var/www we have the following structure(sample LS output below)
$ ls
/web1
/web2
/web3
file1.php
file2.php
file3.php
How can I execute a cp command that will exclude /web1, /web3, file1.php and file3.php (obviously just copying web2 and file2 is not an option since there are significantly more files than just 6)
Note: I am using this to backup Server2 prior to RSYNCing from Server1.
The first two poster's both have good suggestions about rsync. Here's a more complete outline of the process.
(1) You want to backup server 2 before you sync from server 1, so let's do that with rsync. Here's the command as seen from server 1 (assuming it has access to server 2):
ssh user#server2 "rsync $RSYNC_OPTS /var/www/ /path/to/backup"
(2) With server 2 backed up, let's now sync from server 1 (again, as seen from server 1)
rsync $RSYNC_OPTS /path/to/www/ user#server2:/var/www/
As long as you use sane RSYNC_OPTS, the backup and sync should both be reasonable. Richard had a reasonable suggestion for the options:
RSYNC_OPTS="--exclude-from rsync-exclude.txt --stats -avz --numeric-ids -e ssh"
If you want an accurate reproduction, I'd recommend --delete or --delete-after as well. Be sure to lookup details on any options you're unfamiliar with.
For this you should really be using rsync.
I tend to uye an rsync-exclude.txt file to specify what I don't want as it's more future proof.
/public_ftp/.ftpquota
/tmp
/var/local/backups/rsyncs
/backup/rsync
/proc
/dev
so a command could be
rsync --exclude-from rsync-exclude.txt --stats -avz -e ssh \
--numeric-ids /syncfrom/dir user#example.com:/backup/sync-to-dir
edit::
In the case of a local server you can still use rsync, however you could also use tar and exclude what you don't want.
(cd dir1;tar --exclude 'web2/*' -cf -) | (cd dir2; tar -xvf -)
or
find dir1 dir2 >exclude-files (cd dir1;tar --exclude-from exclude-files -cf -) | (cd dir2; tar -xvf -)
I do the same thing when deploying new releases to my webserver. Is it possible for you to use rsync over ssh? rsync allows you to specify an --exclude option and specify either the dirs/files on the command line or via a file. Here's a pretty good writeup that I've used in the past:
http://articles.slicehost.com/2007/10/10/rsync-exclude-files-and-folders
Since what you want to do is "copy all files except those", following your example you could do this :
ssh server2 "cp -rv /var/www/!(web1|web3|file1.php|file3.php) /tmp"
But remember that this is very ugly to backup your server :p

Resources