Script to transfer a folder through SSH property - linux

I want to tranfer a folder through SSH from a client to server and from a server to a client as well. The name of the folder is always the same. Before the transfer I need to create the backup of the folder which I want overwrite.
So I've created two sripts, one for the download (servet to client) and one for the upload (client to server).
down_src.sh
#!/bin/bash
rm -rf ~/Projects/GeoPump/project_bk
mv ~/Projects/GeoPump/project ~/Projects/GeoPump/project_bk
rsync --delete -azvv --rsync-path="mkdir -p ~/Projects/GeoPump/ && rsync" -e ssh pi#25.30.116.202:~/Projects/GeoPump/project ~/Projects/GeoPump/
up_src.sh
#!/bin/bash
ssh pi#25.30.116.202 'rm -rf ~/Projects/GeoPump/project_bk'
ssh pi#25.30.116.202 'mv ~/Projects/GeoPump/project ~/Projects/GeoPump/project_bk'
rsync --delete -azvv --rsync-path="mkdir -p ~/Progetti/GeoPump/ && rsync" -e ssh ~/Projects/GeoPump/project pi#25.30.116.202:~/Projects/GeoPump/
When I run up_src.sh, for example, I need to insert server password three times
andrea#andrea-GL552VW:~/Projects/GeoPump$ sudo ./up_src.sh
pi#25.30.116.202's password:
pi#25.30.116.202's password:
opening connection using: ssh -l pi 25.30.116.202 "mkdir -p ~/Projects/GeoPump/ && rsync" --server -vvlogDtprze.iLsfx --delete . "~/Projects/GeoPump/" (10 args)
pi#25.30.116.202's password:
sending incremental file list
Now my questions are:
Is this the correct way to do this kind of tranfer?
Can anyone suggest me the proper way to create these scripts without insert password multiple times?

For the up_src.sh script to work, you will have to ensure that the user on 25.30.116.202 has permissions without using ... && sudo rsync

Related

How to make Ubuntu bash script wait on password input when using scp command

I want to run a script that deletes a files on computer and copies over another file from a connected host using scp command
Here is the script:
#!/bin/bash
echo "Moving Production Folder Over"
cd ~
sudo rm -r Production
scp -r host#192.168.123.456:/home/user1/Documents/Production/file1 /home/user2/Production
I would want to cd into the Production directory after it is copied over. How can I go about this? Thanks!

SCP is creating two directories (One in the other) with same name, where it is supposed to create only one

This is the bash script I'm trying:
#!/usr/bin/bash
password = SECRET
sshpass -p $password ssh root#destination_ip "mkdir -p /PVs"
sshpass -p $password scp -r /path/to/clear-nginx-deployment/ root#destination_ip:/PVs/
Basically, I'm trying to copy my local directory clear-nginx-deployment into the remote directory /PVs as it is.
The result I get on the destination machine, after running the script:
/PVs/clear-nginx-deployment/clear-nginx-deployment/content.html
The result which I expect:
/PVs/clear-nginx-deployment/content.html
Please help to get through this.
Thanks.

combining ssh and scp command in shell script

Is there any way I can combine the following commands into one command? I do not want to login in each time for each command.
sshpass -p 'somepwd' ssh user#server "mkdir -p /home/user/test"
sshpass -p 'somepwd' scp file.sh user#server:/home/user/test
sshpass -p 'somepwd' scp /test/somefile.txt user#server:/home/user/test
sshpass -p 'somepwd' ssh user#server -C "cd /home/user/test;./file.sh"
I did check the answer for combing multiple commands when using ssh and scp; Based on that I would still need 3 logins, one for first ssh and mkdir, one for scp and one for ssh and running the shell script.
Is there a better solution?
Use public/private keys instead of password authentication. Not only will this simplify the use of ssh, it is much more secure, especially after you disallow password authentication on the server you are connecting to. Using password authentication means you will get hacked, or your server has already been compromised and you don't know it yet. The rest of this answer assumes you have set up public/private keys.
I see you have files in /test. Don't put your work in the root directory, this invites security issues. Instead, work in your home directory unless you are experienced with setting up permissions properly.
Because file.sh is in your current directory (whatever that is) and you want a file from /test/ you cannot use rsync. rsync would be a good choice if all your files lived in the same directory.
Here is what we are left with; I have not messed with the location of /test/ because I don't know enough about the task:
ssh user#server "mkdir -p /home/user/test"
scp file.sh user#server:/home/user/test
scp /test/somefile.txt user#server:/home/user/test
ssh user#server -C "cd /home/user/test;./file.sh"
With GNU tar and ssh:
tar -c file.sh test/somefile.txt | sshpass -p 'somepwd' ssh user#server -C "tar -C / --transform 's|test/||;s|^|/home/user/test/|' --show-transformed-names -xv; cd /home/user/test; ./file.sh"
For more secure methods to pass the password with sshpass, see man sshpass.

How to use lftp to transfer segmented files?

I want to transfer a file from my server to another.The network between these servers isn't very well,so I want to use lftp to speed up.My script is like this:
lftp -u user,password -e "set sftp:connect-program 'ssh -a -x -i /key'; mirror --use-pget=5 -i data.tar.gz -r -R /data/ /tmp; quit" sftp://**.**.**.**:22
I found data.tar.gz wasn't segmented, But When I use it to download a file, that will works.
What should I do?
Segmented uploads are not implemented in lftp. If you have ssh access to the server, login there and use lftp to download the file. If there were many files, you could also upload different files in parallel using -P mirror option.

How can I copy file from local server to remote with creating directories which absent via SSH?

I can copy file via SSH by using SCP like this:
cd /root/dir1/dir2/
scp filename root#192.168.0.19:$PWD/
But if on remote server some directories are absent, in example remote server has only /root/ and havn't dir1 and dir2, then I can't do it and I get an error.
How can I do this - to copy file with creating directories which absent via SSH, and how to make it the easiest way?
The easiest way mean that I can get current path only by $PWD, i.e. script must be light moveable without any changes.
This command will do it:
rsync -ahHv --rsync-path="mkdir -p $PWD && rsync" filename -e "ssh -v" root#192.168.0.19:"$PWD/"
I can make the same directories on the remote servers and copy file to it via SSH by using SCP like this:
cd /root/dir1/dir2/
ssh -n root#192.168.0.19 "mkdir -p '$PWD'"
scp -p filename root#192.168.0.19:$PWD/

Resources