Rsync to Amazon Ec2 Instance - linux

I have an EC2 instance running and I am able to SSH into it.
However, when I try to rsync, it gives me the error Permission denied (publickey).
The command I'm using is:
rsync -avL --progress -e ssh -i ~/mykeypair.pem ~/Sites/my_site/* root#ec2-XX-XXX-XXX-XXX.us-west-2.compute.amazonaws.com:/var/www/html/
I also tried
rsync -avz ~/Sites/mysite/* -e "ssh -i ~/.ssh/id_rsa.pub" root#ec2-XX-XXX-XXX-XXX.us-west-2.compute.amazonaws.com:/var/www/html/
Thanks,

I just received that same error. I had been consistently able to ssh with:
ssh -i ~/path/mykeypair.pem \
ubuntu#ec2-XX-XXX-XXX-XXX.us-west-2.compute.amazonaws.com
But when using the longer rsync construction, it seemed to cause errors. I ended up encasing the ssh statement in quotations and using the full path to the key. In your example:
rsync -avL --progress -e "ssh -i /path/to/mykeypair.pem" \
~/Sites/my_site/* \
root#ec2-XX-XXX-XXX-XXX.us-west-2.compute.amazonaws.com:/var/www/html/
That seemed to do the trick.

Below is what I used and it worked. Source was ec2 and target was home machine.
sudo rsync -azvv -e "ssh -i /home/ubuntu/key-to-ec2.pem" ec2-user#xx.xxx.xxx.xx:/home/ec2-user/source/ /home/ubuntu/target/

use rsync to copy files between servers
copy file from local machine to server
rsync -avz -e "ssh -i /path/to/key.pem" /path/to/file.txt <username>#<ip/domain>:/path/to/directory/
copy file from server to local machine
rsync -avz -e "ssh -i /path/to/key.pem" <username>#<ip/domain>:/path/to/directory/file.txt /path/to/directory/
note: use command with sudo if you are not a root user

After suffering a little bit, I believe this will help:
I am using the below command and it has worked without problems:
rsync -av --progress -e ssh /folder1/folder2/* root#xxx.xxx.xxx.xxx:/folder1/folder2
First consideration:
Use the --rsync-path
I prefer in a shell script:
#!/bin/bash
RSYNC = /usr/bin/rsync
$RSYNC [options] [source] [destination]
Second consideration:
Create a publick key by command below for communication between the servers in question. She will not be the same as provided by Amazon.
ssh-keygen -t rsa
Do not forget to enable permission on the target server in /etc/ssh/sshd_config (UBUNTU and CENTOS).
Sync files from one EC2 instance to another
http://ask-leo.com/how_can_i_automate_an_sftp_transfer_between_two_servers.html
Use -v option for verbose and better identify errors.
Third Consideration
If both servers are on EC2 make a restraint by security group
In the security group Server Destination:
inbound:
Source / TCP port
22 / IP Security (or group name) of the source server

This worked for me:
nohup rsync -zravu --partial --progress -e "ssh -i xxxx.pem" ubuntu#xx.xx.xx.xx:/mnt/data /mnt2/ &

Related

SCP and sshpass - Can't copy from remote source to local destination using script on PIs - debian11

I am struggling to copy files from a remote source to my local destination
I am using scp and I have tried adding sshpass to send the password
I have a script that copies from my local source to a remote destination which works:
sudo sshpass -p "pi" ssh -o StrictHostKeyChecking=no pi#$VAR_IP ls /some_dir
this just connects to it without having to put in additional commands to accept the connection if it is the first time
sudo sshpass -p "pi" scp /path_to_app/$VAR_APP pi#$VAR_IP:/home/pi/$VAR_APP/
this successfully copies from my local source to my remote destination
Now... Even though the scp documentation says I can scp remote source to local destination
I can't seem to get it to work, here is how I am trying to do it in a different script:
sudo sshpass -p "pi" ssh -o StrictHostKeyChecking=no pi#$VAR_IP ls /some_dir
this is just to initialize not to have to accept connection, same as the last script
sudo sshpass -p "pi" scp pi#$VAR_IP:/home/pi/$VAR_APP/logs/file /some_local_dir/
This gives me the error: scp: /home/pi/App_Name/logs/file: No such file or directory
the path doesn't exist on local but does on remote, so it seems it is trying to find it locally instead of remotely, any ideas on this?
I looked at all the related posts about this and the man pages but can't find an answer to my specific case
I cannot do the cert key thing as I have too many sites, it would take forever
I saw in one of the posts someone tried it without sshpass, I tried it too like this:
sudo scp pi:pi#$VAR_IP:/home/pi/$VAR_APP/logs/file /some_local_dir/
This gave me the error: ssh: Could not resolve hostname pi: Name or service not known
I don't think it works like that so I didn't go further down that vein
I hope I gave enough info with clarity
any help would really be appreciated
thank you so much for your time and input
You mention that this command is not working sudo sshpass -p "pi" scp pi#$VAR_IP:/home/pi/$VAR_APP/logs/file /some_local_dir/
Did you check this?
sudo sshpass -p "pi" ssh pi#$VAR_IP 'ls -l /home/pi/$VAR_APP/logs/file /some_local_dir/' to check the directory is exist
If that issue is still there, I recommend you to try pssh and pscp which are parallel ssh that could do the same thing as sshpass
I managed to fix it, for anyone that comes across this problem
Here is how I found the fix:
The file I was looking for was a root file but I was sshing as pi.
Even though I sudoed the script, and sudoed sshpass
That does not mean scp is sudo, so each command in a line needs its own sudo
eg:
sudo sshpass -p "pi" scp pi#IP:/file /local_dir/
This doesn't work because sshpass has sudo but scp does not, however
sudo sshpass -p "pi" sudo scp pi#IP:/file /local_dir/
This works perfectly because scp now has sudo rights

combining ssh and scp command in shell script

Is there any way I can combine the following commands into one command? I do not want to login in each time for each command.
sshpass -p 'somepwd' ssh user#server "mkdir -p /home/user/test"
sshpass -p 'somepwd' scp file.sh user#server:/home/user/test
sshpass -p 'somepwd' scp /test/somefile.txt user#server:/home/user/test
sshpass -p 'somepwd' ssh user#server -C "cd /home/user/test;./file.sh"
I did check the answer for combing multiple commands when using ssh and scp; Based on that I would still need 3 logins, one for first ssh and mkdir, one for scp and one for ssh and running the shell script.
Is there a better solution?
Use public/private keys instead of password authentication. Not only will this simplify the use of ssh, it is much more secure, especially after you disallow password authentication on the server you are connecting to. Using password authentication means you will get hacked, or your server has already been compromised and you don't know it yet. The rest of this answer assumes you have set up public/private keys.
I see you have files in /test. Don't put your work in the root directory, this invites security issues. Instead, work in your home directory unless you are experienced with setting up permissions properly.
Because file.sh is in your current directory (whatever that is) and you want a file from /test/ you cannot use rsync. rsync would be a good choice if all your files lived in the same directory.
Here is what we are left with; I have not messed with the location of /test/ because I don't know enough about the task:
ssh user#server "mkdir -p /home/user/test"
scp file.sh user#server:/home/user/test
scp /test/somefile.txt user#server:/home/user/test
ssh user#server -C "cd /home/user/test;./file.sh"
With GNU tar and ssh:
tar -c file.sh test/somefile.txt | sshpass -p 'somepwd' ssh user#server -C "tar -C / --transform 's|test/||;s|^|/home/user/test/|' --show-transformed-names -xv; cd /home/user/test; ./file.sh"
For more secure methods to pass the password with sshpass, see man sshpass.

How can I copy file from local server to remote with creating directories which absent via SSH?

I can copy file via SSH by using SCP like this:
cd /root/dir1/dir2/
scp filename root#192.168.0.19:$PWD/
But if on remote server some directories are absent, in example remote server has only /root/ and havn't dir1 and dir2, then I can't do it and I get an error.
How can I do this - to copy file with creating directories which absent via SSH, and how to make it the easiest way?
The easiest way mean that I can get current path only by $PWD, i.e. script must be light moveable without any changes.
This command will do it:
rsync -ahHv --rsync-path="mkdir -p $PWD && rsync" filename -e "ssh -v" root#192.168.0.19:"$PWD/"
I can make the same directories on the remote servers and copy file to it via SSH by using SCP like this:
cd /root/dir1/dir2/
ssh -n root#192.168.0.19 "mkdir -p '$PWD'"
scp -p filename root#192.168.0.19:$PWD/

How to copy entire folder from Amazon EC2 Linux instance to local Linux machine?

I connected to Amazon's linux instance from ssh using private key. I am trying to copy entire folder from that instance to my local linux machine .
Can anyone tell me the correct scp command to do this?
Or do I need something more than scp?
Both machines are Ubuntu 10.04 LTS
another way to do it is
scp -i "insert key file here" -r "insert ec2 instance here" "your local directory"
One mistake I made was scp -ir. The key has to be after the -i, and the -r after that.
so
scp -i amazon.pem -r ec2-user#ec2-##-##-##:/source/dir /destination/dir
Call scp from client machine with recursive option:
scp -r user#remote:src_directory dst_directory
scp -i {key path} -r ec2-user#54.159.147.19:{remote path} {local path}
For EC2 ubuntu
go to your .pem file directory
scp -i "yourkey.pem" -r ec2user#DNS_name:/home/ubuntu/foldername ~/Desktop/localfolder
You could even use rsync.
rsync -aPSHiv remote:directory .
This's how I copied file from amazon ec2 service to local window pc:
pscp -i "your-key-pair.pem" username#ec2-ip-compute.amazonaws.com:/home/username/file.txt C:\Documents\
For Linux to copy a directory:
scp -i "your-key-pair.pem" -r username#ec2-ip-compute.amazonaws.com:/home/username/dirtocopy /var/www/
To connect to amazon it requires key pair authentication.
Note:
Username most probably is ubuntu.
I use sshfs and mount remote directory to local machine and do whatever you want. Here is a small guide, commands may change on your system
This is also important and related to the above answer.
Copying all files in a local directory to EC2. This is a Unix answer.
Copy the entire local folder to a folder in EC2:
scp -i "key-pair.pem" -r /home/Projects/myfiles ubuntu#ec2.amazonaws.com:/home/dir
Copy only the entire contents of local folder to folder in EC2:
scp -i "key-pair.pem" -r /home/Projects/myfiles/* ubuntu#ec2.amazonaws.com:/home/dir
I do not like to use scp for large number of files as it does a 'transaction' for each file. The following is much better:
cd local_dir; ssh user#server 'cd remote_dir_parent; tar -c remote_dir' | tar -x
You can add a z flag to tar to compress on server and uncompress on client.
One way I found on youtube is to connect a local folder with a shared folder in EC2 instance. Please view this video for the full instruction. The sharing is instantaneous.

How to pass password to scp?

I know it is not recommended, but is it at all possible to pass the user's password to scp?
I'd like to copy a file via scp as part of a batch job and the receiving server does, of course, need a password and, no, I cannot easily change that to key-based authentication.
Use sshpass:
sshpass -p "password" scp -r user#example.com:/some/remote/path /some/local/path
or so the password does not show in the bash history
sshpass -f "/path/to/passwordfile" scp -r user#example.com:/some/remote/path /some/local/path
The above copies contents of path from the remote host to your local.
Install :
ubuntu/debian
apt install sshpass
centos/fedora
yum install sshpass
mac w/ macports
port install sshpass
mac w/ brew
brew install https://raw.githubusercontent.com/kadwanev/bigboybrew/master/Library/Formula/sshpass.rb
just generate a ssh key like:
ssh-keygen -t rsa -C "your_email#youremail.com"
copy the content of ~/.ssh/id_rsa.pub
and lastly add it to the remote machines ~/.ssh/authorized_keys
make sure remote machine have the permissions 0700 for ~./ssh folder and 0600 for ~/.ssh/authorized_keys
If you are connecting to the server from Windows, the Putty version of scp ("pscp") lets you pass the password with the -pw parameter.
This is mentioned in the documentation here.
curl can be used as a alternative to scp to copy a file and it supports a password on the commandline.
curl --insecure --user username:password -T /path/to/sourcefile sftp://desthost/path/
You can script it with a tool like expect (there are handy bindings too, like Pexpect for Python).
You can use the 'expect' script on unix/terminal
For example create 'test.exp' :
#!/usr/bin/expect
spawn scp /usr/bin/file.txt root#<ServerLocation>:/home
set pass "Your_Password"
expect {
password: {send "$pass\r"; exp_continue}
}
run the script
expect test.exp
I hope that helps.
You may use ssh-copy-id to add ssh key:
$which ssh-copy-id #check whether it exists
If exists:
ssh-copy-id "user#remote-system"
Here is an example of how you do it with expect tool:
sub copyover {
$scp = Expect->spawn("/usr/bin/scp ${srcpath}/$file $who:${destpath}/$file");
$scp->expect(30,"ssword: ") || die "Never got password prompt from $dest:$!\n";
print $scp 'password' . "\n";
$scp->expect(30,"-re",'$\s') || die "Never got prompt from parent system:$!\n";
$scp->soft_close();
return;
}
Nobody mentioned it, but Putty scp (pscp) has a -pw option for password.
Documentation can be found here: https://the.earth.li/~sgtatham/putty/0.67/htmldoc/Chapter5.html#pscp
Once you set up ssh-keygen as explained above, you can do
scp -i ~/.ssh/id_rsa /local/path/to/file remote#ip.com:/path/in/remote/server/
If you want to lessen typing each time, you can modify your .bash_profile file and put
alias remote_scp='scp -i ~/.ssh/id_rsa /local/path/to/file remote#ip.com:/path/in/remote/server/
Then from your terminal do source ~/.bash_profile. Afterwards if you type remote_scp in your terminal it should run the scp command without password.
Here's a poor man's Linux/Python/Expect-like example based on this blog post: Upgrading simple shells to fully interactive
TTYs. I needed this for old machines where I can't install Expect or add modules to Python.
Code:
(
echo 'scp jmudd#mysite.com:./install.sh .'
sleep 5
echo 'scp-passwd'
sleep 5
echo 'exit'
) |
python -c 'import pty; pty.spawn("/usr/bin/bash")'
Output:
scp jmudd#mysite.com:install.sh .
bash-4.2$ scp jmudd#mysite.com:install.sh .
Password:
install.sh 100% 15KB 236.2KB/s 00:00
bash-4.2$ exit
exit
Make sure password authentication is enabled on the target server. If it runs Ubuntu, then open /etc/ssh/sshd_config on the server, find lines PasswordAuthentication=no and comment all them out (put # at the start of the line), save the file and run sudo systemctl restart ssh to apply the configuration. If there is no such line then you're done.
Add -o PreferredAuthentications="password" to your scp command, e.g.:
scp -o PreferredAuthentications="password" /path/to/file user#server:/destination/directory
make sure you have "expect" tool before, if not, do it
# apt-get install expect
create the a script file with following content. (# vi /root/scriptfile)
spawn scp /path_from/file_name user_name_here#to_host_name:/path_to
expect "password:"
send put_password_here\n;
interact
execute the script file with "expect" tool
# expect /root/scriptfile
copy files from one server to other server ( on scripts)
Install putty on ubuntu or other Linux machines. putty comes with pscp. we can copy files with pscp.
apt-get update
apt-get install putty
echo n | pscp -pw "Password#1234" -r user_name#source_server_IP:/copy_file_path/files /path_to_copy/files
For more options see pscp help.
Using SCP non interactively from Windows:
Install the community Edition of netcmdlets
Import Module
Use Send-PowerShellServerFile -AuthMode password -User MyUser -Password not-secure -Server YourServer -LocalFile C:\downloads\test.txt -RemoteFile C:\temp\test.txt for sending File with non-interactive password
In case if you observe a strict host key check error then use -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null options.
The complete example is as follows
sshpass -p "password" scp -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null root#domain-name.com:/tmp/from/psoutput /tmp/to/psoutput
You can use below steps. This works for me!
Step1-
create a normal file suppose "fileWithScpPassword" which contains the ssh password for the destination server.
Step2- use sshpaas -f followed by password file name and then normal scp command.
sshpass -f "fileWithScpPassword" scp /filePathToUpload user#ip:/destinationPath/
One easy way I do this:
Use the same scp cmd as you use with ssh keys i.e
scp -C -i <path_to opens sshkey> <'local file_path'> user#<ip_address_VM>: <'remote file_path’>
for transferring file from local to remote
but instead of providing the correct <path_to_opensshkey>, use some garbage path. Due to wrong key path you will be asked for password instead and you can simply pass the password now to get the work done!
An alternative would be add the public half of the user's key to the authorized-keys file on the target system. On the system you are initiating the transfer from, you can run an ssh-agent daemon and add the private half of the key to the agent. The batch job can then be configured to use the agent to get the private key, rather than prompting for the key's password.
This should be do-able on either a UNIX/Linux system or on Windows platform using pageant and pscp.
All the solutions mentioned above can work only if you the app installed or you should have the admin rights to install except or sshpass.
I found this very useful link to simply start the scp in Background.
$ nohup scp file_to_copy user#server:/path/to/copy/the/file > nohup.out 2>&1
https://charmyin.github.io/scp/2014/10/07/run-scp-in-background/
I found this really helpful answer here.
rsync -r -v --progress -e ssh user#remote-system:/address/to/remote/file /home/user/
Not only you can pass there the password, but also it will show the progress bar when copying. Really awesome.

Resources