How can I upload an entire folder, that contains other folders, using sftp on linux? - linux

I have tried put -r directory/*, which only uploaded the files and not folders. Gave me the error, cannot Couldn't canonicalise.
Any help would be greatly appreciated.

For people actually wanting a direct answer to this question (instead of being told to use something other than sftp)...
put -r local/path/to/directoryName
The uploaded directory must already exist in the working directory on the server, so you might need to create it first.
mkdir directoryName

Here you can find detailed explanation as how to copy a directory using scp. In your case, it would be something like:
$ scp -r foo your_username#remotehost.edu:/some/remote/directory/bar
This will copy the directory "foo" from the local host to a remote host's directory "bar".
Here -r is -recursively copy entire directories.
You can also use rcp with similar syntax. The only difference between them is that scp uses secure shell and rcp uses remote shell.
BTW The "Couldn't canonicalise" error you mentioned appear when sftp server is unable to access the file/directory mentioned in the command.
UPDATE: For users who want to use put specifically, please refer to Ben Thielker answer here.

sftp> mkdir source
sftp> put -r source
Uploading source/ to /home/myself/source
Entering source/
source/file1
source/file2

if you have issues using sftp, you can use ncftp
For centos
yum install ncftp
To copy a whole directory recursively
ncftpput -R -v -u username -P 21 ftp.server.dev /remote-path/ /localdirectory

Use scp instead. It uses SSH too and can easily handle recursion.

Related

Does SCP command works for non-empty directory, too?

Is it possible to copy a non-empty directory from a local to remote system?
with SCP command or another thing?
Yes you can, you just need to add -r flag for directories.
You can check scp manual online scp manual or you can check this stackoverflow link
It is easy :)
The command to copy a directory is much like as when copying files. The only difference is that you need to use the -r flag for recursive.
To copy a directory from a local to remote system, use the -r option:
enter image description here
The -r flag should solve your problem according to the man pages.

SCP not working in EC2 (AWS)

I can SSH into the EC2 instance:
ssh -i "my_key.pem" ec2-user#my-public-ip
However, scp doesn't work:
scp -r –i "my_key.pem" ./my_file ec2-user#my-public-ip:/home/ec2-user/my_file
Permission denied (publickey).
lost connection
I've also tried using public instance DNS, but nothing changes.
Any idea why is this happening and how to solve it?
The only way for this to happen is the private key mykey.pem is not found in the current directory. It is possible you tried ssh from a directory different than scp.
Try the following with full path to your key:
scp -r –i /path/to/my_key.pem ./my_file ec2-user#my-public-ip:/home/ec2-user/my_file
If it fails, post the output with -v option. It will tell you exactly where the problem is
scp -v -r –i /path/to/my_key.pem ./my_file ec2-user#my-public-ip:/home/ec2-user/my_file
I am bit late but this might be help full to someone.
Do not use the /home/ec2-user. Rather directly use the file name or folder name
E.g. the following command will put your my_file at the home folder (i.e. /home/ec2-user)
scp -r –i "my_key.pem" ./my_file ec2-user#my-public-ip:my_file
Or Say if you have a folder at /home/ect-user/my_data
Then use the following command to copy your file to the folder
scp -r –i "my_key.pem" ./my_file ec2-user#my-public-ip:my_data
Stupidly late addendum:
To avoid specifying the private key every time, just add to the .ssh/config file (create it if not already there) the following (without comments):
Host testserver // a memorable alias
Hostname 12.34.56.67 // your server ip
User ec2-user // user to connect
IdentityFile /path/to/key.pem // path to the private key
PasswordAuthentication no
Then a simple ssh testserver should work from anywhere (and consequently your scp too).
I use it to connect with Vim via scp using:
vim scp://testserver/relative/file/path
or
vim scp://testserver//absolute/file/path
and
vim scp://testserver/relative/dir/path/ (note the trailing slash)
to respectively edit files and browse folders directly from local (thus using my precious .vimrc <3 configuration).
Solution found here
Hope this helps! :)
I was facing this issue today and found solution for me (not elegant but one which worked). - this solution is good if you want to download something once and rollback all settings afterwards.
Solution:
When I specified -v option while using scp I noticed the certificate is being denied for some reason so I went to /etc/ssh/sshd_config and set PasswordAuthentication yes. Then I used systemctl restart sshd.
After this procedure I went to my local machine and used:
scp -v -r myname#VPC:/home/{user}/filename.txt path/on/local/machine
provided PWD and file transmission has been successful.
Hope this helps to someone :)

Downloading or moving a directory in SFTP? -r gives an illegal argument

I have a folder inside of my sftp server that I want to either download or move. When I try to rename it (to move it), I get the following error -
Couldn't rename file "/my/directory/<directory>" to "/my/directory/path/../directory/newname": Failure
Passing in a -r doesnt work either -
get: Invalid flag -r
How can I either download or move these directories without the -r command? I do not have the ability to upgrade sftp.
Thanks
Actually I like to use WinSCP with wine, but since it's not the case try to use this command from http://linux.die.net/man/1/sftp and check if it works.
rename oldpath newpath
PS: You can't use mv to do that
download or move?
get -r remote_directory
downloads the directory to ur machine.
rename bla blub
does the renaming

copy directory from another computer on Linux

On a computer with IP address like 10.11.12.123, I have a folder document. I want to copy that folder to my local folder /home/my-pc/doc/ using the shell.
I tried like this:
scp -r smb:10.11.12.123/other-pc/document /home/my-pc/doc/
but it's not working.
So you can use below command to copy your files.
scp -r <source> <destination>
(-r: Recursively copy entire directories)
eg:
scp -r user#10.11.12.123:/other-pc/document /home/my-pc/doc
To identify the location you can use the pwd command, eg:
kasun#kasunr:~/Downloads$ pwd
/home/kasun/Downloads
If you want to copy from B to A if you are logged into B: then
scp /source username#a:/destination
If you want to copy from B to A if you are logged into A: then
scp username#b:/source /destination
In addition to the comment, when you look at your host-to-host copy options on Linux today, rsync is by far, the most robust solution around. It is brought to you by the SAMBA team[1] and continues to enjoy active development. Most distributions include the rsync package by default. (if not, you should find an easily installable package for your distro or you can download it from rsync.samba.org ).
The basic use of rsync for host-to-host directory copy is:
$ rsync -uav srchost:/path/to/dir /path/to/dest
-uav simply recursively copies -ua only new or changed files preserving file & directory times and permissions while providing -v verbose output. You will be prompted for the username/password on 10.11.12.123 unless your have setup ssh-keys to allow public/private key authentication (see: ssh-keygen for key generation)
If you notice, the syntax is basically the same as that for scp with a slight difference in the options: (e.g. scp -rv srchost:/path/to/dir /path/to/dest). rsync will use ssh for secure transport by default, so you will want to insure sshd is running on your srchost (10.11.12.123 in your case). If you have name resolution working (or a simple entry in /etc/hosts for 10.11.12.123) you can use the hostname for the remote host instead of the remote IP. Regardless, you can always transfer the files you are interested in with:
$ rsync -uav 10.11.12.123:/other-pc/document /home/my-pc/doc/
Note: do NOT include a trailing / after document if you want to copy the directory itself. If you do include a trailing / after document (i.e. 10.11.12.123:/other-pc/document/) you are telling rsync to copy the contents, (i.e. the files and directories under) document to 10.11.12.123:/other-pc/ without also copying the document directory.
The reason rsync is far superior to other copy apps is it provides options to truly synchronize filesystems and directory trees both locally and between your local machine and remote host. Meaning, in your case, if you have used rsync to transfer files to /home/my-pc/doc/ and then make changes to the files or delete files on 10.11.12.123, you can simply call rsync again and have the changes/deletions reflected in /home/my-pc/doc/. (look at the several flavors of the --delete option for details in rsync --help or in man 1 rsync)
For these, and many more reasons, it is well worth the time to familiarize yourself with rsync. It is an invaluable tool in any Linux user's hip pocket. Hopefully this will solve your problem and get you started.
Footnotes
[1] the same folks that "Opened Windows to a Wider World" allowing seemless connection between windows/Linux hosts via the native windows server message block (smb) protocol. samba.org
If the two directories (document and /home/my-pc/doc/) you mentioned are on the same 10.11.12.123 machine.
then:
cp -ai document /home/my-pc/doc/
else:
scp -r document/ root#10.11.12.123:/home/my-pc/doc/

How to copy all files via FTP in rsync

I have online account with some Host which give me FTP account with username and password .
i have another with copany which gave me FTP and rsync .
Now i want to transfer all my files from old FTP to NEW FTP with rync.
Now is it possible to do it via rsync only because i don't want to first copy on computer and then upload again
Lets call the machine with only FTP src.
Lets call the machine with FTP and SSH dst.
ssh dst
cd destination-direction
wget --mirror --ftp-user=username --ftp-password=password\
--no-host-directories ftp://src/pathname/
Note that running wget with --ftp-password on the command line will give away the password to anyone else on the system. (As well as transferring it over the wire in the clear, but you knew that.)
If you don't have access to wget, then they might have ncftp or lftp or ftp installed. I just happen to know wget the best. :)
Edit To use ftp, you'll need to do something more like:
ftp src
user username
pass password
bin
cd /pathname
ls
At this point, note all the directories on the remote system. Create each one with !mkdir. Then change into the directory both locally and remotely:
lcd <dirname>
cd <dirname>
ls
Repeat for all the directories. Use mget * to get all the files.
If this looks awful, it is because it is. FTP wasn't designed for this, and if your new host doesn't have better tools (be sure to look for ncftp and lftp and maybe something like ftpmirror), then either compile better tools yourself or get good at writing scripts around the bad tools. :)
Or if you could get a shell on src, that'd help immensely too. FTP is just not intended for transferring thousands of files.
Anyway, this avoids bouncing through your local system, which ought to help throughput significantly.
There's always the trusty FUSE filesystems, CurlFtpFS and SSHFS. Mount each server with the appropriate filesystem and copy across using standard utilities. Probably not the fastest way to do it, but quite possibly the least labor-intensive.
I was looking for a simple solution to sync a remote folder to a local folder via FTP while only replacing new files. I got stuck with a little wget script based on sarnold's answer that I thought might be helpful to others, too, so here it is:
#!/bin/bash
HOST="1.2.3.4"
USER="username"
PASS="password"
LDIR="/path/to/local/dir" # can be empty
RDIR="/path/to/remote/dir" # can be empty
cd $LDIR && \ # only start if the cd was successful
wget \
--continue \ # resume on files that have already been partially transmitted
--mirror \ # --recursive --level=inf --timestamping --no-remove-listing
--no-host-directories \ # don't create 'ftp://src/' folder structure for synced files
--ftp-user=$USER \
--ftp-password=$PASS \
ftp://$HOST/$RDIR

Resources