copy a whole folder to SFTP server - linux

I need to upload a whole folder to SFTP server. I see one way only - via sftp prompt. So I execute command
sftp> put /var/sites/c/public_html/wp-content/uploads/* /wp-content/uploads/
but I get
skipping non-regular file /var/sites/c/public_html/wp-content/uploads/2010
and no files copying. what is need to do to achieve my goal, upload whole folder(subfolders and files) to the SFTP server.

put is used to upload a single file
to upload multiple files use mput
if that doesn't work try switching to scp instead of sftp

put supports the -r switch on my machine (I'm using OpenSSH_6.4p1, OpenSSL 1.0.1e 11 Feb 2013). If your sftp doesn't support -r you can also use scp. This should work as both sftp and scp use ssh to push files to the remote side, and scp is able to push files recursive on almost every system I've seen so far.

Related

Copying sub-folders from remote machine to remote machine directory

I have directory(/usr/share/hub-bucket/GameImages/) which has sub directories which contains files. And I want to transfer the sub-directories to machine to location /usr/share/hub-bucket/GameImages/. Both are remote machine, I can access remote destination using SSH private key and passphrase. And in future I will need to sync both the remote source and remote destination folder/files. How can this be implemented ?? I have used SCP for file transfer but haven't used for folder/sub-folder.
You can use the flag -r to copy files recursively with scp.
scp -r /usr/share/hub-bucket/GameImages/ user#remotehost:/usr/share/hub-bucket/GameImages/
A better and often faster option is to use rsync, which is usually more efficient since it only transfers files that differ between the two hosts.
if you use scp you can use -r option, like this
scp -r /usr/share/hub-bucket/GameImages/ user#remote-host:/usr/share/hub-bucket/GameImages/
you can also use rsync command
rsync -avz /usr/share/hub-bucket/GameImages/ user#remote-host:/usr/share/hub-bucket/GameImages/

copy/move files on remote server linux

I log into server_a and run .sh file, which has the following script:
scp user#server_b:/my_folder/my_file.xml user#server_b:/my_new_folder/
to copy files from my_folder to my_new_folder at server_b. It doesn't throw an error, but no files are copied.
Notes:
server_b is accessed by the pre-set rsa_keys.
server_a: unix
server_b: ubuntu
can SCP files from/to these servers without any issues
The end goal is to move or copy/remove files.
There are two possibilities:
Connect from server_a to server_b and do local copy:
ssh user#server_b "cp /my_folder/my_file.xml /my_new_folder/"
Do copy over the server_a. Your method would require the server_b to be able to authenticate to itself, which is probably not the case:
scp -3 user#server_b:/my_folder/my_file.xml user#server_b:/my_new_folder/
Also note that your code copies only one file and not files as you write in the title.
If you are logged on to the server, why are you authenticating again:
scp user#server_b:/my_folder/my_file.xml user#server_b:/my_new_folder/
You should be in the directory of file or simply use scp and use -v parameter to see the debug information.
Run as follows:
scp -v /my_folder/my_file.xml user#server_b:/my_new_folder/
It is not a directory nor it is recursive, so you do not need to -r parameter.

copy directory from another computer on Linux

On a computer with IP address like 10.11.12.123, I have a folder document. I want to copy that folder to my local folder /home/my-pc/doc/ using the shell.
I tried like this:
scp -r smb:10.11.12.123/other-pc/document /home/my-pc/doc/
but it's not working.
So you can use below command to copy your files.
scp -r <source> <destination>
(-r: Recursively copy entire directories)
eg:
scp -r user#10.11.12.123:/other-pc/document /home/my-pc/doc
To identify the location you can use the pwd command, eg:
kasun#kasunr:~/Downloads$ pwd
/home/kasun/Downloads
If you want to copy from B to A if you are logged into B: then
scp /source username#a:/destination
If you want to copy from B to A if you are logged into A: then
scp username#b:/source /destination
In addition to the comment, when you look at your host-to-host copy options on Linux today, rsync is by far, the most robust solution around. It is brought to you by the SAMBA team[1] and continues to enjoy active development. Most distributions include the rsync package by default. (if not, you should find an easily installable package for your distro or you can download it from rsync.samba.org ).
The basic use of rsync for host-to-host directory copy is:
$ rsync -uav srchost:/path/to/dir /path/to/dest
-uav simply recursively copies -ua only new or changed files preserving file & directory times and permissions while providing -v verbose output. You will be prompted for the username/password on 10.11.12.123 unless your have setup ssh-keys to allow public/private key authentication (see: ssh-keygen for key generation)
If you notice, the syntax is basically the same as that for scp with a slight difference in the options: (e.g. scp -rv srchost:/path/to/dir /path/to/dest). rsync will use ssh for secure transport by default, so you will want to insure sshd is running on your srchost (10.11.12.123 in your case). If you have name resolution working (or a simple entry in /etc/hosts for 10.11.12.123) you can use the hostname for the remote host instead of the remote IP. Regardless, you can always transfer the files you are interested in with:
$ rsync -uav 10.11.12.123:/other-pc/document /home/my-pc/doc/
Note: do NOT include a trailing / after document if you want to copy the directory itself. If you do include a trailing / after document (i.e. 10.11.12.123:/other-pc/document/) you are telling rsync to copy the contents, (i.e. the files and directories under) document to 10.11.12.123:/other-pc/ without also copying the document directory.
The reason rsync is far superior to other copy apps is it provides options to truly synchronize filesystems and directory trees both locally and between your local machine and remote host. Meaning, in your case, if you have used rsync to transfer files to /home/my-pc/doc/ and then make changes to the files or delete files on 10.11.12.123, you can simply call rsync again and have the changes/deletions reflected in /home/my-pc/doc/. (look at the several flavors of the --delete option for details in rsync --help or in man 1 rsync)
For these, and many more reasons, it is well worth the time to familiarize yourself with rsync. It is an invaluable tool in any Linux user's hip pocket. Hopefully this will solve your problem and get you started.
Footnotes
[1] the same folks that "Opened Windows to a Wider World" allowing seemless connection between windows/Linux hosts via the native windows server message block (smb) protocol. samba.org
If the two directories (document and /home/my-pc/doc/) you mentioned are on the same 10.11.12.123 machine.
then:
cp -ai document /home/my-pc/doc/
else:
scp -r document/ root#10.11.12.123:/home/my-pc/doc/

How to send/transfer/moving files between servers programmatically without scp or rsyn commandc?

Is there a way to copy/send/transfer files from one server to another programmatically (using nodejs/scala/python) without SCP or RSYNC ?

FTP specific files

Can we ftp specific files from a directory. And these specific files that needs to be transferred will be specified in config file.
Can we use a for loop once logged into ftp (in a script) for this purpose.
Will a normal ftp work when transferring files from Unix to win ftp server.
Thanks,
Ravi
You can use straight shell. This assumes your login directory is /home/ravi
Try this one time only:
echo "machine serverB user ravi password ravipasswd" > /home/ravi/.netrc
chmod 600 /home/ravi/.netrc
test that .netrc works - ftp serverB should log you straight in.
Shell script that reads config.file, which is just a list of files to send
while read fname
do
ftp serverB <<EOF
get $fname
bye
EOF # leave the EOF in column #1 of the script file
done < config.file
This gets file from serverB. Change get $fname to put $fname to send files from serverA to serverB
That certainly is possible. You can transfeer files listed in some file by implementing a script using an ftp client (buildin or via calling a cli client). The protocol is system independant, therefore it is possible to transfer files between systems running different operating systems. There is only one catch: remember that MS-Windows uses a case insensitive file system, other systems differ in that.

Resources