Sub directory not getting copied to remote server using Rsync Linux - linux

I tried with copying directory and files to remote Linux machine using rsync and incrontab.
It's working fine copying files to remote server.
Incrontab
/data/AMOS_SHARE/CHV_BE/ IN_MODIFY,IN_CREATE,IN_DELETE,IN_CLOSE_WRITE,IN_MOVE /data/AMOS/jboss/chv_rsync.sh
Rsync
#!/bin/bash
chmod -R 775 /data/AMOS_SHARE/CHV_BE
rsync -avuzh /data/AMOS_SHARE/CHV_BE/ jboss#xx.xx.xx.xx:/data/AMOS_SHARE/CHV_BE/
I created some files in /data/AMOS_SHARE/CHV_BE/ folder. It worked fine as well as I created folder in that, it is also working fine. But whenever I creat files in a sub folder, it's not working.
Please help me out.

In incrond recursively monitoring is not implemented yet, so the events in sub-directories are not monitored. You can do it by adding a additional watchers to sub-dirs but I would recommended to use
another tool:
Watcher
Also you can try ionotifywait tool (example)
inotifywait /tmp/test_dir -m -r
and parse the output of this command.

Related

rsync not copying all files and folders

Using rsync in cron to copy all files from a web server to a remote server using the following command
rsync -arvzh /home/foldername/* root#***.***.***.***:/home/foldername>> /var/log/filename`date +%d%m%y`.log 2>&1
This works fine for all folders and files used by ftpuser and ftpgroup
However if any files or folders are added by www-data it does not copy or sync them.
How can i get it to copy all files and folders recursivly no matter who owns them please
Many thanks
P

copy directory from another computer on Linux

On a computer with IP address like 10.11.12.123, I have a folder document. I want to copy that folder to my local folder /home/my-pc/doc/ using the shell.
I tried like this:
scp -r smb:10.11.12.123/other-pc/document /home/my-pc/doc/
but it's not working.
So you can use below command to copy your files.
scp -r <source> <destination>
(-r: Recursively copy entire directories)
eg:
scp -r user#10.11.12.123:/other-pc/document /home/my-pc/doc
To identify the location you can use the pwd command, eg:
kasun#kasunr:~/Downloads$ pwd
/home/kasun/Downloads
If you want to copy from B to A if you are logged into B: then
scp /source username#a:/destination
If you want to copy from B to A if you are logged into A: then
scp username#b:/source /destination
In addition to the comment, when you look at your host-to-host copy options on Linux today, rsync is by far, the most robust solution around. It is brought to you by the SAMBA team[1] and continues to enjoy active development. Most distributions include the rsync package by default. (if not, you should find an easily installable package for your distro or you can download it from rsync.samba.org ).
The basic use of rsync for host-to-host directory copy is:
$ rsync -uav srchost:/path/to/dir /path/to/dest
-uav simply recursively copies -ua only new or changed files preserving file & directory times and permissions while providing -v verbose output. You will be prompted for the username/password on 10.11.12.123 unless your have setup ssh-keys to allow public/private key authentication (see: ssh-keygen for key generation)
If you notice, the syntax is basically the same as that for scp with a slight difference in the options: (e.g. scp -rv srchost:/path/to/dir /path/to/dest). rsync will use ssh for secure transport by default, so you will want to insure sshd is running on your srchost (10.11.12.123 in your case). If you have name resolution working (or a simple entry in /etc/hosts for 10.11.12.123) you can use the hostname for the remote host instead of the remote IP. Regardless, you can always transfer the files you are interested in with:
$ rsync -uav 10.11.12.123:/other-pc/document /home/my-pc/doc/
Note: do NOT include a trailing / after document if you want to copy the directory itself. If you do include a trailing / after document (i.e. 10.11.12.123:/other-pc/document/) you are telling rsync to copy the contents, (i.e. the files and directories under) document to 10.11.12.123:/other-pc/ without also copying the document directory.
The reason rsync is far superior to other copy apps is it provides options to truly synchronize filesystems and directory trees both locally and between your local machine and remote host. Meaning, in your case, if you have used rsync to transfer files to /home/my-pc/doc/ and then make changes to the files or delete files on 10.11.12.123, you can simply call rsync again and have the changes/deletions reflected in /home/my-pc/doc/. (look at the several flavors of the --delete option for details in rsync --help or in man 1 rsync)
For these, and many more reasons, it is well worth the time to familiarize yourself with rsync. It is an invaluable tool in any Linux user's hip pocket. Hopefully this will solve your problem and get you started.
Footnotes
[1] the same folks that "Opened Windows to a Wider World" allowing seemless connection between windows/Linux hosts via the native windows server message block (smb) protocol. samba.org
If the two directories (document and /home/my-pc/doc/) you mentioned are on the same 10.11.12.123 machine.
then:
cp -ai document /home/my-pc/doc/
else:
scp -r document/ root#10.11.12.123:/home/my-pc/doc/

How can I upload an entire folder, that contains other folders, using sftp on linux?

I have tried put -r directory/*, which only uploaded the files and not folders. Gave me the error, cannot Couldn't canonicalise.
Any help would be greatly appreciated.
For people actually wanting a direct answer to this question (instead of being told to use something other than sftp)...
put -r local/path/to/directoryName
The uploaded directory must already exist in the working directory on the server, so you might need to create it first.
mkdir directoryName
Here you can find detailed explanation as how to copy a directory using scp. In your case, it would be something like:
$ scp -r foo your_username#remotehost.edu:/some/remote/directory/bar
This will copy the directory "foo" from the local host to a remote host's directory "bar".
Here -r is -recursively copy entire directories.
You can also use rcp with similar syntax. The only difference between them is that scp uses secure shell and rcp uses remote shell.
BTW The "Couldn't canonicalise" error you mentioned appear when sftp server is unable to access the file/directory mentioned in the command.
UPDATE: For users who want to use put specifically, please refer to Ben Thielker answer here.
sftp> mkdir source
sftp> put -r source
Uploading source/ to /home/myself/source
Entering source/
source/file1
source/file2
if you have issues using sftp, you can use ncftp
For centos
yum install ncftp
To copy a whole directory recursively
ncftpput -R -v -u username -P 21 ftp.server.dev /remote-path/ /localdirectory
Use scp instead. It uses SSH too and can easily handle recursion.

Copy all files on cron run - regular task

I've got a virtual dedicated server (running FreeBSD 6.x) and I need to regulary backup files from one folder to another, say from /home/LOGIN/data/www/mydomain.com/test1 to /home/LOGIN/data/www/mydomain.com/test2. I've tried different approaches:
rsync -a /home/LOGIN/data/www/mydomain.com/test1 /home/LOGIN/data/www/mydomain.com/test2
cp /home/LOGIN/data/www/mydomain.com/test1 /home/LOGIN/data/www/mydomain.com/test2
cp www/mydomain.com/test1 /www/mydomain.com/test2
but all this didn't work, it gave error #127 and others.
AFAIK this can be done using a php script to run by cron also.
what is the better way?
try
cp /home/LOGIN/data/www/mydomain.com/test1/*
/home/LOGON/data/www/mydomain.com/test2
and make sure the test2 directory is existent

shell script cd fails even though the path is correct

I need to do a script to extract a tar at a specified location.
I did something simple like:
cp test.tar /var/www/html
cd /var/www/html
tar xvf test.tar
If I execute the commands by hand everything is OK. If I save them in a .sh then use #bash script.sh, I get the following error ": Not a directory cd: /var/www/html". Any ideea why?
Ty for your time.
Notes: I tried the script version on a virtual machine (CentOS 5.5) and the script worked fine, the problem occurs on the real machine where I want to use it (I used same OS disk image, same configurations as on the virtual machine... this makes it really really odd for me).
Added: Also I try invoking something like service mysqld start... this also fails saying that a dir doesn't exist (still if I run by hand it works.).
I solved the problem - it is quite interesting).
I created the script on a virtual machine running on windows with a centos os, the enter in windows is "\r\n" while in linux is "\n".
The script worked on the vm because the code for enter was correct, while on the second computer, with native linux it was incorrect. I created exactly the same script on linux and everything went back 2 normal ;).
Note... the mkdir part worked because I used another, simplified script written on linux.
On a related note, I have found that the "~" character does not seem to work in bash, so if you are using that, try replacing it with the full path.
It looks like your cp might be coping test.jar to the file html under the www directory. Make sure that html exists and is a directory before you try to cp.
mkdir -p /var/www/html
cp test.tar /var/www/html
cd /var/www/html
tar xvf test.tar

Resources