Using rsync in cron to copy all files from a web server to a remote server using the following command
rsync -arvzh /home/foldername/* root#***.***.***.***:/home/foldername>> /var/log/filename`date +%d%m%y`.log 2>&1
This works fine for all folders and files used by ftpuser and ftpgroup
However if any files or folders are added by www-data it does not copy or sync them.
How can i get it to copy all files and folders recursivly no matter who owns them please
Many thanks
P
Related
I have files in directory: 1500.txt, 1501.txt, 1502.txt etc.
I have to sync only last N files from remote directory to local directory via SSH, last files - files ordered by name DESC (1502.txt, 1501.txt etc).
Old files from local directory (which not copying in this iteration) must be deleted
LFTP has option order by name but it doesn't has file limit option, may be RSYNC has this option?
Let me assure you that the operations I'm going to perform is on my live production server, so there is no scope for any error.
The server as well as my local machine are using Linux operating system only.
I'm able to login through the terminal on my machine with following command and entering the password :
root#32.77.491.13
Then again I typed in following command to go into the directory '/var/www' on the server :
cd /var/www
Now what I want to create is create a folder titled 'website_backup_19_02_2016' inside the folder 'var/www/'
In the newly created folder 'website_backup_19_02_2016' all the files and folders present in '/var/www/' should be copied (except the newly created folder 'website_backup_19_02_2016').
Can someone please provide me the exact set of command in sequential manner so that by executing them I can take backup of my website without any hassle.
You can issue below commands :-
1) Create directory
# mkdir /var/www/website_backup_19_02_2016
2) Copying files except website_backup_19_02_2016 directory. You can achive this using rsync tool.
# cd /var/www/
# rsync -av --progress * website_backup_19_02_2016/ --exclude website_backup_19_02_2016/
" * " --> for all your files and directories in /var/www/
Note :- You can run a dry run with rsync command to check which files and directories will actually copied. This will be important for you.
rsyn -n --progress * website_backup_19_02_2016/ --exclude website_backup_19_02_2016/
Read more options for rsync from man page.
This should work
cd /var/www
mkdir website_backup_19_02_2016
rsync -av --exclude='/var/www/website_backup_19_02_2016' /var/www /var/www/website_backup_19_02_2016
This answer your question but - if I were you - I would use a different date format (YYYY-MM-DD) that works better for listing and sorting. This would be easy to run in a script:
bck=website_backup_$(date +%Y-%m-%m)
cd /var/www && mkdir ${back}
rsync -av --exclude="$bck" /var/www/{,/$bck}
What's the Command in CentOS to purge/delete all contents in ALL "public_html" folders for all users at once? (I have my web server that I just cloned to make it into a mail server, but I don't want too keep all the public_html files)
Well, if all the files are under /home and you want to remove all the contents of /home/*/public_html you could do
rm -rf /home/*/public_html/*
I am trying to create a shell script to copy folders and files within those folders from one Linux machine to another linux machine. After copying I would like to delete only the files that are copied. I want to retain the folder structure as is.
Eg.
Machine X has a main folder named F with subfolders A,B,C folders in which each of them has 10 files.
I would like to make a copy in such a way that machine Y will have a folder named F with subfolders A,B,C containing the same files. Once the copy of all folders and files are complete, it should delete all the files in source folder but retain the folders.
The code below is untested. Use with care and backup first.
Something like this should get you started:
#!/bin/bash
srcdir=...
set -ex
rsync \
--verbose \
--recursive \
"${srcdir}/" \
user#host:/dstdir/
find "${srcdir}" -type f -delete
Set the srcdir variable and the remote argument to rsync to taste.
The rsync options are just from memory, so they may need tweaking. Read the documentation, especially options regarding deletion, backup, permissions and links.
(I'd rather not answer questions requests that show no signs of effort, but my fingers were itching, so there you go.)
scp the files, check the exit code of the scp and then delete the files locally.
Something like scp files user#remotehost:/path/ && rm files
If scp has failed, the second part of the command won't execute
I tried with copying directory and files to remote Linux machine using rsync and incrontab.
It's working fine copying files to remote server.
Incrontab
/data/AMOS_SHARE/CHV_BE/ IN_MODIFY,IN_CREATE,IN_DELETE,IN_CLOSE_WRITE,IN_MOVE /data/AMOS/jboss/chv_rsync.sh
Rsync
#!/bin/bash
chmod -R 775 /data/AMOS_SHARE/CHV_BE
rsync -avuzh /data/AMOS_SHARE/CHV_BE/ jboss#xx.xx.xx.xx:/data/AMOS_SHARE/CHV_BE/
I created some files in /data/AMOS_SHARE/CHV_BE/ folder. It worked fine as well as I created folder in that, it is also working fine. But whenever I creat files in a sub folder, it's not working.
Please help me out.
In incrond recursively monitoring is not implemented yet, so the events in sub-directories are not monitored. You can do it by adding a additional watchers to sub-dirs but I would recommended to use
another tool:
Watcher
Also you can try ionotifywait tool (example)
inotifywait /tmp/test_dir -m -r
and parse the output of this command.