How to the history of files changed on a Rsync server - linux

How can I instruct RSYNC server to keep a copy of the old versions of the files that were updated?
Background info:
I have a simple RSYNC server running on Linux which I am using as a backup of a large file system (many TB). Let's call it the backup server.
On the source server, we run daily:
$ rsync -avzc /local/folder user#backup_server::remote_folder
In theory, no files should be changed on the source server, we should only receive new files. But, nonetheless, it might be possible that some updates are legit (very very seldom). If rsync detects the change, it overwrites the old version of file on the backup server with the new one. Now, here is the problem: if the change was a mistake, I lose the data and do not have the ability to recover it.
Ideally, I'd like that rsync server keeps a backup of the replaced files. Is there a way to configure that?

My backups are local to the same machine (but different drive on a mount point of /backup/)
I use --backup-dir=/backup/backups-`date +%F`/ but then it starts nesting the things rather than having a load of backups-yyyy-mm-dd/ in the /backup/ folder.

If someone has a similar issue, there is a easy solution:
Execute a simple cron that changes access rights in the destination computer.

Related

How to monitor a directory for file changes without using inotifywait?

I require a VM for developing and my host is where my IDE is. I have discovered that inotifywait does not work with shared folders, as I am sharing a local folder with my Linux guest using Virtual Box.
Basically, I have a simple bash script which needs to watch a directory and wait for any file changes. Inotifywait would be the best option but I cannot get it to work with my shared folder.
I was wondering if there is another option for my problem?
Depending on the sizes of the files and the nature of the changes you could:
Create a checksum (md5, CRC, SHA256) of the files and watch for changes
check the size of the files and watch for changes

How can I work on files on my server and keep them in sync?

I have set up a development web server using VMWare and Debian. It's all set up fine, but I have an problem.
I need to be able to work with the files on the server, or a copy of them. But, it's important that both sets of files are in sync. For example, in my text editor if I'm working on index.php I don't want to have to upload with FTP each time, and I don't want to manually keep track of what files I've edited etc.
Any ideas on how I can achieve this?
Besides version controlling you can achieve it with sshfs. It is basically like mounting a remote directory in your local system.
More info:
http://en.wikipedia.org/wiki/SSHFS
https://www.digitalocean.com/community/tutorials/how-to-use-sshfs-to-mount-remote-file-systems-over-ssh
After much searching I felt the best solution for my case is to use lsyncd to upload files to the development server anytime a change is made.
Although I use git I felt setting up a Git server and having to commit and push every time I make a change isn't what I want to be doing. Using lsyncd I'm able to use git on my local machine to keep track of the project.

Deploy Mercurial Changes to Website Hosting Account

I want to move only the website files changed since the published revision to a hosting account using SSH or FTP. The hosting account is Linux based but does have have any version control installed, so I can't simply do an update there, and the solution must run on the local development machines.
I'm essentially trying to do what http://www.deployhq.com/ does, but for free. I want to publish changes without having to re-upload everything or manually choose the files to move. I'm open to simply using a bash script that compares versions and copies each file (how? not that great with bash) since we'll be using Linux for development, but something with a web interface would be nice.
Thanks in advance for the help!
This seems more like a job for rsync than one for hg, given that that target doesn't have hg installed.
Something like so:
rsync -avz /path/to/local/files/ remote_host:/remote/path/
This would transfer all files, recursively (-r), from .../local/files/ and place them in /remote/path. The -az compresses and perserves file attributes.
rsync takes care of only transferring files that have changed. Be sure to watch for trailing slashed when specifying source paths, they matter (see the link above).

ftp push changed files to server

Is there an easy way for me to automatically search "recursively" through a directory and put changed files up through ftp to my live server in their correct spots?
CLI is ideally what I'm looking for.
I'm tired of manually searching out the files I need to do and doing it individually or by queue, trying to make this quick and painless
If the server is under your control, you might want to try rsync instead of FTP.
rsync is the way to go for keeping directories balanced. +1 for Frederic.
The other way to go is change management, like Subversion. Once you set it up, files checked in over time can be brought to productions with a simple "svn up" command.
Subversion: http://www.wandisco.com/subversion/os/downloads

How to do version control via ftp?

I have a web dev. client using a shared host that doesn't allow shell access, and thus no access to SVN, Git, etc. I've tried to convince him to move to one of the many cheap options that allow it, but he won't do it. If I use version control on my staging server, are there any tools that will allow me to replicate the changes to production via ftp? Locally I have both mac & windows, the staging server is linux, so something that works on any of those platforms....
Using your Linux staging server you could keep a separate checked out copy that you use specifically for that host and then use a utility to mirror that directory with the host server.
LFTP is useful for this kind of thing. Its available for most Linux distributions and includes a 'mirror' function:
Mirror specified source directory to
local target directory. If target
directory ends with a slash, the source base name is appended to
target
directory name. Source and/or target can be URLs pointing to
directories.
Some kind of ftp mirror software is what you need. Not tested it but a quick search gave me this Java application. You could run that over your up-to-date checked out repository.
Good thing for keeping SVN repo and FTP copy in sync is svn2web. May I suggest creating separate branch for production copy and do merges to that branch for uploading to production server.
You probably need to write a batch file that is able to
Export the SVN repository
Upload the exported files to your Linux server via FTP
Short of finding / implementing some FUSE based CoW file system that supports immutable versions .. I'd just find another (more developer friendly) host. As far as I know, no FTP server supports this natively, nor can I think of any elegant means of putting it in place with script hackery.
I could be wrong.
This question (and answer) really helped me just now as I implemented version control via gitolite on a separate server and lftp.
Here’s what I did:
Set up gitolite on my ubuntu staging server
created base repo (i.e. foo.git) on staging server
cloned foo.git into working directory on staging server
cloned foo.git into working directory on local development machine
Developed locally
Pushed changes to foo.git repo on staging server
On staging server, logged into working directory, and pulled in changes from foo.git
lftp-ed into shared host (like you mention above)
Once in shared host, ran:
mirror -R --only-newer --delete --parallel=10 /source/directory/ /target/directory
Notes on the mirror command options:
-R - this pushes the source/directory to the target/directory. (mirror pulls in from target to source without this, think reverse)
—only-newer - without this option, even if you only changed one file, the mirror command will send all the files in the source directory over to the target directory. with this option only the changed (newer) files are transferred over the wire.
—delete - deletes files that are no longer in the source directory but still in the target directory. one of my pushes involved deleting expired assets. without this option, the same files would have stayed put on my shared host after executing the mirror command.
—parallel=10 - transfers 10 files at once (instead of 1 by default). this made the process much faster
While this is what worked for me, I’m sure there are ways to improve on this. I was grateful for this question and thought i’d share my experience.
Rsync will do this over an FTP connection. You probably already have it installed if you’re on a Unix-like system.

Resources