I am using OpenVZ Web Panel to manage my virtual machines. For some reason, OVZ Web Panel's "Daily Backup" option will only store a daily backup of each virtual machine. I have configured the Backups to keep to more than 1 under the User's Profile settings - setting it to values higher than 1 and "unlimited" - but the setting is ignored, and only 1 backup copy is rotated every morning. I need at 7 daily snapshot backups for each virtual machine.
Anyone know how to let it store more backup copies? I have searched forums, but nobody else seem to have this issue. The documentation is also not clear about this. I have changed the owner of the virtual machine, restarted OWP - but still no luck.
the easiest and lazy-est way to do this, IMHO, is to write a script and then a cron job to run that script.
e.g.
cd /path/to/files
mv filename filename`date +%y%m%d`
find /path/to/files* -mtime +5 -exec rm {} \;
in some script file, where filename is the name of that one file being generated then add an entry into e.g. /etc/crontab
the find that executes the rm will delete files over 5 days old. then you get 5 days worth of backups.
probably a lot easier then requesting a feature from the devs of the program, trying to modify it yourself, etc. (unless there is some already developed feature...)
Related
Like a week ago a run this command on my server mv /* .. I was trying to move all the files from my current directory to the parent directory but ended up screwing all my server :)
Is there a way to prevent this from happening again?
I can recommend using a minimalistic file-manager like midnight-commander to transfer files.
However the answer to your question is no, if you're working with root permissions you have all abilities to destroy your system.
With great power comes great responsibility - Benjamin Franklin Parker known as Uncle Ben
you can limit the access rights by using a different user than root.
Backups!!!
As mentioned by Chris, you can setup a system with user permissions, group permissions, .... In top of this, I have the impression that you missed a dot: mv ./... instead of mv /... (I hope you did not set root directory / as your home directory? In case you did, change this immediately).
But most of all: regular backups!!! UNIX/Linux doesn't have a system restore, as Windows has, nor is there a recycle bin. Therefore, regular backups (to another machine, obviously) are a must.
I logged to my CentOS 7 server ( Azure Virtual machine) this morning and found that it's running slow on space, so I'm looking for undesired log files and safe to remove files, so Is it safe to delete the files under /var/lib/azsec ?
Here's a screenshot of the folder:
/var/lib/ folders
Thanks in advance,
Actually, it's not safe to do that directly because if you want to check the log to find something indeed later and there will no log to find.
So I suggest you can make a backup of the log and store it in another place. Then you can delete it from the VM to make some space to use.
How can I instruct RSYNC server to keep a copy of the old versions of the files that were updated?
Background info:
I have a simple RSYNC server running on Linux which I am using as a backup of a large file system (many TB). Let's call it the backup server.
On the source server, we run daily:
$ rsync -avzc /local/folder user#backup_server::remote_folder
In theory, no files should be changed on the source server, we should only receive new files. But, nonetheless, it might be possible that some updates are legit (very very seldom). If rsync detects the change, it overwrites the old version of file on the backup server with the new one. Now, here is the problem: if the change was a mistake, I lose the data and do not have the ability to recover it.
Ideally, I'd like that rsync server keeps a backup of the replaced files. Is there a way to configure that?
My backups are local to the same machine (but different drive on a mount point of /backup/)
I use --backup-dir=/backup/backups-`date +%F`/ but then it starts nesting the things rather than having a load of backups-yyyy-mm-dd/ in the /backup/ folder.
If someone has a similar issue, there is a easy solution:
Execute a simple cron that changes access rights in the destination computer.
I want to move only the website files changed since the published revision to a hosting account using SSH or FTP. The hosting account is Linux based but does have have any version control installed, so I can't simply do an update there, and the solution must run on the local development machines.
I'm essentially trying to do what http://www.deployhq.com/ does, but for free. I want to publish changes without having to re-upload everything or manually choose the files to move. I'm open to simply using a bash script that compares versions and copies each file (how? not that great with bash) since we'll be using Linux for development, but something with a web interface would be nice.
Thanks in advance for the help!
This seems more like a job for rsync than one for hg, given that that target doesn't have hg installed.
Something like so:
rsync -avz /path/to/local/files/ remote_host:/remote/path/
This would transfer all files, recursively (-r), from .../local/files/ and place them in /remote/path. The -az compresses and perserves file attributes.
rsync takes care of only transferring files that have changed. Be sure to watch for trailing slashed when specifying source paths, they matter (see the link above).
Is there an easy way for me to automatically search "recursively" through a directory and put changed files up through ftp to my live server in their correct spots?
CLI is ideally what I'm looking for.
I'm tired of manually searching out the files I need to do and doing it individually or by queue, trying to make this quick and painless
If the server is under your control, you might want to try rsync instead of FTP.
rsync is the way to go for keeping directories balanced. +1 for Frederic.
The other way to go is change management, like Subversion. Once you set it up, files checked in over time can be brought to productions with a simple "svn up" command.
Subversion: http://www.wandisco.com/subversion/os/downloads