So I have GitLab installed on our server and I also followed their guide on how to setup the backups.
Goal
[Source] Create a cron task to backup the data every Tuesday - Saturday at 2:00 AM
[Source] Upload the created backup file to a Windows mounted drive
[Source] Remove backup files older than 2 weeks (14 days) on both the local server and the Windows mounted drive
So far only 2½ of my goals are achieved.
For #3, setting gitlab_rails['backup_keep_time'] = 1209600 only cleans up the files on the local server but not the uploaded files on the mounted Windows drive.
What do I need to do so that GitLab cleans both backup locations?
Additional Info
I have used the GitLab CE Omnibus installation.
Currently our version is GitLab CE 9.1.2 df1403f
I couldn't find an answer where GitLab will take care of this for me so I just created another cron task:
0 3 * * * find /path/to/mounted/drive/ -mindepth 1 -maxdepth 1 -name "*_gitlab_backup.tar" -mtime +13 -delete
Related
I have configured gitlab.rb file and reconfigured gitlab server gitlab-ctl reconfigure to apply configuration changes:
I generated a gitlab backup with the following command:
gitlab-backup create
In the firts try, 6 old backups have been deleted. However, I have more backups in etc/gitlab/config_backup folder. I have made a second try with the backup creation command and it did not delete any old backup:
In etc/gitlab/config_backup folder lot of old backups still remain:
BTW, the date configuration of the server is correct:
What can I do in order to delete all the old backups? Do I need to remove them manually?
It appears your backup name is different- note how your Creating backup archive: XXXXX does not match any of your gitlab_config_XXX.tar backup names.
I would hazard that you have some other backup task that is backing up your /etc/gitlab folder (which is never backed up by gitlab-backup as you can see in your first screen capture.)
It would also help if you grabbed your gitlab_rails['backup_path'] = "/path/here" and verified your backup location which most likely is not and should not be /etc/gitlab.
I have found a similar issue and had to pass the "--delete-old-backups" parameter/argument to get the old backups to purge.
gitlab-ctl backup-etc --delete-old-backups
This wasn't required with the main "gitlab-backup create" call, just with the "gitlab-ctl backup-etc" in my case.
By running sudo gitlab-rake gitlab:backup:create we can create backup for Gitlab manually. But is there any way to take Gitlab backup periodically, like daily or monthly.
PS: I want the backup file to be in Local folder not any cloud storage
A good start would be the section "Configuring cron to make daily backups "
To schedule a cron job that backs up your repositories and GitLab metadata, use the root user:
sudo su -
crontab -e
There, add the following line to schedule the backup for everyday at 2 AM:
0 2 * * * /opt/gitlab/bin/gitlab-rake gitlab:backup:create CRON=1
I have two machines, both running on linux with centos 7.
I have installed the rsync packages on both of them and i am able to sync a directory from one machine to the other.
Right now i am doing the syncing manually, each time i want to sync i am running the next line:
rsync -r /home/stuff root#123.0.0.99/home
I was wondering if there is a way of configuring the rsync to do the syncing automatically, every some amount of time or preferably when there is a new file of sub directory in the home directory?
Thank you for your help.
Any help would be appreciated.
If you want to rsync every some amount of time you can use cronjobs which can be configured to run a specific command each amount of time and if you want to run rsync when there is an update or modification you can use lsyncd. check this article about use lsyncd
Update:
As links might get outdated, I will add this brief example (You are free to modify it with what works best for you):
First create an ssh key on the source machine and then add the public key at the ~/.ssh/authorized_keys file on the destination machine.
In the source machine update this file ~/.ssh/config with the following content:
# ~/.ssh/config
...
Host my.remote.server
identityfile ~/.ssh/id_rsa
IdentitiesOnly yes
hostname 123.0.0.99
user root
port 22
...
And configure your lsyncd with the following then restart lsyncd's service
# lsyncd.conf
...
sync {
default.rsyncssh,
source="/home/stuff",
host="my.remote.server",
targetdir="/home/stuff",
excludeFrom="/etc/lsyncd/lsyncd.exclude",
rsync = {
archive = true,
}
}
...
You can setup an hourly cron job to do this.
rsync in itself is quite efficient in that it only transfers changes.
You can find more info about cron here: cron
I want to backup my Lacie OS 3.x NAS 4TB on a remote server using the native web interface.
The best solution for me would be to use rsync, unfortunatly i do not have ssh shell access on the disk.
I tried to backup my device with a "compatible rsync server" but without success :
Going to backup > New Backup, Network backup, selecting all my shares, Rsync compatible server.
I'm typing working ssh credentials of my debian backup server (which have rsync 3.0.9) and it doesn't list any rsync destination so i can't continue the backup shcedule.
The web interface also provide a solution on a "NetBackup Server", but i don't know how I can install it on Debian (not sure it's the symantec product).
Also, the NAS provide a working SFTP access, but i only want to backup modified files (Because backup 4TB each time is a bit greedy).
Any solution ?
With some help, i finaly discover that Rsync could be used as a daemon with preconfigured destinations :
On my debian side, by creating a /etc/rsyncd.conf containning
lock file = /var/run/rsync.lock
log file = /var/log/rsyncd.log
pid file = /var/run/rsyncd.pid
[documents]
path = /home/juan/Documents
comment = The documents folder of Juan
uid = juan
gid = juan
read only = no
list = yes
auth users = rsyncclient
secrets file = /etc/rsyncd.secrets
hosts allow = 192.168.1.0/255.255.255.0
/etc/rsyncd.secrets
rsyncclient:passWord
user:password
Do not forget
chmod 600 /etc/rsyncd.secrets
And then launch
rsync --daemon
After that, i can finaly view rsync destination when configuring Backup on my Nas.
Source : http://www.jveweb.net/en/archives/2011/01/running-rsync-as-a-daemon.html
I want to somehow automatically upload files every 5 minutes. I want to upload/transfer the files from my linux vps to my web host.
What I'm trying to do is upload some logs files generated on my vps to my web host so administrators can access it with an .htaccess file.
use wput along with cron to ftp files to your host
wput [options] [file]... ftp://[username[:password]#]hostname[:port][/[path/][file]]
You will probably have to install the tool as its not included by default (at least it hasnt been on most of my installs)
You'll want to set up a cron job for this. The Wikipedia page for this has a nice overview of how the crontab file is laid out. However, you should check your distribution's documentation for better information (they could be using a different version or a completely different cron daemon).
The line you'd add to the crontab would look something like this:
*/5 * * * * <user to run command as> <your command>
See also: http://www.unixgeeks.org/security/newbie/unix/cron-1.html
Hopefully your web host provides SCP or FTP servers to allow you to copy files over. How do you transfer files when you're uploading your web site files?
If it's ftp, use the ftp command:
ftp -u user:password#host/destination_folder/ sourcefile.txt
If it's scp, use the scp command:
scp foobar.txt username#host:/some/remote/directory