Take backup for Gitlab periodically - gitlab

By running sudo gitlab-rake gitlab:backup:create we can create backup for Gitlab manually. But is there any way to take Gitlab backup periodically, like daily or monthly.
PS: I want the backup file to be in Local folder not any cloud storage

A good start would be the section "Configuring cron to make daily backups "
To schedule a cron job that backs up your repositories and GitLab metadata, use the root user:
sudo su -
crontab -e
There, add the following line to schedule the backup for everyday at 2 AM:
0 2 * * * /opt/gitlab/bin/gitlab-rake gitlab:backup:create CRON=1

Related

Gitlab - sync online repo with a remote server?

I am working on a small team and we are using a Centralized Workflow strategy where we each have a local copy of the project. We then commit and push to Gitlab.
The server actually running the code (ETL scripts) is completely separate though. Currently, we use PyCharm to manage version control and there is a checkbox which syncs the file to a specified remote server. This has worked fine for the most part, but it is not ideal if multiple people are working on the same project - when a commit is made it will transfer and overwrite the file on the remote server.
What is the best way for me to make sure that the remote server which runs the ETL scripts only uses the master branch for each project directly from Gitlab? This way, we can make sure only 'finalized' code is used and we can address any conflicts.
I ended up making a crontab to achieve this and it works.
crontab -e
Custom schedule to navigate to the parent directory, then loop through all sub-folders and git pull every 5 minutes. If anyone sees any huge flaws, let me know.
*/5 * * * * cd /path/to/folder && for i in */.git; do ( echo $i; cd $i/..; git pull; ); done

puppet agent run backup of cron files

Looks like I was having wrong entry in root crontab and when I ran puppet agent it got overwritten.
Error: Could not prefetch cron provider 'crontab': Could not parse line "====" at root:10
Is there any way to recover those cron entries which was there before I run puppet agent?

Setting up Magento 2.1.1 Cronjob

I am a bit lost on how to setup cronjob for Magento 2.1.1 in Cpanel using Cron Jobs. If someone could guide me that would be awesome. I did a search and I have 4 cron.php files in file manager.
/public_html/store/vendor/magento/magento2-base/pub/cron.php
/public_html/store/pub/cron.php
/public_html/store/update/cron.php
/public_html/store/update/dev/tests/integration/framework/cron.php
So this what you want to do.
Navigate to Cpanel
Open Cron Jobs
Add new job. In the Common Settings drop down option, select the Once Per Five Minutes option.
In the Command field enter wget -q -O /dev/null http://www.example.com/cron.php / make sure you use your domain name.
Click Add New Cron Job.
Take/make certain the newly created cron.php file is placed it in the root folder of your magento site.
That's it.

enable rsync to run permanently

I have two machines, both running on linux with centos 7.
I have installed the rsync packages on both of them and i am able to sync a directory from one machine to the other.
Right now i am doing the syncing manually, each time i want to sync i am running the next line:
rsync -r /home/stuff root#123.0.0.99/home
I was wondering if there is a way of configuring the rsync to do the syncing automatically, every some amount of time or preferably when there is a new file of sub directory in the home directory?
Thank you for your help.
Any help would be appreciated.
If you want to rsync every some amount of time you can use cronjobs which can be configured to run a specific command each amount of time and if you want to run rsync when there is an update or modification you can use lsyncd. check this article about use lsyncd
Update:
As links might get outdated, I will add this brief example (You are free to modify it with what works best for you):
First create an ssh key on the source machine and then add the public key at the ~/.ssh/authorized_keys file on the destination machine.
In the source machine update this file ~/.ssh/config with the following content:
# ~/.ssh/config
...
Host my.remote.server
identityfile ~/.ssh/id_rsa
IdentitiesOnly yes
hostname 123.0.0.99
user root
port 22
...
And configure your lsyncd with the following then restart lsyncd's service
# lsyncd.conf
...
sync {
default.rsyncssh,
source="/home/stuff",
host="my.remote.server",
targetdir="/home/stuff",
excludeFrom="/etc/lsyncd/lsyncd.exclude",
rsync = {
archive = true,
}
}
...
You can setup an hourly cron job to do this.
rsync in itself is quite efficient in that it only transfers changes.
You can find more info about cron here: cron

Cronjob on Amazon EC2 Deleted?

I had a cronjob set up to run a php script daily, which went well for about a month. Today, I realized it didn't run the script so I opened up the crontab. The crontab is completely empty - what happened?
I don't know too much about cronjobs, but as far as I understand, they do not delete themselves if the server is reset. How can I make sure cronjobs are always running and that it doesnt get deleted?
It is probably under a different user. Check root user sudo crontab -e. Each user has it's own crontab and there's one for the whole system. Note: Through the crontab configuration you can disable per-user crontabs.

Resources