Linux bash backup script - 12, 9, 6, 3, 1 months - linux

I am writing a bash backup script, and it's working excellent so far. The problem is that it clutters up my harddrive in no time.
The backup runs weekly on sundays.
I would like to:
Save the most recent 4 backups
Save the backup which is 3 months old
Save the backup which is 6 months old
Save the backup which is 12
months old
Now how do i achieve this?
I think i can work out how to "check if file exists", but i'm having trouble getting my head around how to delete the correct backups.
The backup 3 months old, will be 3 months and 1 week old by next week - and thus be deleted..
Is there any geniously simple way to work around this that i may have overlooked..?
Thanks in advance,

If you give your Backup files a nice naming scheme like: 10.29.15-BACKUP.zip you could always do it real easily. Easiest where you can just have 2 separate folders one for Daily Backups and one for Archives.
So in your bash script:
#BACKUP PROCESS HAPPENS HERE, PLACES BACKUP NAMED 10.29.15-BACKUP.zip in /home/User/DailyBackups FOLDER, WHICH WE WILL CALL $CurrentBackup
#Get Date from 3 months ago
ChkDate=`date --date="-3 months" +%m.%d.%y`
#See if this file exists
ls $ChkDate-BACKUP.zip /home/User/BackupArchive/
#If it does exist then copy current backup to BackupArchive Folder and Remove any backups older than 367 days from the BackupArchive Folder
if [[ $? == 0 ]]; then
cp /home/User/DailyBackups/$CurrentBackup /home/User/BackupArchive/$CurrentBackup
find /home/User/BackupArchive/*-BACKUP.zip -mtime +367 -exec rm {} \
fi
#Remove all but the most recent 4 Backups
for i in `ls -t /home/User/DailyBackups/*-BACKUP.zip | tail -n +5`; do
rm "$i"
done
I used 367 to account for a 366 day leap year and just in case your one year backup was a bit off, like 366 days and 1 minute.

I had a similar task to delete files up to n date what i had to do was:
1. generate an interval date from todays date (like 3 months ago)
[this post has a good writeup about getting specific dates
http://stackoverflow.com/questions/11144408/convert-string-to-date-in-bash]
2. loop over all the files in the location and get their time\date stamp with
date -r <filename> +%Y
date -r <filename> +%m
date -r <filename> +%d
3. Compare file date to interval date from todays date and keep if it matches or delete if not.
Hope this helps you to get the concept going

Suppose you named the backup according to the date:
% date +%Y-%m-%d
2015-10-29
Then you can compute the date one year ago like this:
% date +%Y-%m-%d -d "-1 year"
2014-10-29
and the date 5 weeks ago like this:
% date +%Y-%m-%d -d "-5 weeks"
2015-09-24
So you can setup cronjobs with run every 3 months and every Sunday,
and delete backups that occurred one year ago and 5 weeks ago like this:
# Every 3 months, run the backup script
1 2 * */3 * /path/to/backup/script.sh > /path/to/backupdir/$(date +%Y-%m-%d-Q)
# Every 3 months, delete the backup made on that date the previous year
1 2 * */3 * /bin/rm /path/to/backupdir/$(date +%Y-%m-%d-Q -d "-1 year")
# Every Sunday, if backup does not exist, run backup script
1 3 * * 7 if [ ! -f /path/to/backupdir/$(date +%Y-%m-%d-Q) ]; then /path/to/backup/script.sh > /path/to/backupdir/$(date +%Y-%m-%d) fi
# Every Sunday, delete backup 5 weeks old
1 3 * * 7 /bin/rm /path/to/backupdir/$(date +%Y-%m-%d -d "-5 weeks")
Note that
We want to be careful not to run the backup script twice on the same day, for
example, when a quarterly backup happens on a Sunday. If the quarterly backup
cronjob is set to run (at 2:00AM), before the weekly backups (at 3:00AM), then
we can prevent the weekly backup from running by testing if the backup filename
already exists. This is the purpose of using
[ ! -f /path/to/backupdir/$(date +%Y-%m-%d-Q) ]
When we delete backups which are 5 weeks old, we do not want to delete quarterly backups.
We can prevent that from happening by naming the quarterly backups a little differently than the weekly backups, such as with a Q:
% date +%Y-%m-%d-Q
2015-10-29-Q
so that the command to remove weekly backups,
/bin/rm /path/to/backupdir/$(date +%Y-%m-%d -d "-5 weeks")
will not remove quarterly backups.

Related

Calculating the time until the start of next hour in Bash

Hi I have a bash script called by a systemd service that needs to run until the start of the next hour. Currently I have been using:
currentTime=$(date +"%s")
nextHour=$(date -d "$(date -d 'next hour' '+%H:00:00')" '+%s')
duration=$(((nextHour-currentTime)*1000))
Which works except for trying to calculate the difference between 11pm and midnight were as far as I can tell it gets the current days midnight from 23 hours previous.
Oct 13 23:00:05 host bash[2019]: 1665698405
Oct 13 23:00:05 host bash[2019]: 1665615600
Oct 13 23:00:05 host bash[2019]: -82805000
I figure I could put a conditional check with a different calculation if needed or perhaps look at a systemd timer for triggering the service but as the service needs to always activate on boot/reboot as well as running hour to hour this setup seemed more appropriate.
Would appreciate any advice on why this is happening and advice on most streamlined steps to avoid it.
This isn't really a question about bash, but seems to be more about date. Given that date has multiple different implementations, it seems wiser to choose a different tool to do the calculation. I suspect perl is more standardized and (almost) as available as date, so you might try to get the difference with:
perl -MTime::Seconds -MTime::Piece -E '
my $t = localtime; my $m = ($t + ONE_HOUR)->truncate(to => "hour"); say $m - $t'

Command to validate between current day and 7 day old in bash scriptlinux

I want to validate, the date given as input is 7 day old or more from current day in bash script linux
This command provides the date 7 days previous to now, in seconds (since 1/1/1970):
minus7=$(date -d "7 days ago" "+%s")
This command provides the date in seconds of the user's input date:
tstDate=$(date -d "$yourInputDate" "+%s")
then
let delta=$tstDate-$minus7

My cron job should be running only on Sunday, but it's running on other days as well?

I have a cronjob that's end goal is to make a database backup on the first Sunday of every month (and remove the previous month's backup in the process).
Here's what I have it defined as:
0 1 1-7 * * test `date +\%w` -eq 0 && rm /tmp/firstSundayBackup*; mysqldump -u user -ppassword database > /tmp/firstSundayBackup-`date +\%Y-\%m-\%d`.sql
However, looking in my /tmp/ folder I'll see multiple of these backups made during the first week on days that aren't Sunday.
Shouldn't the
test `date +\%w` -eq 0 && REST_OF_JOB
stop the code from running on any day that's not Sunday?
Shouldn't the
test `date +\%w` -eq 0 && REST_OF_JOB
stop the code from running on any day that's not Sunday?
It does, but due to operator precedence this only applies to the rm command. The mysqldump will proceed regardless.
You can fix this simply by putting brackets around the two commands, after the &&:
test `date +\%w` -eq 0 && (rm /tmp/firstSundayBackup*; mysqldump -u user -ppassword database > /tmp/firstSundayBackup-`date +\%Y-\%m-\%d`.sql)
As a major aside: your job will run every day, because you've told Cron to run it every day. Is there any good reason why you're doing this and then trying to short-circuit most of those executions, rather than just telling Cron to run it on Sundays? That seems like the most intuitive way to approach this, so what would go wrong if you scheduled it as:
0 1 1-7 * 0
I think it's possible to realize this without the test of date by relying on an edge case in cron's implementation. Try 0 1 1-7 * */7REST_OF_JOB

How to write a script for backup using bacula?

I am very new to this shell scripting and bacula. I want to create a script that schedules the backup using bacula?
How to do that?
Any lead is appreciated?
Thanks.
If you are going to administer your own Linux system, learn bash. The man page is really quite detailed and useful. Do man bash.
If you are really new to Linux and command-lines, administering bacula is not for newbies. It is a fairly comprehensive backup system, for multiple systems, with a central database, which means that is is also complex.
There are much simpler tools available on Linux to perform simple system backups, which are just as reliable. If you just want to backup you home directory, tar or zip are excellent tools. In particular, tar can do both full backups and incremental backups.
Assuming that you really want to use bacula and have enough information to write a couple of simple scripts, then even so, the original request is ambiguous.
Do you mean schedule a periodic cron job to accomplish backups unattended? Or, do you mean to schedule an single invocation of bacula at a determined time and date?
In either case, it's a good idea to create two simple scripts: one to perform a full backup, and one to perform an incremental backup. The full backup should be run, say, once a week or once a month, and the incremental backup should be run every day, or once a week -- depending on how often your system data changes.
Most modest sites undergoing daily usage would have a daily incremental backup with a full backup on the weekends (say, Sunday). This way, if the system crashed on, say, Friday, you would need to recover with the most recent full backup (on the previous Sunday), and then recover with each day's incremental backup (Mon, Tue, Wed, Thu). You would probably lose data changes that had occurred on the day of the crash.
If the rate of data change was hourly, and recovery at an hourly rate was important, then the incrementals should be scheduled for each hour, with full backups each night.
An important consideration is knowing what, exactly, is to be backed up. Most home users want their home directory to be recoverable. The OS root and application partitions are often easily recoverable without backups. Alternatively, these are backed up on a very infrequent schedule (say once a month or so), since they change must less frequently than the user's home director.
Another important consideration is where to put the backups. Bacula supports external storage devices, such as tapes, which are not mounted filesystems. tar also supports tape archives. Most home users have some kind of USB or network-attached storage that is used to store backups.
Let's assume that the backups are to be stored on /mnt/backups/, and let's assume that the user's home directory (and subdirectories) are all to be backed up and made recoverable.
% cat <<EOF >/usr/local/bin/full-backup
#!/bin/bash
# full-backup SRCDIRS [--options]
# incr-backup SRCDIRS [--options]
#
# set destdir to the path at which the backups will be stored
# each backup will be stored in a directory of the date of the
# archive, grouped by month. The directories will be:
#
# /mnt/backups/2014/01
# /mnt/backups/2014/02
# ...
# the full and incremental files will be named this way:
#
# /mnt/backups/2014/01/DIR-full-2014-01-24.192832.tgz
# /mnt/backups/2014/01/DIR-incr-2014-01-25.192531.tgz
# ...
# where DIR is the name of the source directory.
#
# There is also a file named ``lastrun`` which is used for
# its last mod-time which is used to select files changed
# since the last backup.
$PROG=${0##*/} # prog name: full-backup or incr-backup
destdir=/mnt/backup
now=`date +"%F-%H%M%S"`
monthdir=`date +%Y-%m`
dest=$destdir/$monthdir/
set -- "$#"
while (( $# > 0 )) ; do
dir="$1" ; shift ;
options='' # collect options
while [[ $# -gt 0 && "x$1" =~ x--* ]]; do # any options?
options="$options $1"
shift
done
basedir=`basename $dir`
fullfile=$dest/$basedir-full-$now.tgz
incrfile=$dest/$basedir-incr-$now.tgz
lastrun=$destdir/lastrun
case "$PROG" in
full*) archive="$fullfile" newer= kind=Full ;;
incr*) archive="$incrfile" newer="--newer $lastrun" kind=Incremental ;;
esac
cmd="tar cfz $archive $newer $options $dir"
echo "$kind backup starting at `date`"
echo ">> $cmd"
eval "$cmd"
echo "$kind backup done at `date`"
touch $lastrun # mark the end of the backup date/time
exit
EOF
(cd /usr/local/bin ; ln -s full-backup incr-backup )
chmod +x /usr/local/bin/full-backup
Once this script is configured and available, it can be scheduled with cron. See man cron. Use cron -e to create and edit a crontab entry to invoke full-backup once a week (say), and another crontab entry to invoke incr-backup once a day. The following are three sample crontab entries (see man 5 crontab for details on syntax) for performing incremental and full backups, as well as removing old archives.
# run incremental backups on all user home dirs at 3:15 every day
15 3 * * * /usr/local/bin/incr-backup /Users
# run full backups every sunday, at 3:15
15 3 * * 7 /usr/local/bin/full-backup /Users
# run full backups on the entire system (but not the home dirs) every month
30 4 * 1 7 /usr/local/bin/full-backup / --exclude=/Users --exclude=/tmp --exclude=/var
# delete old backup files (more than 60 days old) once a month
15 3 * 1 7 find /mnt/backups -type f -mtime +60 -delete
Recovering from these backups is an exercise left for later.
Good luck.
I don't think it gives meaning to have a cron scheduled script to activate Bacula.
The standard way to schedule backup using bacula is :
1) Install the Bacula file daemon on the machine you want to backup and then
2) Configure your Bacula Directory to schedule the backup
ad 1)
If your machine to backup is Debian or Ubuntu, you can install the Bacula file daemon from the shell like this:
shell> apt-get install bacula-fd (bacula-fd stands for Bacula File Daemon)
If your machine to backup is Windows, then you need to download the Bacula file daemon and install it. You can download here : http://sourceforge.net/projects/bacula/files/Win32_64/ (select the version that match your Bacula server version)
ad 2)
You need to find the bacula-dir.conf file on your Bacula server (if you installed Bacula Director on a Ubuntu machine, then the path is : /etc/bacula/bacula-dir.conf)
The bacula-dir.conf schedule section is very flexible and therefore also somewhat complicated, here is an example :
Schedule {
Name = "MonthlyCycle"
Run = Level=Full on 1 at 2:05 # full backup the 1. of every month at 2:05.
Run = Level=Incremental on 2-31 at 2:05 # incremental backup all other days.
}
Note that there are a lot more configuration necessary to run Bacula, here is a full tutorial how to install, configure, backup and restore Bacula : http://webmodelling.com/webbits/miscellaneous/bacula.aspx (disclaimer : I wrote the Bacula tutorial myself)

Keep getting error: "date --date=4 days ago: command not found"

I setup a script on my dedicated server to backup all of my Cpanel backup files to Amazon S3, and I've got it running every night via cron.
It ran perfectly last night and backed everything up, but then proceeded to delete all it had backed up. It appears to have something to do with the date command because if I pull the "deletion" portion of the script out and put it into another file and run it as an echo, I can't get the date to echo properly. I keep getting a "command not found" error:
Here's the full code for the backup script:
#!/bin/bash
##Notification email address
_EMAIL=klawncare1212#gmail.com
ERRORLOG=/var/log/s3_backup_logs/backup.err`date +%F`
ACTIVITYLOG=/var/log/s3_backup_logs/activity.log`date +%F`
##Directory which needs to be backed up
SOURCE=/backup/cpbackup/daily
##Name of the backup in bucket
DESTINATION=`date +%F`
##Backup degree
DEGREE=4
#Clear the logs if the script is executed second time
:> ${ERRORLOG}
:> ${ACTIVITYLOG}
##Uploading the daily backup to Amazon s3
/usr/bin/s3cmd -r put ${SOURCE} s3://MK_Web_Server_Backup/${DESTINATION}/ 1>>${ACTIVITYLOG} 2>>${ERRORLOG}
ret2=$?
##Sent email alert
msg="BACKUP NOTIFICATION ALERT FROM `hostname`"
if [ $ret2 -eq 0 ];then
msg1="Amazon s3 DAILY Backup Uploaded Successfully"
else
msg1="Amazon s3 DAILY Backup Failed!!\n Check ${ERRORLOG} for more details"
fi
echo -e "$msg1"|mail -s "$msg" ${_EMAIL}
#######################
##Deleting backup's older than DEGREE days
## Delete from both server and amazon
#######################
DELETENAME=$(dateĀ --date="${DEGREE} days ago" +%F)
/usr/bin/s3cmd -r --force del s3://MK_Web_Server_Backup/${DELETENAME} 1>>${ACTIVITYLOG} 2>>${ERRORLOG}
And here is the sample code that I am testing to try and simply echo the date code above:
#!/bin/bash
##Backup degree
DEGREE=4
#######################
##Deleting backup's older than DEGREE days
## Delete from both server and amazon
#######################
DELETENAME=$(dateĀ --date="4 days ago" +%F)
echo ${DELETENAME}
What am I doing wrong? Every time I run this small test script on my CentOS linux server through SSH I get the following error:
"date --date=4 days ago: command not found"
So, it's not having any trouble inserting the "degree" variable value. And, if I simply take and run the same command at the prompt in SSH (date --date="4 days ago" +%F), it works like a charm, outputting the data just as I would expect it to.
What am I doing wrong?
You probably are picking up different versions of the date command when you run the script from a regular terminal, vs. running it from the script, because they use different paths. Either use the full path to the version of the date command you want to use, or explicitly set the path at the beginning of the script.
mydate4=`date -d "4 days ago " +%F`

Resources