Command line to remove oldest backup - linux

I am having the following directory with multiple Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip files.
/opt/
/opt/files/
/opt/files/private/*
/opt/files/backup.sh
/opt/files/backup.txt
/opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip
With a daily cronjob 0 0 * * * cd /opt/files/ && ./backup.sh > /opt/files/backup.txt I am currently managing my backups.
As you can imagine, this directory gets bigger and bigger over time. I now would like to create another script (or cronjob if it works with one command) to delete the oldest /opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip after 14 days (so that I have 14 recent backups all the time).
It would be great if you could explain your answer.

find /opt/files/Backup -name \*.zip -a -mtime +14 -ls
If you are satisfied the files being matched are the ones to delete, replace -ls with "-exec rm {} \;"

Related

Creating cron on RHEL 7 server to delete files older than a week [duplicate]

This question already has answers here:
Shell script to delete directories older than n days
(5 answers)
Closed 9 years ago.
I want to delete scripts in a folder from the current date back to 10 days.
The scripts looks like:
2012.11.21.09_33_52.script
2012.11.21.09_33_56.script
2012.11.21.09_33_59.script
The script will run in every 10 day with Crontab, that's why I need the current date.
find is the common tool for this kind of task :
find ./my_dir -mtime +10 -type f -delete
EXPLANATIONS
./my_dir your directory (replace with your own)
-mtime +10 older than 10 days
-type f only files
-delete no surprise. Remove it to test your find filter before executing the whole command
And take care that ./my_dir exists to avoid bad surprises !
Just spicing up the shell script above to delete older files but with logging and calculation of elapsed time
#!/bin/bash
path="/data/backuplog/"
timestamp=$(date +%Y%m%d_%H%M%S)
filename=log_$timestamp.txt
log=$path$filename
days=7
START_TIME=$(date +%s)
find $path -maxdepth 1 -name "*.txt" -type f -mtime +$days -print -delete >> $log
echo "Backup:: Script Start -- $(date +%Y%m%d_%H%M)" >> $log
... code for backup ...or any other operation .... >> $log
END_TIME=$(date +%s)
ELAPSED_TIME=$(( $END_TIME - $START_TIME ))
echo "Backup :: Script End -- $(date +%Y%m%d_%H%M)" >> $log
echo "Elapsed Time :: $(date -d 00:00:$ELAPSED_TIME +%Hh:%Mm:%Ss) " >> $log
The code adds a few things.
log files named with a timestamp
log folder specified
find looks for *.txt files only in the log folder
type f ensures you only deletes files
maxdepth 1 ensures you dont enter subfolders
log files older than 7 days are deleted ( assuming this is for a backup log)
notes the start / end time
calculates the elapsed time for the backup operation...
Note: to test the code, just use -print instead of -print -delete. But do check your path carefully though.
Note: Do ensure your server time is set correctly via date - setup timezone/ntp correctly . Additionally check file times with 'stat filename'
Note: mtime can be replaced with mmin for better control as mtime discards all fractions (older than 2 days (+2 days) actually means 3 days ) when it deals with getting the timestamps of files in the context of days
-mtime +$days ---> -mmin +$((60*24*$days))
If you can afford working via the file data, you can do
find -mmin +14400 -delete

How to delete files which have X days lifetime, not last modified. Is it even possible. Linux

I run some kind of server on my linux machine and I use simple bash script to delete files every 3 and some files every 7 days. I use find command for doing that.But my files are saved periodically, meaning that the last modification day is the current day. So files never get deleted. Only worked for me the first time, because it met the conditions. I can't find a way to delete those files using a creation date, not modification date.
Here's my simple script:
#!/bin/sh
while true
do
java -server file.jar nogui
echo ">$(tput setaf 3)STARTING REBOOT$(tput sgr0) $(tput setaf 7)(Ctrl+C To Stop!)$(tput sgr0)"
find /folder/* -mtime +7 -exec rm -rf {} \;
find /folder/* -mtime +3 -exec rm -rf {} \;
find /logs/* -mtime +1 -exec rm -rf {} \;
echo ">Rebooting in:"
for i in 5 4 3 2 1
do
echo ">$i..."
sleep 1
done
done
If someone could help me with this, I would be really thankful!
Just an idea-don't shoot... :-)
If the files are not system files automatically generated by some process but is lets say server log files, you could possibly echo inside the file the creation date (i.e at the end or beginning) and grep that value later to decide if must be removed or kept.
No, it is not possible. Standard Linux filesystems do not track creation time at all. (ctime, sometimes mistaken for creation time, is metadata change time -- as compared to mtime, which is data modification time).
That said, there are certainly ugly hacks available. For instance, if you have the following script invoked by incron (or, less efficiently, cron) to record file creation dates:
#!/bin/bash
mkdir -p .ctimes
for f in *; do
if [[ -f $f ]] && [[ ! -e .ctimes/$f ]]; then
touch ".ctimes/$f"
fi
done
...then you can look for files in the .ctimes directory that are older than three days, and delete both the markers and the files they stand for:
#!/bin/bash
find .ctimes -mtime +3 -type f -print0 | while IFS= read -r -d '' filename; do
realname=${filename#.ctimes/}
rm -f -- "$filename" "$realname"
done
If you are on ext4 Filesystem there is some hope. You can retrieve it using stat and debugfs utilities. ext4 stores creation time with inode table entry i_crtime which is 'File creation time, in seconds since the epoch' per docs. Reference Link.

Exclude directories and delete old backup

I am using following simple script script to take backup of all of my websites via tar
TIME=`date +%b-%d-%y`
FILENAME=backup-$TIME.tar.gz
#Parent backup directory
backup_parent_dir="/backup/httpdocs"
#Create backup directory and set permissions
backup_date=`date +%Y_%m_%d_%H_%M`
backup_dir="${backup_parent_dir}/${backup_date}"
echo "Backup directory: ${backup_dir}"
mkdir -p "${backup_dir}"
chmod 755 "${backup_dir}"
SRCDIR=/var/www/sites #Location of Important Data Directory (Source of backup).
tar -cpzf $backup_dir/$FILENAME $SRCDIR
Now, this is wokring fine but I need 2 things if I can do via same script
Can I exclude some folder within /var/www/sites directory, like if I don't want /var/www/sites/abc.com/logs folder to be backup. Can I define it and some other sub-directories within this script?
This script takes all sites in tar format in specified folder /backup/httpdocs through cronjob which runs daily during night and for old tarballs(older than 7 days) I have to delete them manually, so it there any possibility through same script so when it runs, it checks if there is any backup exists older than 7 days and it deletes it automatically?
EDIT:
Thanks everyone, this is what I am using now which takes backup excluding log files and delete anything older than 7 days
#!/bin/bash
#START
TIME=`date +%b-%d-%y` # This Command will add date in Backup File Name.
FILENAME=backup-$TIME.tar.gz # Here i define Backup file name format.
# Parent backup directory
backup_parent_dir="/backup/httpdocs"
# Create backup directory and set permissions
backup_date=`date +%Y_%m_%d_%H_%M`
backup_dir="${backup_parent_dir}/${backup_date}"
echo "Backup directory: ${backup_dir}"
mkdir -p "${backup_dir}"
chmod 755 "${backup_dir}"
SRCDIR=/var/www/vhosts # Location of Important Data Directory (Source of backup).
tar -cpzf $backup_dir/$FILENAME $SRCDIR --exclude=$SRCDIR/*/logs
find ${backup_parent_dir} -name '*' -type d -mtime +2 -exec rm -rfv "{}" \;
#END
Exclude from tar:
tar -cpzf $backup_dir/$FILENAME --exclude=/var/www/sites/abc.com/logs $SRCDIR
Find and delete old backups:
find ${backup_parent_dir} -type f -name 'backup-*.tar.gz' -mtime +7 -delete
The filter of find is conservative: selecting names that match backup-*.tar.gz probably renders the -type f (files only) option useless. I added it just in case you also have directories with such names. The -mtime +7 option is to be checked by you because older than 7 days is not accurate enough. Depending on what you have in mind it may be +6, +7 or +8. Please have a look at the find man page and decide by yourself. Note that the selection of backups to delete is not based on their names but on their date of last modification. If you modify them after they are created it may not be what you want. Let us know.
Use the --exclude option
tar -cpzf $backup_dir/$FILENAME $SRCDIR --exclude=$SRCDIR/*/logs
Name your backup files with an identifier that is derived from the day of the week. This will ensure the new file for each day will overwrite any existing file.
Day:
a Day of the week - abbreviated name (Mon)
A Day of the week - full name (Monday)
u Day of the week - number (Monday = 1)
d Day of the month - 2 digits (05)
e Day of the month - digit preceded by a space ( 5)
j Day of the year - (1-366)
w Same as 'u'
From: http://ss64.com/bash/date.html
For example:
TARGET_BASENAME= `date +%u`
option 1 when not many files/directories to exclude
tar -cpzf $backup_dir/$FILENAME --exclude=$SRCDIR/dir_ignore --exclude=$SRCDIR/*.log $SRCDIR
or if you have many entries to exclude much better done it in file
tar -cpzf $backup_dir/$FILENAME -X /path/to/exclude.txt $SRCDIR
where /path/to/exclude.txt file looks like
/var/www/dir_to_ignore
/var/www/*.log
you cannot use variables,but can use wildcards
second question answered very good by both guys before,i personalty love
find ${backup_parent_dir} -type f -name 'backup-*.tar.gz' -mtime +7 -delete

Remove log files using cron job

Hi. I want to remove all log files from the last 7 days from a folder, but leave all the other files. Can I use the below command? How do you specify that it just delete the files with .log extension?
find /path/to/file -mtime +7 -exec rm -f {} \;
Do I need to write this command into some file, or can I just write it in command prompt and have it run automatically every day?
I have no idea how to run a cron job in linux.
Use wildcard. And just put it in your crontab use the crontab -e option to edit your crontab jobs.
See example:
* * * * * find /path/to/*.log -mtime +7 -exec rm -f {} \;
Just to increment the answer check this nice article on how to work with your crontab ! in Linux .
You edit your personal crontab by running crontab -e.
This gets saved to /var/spool/cron/<username>. The file will be the owners username, so root would be /var/spool/cron/root. Everything in the file is run as the owner of the file.
The syntax for crontab is as follows:
SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=root
HOME=/
# For details see man 4 crontabs
# Example of job definition:
# .---------------- minute (0 - 59)
# | .------------- hour (0 - 23)
# | | .---------- day of month (1 - 31)
# | | | .------- month (1 - 12) OR jan,feb,mar,apr ...
# | | | | .---- day of week (0 - 6) (Sunday=0 or 7) OR sun,mon,tue,wed,thu,fri,sat
# | | | | |
# * * * * * user-name command to be executed
When you are editing your own personal crontab, via crontab -e, you leave out the user-name field, because the user is inferred by the filename (see first paragraph).
That being said, your entry should look like this:
0 5 * * * find /path/to/*.log -mtime +7 -delete
This will run every day, at 5:00 AM, system time. I don't think you need it to run any more frequently than daily, given the fact that you are removing files that are 7 days old.
Please don't use over use the -exec option, when the -delete option does exactly what you want to do. The exec forks a shell for every file, and is excessively wasteful on system resources.
When you are done, you can use crontab -l to list your personal crontab.
ps. The default editor on most Linux systems is vi, if you do not know vi, use something simple like nano by setting your environ variable export EDITOR=nano
find /path/to/dir-containing-files -name '*.log' -mtime +7 -exec rm -f {} \;
To create a cron job, put a file containing the following in the /etc/cron.daily dir:
#!/bin/sh
find /path/to/dir-containing-files -name '*.log' -mtime +7 -exec rm -f {} \;
You should use crontab -e to edit your crontab and schedule the job. It might look something like this:
* 1 * * * /usr/bin/find /path/to/file -name '*.log' -mtime +7 -exec rm -f {} \;
This will recursively remove all .log files in the directory /path/to/file every day at 1am.
Since this is about log files, you should look at logrotate. It runs daily from system cron job and will rotate logs for you based on rules from /etc/logrotate.conf file, which usually includes /etc/logrotate.d directory. So no need for crontab nor find.
You can also have your own cron job if you have no access to add file to /etc/logrotate.d for your own configuration.
There are plenty of examples in /etc/logrotate.d.
It expects your application to write to single file. It is not for an application that logs into different log file each day. An application generally needs not do that. If the application keeps the log file open, logrotate can run a postrotate script to tell the application to reopen the log file.
You guys are doing it the HARD way. Try using the clear command
* * * * 0 clear > /home/user/CronLog.txt:
where 0 is Sunday and 7 would be Saturday. the ">" will clear the log as appose to ">>" which adds to the log. If your log file is root then type in "root" before "clear" like this
* * * * 0 root clear > /home/user/CronLog.txt
After googling around on this particular topic, I found that many people recommend using the -delete option like so:
* * * * * find /path/to/*.log -mtime +7 -delete;
The benefits of this version is that it is easy to remember and it will perform better since -exec will spawn a new process for every file that is to be deleted.
Here are some references:
https://linuxaria.com/howto/linux-shell-how-to-use-the-exec-option-in-find-with-examples
https://unix.stackexchange.com/questions/167823/find-exec-rm-vs-delete
This will delete log files older than 7 Days
* * * * * find /path/to -name '*.log' -mtime +7 -exec rm -f {} \;
This will delete log files older than 30 Minutes
* * * * * find /path/to -name '*.log' -mmin +30 -exec rm -f {} \;

deleting old files using crontab

I use the following crontab record in order to daily backup my DB:
0 2 * * * MYSQL_PWD=password mysqldump -u user db_name > $HOME/db_backups/db_name-$(date +\%Y-\%m-\%d-\%H-\%M).sql 2>> $HOME/db_backups/cron.log
I want to add another crontab record that will delete the DB dumps that are older then one month.
Any thoughts?
find /db_backups/ -mtime +30 -delete
This command would delete DB backups older than 30 days.
Just create another cron:
0 3 * * * find $HOME/db_backups -name "db_name*.sql" -mtime +30 -exec rm {} \; >> $HOME/db_backups/purge.log 2>&1
It will find all backups older than 30 days and delete them.
There is a tool called tmpreaper that securely deletes files matching certain criteria, such as an access or modification date n days in the past.

Resources