deleting old files using crontab - linux

I use the following crontab record in order to daily backup my DB:
0 2 * * * MYSQL_PWD=password mysqldump -u user db_name > $HOME/db_backups/db_name-$(date +\%Y-\%m-\%d-\%H-\%M).sql 2>> $HOME/db_backups/cron.log
I want to add another crontab record that will delete the DB dumps that are older then one month.
Any thoughts?

find /db_backups/ -mtime +30 -delete
This command would delete DB backups older than 30 days.

Just create another cron:
0 3 * * * find $HOME/db_backups -name "db_name*.sql" -mtime +30 -exec rm {} \; >> $HOME/db_backups/purge.log 2>&1
It will find all backups older than 30 days and delete them.

There is a tool called tmpreaper that securely deletes files matching certain criteria, such as an access or modification date n days in the past.

Related

CRON to Clear out /tmp files older than 30min old

the servers root /tmp directory has been getting full with file uploads. These files should be removed once the upload is done, but in some cases it hasn't. As i investigate that issue. Im looking for a cron task to delete files that start 202 from the /tmp directory that's 30 minutes old that runs every 30 minutes.
So far, i have /30 * * * rm -rf /tmp/202*
Found this solution. Works pretty well.
0 * * * * find /tmp -regextype posix-egrep -regex '/tmp/[0-9]{8}-[0-9]{6}-.*-file-.*' -type f -mmin +60 -delete

Cron Job to auto delete folder older than 7 days in Centos

I have a folder with large files in centos and how to delete the files older than 30 minutes.
Please suggest your ideas and snippets
It's simple. just use find and add the following line incrontab:
30 * * * * find /path/to/dir -type f -mmin +30 -exec rm -f {} \;
the above command will run every 30 minutes and delete ONLY files older that 30 minutes from the directory /path/to/dir

Command line to remove oldest backup

I am having the following directory with multiple Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip files.
/opt/
/opt/files/
/opt/files/private/*
/opt/files/backup.sh
/opt/files/backup.txt
/opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip
With a daily cronjob 0 0 * * * cd /opt/files/ && ./backup.sh > /opt/files/backup.txt I am currently managing my backups.
As you can imagine, this directory gets bigger and bigger over time. I now would like to create another script (or cronjob if it works with one command) to delete the oldest /opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip after 14 days (so that I have 14 recent backups all the time).
It would be great if you could explain your answer.
find /opt/files/Backup -name \*.zip -a -mtime +14 -ls
If you are satisfied the files being matched are the ones to delete, replace -ls with "-exec rm {} \;"

Schedule cronjob to remove files modified before X days

I want to create and schedule cronjob to remove files modified before x days
I have taken following steps for that
Created Shell script (Named: Script.sh) as bellow
#!/bin/sh
15 2 * * 2-6 find /usr/sch/cbm/files/newui/log -type f -mtime +2 exec rm {} \;
I have put this file in "/var/spool/cron/crontabs" and usr/bin folder because i wasn't sure where to place exactly.
When i check with Crontab -e command get as in bellow image
But i didn't found any effect on my files. I am not sure is my job scheduled or i required to do anything else still
please guide me
crontab <file>
where file is what you have above will put the file in the right place and make sure everything is happy for cron to run it.
crontab -l
will display the list of cron jobs for the currently logged in user.
Note that the file itself does not execute so no need for #!/bin/sh as the first line. It is just a data file that cron interprets.
man 5 crontab
and
man 1 crontab
for more information.
You have to insert cron job line in cron file of user who executes that job.
su - username
crontab -e
add cron job line
save and exit
Example for removing files before x days
00 00 * * * find /path/to/folder -mtime +x -exec rm {} \;
or you can do that for some files exist in folder
For example remove files end with .log
00 00 * * * find /path/to/folder/*.log -mtime +x -exec rm {} \;

Remove log files using cron job

Hi. I want to remove all log files from the last 7 days from a folder, but leave all the other files. Can I use the below command? How do you specify that it just delete the files with .log extension?
find /path/to/file -mtime +7 -exec rm -f {} \;
Do I need to write this command into some file, or can I just write it in command prompt and have it run automatically every day?
I have no idea how to run a cron job in linux.
Use wildcard. And just put it in your crontab use the crontab -e option to edit your crontab jobs.
See example:
* * * * * find /path/to/*.log -mtime +7 -exec rm -f {} \;
Just to increment the answer check this nice article on how to work with your crontab ! in Linux .
You edit your personal crontab by running crontab -e.
This gets saved to /var/spool/cron/<username>. The file will be the owners username, so root would be /var/spool/cron/root. Everything in the file is run as the owner of the file.
The syntax for crontab is as follows:
SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=root
HOME=/
# For details see man 4 crontabs
# Example of job definition:
# .---------------- minute (0 - 59)
# | .------------- hour (0 - 23)
# | | .---------- day of month (1 - 31)
# | | | .------- month (1 - 12) OR jan,feb,mar,apr ...
# | | | | .---- day of week (0 - 6) (Sunday=0 or 7) OR sun,mon,tue,wed,thu,fri,sat
# | | | | |
# * * * * * user-name command to be executed
When you are editing your own personal crontab, via crontab -e, you leave out the user-name field, because the user is inferred by the filename (see first paragraph).
That being said, your entry should look like this:
0 5 * * * find /path/to/*.log -mtime +7 -delete
This will run every day, at 5:00 AM, system time. I don't think you need it to run any more frequently than daily, given the fact that you are removing files that are 7 days old.
Please don't use over use the -exec option, when the -delete option does exactly what you want to do. The exec forks a shell for every file, and is excessively wasteful on system resources.
When you are done, you can use crontab -l to list your personal crontab.
ps. The default editor on most Linux systems is vi, if you do not know vi, use something simple like nano by setting your environ variable export EDITOR=nano
find /path/to/dir-containing-files -name '*.log' -mtime +7 -exec rm -f {} \;
To create a cron job, put a file containing the following in the /etc/cron.daily dir:
#!/bin/sh
find /path/to/dir-containing-files -name '*.log' -mtime +7 -exec rm -f {} \;
You should use crontab -e to edit your crontab and schedule the job. It might look something like this:
* 1 * * * /usr/bin/find /path/to/file -name '*.log' -mtime +7 -exec rm -f {} \;
This will recursively remove all .log files in the directory /path/to/file every day at 1am.
Since this is about log files, you should look at logrotate. It runs daily from system cron job and will rotate logs for you based on rules from /etc/logrotate.conf file, which usually includes /etc/logrotate.d directory. So no need for crontab nor find.
You can also have your own cron job if you have no access to add file to /etc/logrotate.d for your own configuration.
There are plenty of examples in /etc/logrotate.d.
It expects your application to write to single file. It is not for an application that logs into different log file each day. An application generally needs not do that. If the application keeps the log file open, logrotate can run a postrotate script to tell the application to reopen the log file.
You guys are doing it the HARD way. Try using the clear command
* * * * 0 clear > /home/user/CronLog.txt:
where 0 is Sunday and 7 would be Saturday. the ">" will clear the log as appose to ">>" which adds to the log. If your log file is root then type in "root" before "clear" like this
* * * * 0 root clear > /home/user/CronLog.txt
After googling around on this particular topic, I found that many people recommend using the -delete option like so:
* * * * * find /path/to/*.log -mtime +7 -delete;
The benefits of this version is that it is easy to remember and it will perform better since -exec will spawn a new process for every file that is to be deleted.
Here are some references:
https://linuxaria.com/howto/linux-shell-how-to-use-the-exec-option-in-find-with-examples
https://unix.stackexchange.com/questions/167823/find-exec-rm-vs-delete
This will delete log files older than 7 Days
* * * * * find /path/to -name '*.log' -mtime +7 -exec rm -f {} \;
This will delete log files older than 30 Minutes
* * * * * find /path/to -name '*.log' -mmin +30 -exec rm -f {} \;

Resources