Schedule cronjob to remove files modified before X days - cron

I want to create and schedule cronjob to remove files modified before x days
I have taken following steps for that
Created Shell script (Named: Script.sh) as bellow
#!/bin/sh
15 2 * * 2-6 find /usr/sch/cbm/files/newui/log -type f -mtime +2 exec rm {} \;
I have put this file in "/var/spool/cron/crontabs" and usr/bin folder because i wasn't sure where to place exactly.
When i check with Crontab -e command get as in bellow image
But i didn't found any effect on my files. I am not sure is my job scheduled or i required to do anything else still
please guide me

crontab <file>
where file is what you have above will put the file in the right place and make sure everything is happy for cron to run it.
crontab -l
will display the list of cron jobs for the currently logged in user.
Note that the file itself does not execute so no need for #!/bin/sh as the first line. It is just a data file that cron interprets.
man 5 crontab
and
man 1 crontab
for more information.

You have to insert cron job line in cron file of user who executes that job.
su - username
crontab -e
add cron job line
save and exit
Example for removing files before x days
00 00 * * * find /path/to/folder -mtime +x -exec rm {} \;
or you can do that for some files exist in folder
For example remove files end with .log
00 00 * * * find /path/to/folder/*.log -mtime +x -exec rm {} \;

Related

Delete old directories using cronjob is not working in LINUX

I have test this deleting directories command in LINUX terminal and it is working fine.
find /home/TEST_/ -maxdepth 0 -mtime +6 -exec rm -r {} ;
printf "deleted IPSIM old directory"
But when i set cronjob to cleanup the directories, i get error as below.
find: missing argument to `-exec'
deleted IPSIM old directory
Crontab:
00 00 * * 3 cd /home/cronjob; sh cleanup_regress_SIM.sh
Can someone help with this and correct me where I am going wrong?
The correct way to put that script command in crontab is:
00 00 * * 3 /home/cronjob/cleanup_regress_SIM.sh
In more detail:
You do not need to use cd, just specify the full path to the bash script.
#!/bin/sh or #!/bin/bash is already defined in the beginning of the script so it will run in the correct environment. No need to use sh
I have tested a copy of this on my own system and it works fine. I don't know what kind of system you run that throws these errors. Here is what works.
test.sh (slightly different)
#!/bin/bash
echo $(find /home/gerge/Documents/Arduino/wifi* -maxdepth 0 -mtime +30 -exec ls {} \;)>>/home/gerge/test.log
echo "command done">>/home/gerge/test.log
crontab (run every minute for testing)
*/1 * * * * /home/gerge/test.sh
The content of test.log
wifiConfigPortal.ino wifiRelayLogin.ino wifiRGB.ino wifiRGBsimple.ino
command done
wifiConfigPortal.ino wifiRelayLogin.ino wifiRGB.ino wifiRGBsimple.ino
command done
wifiConfigPortal.ino wifiRelayLogin.ino wifiRGB.ino wifiRGBsimple.ino
command done
I would recommend checking if other scripts are able to run as cronjobs. If you get the same error there is some bigger issue.

Will missed shebang stop the script for running

I have a crontab job to purge the logs from /var/log/nginx folder. The crontab was set up like this:
15 23 * * * /scripts/logcleanup.sh > /dev/null 2>&1
The logcleanup.sh script is very simple, it only has two line:
find /var/log/nginx -mtime +5 -type f -delete;
find /var/log/nginx -size +50M -type f -delete;
I supposed the script will be run every night at 23:15. However, it doesn't get executed and the files larger than 50 MB are still inside the log folder. Is this caused by the missing Shebang "#!/usr/bin/env bash" ?
Thanks.
Redirect the output to a log file instead of /dev/null to get feedback:
15 23 * * * /scripts/logcleanup.sh > ~/logcleanup.sh.log 2>&1
All the errors will be logged and hopefully, you will find out what is wrong.

Command line to remove oldest backup

I am having the following directory with multiple Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip files.
/opt/
/opt/files/
/opt/files/private/*
/opt/files/backup.sh
/opt/files/backup.txt
/opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip
With a daily cronjob 0 0 * * * cd /opt/files/ && ./backup.sh > /opt/files/backup.txt I am currently managing my backups.
As you can imagine, this directory gets bigger and bigger over time. I now would like to create another script (or cronjob if it works with one command) to delete the oldest /opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip after 14 days (so that I have 14 recent backups all the time).
It would be great if you could explain your answer.
find /opt/files/Backup -name \*.zip -a -mtime +14 -ls
If you are satisfied the files being matched are the ones to delete, replace -ls with "-exec rm {} \;"

Remove log files using cron job

Hi. I want to remove all log files from the last 7 days from a folder, but leave all the other files. Can I use the below command? How do you specify that it just delete the files with .log extension?
find /path/to/file -mtime +7 -exec rm -f {} \;
Do I need to write this command into some file, or can I just write it in command prompt and have it run automatically every day?
I have no idea how to run a cron job in linux.
Use wildcard. And just put it in your crontab use the crontab -e option to edit your crontab jobs.
See example:
* * * * * find /path/to/*.log -mtime +7 -exec rm -f {} \;
Just to increment the answer check this nice article on how to work with your crontab ! in Linux .
You edit your personal crontab by running crontab -e.
This gets saved to /var/spool/cron/<username>. The file will be the owners username, so root would be /var/spool/cron/root. Everything in the file is run as the owner of the file.
The syntax for crontab is as follows:
SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=root
HOME=/
# For details see man 4 crontabs
# Example of job definition:
# .---------------- minute (0 - 59)
# | .------------- hour (0 - 23)
# | | .---------- day of month (1 - 31)
# | | | .------- month (1 - 12) OR jan,feb,mar,apr ...
# | | | | .---- day of week (0 - 6) (Sunday=0 or 7) OR sun,mon,tue,wed,thu,fri,sat
# | | | | |
# * * * * * user-name command to be executed
When you are editing your own personal crontab, via crontab -e, you leave out the user-name field, because the user is inferred by the filename (see first paragraph).
That being said, your entry should look like this:
0 5 * * * find /path/to/*.log -mtime +7 -delete
This will run every day, at 5:00 AM, system time. I don't think you need it to run any more frequently than daily, given the fact that you are removing files that are 7 days old.
Please don't use over use the -exec option, when the -delete option does exactly what you want to do. The exec forks a shell for every file, and is excessively wasteful on system resources.
When you are done, you can use crontab -l to list your personal crontab.
ps. The default editor on most Linux systems is vi, if you do not know vi, use something simple like nano by setting your environ variable export EDITOR=nano
find /path/to/dir-containing-files -name '*.log' -mtime +7 -exec rm -f {} \;
To create a cron job, put a file containing the following in the /etc/cron.daily dir:
#!/bin/sh
find /path/to/dir-containing-files -name '*.log' -mtime +7 -exec rm -f {} \;
You should use crontab -e to edit your crontab and schedule the job. It might look something like this:
* 1 * * * /usr/bin/find /path/to/file -name '*.log' -mtime +7 -exec rm -f {} \;
This will recursively remove all .log files in the directory /path/to/file every day at 1am.
Since this is about log files, you should look at logrotate. It runs daily from system cron job and will rotate logs for you based on rules from /etc/logrotate.conf file, which usually includes /etc/logrotate.d directory. So no need for crontab nor find.
You can also have your own cron job if you have no access to add file to /etc/logrotate.d for your own configuration.
There are plenty of examples in /etc/logrotate.d.
It expects your application to write to single file. It is not for an application that logs into different log file each day. An application generally needs not do that. If the application keeps the log file open, logrotate can run a postrotate script to tell the application to reopen the log file.
You guys are doing it the HARD way. Try using the clear command
* * * * 0 clear > /home/user/CronLog.txt:
where 0 is Sunday and 7 would be Saturday. the ">" will clear the log as appose to ">>" which adds to the log. If your log file is root then type in "root" before "clear" like this
* * * * 0 root clear > /home/user/CronLog.txt
After googling around on this particular topic, I found that many people recommend using the -delete option like so:
* * * * * find /path/to/*.log -mtime +7 -delete;
The benefits of this version is that it is easy to remember and it will perform better since -exec will spawn a new process for every file that is to be deleted.
Here are some references:
https://linuxaria.com/howto/linux-shell-how-to-use-the-exec-option-in-find-with-examples
https://unix.stackexchange.com/questions/167823/find-exec-rm-vs-delete
This will delete log files older than 7 Days
* * * * * find /path/to -name '*.log' -mtime +7 -exec rm -f {} \;
This will delete log files older than 30 Minutes
* * * * * find /path/to -name '*.log' -mmin +30 -exec rm -f {} \;

Bash script not deleting files in given directory

I found this bash script online that I want to use to delete files older than 2 days:
#!/bin/bash
find /path/to/dir -type f -mtime +2 -exec rm {} \;
I setup a cronjob to run the script (I set it a couple of minutes ahead for testing, but it should run once every 24 hours)
54 18 * * * /path/to/another/dir/script.sh
I exit correct so it updates the cronjob.
Why does it not delete the files in the directory?
What if you try dumping an echo at the end of the script and log the output
cron1.sh >> /var/log/cron1.log
You could try this but I'm not sure it will work
--exec rm -rf {}
Most cron jobs do not have PATH set. You must fully qualify the find command.
#!/bin/bash
/usr/bin/find /path/to/dir -type f -mtime +2 -exec rm {} \;
If you capture the stdout and stderr as recommended by damienfrancois, you'd probably see the message "command not found: find". If you didn't capture the stdout and stderr, cron usually will send the output to the cron job owner's email, unless configured not to do so.

Resources