CRON to Clear out /tmp files older than 30min old - cron

the servers root /tmp directory has been getting full with file uploads. These files should be removed once the upload is done, but in some cases it hasn't. As i investigate that issue. Im looking for a cron task to delete files that start 202 from the /tmp directory that's 30 minutes old that runs every 30 minutes.
So far, i have /30 * * * rm -rf /tmp/202*

Found this solution. Works pretty well.
0 * * * * find /tmp -regextype posix-egrep -regex '/tmp/[0-9]{8}-[0-9]{6}-.*-file-.*' -type f -mmin +60 -delete

Related

Crontab to delete files

I´m very new to Linux, and right now I am trying to delete some video recordings of my security camera (from the FTP server) at every 30 minutes.
I already used: pi#raspberrypi:~$ find path/toMy/Video/recordings/ -type f -mmin +30 -delete
After that, I used crontab -e choose the Nano option and typed:
*/30 * * * * find path/toMy/Video/recordings/ -type f -mmin +30 -delete
Saved it, but it doesn´t seem to work. What am I doing wrong?
Make sure you use the full path to find, for example:
% which find
/usr/local/opt/findutils/libexec/gnubin/find
Also, use absolute path (not relative path) for your directory.
Then:
*/30 * * * * /usr/local/opt/findutils/libexec/gnubin/find /path/toMy/Video/recordings/ -type f -mmin +30 -delete
To debug:
You can also redirect the output of your crontab entry.
*/30 * * * * find path/toMy/Video/recordings/ -type f -mmin +30 -delete &> /<your-path>/cron.out
Check for the cron logs in /var/log/
You should consider saving output of cron in a file to debug or simply try to run the command in terminal to see output. To save output of cron in file you can try:
*/30 * * * * command >/home/user/cron.log 2>&1

Cron Job to auto delete folder older than 7 days in Centos

I have a folder with large files in centos and how to delete the files older than 30 minutes.
Please suggest your ideas and snippets
It's simple. just use find and add the following line incrontab:
30 * * * * find /path/to/dir -type f -mmin +30 -exec rm -f {} \;
the above command will run every 30 minutes and delete ONLY files older that 30 minutes from the directory /path/to/dir

Command line to remove oldest backup

I am having the following directory with multiple Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip files.
/opt/
/opt/files/
/opt/files/private/*
/opt/files/backup.sh
/opt/files/backup.txt
/opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip
With a daily cronjob 0 0 * * * cd /opt/files/ && ./backup.sh > /opt/files/backup.txt I am currently managing my backups.
As you can imagine, this directory gets bigger and bigger over time. I now would like to create another script (or cronjob if it works with one command) to delete the oldest /opt/files/Backup from $(date +"%d.%m.%Y at %H:%M:%S").zip after 14 days (so that I have 14 recent backups all the time).
It would be great if you could explain your answer.
find /opt/files/Backup -name \*.zip -a -mtime +14 -ls
If you are satisfied the files being matched are the ones to delete, replace -ls with "-exec rm {} \;"

A crontab to move completed uploads from one dir to another?

I'm using the following crontab, once an hour, to move any files with the .mp3 extension from the dir "webupload" to the dir "complete" :
60 * * * * find usr/webupload -type f -maxdepth 1 -name "*.mp3" -exec mv {} usr/webupload/complete \;
The problem is that "webupload" contains lots of partial files being transferred.
I've read about a lot of different ways to achieve this but I think i'm more confused now than I was when I started!
What is the best practice or easiest way to only move the completed uploads?
Many thanks :)
It's going to be hard to tell when a file is completely written unless it is renamed when the download is completed, but you could change your find command and add -mmin +1 so that it only looks for files which have been modified more than 1 minutes ago (meaning that download is likely completed). Also, you should use / at the beginning of your paths rather than the relative paths your using:
60 * * * * find /usr/webupload -type f -mmin +1 -maxdepth 1 -name "*.mp3" -exec mv {} /usr/webupload/complete \;
You could obviously make the modification time longer (eg. 10 minutes -mmin +10) if you want to be more certain that the file has been downloaded.

Remove log files using cron job

Hi. I want to remove all log files from the last 7 days from a folder, but leave all the other files. Can I use the below command? How do you specify that it just delete the files with .log extension?
find /path/to/file -mtime +7 -exec rm -f {} \;
Do I need to write this command into some file, or can I just write it in command prompt and have it run automatically every day?
I have no idea how to run a cron job in linux.
Use wildcard. And just put it in your crontab use the crontab -e option to edit your crontab jobs.
See example:
* * * * * find /path/to/*.log -mtime +7 -exec rm -f {} \;
Just to increment the answer check this nice article on how to work with your crontab ! in Linux .
You edit your personal crontab by running crontab -e.
This gets saved to /var/spool/cron/<username>. The file will be the owners username, so root would be /var/spool/cron/root. Everything in the file is run as the owner of the file.
The syntax for crontab is as follows:
SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=root
HOME=/
# For details see man 4 crontabs
# Example of job definition:
# .---------------- minute (0 - 59)
# | .------------- hour (0 - 23)
# | | .---------- day of month (1 - 31)
# | | | .------- month (1 - 12) OR jan,feb,mar,apr ...
# | | | | .---- day of week (0 - 6) (Sunday=0 or 7) OR sun,mon,tue,wed,thu,fri,sat
# | | | | |
# * * * * * user-name command to be executed
When you are editing your own personal crontab, via crontab -e, you leave out the user-name field, because the user is inferred by the filename (see first paragraph).
That being said, your entry should look like this:
0 5 * * * find /path/to/*.log -mtime +7 -delete
This will run every day, at 5:00 AM, system time. I don't think you need it to run any more frequently than daily, given the fact that you are removing files that are 7 days old.
Please don't use over use the -exec option, when the -delete option does exactly what you want to do. The exec forks a shell for every file, and is excessively wasteful on system resources.
When you are done, you can use crontab -l to list your personal crontab.
ps. The default editor on most Linux systems is vi, if you do not know vi, use something simple like nano by setting your environ variable export EDITOR=nano
find /path/to/dir-containing-files -name '*.log' -mtime +7 -exec rm -f {} \;
To create a cron job, put a file containing the following in the /etc/cron.daily dir:
#!/bin/sh
find /path/to/dir-containing-files -name '*.log' -mtime +7 -exec rm -f {} \;
You should use crontab -e to edit your crontab and schedule the job. It might look something like this:
* 1 * * * /usr/bin/find /path/to/file -name '*.log' -mtime +7 -exec rm -f {} \;
This will recursively remove all .log files in the directory /path/to/file every day at 1am.
Since this is about log files, you should look at logrotate. It runs daily from system cron job and will rotate logs for you based on rules from /etc/logrotate.conf file, which usually includes /etc/logrotate.d directory. So no need for crontab nor find.
You can also have your own cron job if you have no access to add file to /etc/logrotate.d for your own configuration.
There are plenty of examples in /etc/logrotate.d.
It expects your application to write to single file. It is not for an application that logs into different log file each day. An application generally needs not do that. If the application keeps the log file open, logrotate can run a postrotate script to tell the application to reopen the log file.
You guys are doing it the HARD way. Try using the clear command
* * * * 0 clear > /home/user/CronLog.txt:
where 0 is Sunday and 7 would be Saturday. the ">" will clear the log as appose to ">>" which adds to the log. If your log file is root then type in "root" before "clear" like this
* * * * 0 root clear > /home/user/CronLog.txt
After googling around on this particular topic, I found that many people recommend using the -delete option like so:
* * * * * find /path/to/*.log -mtime +7 -delete;
The benefits of this version is that it is easy to remember and it will perform better since -exec will spawn a new process for every file that is to be deleted.
Here are some references:
https://linuxaria.com/howto/linux-shell-how-to-use-the-exec-option-in-find-with-examples
https://unix.stackexchange.com/questions/167823/find-exec-rm-vs-delete
This will delete log files older than 7 Days
* * * * * find /path/to -name '*.log' -mtime +7 -exec rm -f {} \;
This will delete log files older than 30 Minutes
* * * * * find /path/to -name '*.log' -mmin +30 -exec rm -f {} \;

Resources