Cron Job to auto delete folder older than 7 days Linux - linux

I am having issue storing my server backup on a storage VPS. My server is not deleting old backup folders and the storage is getting full and the backup fails in mid way. My runs once every week.
Can anyone help me create a cron job script on that deletes folder older than 7 days and runs one day before backup and delete old folders.
Any help appreciated.

For example, the description of crontab for deleting files older than 7 days under the /path/to/backup/ every day at 4:02 AM is as follows.
02 4 * * * find /path/to/backup/* -mtime +7 -exec rm {} \;
Please make sure before executing rm whether targets are intended files. You can check the targets by specifying -ls as the argument of find.
find /path/to/backup/* -mtime +7 -ls
mtime means the last modification timestamp and the results of find may not be the expected file depending on the backup method.

Related

find command does not find existing file

I create starttime_file and endtime_file hourly and the interval is 1 hour.
The purpose of this is to check the files which are created hourly.
But the thing is that I sometimes receive error code due to No such file or directory, although there is that file.
Could you let me know what would be the reason for this?
starttime_file
endtile_file
find ~/ -type -f -newer starttime_file -a ! -newer endtime_file > listfile.txt
When I conducted this on crontab hourly based, I received fail msg saying that no such file or directory which was created at Oct. 30th 16:15.
test200.txt was created at 16:15 and when I think, I need to take the error msg only once.
What would be the reason for this?
I really dunno what I need to try.
I suspected the authority issue but when I find this file on normal user status, I was able to find this file as well.
Both are working well on normal user status.
ll -rtl | grep -i "test200"
find ~/ -name "teset200.txt"

mysqldump problem in Crontab and bash file

I have created a cron tab to backup my DB each 30 minutes...
*/30 * * * * bash /opt/mysqlbackup.sh > /dev/null 2>&1
The cron tab works well.. Each 30 minutes I have my backup with the script bellow.
#!/bin/sh
find /opt/mysqlbackup -type f -mtime +2 -exec rm {} +
mysqldump --single-transaction --skip-lock-tables --user=myuser --
password=mypass mydb | gzip -9 > /opt/mysqlbackup/$(date +%Y-%m-%d-%H.%M)_mydb.sql.gz
But my problem is that the rm function to delete old data isn't working.. this is never deleted.. Do you know why ?
and also... the name of my backup is 2020-02-02-12.12_mydb.sql.gz?
I always have a ? at the end of my file name.. Do you know why ?
Thank you for your help
The question mark typically indicates a character that can't be displayed; the fact that it's at the end of a line makes me think that your script has Windows line endings rather than Unix. You can fix that with the dos2unix command:
dos2unix /path/to/script.sh
It's also good practice not to throw around MySQL passwords on the CLI or store them in executable scripts. You can accomplish this by using MySQL Option files, specifically the file that defines user-level options (~/.my.cnf).
This would require us to figure out which user is executing that cronjob, however. My assumption is that you did not make that definition inside the system-level crontab; if you had, you'd actually be trying to execute /opt/mysqlbackup.sh > /dev/null 2>&1 as the user bash. This user most likely doesn't (and shouldn't) exist, so cron would fail to execute the script entirely.
As this is not the case (you say it's executing the mysqldump just fine), this makes me believe you have the definition in a user-level crontab instead. Once we figure out which user that actually is as I asked for in my comment, we can identify the file permissions issue as well as create the aforementioned MySQL options file.
Using find with mtime is not the best choice. If for some reason mysqldump stops creating backups, then in two days all backups will be deleted.
You can use my Python script "rotate-archives" for smart delete backups. (https://gitlab.com/k11a/rotate-archives). The script adds the current date at the beginning of the file or directory name. Like 2020-12-31_filename.ext. Subsequently uses this date to decide on deletion.
Running a script on your question:
rotate-archives.py test_mode=off age_from-period-amount_for_last_timeslot=0-0-48 archives_dir=/mnt/archives
In this case, 48 new archives will always be saved. Old archives in excess of this number will be deleted.
An example of more flexible archives deletion:
rotate-archives.py test_mode=off age_from-period-amount_for_last_timeslot=7-5,31-14,365-180-5 archives_dir=/mnt/archives
As a result, there will remain archives from 7 to 30 days old with a time interval between archives of 5 days, from 31 to 364 days old with time interval between archives 14 days, from 365 days old with time interval between archives 180 days and the number of 5.

moving monthly linux files to another folder script

sorry but I'm a beginner in Linux script commands.
I'm trying to automatically transfer files to another folder monthly ,on every 26th day of the existing month.(Linux system)
I would appreciate if somebody can help!
thank you in advance!
I have files with this format xxxxx_181025.txt. I want them to be moved in another folder every 26th day of month.(but only files generated for actual month, in this case October. I need some help on how to specify that only files of existing month should be moved?
cd /actual folder
_Y='%Y'
_y='%Y'
_m='%m'
_d='%d'
TIMESTAMP=`date "+$_Y$_m$_d"`
mv xxxxx_$TIMESTAMP /new folder/xxxxx_$TIMESTAMP
done
find /actual_folder -t f -mtime -26 -exec mv {} /new_folder/ \;

I have to write an automation script

I have to backup and delete old log files every month. I will delete files older than 6 months and backup files older than 2 months in as a zip file.
I am trying to write a script that will automate and do it every month instead of me doing it manually every time.
I have UNIX commands on how to do it, but I need to put it into script file which will run automatically on the day specified.
You can schedule a cronJob for daily , which runs command inside script such as
find foldername -mtime +120 -name "*.log" -exec gzip {} \;
Above will take care of archiving all files older than 120 days. Part inside quotes after name can be modified as per your requirement and so does the +120.
find foldername -mtime +180 -name "*" -exec rm {} \;
Above will remove all file inside foldername older than 180 days.
For automation part , you can look at wiki link provided in the answer below. Though i will include it in my answer too.
You can use crontab to schedule commands (https://en.wikipedia.org/wiki/Cron)
You can use crontab to schedule commands (https://en.wikipedia.org/wiki/Cron)
You can add entry typing crontab -e and use it to schedule jobs, after adding your unix commands to a script.
For example, if you have a /home/test/test.sh file, you can run it everyday by adding the below to your crontab :
0 0 * * * /home/test/test.sh

deletion of file after 7 days

I have to delete multiple files after 7 days regularly. And the deletion dates and location are different for each file.Yes, I can apply a cronjob for each folder separately but tat will involve many cronjobs (atleast 15).
In order to avoid this, I want to create a script which will go to each folder and delete the data.
For example:
-rw-r--r-- 1 csbackup other 20223605295 Jun 12 06:40 IO.tgz
As you can see IO.tgz was created on 12/06/2015 6:40... now I want to delete this file at 17/06/2015 00:00 hours... this is one reason I'm unable to use mtime as it will delete exactly after 7*24 hrs.
I was thinking to compare the timestamps of the file however, stat utility is not present on my machine. And its now even allowing me to install it.
Can anyone please guide me via a script which I can use to delete after n days
You can make a list of directories you want to search in a file.
# cat file
/data
/d01
/u01/files/
Now you can use for loop to remove the files which are there on those directories one by one.
for dir in $(cat file); do
find $dir -type f -mtime 7 |xargs rm -f
done

Resources