how to delete 3days old back files automatically in server and My backup files looks like? - linux

web_20_10_2022
web_21_10_2022
web_22_10_2022
web_23_10_2022
web_24_10_2022
web_25_10_2022
I need auto delete script for 3 days old backup files from server and with cron job.
below are some commond I have tried but it is not working, so please help me out!!!
find /home -type f -mtime +3 -exec rm -rf {} +
find /home* -mtime +3 -type f -delete
find /home -mtime +3 -type d -exec rmdir {} ;

Kindly run the below command to find out the files older than 3 days.
find /path/to/directory/* -mtime +3 -print
The older files will be displayed in the mentioned directory if the above command is used.
To delete the 3 day old files:
find /path/to/directory/* -type f -mtime +3 | xargs rm -f
or
find /path/to/directory/* -type f -mtime +3 -exec rm -f {} \;
or
find /path/to/directory/* -type f -mtime +3 -delete
To automate the process:
Copy the below code and save the file in a directory where you are not performing any housekeeping activity.
#!/bin/bash
find /path/to/directory/* -mtime +3 -delete
Make the file an executable file and set a cron job.
crontab -e
To run the script at every hour, enter
00 * * * * /path/to/executablefile

Related

log deleted files in linux

I am using the following command to delete the files that are older than 10 days, but I also want to store(log) the list of files that are being deleted using the below command.
find ./path/delete -type f -name '*' -mtime +10 -exec rm {} \;
If you create a file to log to (touch ./path/to/logfile), just add another -exec to your command. Below is a very basic example, but you can add to it:
find ./path/delete -type f -name '*' -mtime +10 -exec rm {} \; -exec echo {} >> ./path/to/logfile \;

Delete files in dir but exclude 1 subdir

I have a dir that is full of many htm reports that I keep around for 30 days and delete old ones via a cron, but there is one sub-dir I would like to keep longer. So this is the line I made in the cron, but how do I tell it to leave one sub-dir alone.
5 0 * * * find /var/www -name "*.htm*" -type f -mtime +30 -exec rm -f {} \;
Any help is greatly appreciated!
Use -prune to prevent going into a directory that matches some conditions.
find /var/www -type d -name 'excluded-directory' -prune -o -name "*.htm*" -type f -mtime +30 -exec rm -f {} \;
In addition to suggestion below, suggesting to use full path in cron.
Also to use find option -delete in-place of -exec rm -f {} \;. It is somewhat safer.
-delete
Delete found files and/or directories. Always returns true.
This executes from the current working directory as find recurses
down the tree. It will not attempt to delete a filename with a
"/" character in its pathname relative to "." for security
reasons. Depth-first traversal processing is implied by this
option. The -delete primary will fail to delete a directory if
it is not empty. Following symlinks is incompatible with this
option.
5 0 * * * /usr/bin/find /var/www -type d -name 'excluded-directory' -prune -o -name "*.htm*" -type f -mtime +30 -delete

Output from script - exec rm

I got a script which deletes files which are older than 2 days, usually it works properly, but for some reason it doesn't work fine today.
I want to find an option to get an output from script with error why the files are not deleted.
Could you tell me is there such option?
script:
#!/bin/bash
#script for cleaning logs from files older than two days
dir_name=/home/albert/scripts/files
file_log=/home/albert/scripts/info.log
{
find ${dir_name} -type f -name '*.log' -mtime +2 -exec rm -v {} \;
} >> ${file_log)
You probably want to add a redirection for standard error too, i.e. in the case you want to send it to the same file:
{
find ${dir_name} -type f -name '*.log' -mtime +2 -exec rm -v {} \;
} >> ${file_log) 2>&1
You could use the find option -delete:
-delete - If the removal failed, an error message is issued.
find "${dir_name}" -type f -name '*.log' -mtime +2 -delete >> ${file_log} 2>&1
Example:
$ find /etc -name passwd -delete
find: cannot delete ‘/etc/pam.d/passwd’: Permission denied
find: cannot delete ‘/etc/passwd’: Permission denied

find directory older than 3 days and zip all files in it

Can i find any directories with a condition like older than 3 days
and zip them then delete the directories?
I have 2 solutions.
zip all directories in 1 zip under working directory
I tried
zip -rm ${WORKDIR}/date +%Y%m%d -d "${DAY_TO_ZIP} days ago".zipfind ${WORKDIR} -daystart -mtime +${DAY_TO_ZIP} -type d ! -name "*.zip"``
this command will zip all files include non-directory file.
1 directory 1 zip same path with a directory
thank you very much
Execute bellow command to find all directory older than 3 days and zip all file
# find / -mtime +3 -type d -exec zip -r zipfile.zip {} +
-mtime +3 means you are looking for a file modified 3 days ago.
-mtime -3 means less than 3 days.
-mtime 3 If you skip + or – it means exactly 3 days.
Finally If you delete all directory then execute bellow command
# find / -mtime +3 -type d -exec rm -f {} \;
find ./ -mtime +x -print -exec gzip {} ;

Linux find and delete files but redirect file names to be deleted

Is there a way to write the file names to a file before they are deleted for reference later to check what has been deleted.
find <PATH> -type f -name "<filePattern>" -mtime +1 -delete
Just add a -print expression to the invocation of find:
find <PATH> -type f -name "<filePattern>" -mtime +1 -delete -print > log
I'm not sure if this prints the name before or after the file is unlinked, but it should not matter. I suspect -delete -print unlinks before it prints, while -print -delete will print before it unlinks.
Like William said, you can use -print. However, instead of -print > log, you can also use the -fprint flag.
You'd want something like:
find <PATH> -type f -name "<filePattern>" -mtime +1 -fprint "<pathToLog>" -delete
For instance, I use this in a script:
find . -type d -name .~tmp~ -fprint /var/log/rsync-index-removal.log -delete
You can use -exec and rm -v:
find <PATH> -type f -name "<filePattern>" -mtime +1 -exec rm -v {} \;
rm -v will report what it is deleting.
With something like this you can execute multiple commands in the exec statement, like log to file, rm file, and whatever more you should need
find <PATH> -type f -name "<filePattern>" -mtime +1 -exec sh -c "echo {} >>mylog; rm -f {}" \;
From a shell script named removelogs.sh
run the command sh removelogs.sh in terminal
this is the text in removelogs.sh file.
cd /var/log;
date >> /var/log/removedlogs.txt;
find . -maxdepth 4 -type f -name \*log.old -delete -print >> /var/log/removedlogs.txt
. - to run at this location !!! so ensure you do not run this in root folder!!!
-maxdepth - to prevent it getting out of control
-type - to ensure just files
-name - to ensure just your filtered names
-print - to send the result to stdout
-delete - to delete said files
>> - appends to files not overwrites > creates new file
works for me on CENTOS7

Resources