Cron to delete folders older than required time deletes parent folder - cron

I have a cron job that creates folders within the "backup" directory \tmp\backup.
I am looking to have a second job to delete folders within "backup" which are older than 1 minute using the job below
55 19 * * * find /tmp/backup/ -maxdepth 1 -type d -mmin +1 -execdir rm -rf {} \;
But this job deletes the parent directory "backup" too, I am confused on where I am going wrong. Any help is appreciated !

Easy enough to test.
for a in {1..3}; do mkdir -p /tmp/backup/${a}; done
find /tmp/backup/ -maxdepth 1 -type d -mmin +1
This returned
/tmp/backup
/tmp/backup/2
/tmp/backup/1
/tmp/backup/3
But
find /tmp/backup/* -maxdepth 1 -type d -mmin +1
returned
/tmp/backup/2
/tmp/backup/1
/tmp/backup/3
Add a asterix

Related

how to delete 3days old back files automatically in server and My backup files looks like?

web_20_10_2022
web_21_10_2022
web_22_10_2022
web_23_10_2022
web_24_10_2022
web_25_10_2022
I need auto delete script for 3 days old backup files from server and with cron job.
below are some commond I have tried but it is not working, so please help me out!!!
find /home -type f -mtime +3 -exec rm -rf {} +
find /home* -mtime +3 -type f -delete
find /home -mtime +3 -type d -exec rmdir {} ;
Kindly run the below command to find out the files older than 3 days.
find /path/to/directory/* -mtime +3 -print
The older files will be displayed in the mentioned directory if the above command is used.
To delete the 3 day old files:
find /path/to/directory/* -type f -mtime +3 | xargs rm -f
or
find /path/to/directory/* -type f -mtime +3 -exec rm -f {} \;
or
find /path/to/directory/* -type f -mtime +3 -delete
To automate the process:
Copy the below code and save the file in a directory where you are not performing any housekeeping activity.
#!/bin/bash
find /path/to/directory/* -mtime +3 -delete
Make the file an executable file and set a cron job.
crontab -e
To run the script at every hour, enter
00 * * * * /path/to/executablefile

how to move jpg and jpeg files whose size is greater than 10kb [duplicate]

I have some automated downloads in a proprietary linux distro.
They go to a temp scratch disk. I want to move them when they're finished to the main RAID array. The best way I can see to do this is to check the folders on the disk to see if the contents have changed in the last minute. If not then its probably finished downloading and then move it.
Assuming there could be hundreds of folders or just one in this location and its all going to the same place. Whats the best way to write this?
I can get a list of folder sizes with
du -h directory/name
The folders can contain multiple files anywhere from 1.5mb to 10GB
Temp Loc: /volume2/4TBScratch/Processing
Dest Loc when complete: /volume1/S/00 Landing
EDIT:
Using this:
find /volume2/4TBScratch/Processing -mindepth 1 -type d -not -mmin +10 -exec mv "{}" "/volume1/S/00 Landing" \;
find: `/volume2/4TBScratch/Processing/test': No such file or directory
4.3#
yet it DOES copy the relevant folders and all files. But the error worries me that something might go wrong in the future.... is it because there is multiple files and it's running the same move command for EACH file or folder in the root folder? But since it moves it all on the first iteration it cant find it on the next ones?
EDIT2:
Using Rsync
4.3# find /volume2/4TBScratch/Processing -mindepth 1 -type d -not -mmin +10 -exec rsync --remove-source-files "{}" "/volume1/S/00 Landing" \;
skipping directory newtest
skipping directory erw
RESOLVED: EDIT3
Resolved with the help in the comments below. Final script looks like this:
find /volume2/4TBScratch/Processing -mindepth 1 -type d -not -mmin +10 -exec rsync -a --remove-source-files "{}" "/volume1/S/00 Landing" \;
find /volume2/4TBScratch/Processing -depth -type d -empty -delete
rsync to move folders and files but leaves empty root dir
the next command finds empty folders and removes them.
Thanks all!
You can use GNU find with options -size for detecting files/folders of certain size and use mv with the -exec option to move to destination directory. The syntax is
find /volume2/4TBScratch/Processing -type d -maxdepth 1 -size -10G -exec mv "{}" "/volume1/S/00 Landing" \;
Using rsync
find /volume2/4TBScratch/Processing -type d -maxdepth 1 -size -10G -exec rsync --remove-source-files "{}" "/volume1/S/00 Landing" \;
The size with a - sign to indicate less than the mentioned size which in this case is 10GB. A note on each of the flags used
-type d -> For identifying only the folders from the source path.
-maxdepth 1 -> To look only on the current source directory and not
being recursive.
-exec -> Execute command following it.
Alternatively, if you want to find files that are last modified over a certain time(minutes), find has an option for -mmin which can be set to a value. E.g. -mmin -5 would return files modified five minutes ago.
So suggest adding it to your requirement, for x as you need and see if the directories are listed, then you can add the -exec option for moving the directories
find /volume2/4TBScratch/Processing -type d -maxdepth 1 -mmin -2 -size -10G
Refer to the GNU documentation for finding files according to size on how this works.
Note:- The double quotes("") are added to avoid Bash from splitting the names containing spaces.

Delete files in dir but exclude 1 subdir

I have a dir that is full of many htm reports that I keep around for 30 days and delete old ones via a cron, but there is one sub-dir I would like to keep longer. So this is the line I made in the cron, but how do I tell it to leave one sub-dir alone.
5 0 * * * find /var/www -name "*.htm*" -type f -mtime +30 -exec rm -f {} \;
Any help is greatly appreciated!
Use -prune to prevent going into a directory that matches some conditions.
find /var/www -type d -name 'excluded-directory' -prune -o -name "*.htm*" -type f -mtime +30 -exec rm -f {} \;
In addition to suggestion below, suggesting to use full path in cron.
Also to use find option -delete in-place of -exec rm -f {} \;. It is somewhat safer.
-delete
Delete found files and/or directories. Always returns true.
This executes from the current working directory as find recurses
down the tree. It will not attempt to delete a filename with a
"/" character in its pathname relative to "." for security
reasons. Depth-first traversal processing is implied by this
option. The -delete primary will fail to delete a directory if
it is not empty. Following symlinks is incompatible with this
option.
5 0 * * * /usr/bin/find /var/www -type d -name 'excluded-directory' -prune -o -name "*.htm*" -type f -mtime +30 -delete

find directory older than 3 days and zip all files in it

Can i find any directories with a condition like older than 3 days
and zip them then delete the directories?
I have 2 solutions.
zip all directories in 1 zip under working directory
I tried
zip -rm ${WORKDIR}/date +%Y%m%d -d "${DAY_TO_ZIP} days ago".zipfind ${WORKDIR} -daystart -mtime +${DAY_TO_ZIP} -type d ! -name "*.zip"``
this command will zip all files include non-directory file.
1 directory 1 zip same path with a directory
thank you very much
Execute bellow command to find all directory older than 3 days and zip all file
# find / -mtime +3 -type d -exec zip -r zipfile.zip {} +
-mtime +3 means you are looking for a file modified 3 days ago.
-mtime -3 means less than 3 days.
-mtime 3 If you skip + or – it means exactly 3 days.
Finally If you delete all directory then execute bellow command
# find / -mtime +3 -type d -exec rm -f {} \;
find ./ -mtime +x -print -exec gzip {} ;

Deleting old files using cron job in Linux

I was setting up a cron job where I wanted to delete log files older than 1 day. The command to do this is as below. I am doing this on a AWS Linux EC2 instance.
find /var/log/tomcat8/ -mindepth 1 -mtime +1 -delete
But what I want to achieve is I want to exclude .log files from getting deleted and want to just delete the files with .gz extension. Can any body let me know how I achieve that exclusion in find command.
Just look for *.gz files and delete them.
find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1 -delete
Before deleting, just list the files to make sure you are deleting the correct ones.
find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1 -print
Print all files selected as below:
find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1
Above files can be deleted as below:
find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1 -exec rm{} \

Resources