Deleting old files using cron job in Linux - linux

I was setting up a cron job where I wanted to delete log files older than 1 day. The command to do this is as below. I am doing this on a AWS Linux EC2 instance.
find /var/log/tomcat8/ -mindepth 1 -mtime +1 -delete
But what I want to achieve is I want to exclude .log files from getting deleted and want to just delete the files with .gz extension. Can any body let me know how I achieve that exclusion in find command.

Just look for *.gz files and delete them.
find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1 -delete
Before deleting, just list the files to make sure you are deleting the correct ones.
find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1 -print

Print all files selected as below:
find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1
Above files can be deleted as below:
find /var/log/tomcat8/ -name '*.gz' -mindepth 1 -mtime +1 -exec rm{} \

Related

how to delete 3days old back files automatically in server and My backup files looks like?

web_20_10_2022
web_21_10_2022
web_22_10_2022
web_23_10_2022
web_24_10_2022
web_25_10_2022
I need auto delete script for 3 days old backup files from server and with cron job.
below are some commond I have tried but it is not working, so please help me out!!!
find /home -type f -mtime +3 -exec rm -rf {} +
find /home* -mtime +3 -type f -delete
find /home -mtime +3 -type d -exec rmdir {} ;
Kindly run the below command to find out the files older than 3 days.
find /path/to/directory/* -mtime +3 -print
The older files will be displayed in the mentioned directory if the above command is used.
To delete the 3 day old files:
find /path/to/directory/* -type f -mtime +3 | xargs rm -f
or
find /path/to/directory/* -type f -mtime +3 -exec rm -f {} \;
or
find /path/to/directory/* -type f -mtime +3 -delete
To automate the process:
Copy the below code and save the file in a directory where you are not performing any housekeeping activity.
#!/bin/bash
find /path/to/directory/* -mtime +3 -delete
Make the file an executable file and set a cron job.
crontab -e
To run the script at every hour, enter
00 * * * * /path/to/executablefile

how to move jpg and jpeg files whose size is greater than 10kb [duplicate]

I have some automated downloads in a proprietary linux distro.
They go to a temp scratch disk. I want to move them when they're finished to the main RAID array. The best way I can see to do this is to check the folders on the disk to see if the contents have changed in the last minute. If not then its probably finished downloading and then move it.
Assuming there could be hundreds of folders or just one in this location and its all going to the same place. Whats the best way to write this?
I can get a list of folder sizes with
du -h directory/name
The folders can contain multiple files anywhere from 1.5mb to 10GB
Temp Loc: /volume2/4TBScratch/Processing
Dest Loc when complete: /volume1/S/00 Landing
EDIT:
Using this:
find /volume2/4TBScratch/Processing -mindepth 1 -type d -not -mmin +10 -exec mv "{}" "/volume1/S/00 Landing" \;
find: `/volume2/4TBScratch/Processing/test': No such file or directory
4.3#
yet it DOES copy the relevant folders and all files. But the error worries me that something might go wrong in the future.... is it because there is multiple files and it's running the same move command for EACH file or folder in the root folder? But since it moves it all on the first iteration it cant find it on the next ones?
EDIT2:
Using Rsync
4.3# find /volume2/4TBScratch/Processing -mindepth 1 -type d -not -mmin +10 -exec rsync --remove-source-files "{}" "/volume1/S/00 Landing" \;
skipping directory newtest
skipping directory erw
RESOLVED: EDIT3
Resolved with the help in the comments below. Final script looks like this:
find /volume2/4TBScratch/Processing -mindepth 1 -type d -not -mmin +10 -exec rsync -a --remove-source-files "{}" "/volume1/S/00 Landing" \;
find /volume2/4TBScratch/Processing -depth -type d -empty -delete
rsync to move folders and files but leaves empty root dir
the next command finds empty folders and removes them.
Thanks all!
You can use GNU find with options -size for detecting files/folders of certain size and use mv with the -exec option to move to destination directory. The syntax is
find /volume2/4TBScratch/Processing -type d -maxdepth 1 -size -10G -exec mv "{}" "/volume1/S/00 Landing" \;
Using rsync
find /volume2/4TBScratch/Processing -type d -maxdepth 1 -size -10G -exec rsync --remove-source-files "{}" "/volume1/S/00 Landing" \;
The size with a - sign to indicate less than the mentioned size which in this case is 10GB. A note on each of the flags used
-type d -> For identifying only the folders from the source path.
-maxdepth 1 -> To look only on the current source directory and not
being recursive.
-exec -> Execute command following it.
Alternatively, if you want to find files that are last modified over a certain time(minutes), find has an option for -mmin which can be set to a value. E.g. -mmin -5 would return files modified five minutes ago.
So suggest adding it to your requirement, for x as you need and see if the directories are listed, then you can add the -exec option for moving the directories
find /volume2/4TBScratch/Processing -type d -maxdepth 1 -mmin -2 -size -10G
Refer to the GNU documentation for finding files according to size on how this works.
Note:- The double quotes("") are added to avoid Bash from splitting the names containing spaces.

How to exclude multiple subdirectories (same directory name) when using find command to delete files older than 30 days in a batch file?

Given the below linux directory structure, how would I skip each excluded directory, (/assess), to remove files older than 30 days for the rest of the entire directory structure?
Thanks for your assistance...
/mnt/nfsmountpoint/location1/appliance1
/assess
/discover
/bkup
/mnt/nfsmountpoint/location1/appliance2
/assess
/discover
/bkup
etc...
I cobbled together an answer and proofs:: (my asterisks * are not showing sorry)
Run from the /mnt/nfsmountpoint/ directory.
find .// -not -path "/assess/" -type f -mtime +30 -delete
validate::
Does it skip the directory?:
find .// -not -path "/assess/" -type f -mtime +30 -ls|more
Verify no current month (January 2021) files included?:
find .// -not -path "/assess/" -type f -mtime +30 -ls|grep Jan
How much space is made free?:
find .// -not -path "/assess/" -type f -mtime +30 -print0 | du --files0-from=- hc | tail -n1
find /path/to/dir -mtime +30 -type f -not -name "*assess*" -delete
Find files (-type f) in /path/to/dir as well as children directories. Specify only files that have been modified more than 30 days ago (-mtime +30) and do not include files that contain "assess" (-not -name "assess")

How to write find command to delete 7 days older files with selected JPGs ?

I use Linux on Centos
And I need to remove the JPG in a file for more than 7 days.
But can't delete the JPG of the main directory
example: find /users/mac/desktop/test/*
Will output
/users/mac/desktop/test/test.jpg
/users/mac/desktop/test/test01
/users/mac/desktop/test/test01/test01.jpg
/users/mac/desktop/test/test02
/users/mac/desktop/test/test02/test02.jpg
But I only need delete this two .jpg
/users/mac/desktop/test/test01/test01.jpg
/users/mac/desktop/test/test02/test02.jpg
I need to remove the JPG in a file for more than 7 days. But can't delete the JPG of the main directory
find /users/mac/desktop/test -mtime +7 -mindepth 2 -type f -name '*.jpg' -delete
-mtime +7 the modification time of the file is older then 7 days
-mindepth 2 ignore the "main directory"
-type f only files
-name '*.jpg' only jpg files
-delete delete them
find /users/mac/desktop/test/*/*

Delete files older than certain days that not have certain substring in name (Linux)

I have a folder with backup files with names like:
backup_2017_12_01__09_00_01.sql.gz
backup_2017_12_01__10_00_01.sql.gz
...
backup_2017_12_01__19_00_01.sql.gz
backup_2017_12_01__20_00_01.sql.gz
backup_2017_12_02__09_00_01.sql.gz
backup_2017_12_02__10_00_01.sql.gz
...
backup_2017_12_02__19_00_01.sql.gz
backup_2017_12_02__20_00_01.sql.gz
and so on.
I have a cron that should perform the deletion of the files respecting these rules:
delete all files older than 45 days; solved with find. -mtime +45 -exec rm {} \;
delete all files older than 7 days except those with the string __20_ in the name (the last backup in the evening); a command that is based on the last modification time rather than the name would also be fine
Can someone help me on the second point?
Thanks.
find /p/a/t/h \( -mtime +45 -o \( -mtime +5 ! -name '*__20_*' \) \) -delete
If you want, you can be more explicit:
find /p/a/t/h \( -mtime +45 -o \( -mtime +5 -and ! -name '*__20_*' \) \) -delete
Note that you should be more precise with language. This does not delete files that are "older than 45 days". It deletes files based on their modification time, which can be very different than age.
With find's -name test:
find . -type f -name "*.gz" ! -name "*__20_*.gz" -mtime +7 -delete

Resources