Script to delete selected 60 days old files in Linux - linux

i have a requirement to delete 60+ days old files from our growing linux server. There are folders for every month and files are stored in it. I just want to delete files from all monthly folder EXCEPT the December folder (named as 2012_12).
what condition should i put in the script to not find files in all the folder like '%_12' ?

find /path/ -name '*_12' -prune -o -type f -mtime +59 -delete

Related

find and delete command deletes all files in that folder

Currently, I am using this command to delete all files older than 30 minutes on my Linux server.
sudo find /var/www/html/folder/* -type f -mmin +30 -delete
But it deletes all the files in that folder irrespective of their age. What's wrong with this?

bash delete older files

I have this unique requirement of finding 2 years older files and delete them. But not only files as well as corresponding empty directories. I have written most of the logic but only thing that is still pending is , when I delete particular file from a directory , How can I delete the corresponding directory when it is empty. As when I delete the particular file , the ctime/mtime would also accordingly get updated. How do I target those corresponding older directories and delete them?
Any pointers will be helpful.
Thanks in advance.
Admin
I would do something like this:
find /path/to/files* -mtime +730 -delete
-mtime +730 finds files which are older than 730 days.
Please be careful with this kind of command though, be sure to write find /path/to/files* -mtime +730 beforehand and check that these are the files you want to delete!
Edit:
Now you have deleted the files from the directories, -mtime +730 won't work.
To delete all empty directories that you have recently altered:
find . -type d -mmin -60 -empty -delete

Exclude directories and delete old backup

I am using following simple script script to take backup of all of my websites via tar
TIME=`date +%b-%d-%y`
FILENAME=backup-$TIME.tar.gz
#Parent backup directory
backup_parent_dir="/backup/httpdocs"
#Create backup directory and set permissions
backup_date=`date +%Y_%m_%d_%H_%M`
backup_dir="${backup_parent_dir}/${backup_date}"
echo "Backup directory: ${backup_dir}"
mkdir -p "${backup_dir}"
chmod 755 "${backup_dir}"
SRCDIR=/var/www/sites #Location of Important Data Directory (Source of backup).
tar -cpzf $backup_dir/$FILENAME $SRCDIR
Now, this is wokring fine but I need 2 things if I can do via same script
Can I exclude some folder within /var/www/sites directory, like if I don't want /var/www/sites/abc.com/logs folder to be backup. Can I define it and some other sub-directories within this script?
This script takes all sites in tar format in specified folder /backup/httpdocs through cronjob which runs daily during night and for old tarballs(older than 7 days) I have to delete them manually, so it there any possibility through same script so when it runs, it checks if there is any backup exists older than 7 days and it deletes it automatically?
EDIT:
Thanks everyone, this is what I am using now which takes backup excluding log files and delete anything older than 7 days
#!/bin/bash
#START
TIME=`date +%b-%d-%y` # This Command will add date in Backup File Name.
FILENAME=backup-$TIME.tar.gz # Here i define Backup file name format.
# Parent backup directory
backup_parent_dir="/backup/httpdocs"
# Create backup directory and set permissions
backup_date=`date +%Y_%m_%d_%H_%M`
backup_dir="${backup_parent_dir}/${backup_date}"
echo "Backup directory: ${backup_dir}"
mkdir -p "${backup_dir}"
chmod 755 "${backup_dir}"
SRCDIR=/var/www/vhosts # Location of Important Data Directory (Source of backup).
tar -cpzf $backup_dir/$FILENAME $SRCDIR --exclude=$SRCDIR/*/logs
find ${backup_parent_dir} -name '*' -type d -mtime +2 -exec rm -rfv "{}" \;
#END
Exclude from tar:
tar -cpzf $backup_dir/$FILENAME --exclude=/var/www/sites/abc.com/logs $SRCDIR
Find and delete old backups:
find ${backup_parent_dir} -type f -name 'backup-*.tar.gz' -mtime +7 -delete
The filter of find is conservative: selecting names that match backup-*.tar.gz probably renders the -type f (files only) option useless. I added it just in case you also have directories with such names. The -mtime +7 option is to be checked by you because older than 7 days is not accurate enough. Depending on what you have in mind it may be +6, +7 or +8. Please have a look at the find man page and decide by yourself. Note that the selection of backups to delete is not based on their names but on their date of last modification. If you modify them after they are created it may not be what you want. Let us know.
Use the --exclude option
tar -cpzf $backup_dir/$FILENAME $SRCDIR --exclude=$SRCDIR/*/logs
Name your backup files with an identifier that is derived from the day of the week. This will ensure the new file for each day will overwrite any existing file.
Day:
a Day of the week - abbreviated name (Mon)
A Day of the week - full name (Monday)
u Day of the week - number (Monday = 1)
d Day of the month - 2 digits (05)
e Day of the month - digit preceded by a space ( 5)
j Day of the year - (1-366)
w Same as 'u'
From: http://ss64.com/bash/date.html
For example:
TARGET_BASENAME= `date +%u`
option 1 when not many files/directories to exclude
tar -cpzf $backup_dir/$FILENAME --exclude=$SRCDIR/dir_ignore --exclude=$SRCDIR/*.log $SRCDIR
or if you have many entries to exclude much better done it in file
tar -cpzf $backup_dir/$FILENAME -X /path/to/exclude.txt $SRCDIR
where /path/to/exclude.txt file looks like
/var/www/dir_to_ignore
/var/www/*.log
you cannot use variables,but can use wildcards
second question answered very good by both guys before,i personalty love
find ${backup_parent_dir} -type f -name 'backup-*.tar.gz' -mtime +7 -delete

Copy N days old files on Linux

Good morning,
I have many files inside directories, subdirectories which I'm now using copy everything inside.
find /tmp/temp/ -name *files.csv -type f -exec cp -u {} /home/dir/Desktop/dir1/ \;
And I was wondering, if there is anyway that I can copy like, copy if the file's modified date is within two days. I don't want to copy if the modification date is 2 days before the current date.
You can use mtime within your find command:
find /tmp/temp/ -type f -mtime -2 -name *files.csv -exec cp -u {} /home/dir/Desktop/dir1/ \;
This would copy only files with a modified time within the last two days of the system time.
-mtime n
File's data was last modified n*24 hours ago

In linux shell, How to cp/rm files by time?

In linux shell, When I run
ls -al -t
that show the time of files.
How to cp/rm files by time? just like copy all the files that created today or yesterday. Thanks a lot.
Depending on what you actually want to do, find provides -[acm]time options for finding files by accessed, created or modified dates, along with -newer and -min. You can combine them with -exec to copy, delete, or whatever you want to do. For example:
find -maxdepth 1 -mtime +1 -type f -exec cp '{}' backup \;
Will copy all the regular files in the current directory more than 1 day old to the directory backup (assuming the directory backup exists).
Simple Example
find /path/to/folder/ -mtime 1 -exec rm {} \; // Deletes all Files modified yesterday
For more examples google for bash find time or take a look here

Resources