Bash: delete based on file date stamp - linux

I have a folder with a bunch of files. I need to delete all the files created before July 1st. How do I do that in a bash script?

I think the following should do what you want:
touch -t 201007010000 dummyfile
find /path/to/files -type f ! -newer dummyfile -delete
The first line creates a file which was last modified on the 1st July 2010. The second line finds all files in /path/to/file which has a date not newer than the dummyfile, and then deletes them.
If you want to double check it is working correctly, then drop the -delete argument and it should just list the files which would be deleted.

This should work:
find /file/path ! -newermt "Jul 01"
To find the files you want to delete, so the command to delete them would be:
find /file/path ! -newermt "Jul 01" -type f -print0 | xargs -0 rm

Related

How to get MD5 checksum of newest files in directory?

I use the following code here to get all the files within my hard drive to be put into a new text file. Is there a way to modify this code to just get files added after a certain date? (For example, August 28th 2021 and onwards)
find -type f -exec md5sum "{}" + > checklist.txt
Try using xargs with find to apply md5sum to each file that find finds.
For example, this command will give you the md5sum for each file in the current directory modified since Aug 15, 2021:
find . -type f -newermt '2021-08-15' -print0 | xargs -0 md5sum

Delete all files older than 30 days, based on file name as date

I'm new to bash, I have a task to delete all files older than 30 days, I can figure this out based on the files name Y_M_D.ext 2019_04_30.txt.
I know I can list all files with ls in a the folder containing the files. I know I can get todays date with $ date and can configure that to match the file format $ date "+%Y_%m_%d"
I know I can delete files using rm.
How do I tie all this together into a bash script that deletes files older than 30 days from today?
In pseudo-python code I guess it would look like:
for file in folder:
if file.name to date > 30 day from now:
delete file
I am by no means a systems administrator, but you could consider a simple shell script along the lines of:
# Generate the date in the proper format
discriminant=$(date -d "30 days ago" "+%Y_%m_%d")
# Find files based on the filename pattern and test against the date.
find . -type f -maxdepth 1 -name "*_*_*.txt" -printf "%P\n" |
while IFS= read -r FILE; do
if [ "${discriminant}" ">" "${FILE%.*}" ]; then
echo "${FILE}";
fi
done
Note that this is will probably be considered a "layman" solution by a professional. Maybe this is handled better by awk, which I am unfortunately not accustomed to using.
Here is another solution to delete log files older than 30 days:
#!/bin/sh
# A table that contains the path of directories to clean
rep_log=("/etc/var/log" "/test/nginx/log")
echo "Cleaning logs - $(date)."
#loop for each path provided by rep_log
for element in "${rep_log[#]}"
do
#display the directory
echo "$element";
nb_log=$(find "$element" -type f -mtime +30 -name "*.log*"| wc -l)
if [[ $nb_log != 0 ]]
then
find "$element" -type f -mtime +30 -delete
echo "Successfull!"
else
echo "No log to clean !"
fi
done
allows to include multiple directory where to delete files
rep_log=("/etc/var/log" "/test/nginx/log")
we fill the var: we'r doing a search (in the directory provided) for files which are older than 30 days and whose name contains at least .log. Then counts the number of files.
nb_log=$(find "$element" -type f -mtime +30 -name "*.log*"| wc -l)
we then check if there is a result other than 0 (posisitive), if yes we delete
find "$element" -type f -mtime +30 -delete
For delete file older than X days you can use this command and schedule it in /etc/crontab
find /PATH/TO/LOG/* -mtime +10 | xargs -d '\n' rm
or
find /PATH/TO/LOG/* -type f -mtime +10 -exec rm -f {} \

Copy specific named directories which content changed the last 24 hours

i can recursivly find and copy all my test-directories (with content) of the current directory:
find . -name test ! -path "./my_dest/*" -exec cp -r --parents {} /path/to/my_dest \;
But now I want to copy only that test-directories (with content), which content was changed within the last 24 houres.
What do I have to add to my line above?
Edit: I want to have the same results as my find-line above, but I want only that entries in my result, in which folders a folder or a file has been changed within the last 24hours (or something else).
The line
find . -name test ! -path "./my_dest/*" ! -ctime +0 -exec cp -r --parents {} /path/to/my_dest \;
does not do that! This line would find&copy only the folderchanged test-folders but not the filechanged test-folders.
You use the rsync command built specifically for this task. Here is the documentation and the manual page.

How do you delete files older than specific date in Linux?

I used the below command to delete files older than a year.
find /path/* -mtime +365 -exec rm -rf {} \;
But now I want to delete all files whose modified time is older than 01 Jan 2014. How do I do this in Linux?
This works for me:
find /path ! -newermt "YYYY-MM-DD HH:MM:SS" | xargs rm -rf
You can touch your timestamp as a file and use that as a reference point:
e.g. for 01-Jan-2014:
touch -t 201401010000 /tmp/2014-Jan-01-0000
find /path -type f ! -newer /tmp/2014-Jan-01-0000 | xargs rm -rf
this works because find has a -newer switch that we're using.
From man find:
-newer file
File was modified more recently than file. If file is a symbolic
link and the -H option or the -L option is in effect, the modification time of the
file it points to is always used.
This other answer pollutes the file system and find itself offers a "delete" option. So, we don't have to pipe the results to xargs and then issue an rm.
This answer is more efficient:
find /path -type f -not -newermt "YYYY-MM-DD HH:MI:SS" -delete
find ~ -type f ! -atime 4|xargs ls -lrt
This will list files accessed older than 4 days, searching from home directory.

Linux command to check new files in file system

We have linux machine we would like to check what new files have been added between a certain date range.
I only have SSH access to this box and it's openSUSE 11.1
Is there some sort of command that can give me a list of files that have been added to the filesystem between say 04/05/2011 and 05/05/2011
Thanks
Regards
Gabriel
There are bunch of ways for doing that.
First one:
start_date=201105040000
end_date=201105042359
touch -t ${start_date} start
touch -t ${end_date} end
find /you/path -type f -name '*you*pattern*' -newer start ! -newer end -exec ls -s {} \;
Second one:
find files modified between 20 and 21 days ago:
find -ctime +20 -ctime -21
finds files modified between 2500 and 2800 minutes ago:
find -cmin +2500 -cmin -2800
And read this topic too.
Well, you could use find to get a list of all the files that were last-modified in a certain time window, but that isn't quite what you want. I don't think you can tell just from a file's metadata when it came into existence.
Edit: To list the files along with their modification dates, you can pipe the output of find through xargs to run ls -l on all the files, which will show the modification time.
find /somepath -type f ... -print0 | xargs -0 -- ls -l
I misunderstood your question. Depending on what filesystem you are using, it may or may not store creation time.
My understanding is that ext2/3/4 do not store creation time, but modified, changed (status, which is slightly different), and access times are.
Fat32 on the other hand does contain creation timestamps IIRC.
If you are using an ext filesystem, you have two options it seems:
1.Settle for finding all of the files that were modified between two dates (which will include created files, but also files that were just edited). You could do this using find.
2.Create a script/cronjob that will document the contents of your filesystem at some interval, e.g.
find / > filesystem.$(date +%s).log
and then run diffs to see what has been added. This, of course, would prevent you from looking backwards to time before you started making these logs.
You can try one of these:
find -newerct "1 Aug 2013" ! -newerct "1 Sep 2013" -ls
find . -mtime $(date +%s -d"Jan 1, 2013 23:59:59") -mtime $(date +%s -d"Jan 2, 2016 23:59:59")
find /media/WD/backup/osool/olddata/ -newermt 20120101T1200 -not -newermt 20130101T1400
find . -mtime +1 -mtime -3
find . -mtime +1 -mtime -3 > files_from_yesterday.txt 2>&1
find . -mtime +1 -mtime -3 -ls > files_from_yesterday.txt 2>&1
touch -t 200506011200 first
touch -t 200507121200 last
find / -newer first ! -newer last
#!/bin/bash
for i in `find Your_Mail_Dir/ -newermt "2011-01-01" ! -newermt "2011-12-31"`; do
mv $i /moved_emails_dir/
Hope this helps.

Resources