How to find directories older than date when the date is encoded to dirname as yyyy-mm-dd - linux

I want to purge old folders with all the content by GNU find. I'm familiar with -mtime but the time stamp of the directories is usually corrupted by rsync - not important. Luckily, the time stamp is encoded to the directory name as yyyy-mm-dd.
How can I do the same by using the dirname instead of time stamp? Optimal read solution is preferred.
EDIT:
Corrupted time stamps:
drwxr-xr-x 2 user user 8192 Aug 23 11:12 2016-05-03
drwxr-xr-x 2 user user 8192 Aug 23 11:12 2016-05-04
drwxr-xr-x 2 user user 8192 Aug 23 11:12 2016-05-05
The files inside the dirs have correct time stamps.
idea: purge the files with find -mtime and then (second round) purge the empty dirs. Most likely it is not possible to perform both in one round, since -empty would apply to files as well.
idea: fix the time stamps of the directories in one round (according to their name) and then purge all by find -mtime in another round. But the regular rsync will corrupt that again. The cron jobs must be tuned against race conditions.
idea: convert -mtime +150 to yyyy-mm-dd (using date -d "-150 days") and then compare this string with the folder name, as it was suggested in the answer by #xvan
I ask for help finding the best way.

You may use bash lexicographical compare:
if [[ "2010-01-01" < "2011-02-02" ]]
then echo "yes"
fi
EDIT: It's a bit hard to escape <, this worked for me.
find . ! -path . -type d -exec bash -c "[[ {} < 2010-02-02 ]]" \; -print

Related

Inputing directories/files into a text file and then showing files older than certain date

I'm using "ls -alR" to input directories and files in those directories into another text file.
ls -alR > testalR.txt
text file is created like so:
./test-file_folders/test_folder_1:
total 400
drwx------ 5 "user" "group" "size" May 2 10:30 test_file_1
.... file info ... more file info ....test_file_2
...more files
./test-file_folders/test_folder_2:
total 400
drwx------ 5 "user" "group" "size" Oct 2 11:35 test_file_1
.... file info ... more file info ....test_file_2
...more files
I am trying to show files that have not been accessed since October 2 2018.
I've tried:
`sed -n '/Oct 2 00:00/,/Oct/ 2 23:59/p' /testalR.txt
..but it produces no results. Is there a better way to display this or even possible?
Sorry, should have added this to begin with. I know using find -atime variant would be the best option. But we are using a system and a process that is bogged down by the find command. I am trying to find alternatives so using "find" can be avoided and wouldn't have to access directories directly each time I wanted to run a search.
Parsing the output of ls is a slippery slope.
Use find:
find . -type f -atime +3 -print
find . -type f -atime +3 -exec ls -uld {} +
Using -print simply returns a list of the filenames. Using -exec ls -ld {} + causes ls to be run for every file returned, giving you the details you may want.
The argument. to atime (or mtime or ctime) is in 24-hour steps. The argument can be positive or negative (or zero). Using -atime +3 finds a files that have been accessed at least FOUR days ago.
Using -exec ... {} + causes the command in "..." to be executed for every object returned, bundling as many objects (files) as possible at a time. This is much more efficient than forking a process for every file returned, as with:
... -exec ls -uld {} \;
One way to limit your results to a specific date, is to create two reference points (files) like this:
touch -amt 201809302359 f1
touch -amt 201810012359 f2
find . -type f \( -anewer f1 -a ! -anewer f2 \) -exec ls -uld -exec {} +
try with find:
find /folder -atime +30
where +30 = days
others params: man find

Shell script to find recently modified files [duplicate]

E.g., a MySQL server is running on my Ubuntu machine. Some data has been changed during the last 24 hours.
What (Linux) scripts can find the files that have been changed during the last 24 hours?
Please list the file names, file sizes, and modified time.
To find all files modified in the last 24 hours (last full day) in a particular specific directory and its sub-directories:
find /directory_path -mtime -1 -ls
Should be to your liking
The - before 1 is important - it means anything changed one day or less ago.
A + before 1 would instead mean anything changed at least one day ago, while having nothing before the 1 would have meant it was changed exacted one day ago, no more, no less.
Another, more humanist way, is to use -newermt option which understands human-readable time units.
Unlike -mtime option which requires the user to read find documentation to figure our what time units -mtime expects and then having the user to convert its time units into those, which is error-prone and plain user-unfriendly. -mtime was barely acceptable in 1980s, but in the 21st century -mtime has the convenience and safety of stone age tools.
Example uses of -newermt option with the same duration expressed in different human-friendly units:
find /<directory> -newermt "-24 hours" -ls
find /<directory> -newermt "1 day ago" -ls
find /<directory> -newermt "yesterday" -ls
You can do that with
find . -mtime 0
From man find:
[The] time since each file was last modified is divided by 24 hours and any remainder is discarded. That means that to
match -mtime 0, a file will have to have a modification in the past which is less than 24 hours ago.
On GNU-compatible systems (i.e. Linux):
find . -mtime 0 -printf '%T+\t%s\t%p\n' 2>/dev/null | sort -r | more
This will list files and directories that have been modified in the last 24 hours (-mtime 0). It will list them with the last modified time in a format that is both sortable and human-readable (%T+), followed by the file size (%s), followed by the full filename (%p), each separated by tabs (\t).
2>/dev/null throws away any stderr output, so that error messages don't muddy the waters; sort -r sorts the results by most recently modified first; and | more lists one page of results at a time.
For others who land here in the future (including myself), add a -name option to find specific file types, for instance: find /var -name "*.php" -mtime -1 -ls
This command worked for me
find . -mtime -1 -print
Find the files...
You can set type f = file
find /directory_path -type f -mtime -1 -exec ls -lh {} \;
👍

Delete directories older than X days

so I have looked at every single script on here regarding deleting directories older than 14 days. The Script I wrote works with files but for some reason it is not deleting the directories. So here is my scripts.
#!/bin/bash
find /TBD/* -mtim +1 | xargs rm -rf
So this code successfully deleted the FILES inside TBD but it left two directories. I checked the timestamp on them and they are atleast 2 days since last modification according to the timestamp. Specifically Dec 16 16:10 So I can't figure this out. My crontab I have running this runs every minute and logs and in the log it only shows.
+ /scripts/deletebackups.sh: :2:BASH_XTRACEFD=3xargs rm -rf
+ /scripts/deletebackups.sh: :2: BASH_XTRACEFD=3find /TBD/contents TBD/contents -mtime +1
I used contents since the contents are actually peoples name in our pxe server. I checked every file and folder INSIDE these two directories and their timestamps are the same as the parent directory as they should be but it's still not deleting.
Could it be a permissions thing? I wrote the script using sudo nano deletebackups.sh
When I type ls under TBD in the far left it shows
drwxr-xr-x 3 hscadministrator root 4096 DEC 16 16:10 for each of the two directories that won't delete.
I'm not overly familiar with what all those letters mean.
Other iterations of this code I have already attempted are
find /TBD/* -mtime +1 rm -r {} \;
To delete directories in /TBD older than 1 day:
find /TBD -mtime +1 -type d | xargs rm -f -r
Add -exec and -f to your find:
find /TBD/* -mtime +1 -exec rm -rf {} \;
Note, if you're looking to delete files older than 14 days, you need to change mtime:
-mtime +14

Find modified files and echo them into a txt file on server root

I'd need a shell command to show of the last modified and new files on the whole server (recursive) and echoing them into a txt file in the root.
Has anybody something like this?
I tried
find / - mmtime 30 -printf "%AD %Ar - %p\n" 2> /dev/null | sort -r > /lastmodified.txt
to post all names of all modified files from the last 30 days into a txt file in root, but it shows me only the files of the server itself and not the directories where my websites are uploaded to.
Thank you in advance - I am not an expert, and this is what I found so far. It is relative urgent as I need this to fix hacked files which has happend last week.
From http://content.hccfl.edu/pollock/Unix/FindCmd.htm:
find . -mtime 0 # find files modified within the past 24 hours
find . -mtime -1 # find files modified within the past 24 hours
find . -mtime 1 # find files modified between 24 and 48 hours ago
find . -mtime +1 # find files modified more than 48 hours ago
Make sure that you have only one 'm' and a minus sign in -mtime -30, as suggested in chorobas comment, to get last 30 days. -mtime 30 would give just files exactly 30 days ago.
You may want to use option -daystart to get files of last 30 days starting from midnight instead of just 30*24 hours ago. Use %TD and %Tr instead of %AD and %Ar to get modification times (instead of access times).
The final command would then be:
find / -daystart -mtime -30 -printf "%TD %Tr - %p\n" 2> /dev/null | sort -r > /lastmodified.txt
Note that the sort will break in January as 12 is sorted before 01. If you want to make sure the dates are always in order, use for example time definition %T+ (2012-11-29+21:07:41.0000000000) or %Tu/%Tm/%Td %TH:%TM (12/11/29 21:07)
What about inotify-tools
https://github.com/rvoicilas/inotify-tools/wiki#wiki-getting
http://linux.die.net/man/1/inotifywait
inotifywait example 2
#!/bin/sh
EVENT=$(inotifywait --format '%e' ~/file1)
[ $? != 0 ] && exit
[ "$EVENT" = "MODIFY" ] && echo 'file modified!'
[ "$EVENT" = "DELETE_SELF" ] && echo 'file deleted!'
# etc...

Linux command to check new files in file system

We have linux machine we would like to check what new files have been added between a certain date range.
I only have SSH access to this box and it's openSUSE 11.1
Is there some sort of command that can give me a list of files that have been added to the filesystem between say 04/05/2011 and 05/05/2011
Thanks
Regards
Gabriel
There are bunch of ways for doing that.
First one:
start_date=201105040000
end_date=201105042359
touch -t ${start_date} start
touch -t ${end_date} end
find /you/path -type f -name '*you*pattern*' -newer start ! -newer end -exec ls -s {} \;
Second one:
find files modified between 20 and 21 days ago:
find -ctime +20 -ctime -21
finds files modified between 2500 and 2800 minutes ago:
find -cmin +2500 -cmin -2800
And read this topic too.
Well, you could use find to get a list of all the files that were last-modified in a certain time window, but that isn't quite what you want. I don't think you can tell just from a file's metadata when it came into existence.
Edit: To list the files along with their modification dates, you can pipe the output of find through xargs to run ls -l on all the files, which will show the modification time.
find /somepath -type f ... -print0 | xargs -0 -- ls -l
I misunderstood your question. Depending on what filesystem you are using, it may or may not store creation time.
My understanding is that ext2/3/4 do not store creation time, but modified, changed (status, which is slightly different), and access times are.
Fat32 on the other hand does contain creation timestamps IIRC.
If you are using an ext filesystem, you have two options it seems:
1.Settle for finding all of the files that were modified between two dates (which will include created files, but also files that were just edited). You could do this using find.
2.Create a script/cronjob that will document the contents of your filesystem at some interval, e.g.
find / > filesystem.$(date +%s).log
and then run diffs to see what has been added. This, of course, would prevent you from looking backwards to time before you started making these logs.
You can try one of these:
find -newerct "1 Aug 2013" ! -newerct "1 Sep 2013" -ls
find . -mtime $(date +%s -d"Jan 1, 2013 23:59:59") -mtime $(date +%s -d"Jan 2, 2016 23:59:59")
find /media/WD/backup/osool/olddata/ -newermt 20120101T1200 -not -newermt 20130101T1400
find . -mtime +1 -mtime -3
find . -mtime +1 -mtime -3 > files_from_yesterday.txt 2>&1
find . -mtime +1 -mtime -3 -ls > files_from_yesterday.txt 2>&1
touch -t 200506011200 first
touch -t 200507121200 last
find / -newer first ! -newer last
#!/bin/bash
for i in `find Your_Mail_Dir/ -newermt "2011-01-01" ! -newermt "2011-12-31"`; do
mv $i /moved_emails_dir/
Hope this helps.

Resources