E.g., a MySQL server is running on my Ubuntu machine. Some data has been changed during the last 24 hours.
What (Linux) scripts can find the files that have been changed during the last 24 hours?
Please list the file names, file sizes, and modified time.
To find all files modified in the last 24 hours (last full day) in a particular specific directory and its sub-directories:
find /directory_path -mtime -1 -ls
Should be to your liking
The - before 1 is important - it means anything changed one day or less ago.
A + before 1 would instead mean anything changed at least one day ago, while having nothing before the 1 would have meant it was changed exacted one day ago, no more, no less.
Another, more humanist way, is to use -newermt option which understands human-readable time units.
Unlike -mtime option which requires the user to read find documentation to figure our what time units -mtime expects and then having the user to convert its time units into those, which is error-prone and plain user-unfriendly. -mtime was barely acceptable in 1980s, but in the 21st century -mtime has the convenience and safety of stone age tools.
Example uses of -newermt option with the same duration expressed in different human-friendly units:
find /<directory> -newermt "-24 hours" -ls
find /<directory> -newermt "1 day ago" -ls
find /<directory> -newermt "yesterday" -ls
You can do that with
find . -mtime 0
From man find:
[The] time since each file was last modified is divided by 24 hours and any remainder is discarded. That means that to
match -mtime 0, a file will have to have a modification in the past which is less than 24 hours ago.
On GNU-compatible systems (i.e. Linux):
find . -mtime 0 -printf '%T+\t%s\t%p\n' 2>/dev/null | sort -r | more
This will list files and directories that have been modified in the last 24 hours (-mtime 0). It will list them with the last modified time in a format that is both sortable and human-readable (%T+), followed by the file size (%s), followed by the full filename (%p), each separated by tabs (\t).
2>/dev/null throws away any stderr output, so that error messages don't muddy the waters; sort -r sorts the results by most recently modified first; and | more lists one page of results at a time.
For others who land here in the future (including myself), add a -name option to find specific file types, for instance: find /var -name "*.php" -mtime -1 -ls
This command worked for me
find . -mtime -1 -print
Find the files...
You can set type f = file
find /directory_path -type f -mtime -1 -exec ls -lh {} \;
👍
Related
I have written a script to create an array of files that are modified more than 1 day back, iterate through the array and delete them interactively. Somehow the script is not working.
Below is the code:
#!/bin/bash
#take input from user for directory
read -p "Please enter the directory path from which you want to remove unused files" path
#create an array of files modified 2 days back
readarray -t files < <(find "$path" -maxdepth 0 -type f -mtime +1)
#remove the files iteratively and interactively
for file in "${files[#]}"; do
rm -i "$file"
done
This script isn't deleting anything. I have some files created on 12th Jan and untouched after that, but they're still there.
Can you please mention if something is missing here?
-mtime only considers full days, meaning +1 means "at least 2 days ago". From man find:
-atime n
File was last accessed less than, more than or exactly n*24 hours
ago. When find figures out how many 24-hour periods ago the file was
last accessed, any fractional part is ignored, so to match -atime +1,
a file has to have been accessed at least two days ago.
-mtime n
File's data was last modified less than, more than or exactly n*24
hours ago. See the comments for -atime to understand how rounding
affects the interpretation of file modification times.
You might also want to consider -exec of find instead of storing in an array. This will avoid all sorts of problems when your file names contain special characters, such as blanks, newlines, or globbing wildcards:
find "$path" -maxdepth 0 -type f -mtime +1440 -exec rm {} +
(a day has 1440 minutes (=60*24))
I am trying to delete all files that are older than 7 days. The command is working but not correctly.
find '/files/tem_data/' -mtime +7 -exec rm -rf {} \+
It does delete files but it's not accurate.
ls -Artl | head -n 2
The find does delete files, but when I run the ls command does contain files that should be deleted. For example today is November 7th. The find should delete all files before November 1st. It does not. The command leaves files that are in October 30 and 31. How can I delete files that are older than 7 days.
If I run find command like 3 minutes later. It deletes files with the date of October 30 and a time of 3 minutes after it first ran.
From man find:
-atime n
File was last accessed n*24 hours ago. When find figures out how many
24-hour periods ago the file was last accessed, any fractional part is
ignored, so to match -atime +1, a file has to have been accessed at least
two days ago.
This means that your command actually deletes files that were accessed 8 or more days ago.
Since the time now is
$ date
Tue Nov 7 10:29:29 PST 2017
find will require files need to be older than:
$ date -d 'now - 8 days'
Mon Oct 30 11:29:05 PDT 2017
In other words, leaving some files from Oct 30 is expected and documented behavior.
To account for find rounding down, simply use -mtime +6 instead.
This is not the exact answer but you can try this as an sample.
find /path/to/ -type f -mtime +7 -name '*.gz' -execdir rm -- '{}' \;
Or for an alternative and also faster command is using exec's + terminator instead of \;:
find /path/to/ -type f -mtime +7 -name '*.gz' -execdir rm -- '{}' +
or
find /path/to/ -type f -mtime +7 -name '*.gz' -delete
find: the unix command for finding files/directories/links and etc.
/path/to/: the directory to start your search in.
-type f: only find files.
-name '*.gz': list files that ends with .gz.
-mtime +7: only consider the ones with modification time older than 7 days. -execdir ... \;: for each such result found, do the
following command in rm -- '{}': remove the file; the {} part is
where the find result gets substituted into from the previous part.
-- means end of command parameters avoid prompting error for those files starting with hyphen
.
How can I search through a massive amount of data (28TB) to find the largest 10 files in the past 24 hours?
From the current answers below I've tried:
$ find . -type f -mtime -1 -printf "%p %s\n" | sort -k2nr | head -5
This command takes over 24 hours which defeats the purpose of searching for most recently modified in the past 24 hours. Are there any solutions known to be faster than the one above that can drastically cut search time? Solutions to monitor the system also will not work as there is simply too much to monitor and doing such could cause performance issues.
something like this?
$ find . -type f -mtime -1 -printf "%p %s\n" | sort -k2nr | head -5
top 5 modified files by size in the past 24 hours.
you can use the standard yet very powerful find command like this (start_directory is the directory where to scan files)
find start_directory -type f -mtime -1 -size +3000G
-mtime -1 option: files modified 1 day before or less
-size +3000G option: files of size at least 3 Gb
how to find files based upon time information, such as creation, modified and accessed. It is useful to find files before a certain time, after a certain time and between two times. what command in Linux would i have to use ?
I understand to find setuid files on linux computers i would have to use :
find / -xdev ( -perm -4000 ) -type f -print0 | xargs -0 ls -l
How do i check for files which have been modified in the last 30 minutes. (I created a new file called FILE2)
Just add -mtime -30m. I might be wrong about the actual syntax, but you get the idea. See man find.
Answer on your question is
find . -cmin -30 -exec ls -l {} \;
We have linux machine we would like to check what new files have been added between a certain date range.
I only have SSH access to this box and it's openSUSE 11.1
Is there some sort of command that can give me a list of files that have been added to the filesystem between say 04/05/2011 and 05/05/2011
Thanks
Regards
Gabriel
There are bunch of ways for doing that.
First one:
start_date=201105040000
end_date=201105042359
touch -t ${start_date} start
touch -t ${end_date} end
find /you/path -type f -name '*you*pattern*' -newer start ! -newer end -exec ls -s {} \;
Second one:
find files modified between 20 and 21 days ago:
find -ctime +20 -ctime -21
finds files modified between 2500 and 2800 minutes ago:
find -cmin +2500 -cmin -2800
And read this topic too.
Well, you could use find to get a list of all the files that were last-modified in a certain time window, but that isn't quite what you want. I don't think you can tell just from a file's metadata when it came into existence.
Edit: To list the files along with their modification dates, you can pipe the output of find through xargs to run ls -l on all the files, which will show the modification time.
find /somepath -type f ... -print0 | xargs -0 -- ls -l
I misunderstood your question. Depending on what filesystem you are using, it may or may not store creation time.
My understanding is that ext2/3/4 do not store creation time, but modified, changed (status, which is slightly different), and access times are.
Fat32 on the other hand does contain creation timestamps IIRC.
If you are using an ext filesystem, you have two options it seems:
1.Settle for finding all of the files that were modified between two dates (which will include created files, but also files that were just edited). You could do this using find.
2.Create a script/cronjob that will document the contents of your filesystem at some interval, e.g.
find / > filesystem.$(date +%s).log
and then run diffs to see what has been added. This, of course, would prevent you from looking backwards to time before you started making these logs.
You can try one of these:
find -newerct "1 Aug 2013" ! -newerct "1 Sep 2013" -ls
find . -mtime $(date +%s -d"Jan 1, 2013 23:59:59") -mtime $(date +%s -d"Jan 2, 2016 23:59:59")
find /media/WD/backup/osool/olddata/ -newermt 20120101T1200 -not -newermt 20130101T1400
find . -mtime +1 -mtime -3
find . -mtime +1 -mtime -3 > files_from_yesterday.txt 2>&1
find . -mtime +1 -mtime -3 -ls > files_from_yesterday.txt 2>&1
touch -t 200506011200 first
touch -t 200507121200 last
find / -newer first ! -newer last
#!/bin/bash
for i in `find Your_Mail_Dir/ -newermt "2011-01-01" ! -newermt "2011-12-31"`; do
mv $i /moved_emails_dir/
Hope this helps.