I use the following code here to get all the files within my hard drive to be put into a new text file. Is there a way to modify this code to just get files added after a certain date? (For example, August 28th 2021 and onwards)
find -type f -exec md5sum "{}" + > checklist.txt
Try using xargs with find to apply md5sum to each file that find finds.
For example, this command will give you the md5sum for each file in the current directory modified since Aug 15, 2021:
find . -type f -newermt '2021-08-15' -print0 | xargs -0 md5sum
Related
I have a /folder with over a half million files created in the last 10 years. I'm restructuring the process so that in the future there are subfolders based on the year.
For now, I need to backup all files modified within the last year. I tried
zip -r /backup.zip $(find /folder -type f -mtime -365
but get error: Argument list too long.
Is there any alternative to get the files compressed and archived?
Zip has an option to read the filelist from stdin. Below is from the zip man page
-# file lists. If a file list is specified as -# [Not on MacOS],
zip takes the list of input files from standard input instead of
from the command line. For example,
zip -# foo
will store the files listed one per line on stdin in foo.zip.
This should do what you need
find /folder -type f -mtime -365 | zip -# /backup.zip
Note that I've removed the -r option because it isn't doing anything - you are explicitly selecting standard files with the find command (-type f)
You'll have to switch from passing all the files at once to piping the files one at a time to the zip command.
find /folder -type f -mtime -365 | while read FILE;do zip -r /backup.zip $FILE;done
You can also work with the -exec parameter in find, like this:
find /folder -type f -mtime -365 -exec zip -r /backup.zip \;
(or whatever your command is). For every file, the given command is executed with the file passed as a last parameter.
Find the files and then execute the zip command on as many files as possible using + as opposed to ;
find /folder -type f -mtime -365 -exec zip -r /backup.zip '{}' +
What command should I use to sort all the files in a project directory according to the date of modification?
Tried ls -t but it does not find the file and used find to find all the files of the specified type, but could not sort them.
You can use this to find all the files of the type you are looking for and sorted by the last modified date.
find $directory -type f -name "*.type" -print0 | xargs -0 stat -c "%y %n" | sort
The above looks for all the files of type "type" and then sorts it according to the last modified timestamp.
This should be pretty simple, but I am not figuring it out. I have a large code base more than 4GB under Linux. A few header files and xml files are generated during build (using gnu make). If it matters the header files are generated based on xml files.
I want to search for a keyword in header file that was last modified after a time instance ( Its my start compile time), and similarly xml files, but separate grep queries.
If I run it on all possible header or xml files, it take a lot of time. Only those that were auto generated. Further the search has to be recursive, since there are a lot of directories and sub-directories.
You could use the find command:
find . -mtime 0 -type f
prints a list of all files (-type f) in and below the current directory (.) that were modified in the last 24 hours (-mtime 0, 1 would be 48h, 2 would be 72h, ...). Try
grep "pattern" $(find . -mtime 0 -type f)
To find 'pattern' in all files newer than some_file in the current directory and its sub-directories recursively:
find -newer some_file -type f -exec grep 'pattern' {} +
You could specify the timestamp directly in date -d format and use other find tests e.g., -name, -mmin.
The file list could also be generate by your build system if find is too slow.
More specific tools such as ack, etags, GCCSense might be used instead of grep.
Use this. Because if find doesn't return a file, then grep will keep waiting for an input halting the script.
find . -mtime 0 -type f | xargs grep "pattern"
We have linux machine we would like to check what new files have been added between a certain date range.
I only have SSH access to this box and it's openSUSE 11.1
Is there some sort of command that can give me a list of files that have been added to the filesystem between say 04/05/2011 and 05/05/2011
Thanks
Regards
Gabriel
There are bunch of ways for doing that.
First one:
start_date=201105040000
end_date=201105042359
touch -t ${start_date} start
touch -t ${end_date} end
find /you/path -type f -name '*you*pattern*' -newer start ! -newer end -exec ls -s {} \;
Second one:
find files modified between 20 and 21 days ago:
find -ctime +20 -ctime -21
finds files modified between 2500 and 2800 minutes ago:
find -cmin +2500 -cmin -2800
And read this topic too.
Well, you could use find to get a list of all the files that were last-modified in a certain time window, but that isn't quite what you want. I don't think you can tell just from a file's metadata when it came into existence.
Edit: To list the files along with their modification dates, you can pipe the output of find through xargs to run ls -l on all the files, which will show the modification time.
find /somepath -type f ... -print0 | xargs -0 -- ls -l
I misunderstood your question. Depending on what filesystem you are using, it may or may not store creation time.
My understanding is that ext2/3/4 do not store creation time, but modified, changed (status, which is slightly different), and access times are.
Fat32 on the other hand does contain creation timestamps IIRC.
If you are using an ext filesystem, you have two options it seems:
1.Settle for finding all of the files that were modified between two dates (which will include created files, but also files that were just edited). You could do this using find.
2.Create a script/cronjob that will document the contents of your filesystem at some interval, e.g.
find / > filesystem.$(date +%s).log
and then run diffs to see what has been added. This, of course, would prevent you from looking backwards to time before you started making these logs.
You can try one of these:
find -newerct "1 Aug 2013" ! -newerct "1 Sep 2013" -ls
find . -mtime $(date +%s -d"Jan 1, 2013 23:59:59") -mtime $(date +%s -d"Jan 2, 2016 23:59:59")
find /media/WD/backup/osool/olddata/ -newermt 20120101T1200 -not -newermt 20130101T1400
find . -mtime +1 -mtime -3
find . -mtime +1 -mtime -3 > files_from_yesterday.txt 2>&1
find . -mtime +1 -mtime -3 -ls > files_from_yesterday.txt 2>&1
touch -t 200506011200 first
touch -t 200507121200 last
find / -newer first ! -newer last
#!/bin/bash
for i in `find Your_Mail_Dir/ -newermt "2011-01-01" ! -newermt "2011-12-31"`; do
mv $i /moved_emails_dir/
Hope this helps.
I have a folder with a bunch of files. I need to delete all the files created before July 1st. How do I do that in a bash script?
I think the following should do what you want:
touch -t 201007010000 dummyfile
find /path/to/files -type f ! -newer dummyfile -delete
The first line creates a file which was last modified on the 1st July 2010. The second line finds all files in /path/to/file which has a date not newer than the dummyfile, and then deletes them.
If you want to double check it is working correctly, then drop the -delete argument and it should just list the files which would be deleted.
This should work:
find /file/path ! -newermt "Jul 01"
To find the files you want to delete, so the command to delete them would be:
find /file/path ! -newermt "Jul 01" -type f -print0 | xargs -0 rm