Is there any way to find out changed file after some date in whole project code? - linux

see i am working in one BIG project source code Now i want to know which files are modified after some date.
Is there any command or any way to get that..
i have tried
# ls -R -l
but here it shows all file with last modified data but i want to filter this output by some data ...
so is there any way to do this in linux? is there any tool available for this?

#set timestamp for file
touch --date "2011-12-31" /tmp/foo
# Find files newer than 2011/Dec/31, in /some/files
find /some/files -newer /tmp/foo

Use find command with mtime arguments: Some examples are here or here
For example, list files changed in last 7 days...
find / -type f -mtime -7
For fine grained search you may try -mmin argument. See an example discussed in another SE site: Find All files older than x minutes

You should use find with -newerXY option.
m – modification time of the file reference
t – reference is interpreted directly as a time
All files modified after 2022-12-01 (inclusive):
find . -type f -newermt 2022-12-01

Related

Finding and following symbolic links but without deleting them

The current find command is utilized to find and delete outdated files and directories. The expired data is based on a given properties file and the destination.
If the properties file says…
"/home/some/working/directory;.;180"
…then we want files and empty subdirectories deleted after 180 days.
The original command is…
"find ${var[0]} -mtime +${var[2]} -delete &"
…but I now need to modify now that we've discovered is has deleted symbolic links that existed in specified sub-directories after the given expiration date in the properties file. The variable path and variable expiration time are designated in the properties file (as previously demonstrated).
I have been testing using…
"find -L"
…to follow the symbolic links to make sure this clean up command reaches the destinations, as desired.
I have also been testing using…
"\! -type l"
…to ignore deleting symbolic links, so the command I've been trying is…
"find -L ${var[0]} ! -type l -mtime +${var[2]} -delete &"
…but I haven't achieved the desired results. Help please, I am still fresh into Linux and my research hasn't lead me to a desired answer. Thank you for your time.
Change
\! -type l
to
\! -xtype l
find -L ${var[0]} \\! -xtype l -mtime +${var[2]} -delete &

Find all .py files across directory and subdirectories, fetch the last modified date of all and return the max date in a specific format

I have been struggling to do this. I have multiple .py files located in my directory and subdirectories. I want to find and list all of them with their date and time and return the date and time of the last updated .py file in a variable in the format "+%m-%d-%Y %H:%M:%S".
How can I achieve this?
As of now, I'm able to reach only at this stage where it shows me the stats of the last modified date in terminal. Below is the command I'm using:
find . -path ./ABC -prune -false -o -name '*.py' -exec ls -lat {} + | head -n1
Here I have specifically mentioned not to search in the ./ABC folder.
The output of the above command is:
-rw-rw----+ 1 owner server 8263 Jul 8 09:09 ./apps/Test.py
I'm looking for a way where I can return the date and time of the last updated file (./apps/Test.py in this case) in a variable with the format like 07-08-2021 09:09:30.
Just find ... -printf. See man find.
find . -printf "%Tm-%Td-%TY %TH:%TM:%TS\t%p\n"
I see find output prints seconds with fractional part. You can filter them with sed.

How to identify renamed files on linux?

I am using the 'find' command to identify modified files. But I've noticed that my method only identifies content-modified files and new files. It does not identify files where the only change was a rename. Is there a way to use 'find' to identify renamed files? If not, is there some other linux command that can be used for this?
Here is my current method for identifying changed files going back roughly one month (this method does NOT identify renamed files):
$ touch --date "2017-09-10T16:00:00" ~/Desktop/tmp
$ find ~/Home -newer ~/Desktop/tmp -type f > modified-files
You should replace option -newer with \( -newer -o -cnewer \) in order to catch modifications to file metadata as well.

Find files created 30 minutes before AND after a given file - UNIX

I am trying to figure out the command to display all files created 30 minutes (as an example) before and after another file was created. So far I managed to find files newer than that file
but I cannot work out how to look for both before and after given time.
A command I have used:
find -type f -newer file.txt -cmin -30
This works fine but only does half of what I am trying to do.
Also, I need to modify that to search for setuid files only, which I THINK I can do by adding the -perm -4000 in that command.
Any suggestions?
As far as I know there is no way to find file creation time.
You can try by modification time (this will get all files last-modified between 5th and 8th)
find . -type f -newermt 2012-10-05 ! -newermt 2012-10-08
(or access time replace newermt with newerat)
newerXY is flag to compare timestamps of current file with reference (see man find for more info).
According to man find (on my debian) there are 4 flags (aside from t to interpret directly as time)
a The access time of the file reference
B The birth time of the file reference
c The inode status change time of reference
m The modification time of the file reference
You can also try with 'B' birth time but it does not work for me, gives me error. I don't know why it is included in the man page
compare to another file
find / -newer file
find / ! -newer file
You can create temp file (one with modification time 30 min before the target file, another 30 mins after)
touch -d `stat -c %y test.txt` - 30 temp/before_temp
touch -d `stat -c %y test.txt` + 30 temp/after_temp
find / -newer temp/before_temp ! -newer temp/after_temp
touch -d takes a date option, so if you add and subtract correctly, this should work.

grep files based on time stamp

This should be pretty simple, but I am not figuring it out. I have a large code base more than 4GB under Linux. A few header files and xml files are generated during build (using gnu make). If it matters the header files are generated based on xml files.
I want to search for a keyword in header file that was last modified after a time instance ( Its my start compile time), and similarly xml files, but separate grep queries.
If I run it on all possible header or xml files, it take a lot of time. Only those that were auto generated. Further the search has to be recursive, since there are a lot of directories and sub-directories.
You could use the find command:
find . -mtime 0 -type f
prints a list of all files (-type f) in and below the current directory (.) that were modified in the last 24 hours (-mtime 0, 1 would be 48h, 2 would be 72h, ...). Try
grep "pattern" $(find . -mtime 0 -type f)
To find 'pattern' in all files newer than some_file in the current directory and its sub-directories recursively:
find -newer some_file -type f -exec grep 'pattern' {} +
You could specify the timestamp directly in date -d format and use other find tests e.g., -name, -mmin.
The file list could also be generate by your build system if find is too slow.
More specific tools such as ack, etags, GCCSense might be used instead of grep.
Use this. Because if find doesn't return a file, then grep will keep waiting for an input halting the script.
find . -mtime 0 -type f | xargs grep "pattern"

Resources