Find file according to the access time using "grep" and "find" commands - linux

My goal is to find all text files with extension .log, which have the last access more than 24 hours ago and contain the required text.
Here is what I have already tried:
find / *.log -mtime +1 -print | grep "next" *.log
but this doesn't work.
Question is: how can I reach the goal I have described above?Maybe some ways to modify my find expression?

The problem with your command is that you are running the grep on the output of the find command - which means you are running it on the file names, not content (actually, since you have the *.log at the end, you run it on all *.log files, completely ignoring what your find command found). also, you need -name in order to filter only the .log files.
you can use the -exec flag of find to execute a command on each of the files that matches your find criteria:
find / -name "*.log" -mtime +1 -exec grep 'next' \{};

Try with xargs:
find / -name "*.log" -mtime +1 | xargs grep "next"
But also, note what the find manual says about the arg to -atime which also applies to -mtime. That is, your mtime as specified probably doesn't get the time period you want.
When find figures out how many 24-hour periods ago the file was last
accessed, any fractional part is ignored, so to match -atime +1, a
file has to have been accessed at least two days ago.

Related

What does this cron do for each command?

find /home/root/public_html/_sess -type f -mtime +3 -name 'sess-*' -execdir rm -- {} \;
I feel like I understand find , but I'm not 100% sure what -type is, I think that is the file type f not sure yet -mtime I feel like -mtime means a time setting of some sort, and +3 means maybe that time setting +3? , I feel like -execdir rm -- just means remove the files in the directory call -name 'sess-*' as well. But again not 100% sure of all the command elements within and wanted to get clarification.
You can do man find to get information on how Linux find works and all the options you can pass to it.
In this case, the command is using the Linux find utility to search for files in the /home/root/public_html/_sess directory with the following options:
-file f - searches for files of filetype f, which is regular files (not directories, links, etc)
-mtime +3 - searches for files modified more than 3 days ago (the + is for more than, -3 would be less than 3 days old)
-name 'sess-* - searches for files whose name matches the regex sess-* (name starts with "sess-")
-execdir <command> {}; - executes <command> on each file that find finds in the directory that the file was found in, in this case <command> is rm to remove the file
So in summary, this job searches for files located in a certain directory, whose names start with a specific string, and which are more than 3 days old, and deletes them.

Shell script to find recently modified files [duplicate]

E.g., a MySQL server is running on my Ubuntu machine. Some data has been changed during the last 24 hours.
What (Linux) scripts can find the files that have been changed during the last 24 hours?
Please list the file names, file sizes, and modified time.
To find all files modified in the last 24 hours (last full day) in a particular specific directory and its sub-directories:
find /directory_path -mtime -1 -ls
Should be to your liking
The - before 1 is important - it means anything changed one day or less ago.
A + before 1 would instead mean anything changed at least one day ago, while having nothing before the 1 would have meant it was changed exacted one day ago, no more, no less.
Another, more humanist way, is to use -newermt option which understands human-readable time units.
Unlike -mtime option which requires the user to read find documentation to figure our what time units -mtime expects and then having the user to convert its time units into those, which is error-prone and plain user-unfriendly. -mtime was barely acceptable in 1980s, but in the 21st century -mtime has the convenience and safety of stone age tools.
Example uses of -newermt option with the same duration expressed in different human-friendly units:
find /<directory> -newermt "-24 hours" -ls
find /<directory> -newermt "1 day ago" -ls
find /<directory> -newermt "yesterday" -ls
You can do that with
find . -mtime 0
From man find:
[The] time since each file was last modified is divided by 24 hours and any remainder is discarded. That means that to
match -mtime 0, a file will have to have a modification in the past which is less than 24 hours ago.
On GNU-compatible systems (i.e. Linux):
find . -mtime 0 -printf '%T+\t%s\t%p\n' 2>/dev/null | sort -r | more
This will list files and directories that have been modified in the last 24 hours (-mtime 0). It will list them with the last modified time in a format that is both sortable and human-readable (%T+), followed by the file size (%s), followed by the full filename (%p), each separated by tabs (\t).
2>/dev/null throws away any stderr output, so that error messages don't muddy the waters; sort -r sorts the results by most recently modified first; and | more lists one page of results at a time.
For others who land here in the future (including myself), add a -name option to find specific file types, for instance: find /var -name "*.php" -mtime -1 -ls
This command worked for me
find . -mtime -1 -print
Find the files...
You can set type f = file
find /directory_path -type f -mtime -1 -exec ls -lh {} \;
👍

How to delete files and directories older than n days in linux

I have a directory named repository which has a number of files and sub directories. I want to find the files and directories which have not been modified since last 14 days so that I can delete those files and directories.
I have wrote this script but it is giving the directory name only
#!/bin/sh
M2_REPO=/var/lib/jenkins/.m2/repository
echo $M2_REPO
OLDFILES=/var/lib/jenkins/.m2/repository/deleted_artifacts.txt
AGE=14
find "${M2_REPO}" -name '*' -atime +${AGE} -exec dirname {} \; >> ${OLDFILES}
find /path/to/files* -mtime +5 -exec rm {} \;
Note that there are spaces between rm, {}, and \;
Explanation
The first argument is the path to the files. This can be a path, a directory, or a wildcard as in the example above. I would recommend using the full path, and make sure that you run the command without the exec rm to make sure you are getting the right results.
The second argument, -mtime, is used to specify the number of days old that the file is. If you enter +5, it will find files older than 5 days.
The third argument, -exec, allows you to pass in a command such as rm. The {} \; at the end is required to end the command.
This should work on Ubuntu, Suse, Redhat, or pretty much any version of linux.
You can give the find -delete flag to remove the files with it. Just be careful to put it in the end of the command so that the time filter is applied first.
You can first just list the files that the command finds:
find "${M2_REPO}" -depth -mtime +${AGE} -print
The -d flag makes the find do the search depth-first, which is implied by the -deletecommand.
If you like the results, change the print to delete:
find "${M2_REPO}" -mtime +${AGE} -delete
I know this is a very old question but FWIW I solved the problem in two steps, first find and delete files older than N days, then find and delete empty directories. I tried doing both in one step but the delete operation updates the modification time on the file's parent directory, and then the (empty) directory does not match the -mtime criteria any more! Here's the solution with shell variables:
age=14
dir="/tmp/dirty"
find "$dir" -mtime "+$age" -delete && find "$dir" -type d -empty -delete

find files which have been modified in the last 30 minutes in Linux

how to find files based upon time information, such as creation, modified and accessed. It is useful to find files before a certain time, after a certain time and between two times. what command in Linux would i have to use ?
I understand to find setuid files on linux computers i would have to use :
find / -xdev ( -perm -4000 ) -type f -print0 | xargs -0 ls -l
How do i check for files which have been modified in the last 30 minutes. (I created a new file called FILE2)
Just add -mtime -30m. I might be wrong about the actual syntax, but you get the idea. See man find.
Answer on your question is
find . -cmin -30 -exec ls -l {} \;

grep files based on time stamp

This should be pretty simple, but I am not figuring it out. I have a large code base more than 4GB under Linux. A few header files and xml files are generated during build (using gnu make). If it matters the header files are generated based on xml files.
I want to search for a keyword in header file that was last modified after a time instance ( Its my start compile time), and similarly xml files, but separate grep queries.
If I run it on all possible header or xml files, it take a lot of time. Only those that were auto generated. Further the search has to be recursive, since there are a lot of directories and sub-directories.
You could use the find command:
find . -mtime 0 -type f
prints a list of all files (-type f) in and below the current directory (.) that were modified in the last 24 hours (-mtime 0, 1 would be 48h, 2 would be 72h, ...). Try
grep "pattern" $(find . -mtime 0 -type f)
To find 'pattern' in all files newer than some_file in the current directory and its sub-directories recursively:
find -newer some_file -type f -exec grep 'pattern' {} +
You could specify the timestamp directly in date -d format and use other find tests e.g., -name, -mmin.
The file list could also be generate by your build system if find is too slow.
More specific tools such as ack, etags, GCCSense might be used instead of grep.
Use this. Because if find doesn't return a file, then grep will keep waiting for an input halting the script.
find . -mtime 0 -type f | xargs grep "pattern"

Resources