Linux/Unix Command Needed for finding files on a particular date [closed] - linux

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I need help finding files in a directory which contain a word/string and on a particular date.
Currently I am using this command:
find . -exec grep -l .string. {} \;
This command returns all the files containing that string in that directory. I would like to get those files on from a particular date, for example 12/24/2013.

You can use:
find . -type f -exec grep 'string' {} \; -exec ls -l {} \; | grep 'Dec 24'
Which will search any files which contain the string string, and then execute ls -l on only those files, and finally, grep out any that match Dec 24.
This works because find will apply it's arguments in order, so only those that match previous results will be passed on.

Maybe this could help you with grep:
find /path/to/find -type d -atime -7
The last parameter is days here 7 days before you can modify to particular dat ,atime is the file access time ,'d' is directory search for directory for find a file replace 'd' with 'f' give the path where to find and then finally make pipeline this with grep to string to search

Related

Rich globbing `ls [G-S]*` in fish shell? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
In Bash it is possible to
ls [G-S]*
and it would list all files from g-s and G-S.
How is that done in Fish shell?
Fish currently does not support a rich glob syntax. The current thinking is that a glob command should be added in keeping with the fish goal of doing things via commands rather than magic syntax. See, for example, https://github.com/fish-shell/fish-shell/issues/3681. The solution is to create a function that filters the results. For example, the ** glob matches all files and directories in and below the CWD. I frequently want just the plain files and want to ignore the .git subdir. So I wrote this function:
function ff --description 'Like ** but only returns plain files.'
# This also ignores .git directories.
find . \( -name .git -type d -prune \) -o -type f | sed -n -e '/\/\.git$/n' -e 's/^\.\///p'
end
Which I can then use like this: grep something (ff). You could create a similar function that uses the find -name pattern matching feature or filter the results with string match --regex.
You can use find -iregex "./[G-S].*". Fish is quite limited in this regard.

No such file or directory find command on linux [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I've created script which delete old backup file from directory but this command was worked fine before 1 week and Nothing change on script or packages but still getting below error:
root#:# find /var/backups/abc/* -type d -mtime +6
/var/backups/abc/2016-03-09_0321
root#:~# find /var/backups/abc/* -type d -mtime +6 -exec rm -rf {} \;
find: `/var/backups/abc/2016-03-08_0321': No such file or directory
Problem is that, this script run every day on cron, I getting a mail like " find: `/var/backups/abc/2016-03-08_0321': No such file or directory". files are deleted but such a mails are getting from root.
find /var/backups/abc/* -type d -mtime +6 -prune -exec rm -rf {} \;
Here, we useĀ -pruneĀ on the directories that we're about to delete, so find will then not try to read their contents.
This is because of after having returned your directory, find will try to look in it (to continue his recursive search), and will fail because of you just removed it.

Find empty files, if found update files with [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
Can someone with more Linux knowledge answer this correctly for me.
On our web server, we host and run ALOT of web scripts.
we control these via Datestamp files, So the script is not over ran, or ran more than once.
A lot of files are all 0 KB. I wanted to know if there is a quick way in Linux to locate the files and update them.
I have located the files using:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty
I have a long list of files, Can i update these with a simple datestamp format:
i.e.
20150923114046
You can use the -exec option of find:
find /var/www/vhosts/DOMAINNAME.co.uk/httpdocs -name "datestamp.*" -type f -empty \
-exec bash -c 'echo 20150923114046 > {}' \;
To get the timestamp dynamically, use date:
bash -c 'echo $(date +%Y%m%d%H%M%S) > {}'
To use the last modified timestamp, use the -r option:
bash -c 'echo $(date +%Y%m%d%H%M%S -r {}) > {}'

How to search for a string in entire linux system? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am looking to search for a string for example "/uniquexx" in the entire hard drive and find the files where it is referenced? How could I do that? I tried grep and find / but no luck
grep -r blablastring /
-r means recursive searching from all subdirectories.
If you want to search only text files, try ack. It's like grep, but defaults to skipping file types it recognizes as binary. It also highlights matches by default, when searching recursively in a directory.
Some answer point with use of grep.
What this actually does is make a list of every file on the system, and then for each file, execute grep with the given arguments and the name of each file
use
find / -xdev '(' -type f -a -name '*.txt' -a -size -2M -a -mtime -5 ')' -print0 | xargs -0 grep -H "800x600"
read more: How to search text throughout entire file system?
You can try:
grep -r -H "your string" /home/yourdir
-H means you will shown the filename contains your string.
Anyway if you want to search within the WHOLE linux directory, you need sudo privileges.

find -mtime returns wrong listing [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
When I run this command:
root:/home/mws 0$ ls -lrt `find /home/data/ll_misc_logs/ -mtime +20`
And there are no files meeting the mtime setting, 20 days, it lists the contents of the current directory, /home/mws
Why?
Is there a way to just return nothing or a message?
When there are no files meeting the mtime setting, the output of find .... expands to ... nothing. In which case, your command becomes ls -lrt, which will always list the current directory.
If there aren't too many files on a typical run, this might work better:
find /home/data/ll_misc_logs -mtime +20 -print0 | xargs -0 -r ls -ltr
But, if you get so many files that xargs decides to split it into multiple invocations, it probably won't do exactly what you want, either.
Which leads me to... What exactly are you trying to do? On the surface, it looks like "show me the old files, in order by modification time", but it's likely part of something bigger that might be solved in a more efficient (and less error-prone) manner...
If you just want a list of files older than 20 days sorted by oldest first:
find /home/data/ll_misc_logs -mtime +20 -exec ls -l --time-style=%s {} \; | sort -n -k 6

Resources