A few days ago I was reading about the Linux find tool and based on that I issued the following command to see if I have the Python.h file:
find . 'Python.h'
The problem is that all files in current dir and subdirs are returned. Shouldn't I get what I'm looking for?
You left out the parameter specifier -name:
find ./ -name 'Python.h'
find will recurse through all directories in the current directory. If you just want to see whether you have a file in the current directory, use ls:
ls Python.h
Use -name switch:
find . -name 'Python.h'
Otherwise it takes the name as location to look at.
Related
Quarrying out from another thread Move file that has aged x minutes, this question came up:
How does the find command found typically in Linux search for files in the current directory?
Consider a directory that contains a fairly large amount of files, then:
Firstly find MY_FILE.txt returns immediately and secondly find . -name MY_FILE.txt takes much longer.
I used strace -c to see what happens for both and I learned that the second command invokes a directory scan, which explains why it's slower.
So, the first command must be optimized. Can anybody point me to the appropriate resource or provide a quick explanation how this might be implemented?
The syntax for find is find <paths> <expression>, where paths is a list of files and directories to start the search from. find starts from those locations and then recurses (if they're directories).
When you write find . -name MY_FILE.txt it performs a recursive search under the ./ directory. But if you write find MY_FILE.txt then you're telling it to start the search at ./MY_FILE.txt, and so it does:
$ strace -e file find MY_FILE.txt
...
newfstatat(AT_FDCWD, "MY_FILE.txt", 0x556688ecdc68, AT_SYMLINK_NOFOLLOW) = -1 ENOENT (No such file or directory)
...
(No such file or directory)
: No such file or directory
+++ exited with 1 +++
Since the path doesn't exist, it only takes a single system call to determine that there's no such file. It calls newfstat(), gets a No such file or directory error, and that's that.
In other words, find MY_FILE.txt isn't equivalent to find . -name MY_FILE.txt. Heck, I wouldn't even call it useful because you're not asking it to search. You're just asking it to tell you if MY_FILE.txt exists in the current directory or not. But you could find that out by simply calling ls MY_FILE.txt.
Here's the difference:
[~]$ cd /usr
[/usr]$ find . -name sha384sum
./bin/sha384sum
[/usr]$ find sha384sum
find: ‘sha384sum’: No such file or directory
The first one performs a recursive search and finds /usr/bin/sha384sum. The second one doesn't recurse and immediately fails bcause /usr/sha384sum doesn't exist. It doesn't look any deeper. It's done in a nanosecond.
I have a file named pqr.txt This file, is present in many of the subdirectories, but i want to find only the ones whose path is ../../whatever/whatever/pqr/pqr.txt
I am doing this operation with find command using name option but how can i add this constarint that it has to find pqr.txt only in pqr directories?
Also i am passing name parameter as argument from command-line.
find . -name pqr.txt
If you want to find only the pqr.txt that are in a directory named pqr,
you can use the -path option of find:
find . -path '*/pqr/pqr.txt'
This works:
:/# find . -name *gedit*
It finds all given files on local disk as well as a backup disk mounted on /media/backup/root
However, change directory to /media/backup/root/ and the same command does not work:
:/media/backup/root# find . -name *gedit*
It finds nothing.
Why?
You need quotes around your search term
:/media/backup/root# find . -name "*gedit*"
The first parameter to find (in this case .) tells find where it should start searching.
So when your current directory is / (The root of the file system) the whole system gets searched. Whereas when your current directory is /some/other/path only this path downward is searched.
If you want to always search the whole file system use
find / -name "*gedit*"
The quotes are important, they prevent the shell from expanding the *.
I am using CentOS Linux release 7.0.1406 on virtual box. I am trying to find the files using find command.
this find command is not giving any response:
find . -name "orm.properties"
My current working directory is /eserver6. File orm.properties is present in /eserver6/share/system/config/cluster, but find command is not able to find the file.
I have tried other combinations like
find . -name "orm.*"
find . -name 'orm*'
this is finding few files staring with orm but not all the files present inside the current working directory.
The command line looks correct and it should find the file. Some reasons why it might fail:
You don't have permission to enter one of the folders in the path to /eserver6/share/system/config/cluster.
You made a typo
The file system is remote and the remote file system behaves oddly
There is a simlink somewhere in the path. By default, find doesn't follow symlinks to avoid recursive loops. Use find /eserver6 -L ... to tell find to look at the target of the link and follow it if it's a folder.
The command
find /eserver6 -name "orm.properties"
should definitely find the file, no matter where you are. If it doesn't, look at -D debugoptions in the manpage. You probably want -D stat to see at which files find looks and what it sees.
If your user have entry into sudoers file then its ok and you can run
sudo find / -name "orm.properties"
or else ask your admin to give an entry in sudoers file of your user and run the same command then it will work.
I do not have a working Linux system to try these commands out with so I am asking on here if what I am planning on doing is the correct thing to do. (Doing this while I am downloading an ISO via a connection that I think dial-up is faster).
1, I am trying to find all files with the .log extension in the /var/log directory and sub-directories, writing the standard out to logdata.txt and standard out to logerrors.txt
I believe the command would be:
$ find /var/log/ -name *.log 1>logdata.txt 2>/home/username/logs/logerrors.txt.
2, Find all files with .conf in the /etc directory. standard out will be a file called etcdata and standard error to etcerrors.
$ find /etc -name *.conf 1>etcdata 2>etcerrors
3, find all files that have been modified in the last 30 minutes in the /var directory. standard out is to go into vardata and errors into varerrors.
Would that be:
$ find /var -mmin 30 1>vardata 2>varerrors.
Are these correct? If not what am I doing wrong?
1, I am trying to find all files with the .log extension in the /var/log directory and sub-directories, writing the standard out to logdata.txt and standard out to logerrors.txt
Here you go:
find /var/log/ -name '*.log' >logdata.txt 2>/home/username/logs/logerrors.txt
Notes:
You need to quote '*.log', otherwise the shell will expand them before passing to find.
No need to write 1>file, >file is enough
2, Find all files with .conf in the /etc directory. standard out will be a file called etcdata and standard error to etcerrors.
As earlier:
find /etc -name \*.conf >etcdata 2>etcerrors
Here I escaped the * another way, for the sake of an example. This is equivalent to '*.conf'.
3, find all files that have been modified in the last 30 minutes in the /var directory. standard out is to go into vardata and errors into varerrors.
find /var -mmin -30 >vardata 2>varerrors
I changed -mmin 30 to -mmin -30. This way it matches files modified within 30 minutes. Otherwise it matches files were modified exactly 30 minutes ago.
When using wildcards in the command, you need to make sure that they do not get interpreted by the shell. So, it is better to include the expression with wildcards in quotes. Thus, the first one will be:
find /var/log/ -name "*.log" 1>logdata.txt 2>/home/username/logs/logerrors.txt
Same comment on the second one where you should have "*.conf".