How can I find a file within a specific directory name? - linux

So I need to find all files in /home/ with a file name of "options.php".
find . -name "options.php"
When 'in home', that will find all options.php files, however, I want to only find all options.php files when they are in /public_html/.
So in other words, it should ignore all other 'options.php' files found.
eg, positive/show results:
/home/usr1/public_html/options.php
/home/usr2/public_html/options.php
eg, shouldnt show me:
/home/usr1/public_html/wp-admin/options.php
/home/usr2/public_html/wp-content/plugins/whatever/options.php

You can pass a pattern via -path option as follows:
find /home/ -path '*/public_html/options.php'
For a more flexible pattern use -regex which accepts a regular expression applied on the whole path. But in this particular case -regex has no advantage over -path:
find /home/ -regex '.*/public_html/options.php'

Filter the desired results from the found results with grep.
find . -name "options.php" | grep 'public_html/options.php'

You can limit the depth of find:
find . -maxdepth N, this way It should only find options.php in your desired folder.

The ls utility is much better suited for this task:
ls -1 /home/*/public_html/options.php
If you want to process the result list and do not want to have an error message or warning in case no such files are found, then simply redirect the error output of the command:
ls -1 /home/*/public_html/options.php 2>/dev/null
An alternative using the find utility would be:
find /home -path "*/public_html/options.php"
Or, if you want to prevent matches in folders called "public_html" further down in the hierarchy:
find /home -path "/home/*/public_html/options.php"
find /home -maxdepth 3 -path "*/public_html/options.php"

Related

GNU findutils doesn't find file

When searching for .txt files that are in 2 different home directories only one shows up depending on the present working directory. Why is this?
/home/bob/1.txt
/home/alice/dir1/2.txt
pwd /tmp
[root#host tmp]#find /home -name *.txt
/home/bob/1.txt
/home/alice/dir1/2.txt
pwd /home
[root#host bob]#find /home -name *.txt
/home/bob/1.txt
Why does searching from within the bob directory only return the one file?
Why does searching from within the bob directory only return the one file?
Because when the working directory is /home/bob, the *.txt in the find command is expanded by the shell (to 1.txt) and that is what is passed to find. That is, find /home -name 1.txt. That will find the file in /home/bob, but not the differently named one in /home/alice. It would find /home/alice/1.txt if such a file existed.
On the other hand, when the pattern does not match any file (relative to the working directory) it is passed on as a literal. At least by default -- you should be careful about this, because the pattern would instead be expanded to nothing if the nullglob shell option were in effect and the find command were executed from a location where the pattern didn't match any files.
If you want to ensure that shell pathname expansion is not applied to the pattern then quote it:
find /home -name '*.txt'
or
find /home -name \*.txt
or ....

Why find's -exec option is including 'non-matched' items?

I'm trying to use find to find and exclude/filter few directories from being copied to another backup directory.
My attempts to do so using find's '-exec' option end up copying every processed file instead of only the matches, so I'm quite confused about what the expected behavior should be and would appreciate help gaining better understanding.
Starting point:
me#computer>ls
AddMonitorsOnEntry MantisCoreFormatting MantisGraph PastePicture XmlImportExport
Make sure find excludes the unwanted 'files' as expected
me#computer>find . -maxdepth 1 -not -regex '.*MantisCoreFormatting\|.*MantisGraph\|.*XmlImportExport'
.
./AddMonitorsOnEntry
./PastePicture
Now to copy those 2 directories to a backup dir:
me#computer>find . -maxdepth 1 -not -regex '.*MantisCoreFormatting\|.*MantisGraph\|.*XmlImportExport' -exec cp -dr '{}' ~/backup \;
Now to see if it worked...
me#computer>cd ~/backup
me#computer>ls
AddMonitorsOnEntry backup MantisCoreFormatting MantisGraph PastePicture XmlImportExport
WTH??
I thought '-exec' only operated on the matches, according to this snippet from the man page: " ...The specified command is run once for each matched file..."
I know there are other ways to accomplish this task, but '-exec' seems to work well enough for the poster here https://unix.stackexchange.com/questions/50612/how-to-combine-2-name-conditions-in-find/50633. I'm looking for help understanding how to make use of "-exec" versus using xargs or something else. Thanks.
Now to copy those 2 directories to a backup dir
You don't have 2 matches. Your command shows 3:
.
./AddMonitorsOnEntry
./PastePicture
. is the current directory, so your cp command copies everything.
Instead of find . you can use find * to skip the current directory ., but still process all the (non-hidden) files/dirs within it.
Silly of me..
My initial find expression includes the current directory as a result, so any files in the current dir will be operated on by "-exec".
To fix I added the current dir among the ones excluded.
me#computer>find . -maxdepth 1 -not -regex '.*MantisCoreFormatting\|.*MantisGraph\|.*XmlImportExport\|\.'
./AddMonitorsOnEntry
./PastePicture

Exclude range of directories in find command

I have directory called test which has sub folders in the date range like 01,02,...31. This all sub folders contain .bz2 files in it. I need to search all the files with .bz2 extension using find command but excluding particular range of directories. I know about find . -name ".bz2" -not -path "./01/*", but writing -not -path "./01/*" would be so pathetic if I would want to skip 10 directories. So how would I skip 01..19 subdirectories in my find command ?
You can use wildcards in the pattern for the option -not -path:
find ./ -type f -name "*.bz2" -not -path "./0*/*" -not -path "./1*/*
this will exclude all directories starting with 0 or 1. Or even better:
find ./ -type f -name "*.bz2" -not -path "./[01]*/*"
Firstly, you can help find by using -prune rather than -not -path - that will avoid even looking inside the relevant directories.
To your main point - you can build a wildcard for your example (numeric 01 to 19):
find . -path './0[1-9]' -prune -o -path './1[0-9]' -prune -o -print
If your range is less convenient (e.g. 05 to 25) you might want to build the range into a bash variable, then interpolate that into the find command:
a=("-path ./"{05..25}" -prune -o")
find . ${a[*]} -print -prune
(you might want to echo "${a[*]}" or printf '%s\n' ${a[*]} to see how it's working)
For me, I found the find command as a standalone tool somehow cumbersome. Therefore, I always end up using a combination of find just for the recursive file search and grep to make the actual exculsion/inclusion stuff. Finally I hand over the results to a third command which will perform the actions, like rm to remove files for example.
My generic command would look something like this:
find [root-path] | grep (-v)? -E "cond1|cond2|...|condN" | [action-performing-tool]
root-path is where to start the search recursively
add -v option is used to invert the matching results.
cond1 - condN, the conditions for the matching. When -v is involed then this are the conditions to not match.
the action-performing-tool does the actual work
For example you want to remove all files not matching some conditions in the current directory:
find . -not -name "\." | grep -v -E "cond1|cond2|cond3|...|condN" | xargs rm -rf
As you can see, we are searching in the current directory indicated by the dot as root-path: then we want to invert the matching results, because we want all files not matching our conditions: and finally we pass all files found to rm in order to delete them: I add -rf to recursive/force delete all files. I used the find command with -not -name "." to exclude the current directory indicated normally by dot.
For the actuall question: Assume we have a directory using .git and .metadata directory and we want to exclude them in our search:
find . -not -name "\." | grep -v -E ".git|.metadata" | [action-performing-tool]
Hope that helps!
If you wan to exclude child directory under parent directory then this might be useful:
E.g.- You have parent directory "ParentDir" and it has two child directories "Child1, Child2". You wan to read files from "Chiled2" only and skip "Child1". Then this will help.
find ./ParentDir ! -path "./ParentDir/Child1*" -name *.<extention>

check if a file is in a folder or its subfolder using linux terminal

I want to check if the particular file is in a folder or its sub folder or not using Linux terminal.
Which should I use for this? I use find and grep command but it travels only one folder.
In order to search from your current directory, use
find . -name filename
In order to search from root directory use
find / -name filename
If you don't know the file extension try
find . -name filename.*
Also note that find command only displays the files in the path which you have permission to view. If you don't have permission for a/b/c path then it will just display a message mentioning that path can't be searched
If you want to search for by filename, use find:
find /path -name "filename"
example:
find . -name myfile.txt
If need to find all files containing a specific string, use grep:
grep -r "string" /path
example:
grep -r foobar .
By default, find will traverse all subdirectories, for example:
mkdir level1
mkdir level1/level2
touch level1/level2/file
find . -name "file"
Output:
./level1/level2/file
locate file name
This is the simple command
I also prefer using a combination of tree and grep. Something like
tree | grep filename
Try
find . -name "filename" -type f
-type f restricts to only files in the current directory (replace . with your path).

how to recursively list files from a folder?

'ls dir1/*/.ext' just lists all the files with just one level of nesting. What is the command to recursively list all the files with any level of nesting in linux?
ls -R dir1
Or:
find dir1 -name "*.ext"
The find command is one way to do this:
find dir1 -name .ext
The -name operator can take a wildcard to match with, but it's important to quote the wildcard expression so that it won't be expanded by your shell before calling into find:
find dir1 -name "*.ext"
The find command has many operators that can do various different tests on the files in the directory, of which -name is just one example. Consult the find manual page for more information.
To list folder recursively:
ls -R
You could use find:
find .
That command would list everything under the current folder

Resources