How can I find a directory and a subdirectory and a subdirectory in linux? - linux

I am new to linux. I am trying to find a directory structure that looks like this: /org/voltdb/client.
I can find lots of tutorials showing how to find a directory but none showing how to find an entire directory path.
How can I search for a particular directory hierarchy?
This does not work: find / -type d -name "/org/voltdb/client" -ls

Jon Lin's solution does not work, because the '/org/voltdb/client' will not appear in the output of find. You should use this instead:
find / -type d -name 'client' | grep /org/voltdb/client

You can use the find command. It's very versatile for searching with patterns. Try:
find / -type d -path '/org/voltdb/client'

You could just grep the output:
find / -type d -name 'org' | grep /org/voltdb/client

Related

Replace a file if path of the file contains a string in Linux [duplicate]

I know how to find files using
find . -name "file_name"
But if I am given one part of a path, say "folder1/subfolder2/", how do I get all the full path that contains this partial path?
Example
partial path: folder1/subfolder2/
desire result:
/bob/folder1/subfolder2/yo/
/sandy/folder1/subfolder2/hi/
Use the -path option:
find . -path '*/folder1/subfolder2/*'
You may do it like below:
find . -path "*folder1/folder2" -prune -exec find {} -type f -name file.txt \;
With -prune you don't recurse after first match in a directory
This one worked for me (using bash)
ls -l /**/folder1/subfolder2/**
I came up with this, other solutions did not work for me,
find1 is a function
find1 ()
{
for file in `find -name $1`;
do
full_path=$PWD/$file;
echo $full_path;
done
}
If you dont want to stay posix-compliant, at least on Linux you can also use the -regex (and -regextype) option for this purpose.
For instance:
find folder/ -regextype posix-extended -regex "(.*/)?deer/(.*/)?beer"
will match
folder/deer/beer
folder/deer/dir/forest/beer/
folder/forest/deer/dir/forest/beer/
etc.
See linux man for details.

find command with regex to find directories

I have the following linux find command that gives a base path then also searches if there are any directories that contain the name 3.7.1
In my base path /some/folder/path/folder1 i would like to change folder1 into a regular expression something like folder[1-3] how can I do this?
# working command
find /some/folder/path/folder1/another_folder -type d -name "3.7.1*" -ls
# would like to use some regular expresion on folder1-3
find /some/folder/path/folder[1-3]/another_folder -type d -name "3.7.1*" -ls
error I am seeing: /folder[1-3]/another_folder/': No such file or directory
That's the correct syntax already:
find /some/folder/path/folder[1-3] -type d -name "3.7.1*" -ls

How can I find a file within a specific directory name?

So I need to find all files in /home/ with a file name of "options.php".
find . -name "options.php"
When 'in home', that will find all options.php files, however, I want to only find all options.php files when they are in /public_html/.
So in other words, it should ignore all other 'options.php' files found.
eg, positive/show results:
/home/usr1/public_html/options.php
/home/usr2/public_html/options.php
eg, shouldnt show me:
/home/usr1/public_html/wp-admin/options.php
/home/usr2/public_html/wp-content/plugins/whatever/options.php
You can pass a pattern via -path option as follows:
find /home/ -path '*/public_html/options.php'
For a more flexible pattern use -regex which accepts a regular expression applied on the whole path. But in this particular case -regex has no advantage over -path:
find /home/ -regex '.*/public_html/options.php'
Filter the desired results from the found results with grep.
find . -name "options.php" | grep 'public_html/options.php'
You can limit the depth of find:
find . -maxdepth N, this way It should only find options.php in your desired folder.
The ls utility is much better suited for this task:
ls -1 /home/*/public_html/options.php
If you want to process the result list and do not want to have an error message or warning in case no such files are found, then simply redirect the error output of the command:
ls -1 /home/*/public_html/options.php 2>/dev/null
An alternative using the find utility would be:
find /home -path "*/public_html/options.php"
Or, if you want to prevent matches in folders called "public_html" further down in the hierarchy:
find /home -path "/home/*/public_html/options.php"
find /home -maxdepth 3 -path "*/public_html/options.php"

Exclude range of directories in find command

I have directory called test which has sub folders in the date range like 01,02,...31. This all sub folders contain .bz2 files in it. I need to search all the files with .bz2 extension using find command but excluding particular range of directories. I know about find . -name ".bz2" -not -path "./01/*", but writing -not -path "./01/*" would be so pathetic if I would want to skip 10 directories. So how would I skip 01..19 subdirectories in my find command ?
You can use wildcards in the pattern for the option -not -path:
find ./ -type f -name "*.bz2" -not -path "./0*/*" -not -path "./1*/*
this will exclude all directories starting with 0 or 1. Or even better:
find ./ -type f -name "*.bz2" -not -path "./[01]*/*"
Firstly, you can help find by using -prune rather than -not -path - that will avoid even looking inside the relevant directories.
To your main point - you can build a wildcard for your example (numeric 01 to 19):
find . -path './0[1-9]' -prune -o -path './1[0-9]' -prune -o -print
If your range is less convenient (e.g. 05 to 25) you might want to build the range into a bash variable, then interpolate that into the find command:
a=("-path ./"{05..25}" -prune -o")
find . ${a[*]} -print -prune
(you might want to echo "${a[*]}" or printf '%s\n' ${a[*]} to see how it's working)
For me, I found the find command as a standalone tool somehow cumbersome. Therefore, I always end up using a combination of find just for the recursive file search and grep to make the actual exculsion/inclusion stuff. Finally I hand over the results to a third command which will perform the actions, like rm to remove files for example.
My generic command would look something like this:
find [root-path] | grep (-v)? -E "cond1|cond2|...|condN" | [action-performing-tool]
root-path is where to start the search recursively
add -v option is used to invert the matching results.
cond1 - condN, the conditions for the matching. When -v is involed then this are the conditions to not match.
the action-performing-tool does the actual work
For example you want to remove all files not matching some conditions in the current directory:
find . -not -name "\." | grep -v -E "cond1|cond2|cond3|...|condN" | xargs rm -rf
As you can see, we are searching in the current directory indicated by the dot as root-path: then we want to invert the matching results, because we want all files not matching our conditions: and finally we pass all files found to rm in order to delete them: I add -rf to recursive/force delete all files. I used the find command with -not -name "." to exclude the current directory indicated normally by dot.
For the actuall question: Assume we have a directory using .git and .metadata directory and we want to exclude them in our search:
find . -not -name "\." | grep -v -E ".git|.metadata" | [action-performing-tool]
Hope that helps!
If you wan to exclude child directory under parent directory then this might be useful:
E.g.- You have parent directory "ParentDir" and it has two child directories "Child1, Child2". You wan to read files from "Chiled2" only and skip "Child1". Then this will help.
find ./ParentDir ! -path "./ParentDir/Child1*" -name *.<extention>

Output directory name from linux find command

How do I get the name of a folder from a linux find commnad.
I have a command like this:
find /root/wgetlog -name -type d -empty
Whic produces the following results:
/root/wgetlog/smil3
/root/wgetlog/smil5
/root/wgetlog/smil4
how do I get just the name of the folder:
Example:
smil3
smil4
smil5
find /root/wgetlog -type d -empty -printf "%f\n"
If all you need is a relative path, then
{ pushd /root/wgetlog/; find . -name -type d -empty; popd; }
is the approach, especially if you do care about subdirectories of /root/wgetlog/*.
Use basename:
find /root/wgetlog -type d -empty -exec basename {} \;
You don't need -name.
You could also use sed to filter out the leading elements of each path:
$ find /usr/bin -type d
/usr/bin
/usr/bin/multiarch-i386-linux
/usr/bin/multiarch-x86_64-linux
/usr/bin/gda_trml2pdf
/usr/bin/gda_trml2html
...
$ find /usr/bin -type d | sed 's|.*/||'
bin
multiarch-i386-linux
multiarch-x86_64-linux
gda_trml2pdf
gda_trml2html
...
This might be more portable than using the -printf option of find, although that should not be an issue if you stick to Linux.
Disclaimer: this will fail horribly if you have newlines in your file/folder names. On the other hand, this snippet is probably not the only thing that would fail in that case...

Resources