How to list non-empty directories with the command find or ls? - linux

It is necessary to list non-empty directories in a subdirectory
Trying to use
find . ! -type d -empty -print
but it doesn't work,

You're asking for "things that are not directories and are empty", which includes all zero-length files (and would exclude all directories). You want "things that are directories and are not empty", so:
find . -type d ! -empty -print

Related

Delete old files using ls and find command

My folder structer looks like below:
folder1
---tmp
---sub1
folder2
---tmp
---sub2
folder3
---tmp
---sub3
folder4
---tmp
---sub4
I want to delete files which older than 30 days in all tmp folder.
list all tmp folder:
ls -d */tmp
delete all files which older than 30 days
find . -mtime +30 -type f -delete
Could i combine these 2 steps into one command line?
What you can do is replace the . in your find with the actual directories you want to search in.
find */tmp -mtime +30 -type f -delete
If tmp can be several levels deeper then you might be interested in
find . -regex '.*/tmp/[^/]+' -mtime +30 -type f -delete
or similar to the first option, but by using the double-star globular expression (enabled with shopt -s globstar)
find **/tmp -mtime +30 -type f -delete
* Matches any string, including the null string. When the globstar shell option is enabled, and * is used in a pathname expansion context, two adjacent *s used as a single pattern will match all files and zero or more directories and subdirectories. If followed by a /, two adjacent *s will match only directories and subdirectories.
source: man bash
Note: you have to be careful though. Imagine that you have a directory folder1/tmp/foo/ then the above commands (with exception of the regex version) will select also files in folder1/tmp/foo and this might not be wanted. You might be interested in the extra option -maxdepth 1

Recusively delete all files that do not contain a word in their title

Hi I wanted to recursively delete all files in a directory with various folders that do not contain the word "weight" in their file name. How can I do this?
Type this command first:
find '/path/to/directory' -type f \! -iname '*weight*'
If you are OK with deleting all the files suggested by the command, then you can actually delete them with:
find '/path/to/directory' -type f \! -iname '*weight*' -delete

find files or directories that are 30 or more days old, recursively, BUT starting the search from specific directory

I have been searching but I cannot find a way to essentially do the following in 1 line at Linux, so as to find files and directories that are more than 30 days old, starting the recursive search from script_dir:
cd $script_dir
find . -type f -or -type d -mtime +30
If I do not do the cd to change to directory that I need to start searching from recursively (and use directly only the find), then, although I specify the script_dir at find the recursive search starts from the directory I am currently and NOT from the script_dir and beneath this directory. I want to do something like the following and even if I am currently at other directory than script_dir, the recursive search to start from script_dir:
find $script_dir -type f -or -type d -mtime +30
Thank you.
In one line, you can do like this :
cd /path/to/directory && find . -type f -or -type d -mtime +30
that do the search from the specified directory

how to exclude all subdirectories of a given directory in the search path of the find command in unix

I need to backup all the directory hierarchy of our servers, thus I need to list all the sub directories of some of the directories in the server.
The problem is that one of those sub directories contains tens of thousands of sub directories (file with only the names of the sub directories could take couple of hundreds megabytes and the respective find command takes very long).
For example, if I have a directory A and one sub directory A/a that contains tens of thousands of sub directories, I want to use the find command to list all the sub directories of A excluding all the sub directories of A/a but not excluding A/a itself.
I tried many variations of -prune using the answers in this question to no avail.
Is there a way to use the find command in UNIX to do this?
UPDATE:
the answer by #devnull worked very well, but now i have another problem, so i will refine my question a little:
i used the following command:
find /var/www -type d \( ! -wholename "/var/www/web-release-data/*" ! -wholename "/var/www/web-development-data/*" \)
the new problem that arises is that find for some reason is still traversing the whole directory tree of "/var/www/web-release-data/" and "/var/www/web-development-data/", thus it's very slow, and I fear it could take hours.
Is there any way make find completely exclude those directories and not traverse their respective directory hierarchies?
The following should work for you:
find A -type d \( ! -wholename "A/a/*" \)
This would list all subdirectories of A including A/a but excluding subdirectories of A/a.
Example:
$ mkdir -p A/{a..c}/{1..4}
$ find A -type d \( ! -wholename "A/a/*" \)
A
A/c
A/c/4
A/c/2
A/c/3
A/c/1
A/a
A/b
A/b/4
A/b/2
A/b/3
A/b/1
Another solution:
find A \! -path "A/a/*"
If you don't want a as well, use
find A \! -path "A/a/*" -a \! -path "A/a"
Have you tried rsync(1)? It has an option --exclude=PATTERN which might work well here:
rsync -avz --exclude=A/a <source> <target>
Using rsync you wouldn't need to use find(1)
To exclude 2 subdirs:
find . -type d ! -wholename "dir/name/*" -a ! -wholename "dir/name*"
To answer your updated question, you can do
find /var/www -wholename "/var/www/web-release-data/*" -o -wholename "/var/www/web-development-data/*" -prune -o -type d -print

Find Directories With No Files in Unix/Linux

I have a list of directories
/home
/dir1
/dir2
...
/dir100
Some of them have no files in it. How can I use Unix find to do it?
I tried
find . -name "*" -type d -size 0
Doesn't seem to work.
Does your find have predicate -empty?
You should be able to use find . -type d -empty
If you're a zsh user, you can always do this. If you're not, maybe this will convince you:
echo **/*(/^F)
**/* will expand to every child node of the present working directory and the () is a glob qualifier. / restricts matches to directories, and F restricts matches to non-empty ones. Negating it with ^ gives us all empty directories. See the zshexpn man page for more details.
-empty reports empty leaf dirs.
If you want to find empty trees then have a look at:
http://code.google.com/p/fslint/source/browse/trunk/fslint/finded
Note that script can't be used without the other support scripts,
but you might want to install fslint and use it directly?
You can also use:
find . -type d -links 2
. and .. both count as a link, as do files.
The answer of Pimin Konstantin Kefalou prints folders with only 2 links and other files (d, f, ...).
The easiest way I have found is:
for directory in $(find . -type d); do
if [ -n "$(find $directory -maxdepth 1 -type f)" ]; then echo "$directory"
fi
done
If you have name with spaces use quotes in "$directory".
You can replace . by your reference folder.
I haven't been able to do it with one find instruction.

Resources