recursively find directories without setgid set - linux

In linux, how do you recursively pull up a list of all directories that do NOT have the setgid bit set?
I know you can do
find . -type d /perm g+s
to find all the directories that have it set, but it's not obvious to me how to negate this. Or if another tool is more appropriate for this use case.
I've got a rather large directory tree and I'm trying to limit the operations I do on them.

You can simply add \! before an expression in find in order to negate it.
find . -type d \! -perm -g+s

Related

Using Perl how can I clean up left over directories with no files?

There is a specific directory which is used as a temp/scratch directory by some program.
E.g. /a/b/c/work
Under work multiple hierarchical directories may exist e.g.
/a/b/c/work/
\d1/
\d1.1
\d2
\d2.2
What I want is to clean up this work directory as there are left over files that take space.
Essentially I need to delete all subdirectories under work that the leaf directory is empty.
So if d1.1 is empty but d2.2 has files then delete everything under d1 (including d1) but not d2.
What is the cleanest/standard way to do this in perl?
I thought to use a solution with backticks e.g. rm -rf etc but I thought there could be some better way than coding sequences of ls folowed by rm
Note: Just to be clear. I want a solution in Perl as this is not a one time thing and I dont want to do this manually each time
If you use find command this way you can achieve it.
find /path/to/dir -empty -type d -delete
Where,
-empty Only find empty files and make sure it is a regular file or a directory.
-type d Only match directories.
-delete Delete files.
Always put -delete option at the end of find command as find command line is evaluated as an expression, so putting -delete first will make find try to delete everything below the starting points you specified.
To automate this in shell script follow below code:
path=`pwd`
find $path -empty -type d -delete
or you can give certain input as arguments of shell script like myShell.sh /path/to/mydir in that case the following code will be do the work,
$path=$1
find $path -empty -type d -delete
As for if you really want to go for perl you can find your answer as follows
use strict;
use warnings;
use File::Util;
my $path = '...';
my $fu = File::Util->new();
my #all_dirs = $fu->list_dir($path, '--recurse', '--dirs-only');
my #empty_dirs = grep { not $fu->list_dir($_) } #all_dirs;
also a short method
perl -MFile::Find -e"finddepth(sub{rmdir},'.')"
which is explained very good here.

How do I find files/directories that are executable by anybody?

I'm trying to find all files or directories that can be executed by EITHER user, group, or other. So far I have come up with this
find . -perm -u+x
I do not know how to search for group and other as well. I know that
find .-perm -ugo+x
will search for files/directories that can be executed by all 3 of those (essentially a+x).
I have searched and cannot figure out how to look for any of those 3. One place suggested
find . -perm -u+x, g+x, o+x
but i get the error
find: -perm: u+x,: illegal mode string
any ideas?
try this :
find . -perm /u=x,g=x,o=x
The POSIX-conformant syntax would be
find . \( -perm u=x -o -perm g=x -o -perm o=x \)
Each of the three -perm primaries checks if the file is executable by user, group, or other; they are joined by -o so that only one has to be true for the entire \(...\) group to be true.

Find all directories without r/x permissions for world/other

I want to find all the directories that that are not both readable and executable by the 'others'. Or put another way, anything where the permissions for 'other' users is anything except r-x or rwx.
I thought this woud work, but I'm off somehow:
find . -type d ! -perm -o+rw
This syntax will work:
find . -type d ! -perm /o+x,o+r
Check the examples section of the man page for more info.

Find Directories With No Files in Unix/Linux

I have a list of directories
/home
/dir1
/dir2
...
/dir100
Some of them have no files in it. How can I use Unix find to do it?
I tried
find . -name "*" -type d -size 0
Doesn't seem to work.
Does your find have predicate -empty?
You should be able to use find . -type d -empty
If you're a zsh user, you can always do this. If you're not, maybe this will convince you:
echo **/*(/^F)
**/* will expand to every child node of the present working directory and the () is a glob qualifier. / restricts matches to directories, and F restricts matches to non-empty ones. Negating it with ^ gives us all empty directories. See the zshexpn man page for more details.
-empty reports empty leaf dirs.
If you want to find empty trees then have a look at:
http://code.google.com/p/fslint/source/browse/trunk/fslint/finded
Note that script can't be used without the other support scripts,
but you might want to install fslint and use it directly?
You can also use:
find . -type d -links 2
. and .. both count as a link, as do files.
The answer of Pimin Konstantin Kefalou prints folders with only 2 links and other files (d, f, ...).
The easiest way I have found is:
for directory in $(find . -type d); do
if [ -n "$(find $directory -maxdepth 1 -type f)" ]; then echo "$directory"
fi
done
If you have name with spaces use quotes in "$directory".
You can replace . by your reference folder.
I haven't been able to do it with one find instruction.

Find symlinks to certain directory or one of its subdirs

Is there an easy way to show whether there are any symlinks in a specified path pointing to a certain directory or one of its children?
A simple and fast approach, assuming that you have the target as absolute path (readlink(1) may help with that matter):
find $PATH -type l -xtype d -lname "$DIR*"
This finds all symlinks (-type l) below $PATH which link to a directory (-xtype d) with a name starting with $DIR.
Another approach, which is O(n*m) and therefore may take ages and two days:
find $DIR -type d | xargs -n1 find $PATH -lname
The first find lists $DIR and all its subdirectories which are then passed (xargs), one at a time (-n1), to a second find which looks for all symlinks originating below $PATH.
To sum things up: find(1) is your friend.
Following up on the answer given by earl:
-xtype does not work on Mac OSX, but can be safely omitted:
find $PATH -type l -lname "$DIR*"
Example:
find ~/ -type l -lname "~/my/sub/folder/*"
Have a look at the findbl (bad links) script in fslint. It might give you some hints:
http://code.google.com/p/fslint/source/browse/trunk/fslint/findbl

Resources