How to find all files which are basically soft or hard links of other directories or files on linux? - linux

How could I get the list of all linked files on my system or from a certain directory. I used to create links but they became unmanageable with time. I want the list of all such links from a directory. Can anyone help?

Finding symlinks is easy:
% find . -type l
Finding hard links is tricky, because if a subdirectory of the directory in question also has subdirectories then those increase the hard link count. That's how subdirectories are linked to their parents in UNIX (it's the .. entry in each subdirectory).
If you only want to find linked files (and not directories), this will work:
% find . -type f \! -links 1
This works because a file that does have hard links will have a link count > 1, and unlinked file has a link count == 1, hence this command looks for all files whose link count <> 1
Alternatively, on newer versions of find you could use:
% find . -type f -links +1
This works for the same reason as above; however, newer versions of find can take +n or -n instead of just a number. This is equivalent to testing for greater than n or less than n, respectively.

find / -xdev -samefile filename

#OP, If you have GNU find, you can find hard links using -printf "%n",
e.g.
find /path -type f -printf "%f/%n/%i\n" | while IFS="/" read filename num_hlinks inum
do
echo "Filename: $filename. Number of hard links: $num_hlinks, inode: $inum"
# if 2 or more files have the same inode number, then they are hard links.
# you can therefore count how many $inum that are the same and determine those hard links, which
# you have to try doing yourself.
done

See e.g. here
https://www.gnu.org/software/findutils/manual/html_node/find_html/Hard-Links.html
or combine Alnitak and amber_linux answer into
find -L /where/to/search -samefile /some/link/to/file
to find all hard and soft links to a given file.

Related

Finding and following symbolic links but without deleting them

The current find command is utilized to find and delete outdated files and directories. The expired data is based on a given properties file and the destination.
If the properties file says…
"/home/some/working/directory;.;180"
…then we want files and empty subdirectories deleted after 180 days.
The original command is…
"find ${var[0]} -mtime +${var[2]} -delete &"
…but I now need to modify now that we've discovered is has deleted symbolic links that existed in specified sub-directories after the given expiration date in the properties file. The variable path and variable expiration time are designated in the properties file (as previously demonstrated).
I have been testing using…
"find -L"
…to follow the symbolic links to make sure this clean up command reaches the destinations, as desired.
I have also been testing using…
"\! -type l"
…to ignore deleting symbolic links, so the command I've been trying is…
"find -L ${var[0]} ! -type l -mtime +${var[2]} -delete &"
…but I haven't achieved the desired results. Help please, I am still fresh into Linux and my research hasn't lead me to a desired answer. Thank you for your time.
Change
\! -type l
to
\! -xtype l
find -L ${var[0]} \\! -xtype l -mtime +${var[2]} -delete &

How can i count the number of files with a specific octal code without them showing in shell

I tried using tree command but I didn't know how .(I wanted to use tree because I don't want the files to show up , just the number)
Let's say c is the code for permission
For example I want to know how many files are there with the permission 751
Use find with the -perm flag, which only matches files with the specified permission bits.
For example, if you have the octal in $c, then run
find . -perm $c
The usual find options apply—if you only want to find files at the current level without recursing into directories, run
find . -maxdepth 1 -perm $c
To find the number of matching files, make find print a dot for every file and use wc to count the number of dots. (wc -l will not work with more exotic filenames with newlines as #BenjaminW. has pointed out in the comments. Source of idea of using wc -c is this answer.)
find . -maxdepth 1 -perm $c -printf '.' | wc -c
This will show the number of files without showing the files themselves.
If you're using zsh as your shell, you can do it natively without any external programs:
setopt EXTENDED_GLOB # Just in case it's not already set
c=0751
files=( **/*(#qf$c) )
echo "${#files[#]} files found"
will count all files in the current working directory and subdirectories with those permissions (And gives you all the names in an array in case you want to do something with them later). Read more about zsh glob qualifiers in the documentation.

How to search for files ending/starting/containing a certain letter in terminal?

I have been looking all over the internet to help me with this. I want to list all files that start/end/contain a certain letter but the results I found on the internet do not seem to work for me. I need to use the ls command for this (assignment).
I tried this code from another question:
ls abc* # list all files starting with abc---
ls *abc* # list all files containing --abc--
ls *abc # list all files ending with --abc
but when ever I try any of those it comes back with "ls: cannot access '*abc': No such file or directory"
Use find for finding files:
find /path/to/folder -maxdepth 1 -type f -name 'abc*'
This will give you all regular filenames within /path/to/folder which start with abc.

php script just filled up harddrive with junk how do i find it?

I just ran a php script which filled up my nix servers harddrive with 15GB of some sort of junk, how do i find the junk so I can delete it? I'm not sure if it's a huge error_doc file or what
One option is to use the find command.
find / -type f -size +50M
Will search downwards from the root directory for items which are files larger than 50MB. If you want to limit how many subdirectories you want to search, you can use the -maxdepth switch.
find / -maxdepth 3 -type f -size +50M
will look for files larger than 50MB, but will only recurse 3 directories down.
This assumes that you know that the files which were created are larger than a certain size, and you can pick them out if they are displayed.
You might also be able to make use of the knowledge that the files were created recently.
find / -type f -mmin 60
should find files which were modified in the past hour.

Find symlinks to certain directory or one of its subdirs

Is there an easy way to show whether there are any symlinks in a specified path pointing to a certain directory or one of its children?
A simple and fast approach, assuming that you have the target as absolute path (readlink(1) may help with that matter):
find $PATH -type l -xtype d -lname "$DIR*"
This finds all symlinks (-type l) below $PATH which link to a directory (-xtype d) with a name starting with $DIR.
Another approach, which is O(n*m) and therefore may take ages and two days:
find $DIR -type d | xargs -n1 find $PATH -lname
The first find lists $DIR and all its subdirectories which are then passed (xargs), one at a time (-n1), to a second find which looks for all symlinks originating below $PATH.
To sum things up: find(1) is your friend.
Following up on the answer given by earl:
-xtype does not work on Mac OSX, but can be safely omitted:
find $PATH -type l -lname "$DIR*"
Example:
find ~/ -type l -lname "~/my/sub/folder/*"
Have a look at the findbl (bad links) script in fslint. It might give you some hints:
http://code.google.com/p/fslint/source/browse/trunk/fslint/findbl

Resources