Group directories which are symbol links first with ls - linux

I employed the predicate --group-directories-first to display the directories on the top.
However, it take effect singularly on the real directories but excluding the directories which are sym-links.
For example, in the Downloads directory, I make dozens of symlink-directories to facilitate the categories of freshly downloaded file.
How could group directories which are syslinks first?

You can use -L option to group directories with symlink first
ls -lL --group-directories-first

Related

rsync only certain types of files

I know there has been a huge discussion about this but I have not found something this specific.
Im trying to copy all .key files in /home// directory
This does not work
/usr/bin/rsync -auPA --include="*/*.key" --exclude="*" /home/* /tmp/test
This works but it copies over unwanted empty directories like /home/uname/Documents
/usr/bin/rsync -auPA --include="*/" --include="*.key" --exclude="*" /home /tmp/test
Basically what i need for rsync to do is to copy only files with .key extension and only create necessarily folders that contain .key files
I think you are looking for the -m option. From the man page:
-m, --prune-empty-dirs
This option tells the receiving rsync to get rid of empty directories from the file-list, including nested directories that
have no non-directory children. This is useful for avoiding the creation of a bunch of useless directories when the sending
rsync is recursively scanning a hierarchy of files using include/exclude/filter rules.
Note that the use of transfer rules, such as the --min-size option, does not affect what goes into the file list, and thus
does not leave directories empty, even if none of the files in a directory match the transfer rule.
Because the file-list is actually being pruned, this option also affects what directories get deleted when a delete is active.
However, keep in mind that excluded files and directories can prevent existing items from being deleted due to an exclude both
hiding source files and protecting destination files. See the perishable filter-rule option for how to avoid this.
You can prevent the pruning of certain empty directories from the file-list by using a global "protect" filter. For instance,
this option would ensure that the directory "emptydir" was kept in the file-list:
--filter ’protect emptydir/’
Here’s an example that copies all .pdf files in a hierarchy, only creating the necessary destination directories to hold the
.pdf files, and ensures that any superfluous files and directories in the destination are removed (note the hide filter of
non-directories being used instead of an exclude):
rsync -avm --del --include=’*.pdf’ -f ’hide,! */’ src/ dest
If you didn’t want to remove superfluous destination files, the more time-honored options of "--include='*/' --exclude='*'"
would work fine in place of the hide-filter (if that is more natural to you).

Compare files and versions in two directories, one of which has symlinks?

We're moving some utilities from an old project to a new to-be-version-controlled one (on the same disk, under different usernames), and I've been tasked with finding utilities that are part of the old account but not the new one, and to find files that are different between the two accounts. The new account has been active for about a year and people have made (small) changes to some files in the old one without making them in the new one.
These utilities are on a remote server that I only have read access to (except my own home folder).
The old account, call it user1, has all its utilities in the ~user1/bin/ folder, including source, executable, and script for each utility.
The new account, say user2, has been set up in a way that each 'executable/script' in the ~user2/bin/ folder is a symlink to the appropriate file in the subfolder ~user2/src/{utilityname}/, which also contains the source for that executable.
Is there an easier way to compare the two directories than
find ~user1/bin/ -maxdepth 1 -printf '%s, %p\n' | sort -k2 > user1.txt
find -L ~user2/bin/ -maxdepth 1 -printf '%s, %p\n' | sort -k2 > user2.txt
and comparing the results manually to see what's different / missing?
Also, the above commands will only let me compare executables/scripts in the ~user2/bin/ folder, which can be different even if the source code in ~user2/src/{utilityname}/ is the same between user1 and user2 (because of different paths in scripts, for example). Is it possible to search the folder in which a utility's symlink target resides for a source file with the same name, so that I can compare source files between user1 and user2 directly?
I would use a different approach: find files which have no duplicates. I suggest this thread, especially the part combining fdupes, find and grep:
fdupes -r backup/ documents/ > dup.txt
find backup/ -type f | grep -Fxvf dup.txt
Though, it may need some more adjustments to adapt it to your needs, for example the use of fdupes' "-s" option.

The output of ls with -aF option is not clear

When I try the command ls with -aF option inside any directory whether it's empty or not, I always got the following:
./ ../
so what does the output mean when I have these two options together with ls command?
When you use ls, you are reading a directory file, not actually looking in a directory. Every directory listing contains an entry for the present/current directory, as well as its parent directory, just as you would expect to also see listings for sub/child directories in any directory listing.
The -A option for ls merely tells ls to display ALL files, which includes the entries of ./ & ../ for present and parent. Note that these dots are merely a shorthand that the shell (bash) uses to represent file paths for those files. In other words, what "./" really means is say ~/Desktop if you were currently in the Desktop directory doing an ls. And "../" would mean "~/" which is just another symbolic shorthand to represent your user home directory, which is probably something like /Users/your_username on macOS (OS X), or /usr/your_username for various Linux distributions. Note that those paths could also be written with the forward slash appended at the end and would mean the same thing (e.g., /Users/your_username/ is the same as /Users/your_username because they are both references to other directories (directory files).
Use the -a option for ls if you don't want to see ./ & ../, but still want to see (other) hidden files.
Using the -F option causes ls to display appended characters to the file types based on the file type. This is why directories are displayed with the forward slash appended at the end, and executables are displayed as executable* (with the asterisk appended), and regular files have no appendage (e.g., .txt, .png, .dmg).

Wget - output directory prefix

Currently I try to use:
"wget --user=xxx --password=xxx -r ftp://www.domain.com/htdocs/"
But this saves output files to current directory in this fashion:
curdir/www.domain.com/htdocs/*
I need it to be:
curdir/*
Is there a way to do this, I only see a way to use output prefix, but i think this will just allow me to define directory outside current dir?
You can combine --no-directories if you want all your files inside one directory or --no-host-directories to have subdirectories but no subdirectories per host with your --directory-prefix option.
2.6 Directory Options
‘-nd’
‘--no-directories’
Do not create a hierarchy of directories when retrieving recursively. With this option turned on, all files will get saved to the current directory, without clobbering (if a name shows up more than once, the filenames will get extensions ‘.n’).
‘-nH’
‘--no-host-directories’
Disable generation of host-prefixed directories. By default, invoking Wget with ‘-r http://fly.srk.fer.hr/’ will create a structure of directories beginning with fly.srk.fer.hr/. This option disables such behavior.
‘-P prefix’
‘--directory-prefix=prefix’
Set directory prefix to prefix. The directory prefix is the directory where all other files and subdirectories will be saved to, i.e. the top of the retrieval tree. The default is ‘.’ (the current directory).
(From the wget manual.)

Find folders with specific name and no symlink pointing to them

I'm trying to write a shell script under linux, which lists all folders (recursively) with a certain name and no symlink pointing to it.
For example, I have:
/home/htdocs/cust1/typo3_src-4.2.11
/home/htdocs/cust2/typo3_src-4.2.12
/home/htdocs/cust3/typo3_src-4.2.12
Now I want to go through all subdirectories of /home/htdocs and find those folders typo3_*, that are not pointed to from somewhere.
Should be possible with a shellscript or a command, but I have no idea how.
Thanks for you help
Stefan
I think none of the common file systems store if there are symlinks pointing to this file in the file node, so you would have to scan all other files to see if it is a symlink to this one. If you don't limit your depth of search to a certain level, this might take a very long time. If you want to perform that search in /home/htdocs, for example, it would work something like this:
# find specified folders:
find /home/htdocs -name 'typo3_*' -type d | while read folder; do
# list all symlinks pointing to $folder
find -L /home/htdocs -samefile "$folder"|grep -v "$folder\$"
done

Resources