Display only files and folders that are symbolic links in tcsh or bash - linux

Basically I want do the following:
ls -l[+someflags]
(or by some other means) that will only display files that are symbolic links
so the output would look
-rw-r--r-- 1 username grp size date-time filename -> somedir
-rw-r--r-- 1 username grp size date-time filename2 -> somsdfsdf
etc.
For example,
to show only directories I have an alias:
alias lsd 'ls -l | grep ^d'
I wonder how to display only hidden files or only hidden directories?
I have the following solution, however it doesn't display the output in color :(
ls -ltra | grep '\->'

Find all the symbolic links in a directory:
ls -l `find /usr/bin -maxdepth 1 -type l -print`
For the listing of hidden files:
ls -ald .*

For only "hidden" folders - dot folders, try:
ls -l .**
Yes, the two asterisks are necessary, otherwise you'll also get . and .. in the results.
For symlinks, well, try the symlinks program:
symlinks -v .
(shows all symlinks under current directory)

ls -l | grep lrw
shows only symlinks (files and directories). Not sure how to get them colorful, though.
ls -lad .*
shows only hidden files/directories
ls -l | grep drw
shows directories only.

To display JUST the symlinks and what they link to:
find -P . -type l -exec echo -n "{} -> " \; -exec readlink {} \;
To limit to JUST THIS DIR
find -P . -maxdepth 1 -type l -exec echo -n "{} -> " \; -exec readlink {} \;
Example output (after ln -s /usr/bin moo):
./moo -> /usr/bin

You were almost there with your grep solution; let's focus on getting you COLOR again.
Try this:
ls --color=always -ltra | grep '->'

Improving a little on the accepted answer given by #ChristopheD (coudnt comment on the accepted answer since I dont have enough reputation)
I use an alias
findsymlinks <path> <depth>
where the alias is
alias findsymlinks "find \!:1 -maxdepth \!:2 -type l -print | xargs ls -l --color=auto"

Try file type flag and get rid of the appending #
ls -F /home/usr/foo | grep "#" | sed 's/#//'

For (t)csh:
ls --color=always -ltra | grep '\->'
(This is simply pbr's answer but with the hyphen escaped.)
Mac OSX
On OSX, ls works differently, so add this to your ~/.cshrc file:
setenv CLICOLOR_FORCE 1 # (equivalent of Linux --color=always)
And then call:
ls -G -ltra | grep '\->' # (-G is equivalent of ls --color)

For bash:
This provides a nice output.
sl=`find -L /path/to/target -xtype l`; for links in $sl; do ls --color=always -ltra $links; done | sed 's/^/ /'

Usage: foo $path
Uses current path if none specified.
#!/bin/bash
case "$1" in
-r)
find $2 -type l -print | while IFS= read line ; do ls -l --color=always "$line"; done
;;
--help)
echo 'Usage: foo [-r] [$PATH]'
echo
echo '-r Recursive'
;;
*)
ls --color=always -ltra $1 | grep '\->'
esac

Related

How to grep all files beside current dir, parent dir and one definded?

I have a folder with the following files / folders:
.test
README.md
/dist
/src
I want to grep all files beside dist. So the result should look like:
.test
README.md
/src
When I do
ls -a | grep -v dist
it will remove dist. But . and .. are present. However I require the -a to get files with dot prefix.
When I try to add ls -a | grep -v -e dist -e . -e .. there is no output.
Why will -e . remove all files? How to do it?
Better to use find with -not option instead of error prone ls | grep:
find . -maxdepth 1 -mindepth 1 -not -name dist
btw just for resolving your attempt, correct ls | grep would be:
ls -a | grep -Ev '^(dist|\.\.?)$'
If you use bash, you can do :
shopt -s extglob
echo .[^.]* !(dist)

LINUX Copy the name of the newest folder and paste it in a command [duplicate]

I would like to find the newest sub directory in a directory and save the result to variable in bash.
Something like this:
ls -t /backups | head -1 > $BACKUPDIR
Can anyone help?
BACKUPDIR=$(ls -td /backups/*/ | head -1)
$(...) evaluates the statement in a subshell and returns the output.
There is a simple solution to this using only ls:
BACKUPDIR=$(ls -td /backups/*/ | head -1)
-t orders by time (latest first)
-d only lists items from this folder
*/ only lists directories
head -1 returns the first item
I didn't know about */ until I found Listing only directories using ls in bash: An examination.
This ia a pure Bash solution:
topdir=/backups
BACKUPDIR=
# Handle subdirectories beginning with '.', and empty $topdir
shopt -s dotglob nullglob
for file in "$topdir"/* ; do
[[ -L $file || ! -d $file ]] && continue
[[ -z $BACKUPDIR || $file -nt $BACKUPDIR ]] && BACKUPDIR=$file
done
printf 'BACKUPDIR=%q\n' "$BACKUPDIR"
It skips symlinks, including symlinks to directories, which may or may not be the right thing to do. It skips other non-directories. It handles directories whose names contain any characters, including newlines and leading dots.
Well, I think this solution is the most efficient:
path="/my/dir/structure/*"
backupdir=$(find $path -type d -prune | tail -n 1)
Explanation why this is a little better:
We do not need sub-shells (aside from the one for getting the result into the bash variable).
We do not need a useless -exec ls -d at the end of the find command, it already prints the directory listing.
We can easily alter this, e.g. to exclude certain patterns. For example, if you want the second newest directory, because backup files are first written to a tmp dir in the same path:
backupdir=$(find $path -type -d -prune -not -name "*temp_dir" | tail -n 1)
The above solution doesn't take into account things like files being written and removed from the directory resulting in the upper directory being returned instead of the newest subdirectory.
The other issue is that this solution assumes that the directory only contains other directories and not files being written.
Let's say I create a file called "test.txt" and then run this command again:
echo "test" > test.txt
ls -t /backups | head -1
test.txt
The result is test.txt showing up instead of the last modified directory.
The proposed solution "works" but only in the best case scenario.
Assuming you have a maximum of 1 directory depth, a better solution is to use:
find /backups/* -type d -prune -exec ls -d {} \; |tail -1
Just swap the "/backups/" portion for your actual path.
If you want to avoid showing an absolute path in a bash script, you could always use something like this:
LOCALPATH=/backups
DIRECTORY=$(cd $LOCALPATH; find * -type d -prune -exec ls -d {} \; |tail -1)
With GNU find you can get list of directories with modification timestamps, sort that list and output the newest:
find . -mindepth 1 -maxdepth 1 -type d -printf "%T#\t%p\0" | sort -z -n | cut -z -f2- | tail -z -n1
or newline separated
find . -mindepth 1 -maxdepth 1 -type d -printf "%T#\t%p\n" | sort -n | cut -f2- | tail -n1
With POSIX find (that does not have -printf) you may, if you have it, run stat to get file modification timestamp:
find . -mindepth 1 -maxdepth 1 -type d -exec stat -c '%Y %n' {} \; | sort -n | cut -d' ' -f2- | tail -n1
Without stat a pure shell solution may be used by replacing [[ bash extension with [ as in this answer.
Your "something like this" was almost a hit:
BACKUPDIR=$(ls -t ./backups | head -1)
Combining what you wrote with what I have learned solved my problem too. Thank you for rising this question.
Note: I run the line above from GitBash within Windows environment in file called ./something.bash.

Select files by extension using grep

I need to count all the .txt files in the current folder.
I tried ls | grep .txt but if my folder content is: a.txt btxt c.c it will select a.txt and btxt and I only want files that end with .txt. I tried various combinations of regexp but with no result.
Find may be better than in this case since it is designed for handling file names:
find . -maxdepth 0 -name '*.txt' | wc -l
Buf if you are very cautious about possibly strange file names:
find . -maxdepth 0 -name '*.txt' -exec echo 1 \; | wc -l
For Grep, using the character '.' means: "any character"... so you'll need to escape the dot:
ls | grep -e "\.txt"
edit in fact the -e option is not even necessary. this will do the trick:
ls | grep "\.txt"
If all you need is number of files with extension '.txt' in current directory only, then this will also help.
ls -l *.txt | wc -l

delete file other than particular extension file format

i have a lot of different type of files in one folder. i need to delete the files but except the pdf file.
I tried to display the pdf file only. but i need to delete the other than pdf files
ls -1 | xargs file | grep 'PDF document,' | sed 's/:.*//'
You could do the following - I've used echo rm instead of rm for safety:
for i in *
do
[ x"$(file --mime-type -b "$i")" != xapplication/pdf ] && echo rm "$i"
done
The --mime-type -b options to file make the output of file easier to deal with in a script.
$ ls
aa.txt a.pdf bb.cpp b.pdf
$ ls | grep -v .pdf | xargs rm -rf
$ ls
a.pdf b.pdf
:) !
ls |xargs file|awk -F":" '!($2~/PDF document/){print $1}'|xargs rm -rf
Try inverting the grep match:
ls -1 | xargs file | grep -v 'PDF document,' | sed 's/:.*//'
It's rare in my experience to encounter PDF files which don't have a .pdf extension. You don't state why "file" is necessary in the example, but I'd write this as:
# find . -not -name '*.pdf' -delete
Note that this will recurse into subdirectories; use "-maxdepth 1" to limit to the current directory only.

How can I add text to the same line?

I used this command to find mp3 files and write their name on log.txt:
find -name *.mp3 >> log.txt
I want to move the files using the mv command and I would like to append that to the log file so it could show the path where the files have been moved.
For example if the mp3 files are 1.mp3 and 2.mp3 then the log.txt should look like
1.mp3 >>>> /newfolder/1.mp3
2.mp3 >>>> /newfolder/2.mp3
How can I do that using unix commands? Thank you!
Using only move:
mv -v *.mp3 tmp/ > log.txt
or using find:
find -name '*.mp3' -exec mv -v {} test/ >> log.txt \;
You should probably use some scripting language like Perl or Python; text processing is rather awkward in the shell.
E.g. in Perl you can just postprocess the output of find, and print out what you did.
#!/usr/bin/perl -w
use strict;
use File::Find;
my #directories_to_search=("/tmp/");
sub wanted {
print "$File::Find::name >>> newdir/$_\n";
# do what you want with the file, e.g. invoke commands on it using system()
}
find(\&wanted, #directories_to_search);
Doing it in Perl or similar makes some things easier than in the shell; in particular handling of funny filenames (embedded spaces, special chars) is easier. Be careful when invoking syste() commands though.
For docs on the File::Find module see http://perldoc.perl.org/File/Find.html .
GNU find
find /path -type f -iname "*.mp3" -printf "%f/%p\n" | while IFS="/" -r read filename path
do
mv "$path" "$destination"
echo "$filename >>> $destination/$filename " > newfile.txt
done
output
$ touch 'test"quotes.txt'
$ ls -ltr
total 0
-rw-r--r-- 1 root root 0 2009-11-20 10:30 test"quotes.txt
$ mkdir temp
$ ls -l temp
total 0
$ find . -type f -iname "*\"*" -printf "%f:%p\n" | while IFS=":" read filename path; do mv "$filename" temp ; done
$ ls -l temp
total 0
-rw-r--r-- 1 root root 0 2009-11-20 11:53 test"quotes.txt

Resources