How to get the name of the executables files in bash with ls - linux

I try to get the name of the executable files using ls -l.
Then I tried to get the lines of ls -l which have a x using grep -w x but the result is not right : some executable files are missing (the .sh).
I just need the name of the executable files not the path but I don't know how ...
user#user-K53TA:~/Bureau$ ls -l
total 52
-rwxrwxrwx 1 user user 64 oct. 6 21:07 a.sh
-rw-rw-r-- 1 user user 11 sept. 29 21:51 e.txt
-rwxrwxrwx 1 user user 140 sept. 29 23:42 hi.sh
drwxrwxr-x 8 user user 4096 juil. 30 20:47 nerdtree-master
-rw-rw-r-- 1 user user 492 oct. 6 21:07 okk.txt
-rw-rw-r-- 1 user user 1543 oct. 6 21:07 ok.txt
-rw-rw-r-- 1 user user 119 sept. 29 23:27 oo.txt
-rwxrwxr-x 1 user user 8672 sept. 29 21:20 prog
-rw-rw-rw- 1 user user 405 sept. 29 21:23 prog.c
-rw-rw-r-- 1 user user 0 sept. 29 21:58 rev
drwxrwxr-x 3 user user 4096 sept. 29 20:51 sublime
user#user-K53TA:~/Bureau$ ls -l | grep -w x
drwxrwxr-x 8 user user 4096 juil. 30 20:47 nerdtree-master
-rwxrwxr-x 1 user user 8672 sept. 29 21:20 prog
drwxrwxr-x 3 user user 4096 sept. 29 20:51 sublime

Don't parse ls. This can be done with find.
find . -type f -perm /a+x
This finds files with any of the executable bits set: user, group, or other.

Use find instead:
find -executable
find -maxdepth 1 -type f -executable
find -maxdepth 1 -type f -executable -ls

One can use a for loop with glob expansion for discovering and manipulating file names. Observe:
#!/bin/sh
for i in *
do # Only print discoveries that are executable files
[ -f "$i" -a -x "$i" ] && printf "%s\n" "$i"
done

Since the accepted answer uses no ls at all:
ls -l | grep -e '^...x'

Related

`ls -l` for all parent directories

I want to get a list of all directory permissions from current folder to /. For example, for the directory: /var/lib/program/subfolder, I want an output such as:
$ pwd
/var/lib/program/subfolder
$ magic_ls_-l_command somefile
drwxr-xr-x 10 root root 4096 May 15 20:20 var
drwxr-xr-x 10 root root 4096 May 15 20:20 lib
drwxrwxr-x 10 root user 4096 May 16 20:21 program
drwxrwxr-x 10 root user 4096 May 16 20:21 subfolder
-rwxrwxr-- 1 root user 4096 May 16 20:22 somefile
I don't care about the order (from /var to /subfolder or the other way around), the number of hard links or even the date. I just wrote them down to emulate the ls -l output. Also, I don't care how each filename in printed (/var and /lib, var and lib, or /var and /var/lib). I'm just interested in the ownership of each file/directory in the path from the choosen file or pwd to /.
In case I should install some program, I'm under Ubuntu 20.04.
This question has already been answered in superuser.com (I don't know if I can mark a question from one site as duplicate from another). The solution is as simple as writing (assuming I am in the same directory as the target filename):
$ namei -l $(pwd)/somefile ## or `namei -l $(realpath -s somefile)`
Because of -l, it lists basic permissions in long format for each parent directory.
I have to use pwd/realpath because namei doesn't resolve relative paths. If I'm not in the target directory, just write the full path.
I made this small script that does this. I use cd "$1"; pwd to get the current directory so that paths are not canonicalized (say, if you try magic-ls . and your current directory is /var/lib/postgres, but that is a symlink to /mnt/postgres, you will get /var, /var/lib and /var/lib/postgres, while using realpath you would get /mnt and /mnt/postgres)
magic-ls() {
local current=$(cd "$1"; pwd)
while [[ $current != '/' ]]; do
ls -ld "$current"
current=$(dirname "$current")
done
}
Here's an example output:
[leodag#desk ~]$ magic-ls
drwx------ 1 leodag leodag 2722 jun 21 13:49 /home/leodag
drwxr-xr-x 1 root root 18 mai 2 2019 /home
By the way it will also work with no argument since cd "" does not change your directory.
Edit: removed realpath from the while check, since that could lead to unexpected results if there was a link to / in the path, and was unneeded.
I wrote a bash script for you. It'll have some bugs, if you have space in names. If it bothers you, I'm happy for changes recommendations in the comments.
#!/bin/bash
if [ ! -z "$1" ] && [ -e "$1" ]
then
path=`realpath -s "$1"` # read argument as absolute path
else
path="$PWD" # No valid argument, so we take pwd
fi
paths=""
while [ "$path" != / ];do
paths+=" $path"
path=`dirname "$path"`
done
paths+=" $path" # Adding / to pathlist too
ls -ld $paths
With realpath -s you can catch the absolute path, but you wont follow the link. If no argument is given, we will use pwd as the file/directory to list.
We append each path to a list. This gives us the advantage of a better layout in the end, so that we get a nice table because we run ls only once.
Output:
bobafit:~$ magic_ls_-l_command /usr/bin/python3
drwxr-xr-x 21 root root 4096 Jun 20 10:07 /
drwxr-xr-x 14 root root 4096 Sep 5 2019 /usr
drwxr-xr-x 2 root root 110592 Jun 20 10:07 /usr/bin
lrwxrwxrwx 1 root root 9 Apr 7 12:43 /usr/bin/python3 -> python3.8
Just using parameter expansion:
#!/usr/bin/env bash
path="$1"
while test -n "$path"; do
ls -lLd "$path"
path="${path%/*}"
done
calling method :
bash test.sh /var/lib/program/subfolder/somefile
giving
-rw-r--r-- 1 root root 0 Jun 21 18:49 /var/lib/program/subfolder/somefile
drwxr-xr-x 1 root root 4096 Jun 21 18:49 /var/lib/program/subfolder
drwxr-xr-x 1 root root 4096 Jun 21 18:49 /var/lib/program
drwxr-xr-x 1 root root 4096 Jun 21 18:49 /var/lib
drwxr-xr-x 1 root root 4096 Jun 13 19:24 /var
#! /bin/bash
cur=""
IFS="/"
path=`pwd`
for dir in ${path:1}
do
cur=$cur/$dir
ls -lhd "$cur"
done
cur=$cur/$1
ls -lhd "$cur"
Terminal Session:
$ pwd
/tmp/dir_underscore/dir space/dir special #!)
$ ls
bash.sh test.txt
$ ./bash.sh test.txt
drwxrwxrwt 28 root root 36K Jun 21 22:45 /tmp
drwxr-xr-x 3 root root 4.0K Jun 21 22:27 /tmp/dir_underscore
drwxr-xr-x 3 root root 4.0K Jun 21 22:28 '/tmp/dir_underscore/dir space'
drwxr-xr-x 2 root root 4.0K Jun 21 22:54 '/tmp/dir_underscore/dir space/dir special #!)'
-rw-r--r-- 1 root root 0 Jun 21 22:29 '/tmp/dir_underscore/dir space/dir special #!)/test.txt'
This should possibly work:
pwd ; ls -lh ; while true ; do cd .. ; pwd ; ls -lh ; [[ "$PWD" == "/" ]] && break ; done
EDIT: I misunderstood the question at first. Try this:
(pwd ; ls -ldh ; while true ; do cd .. ; pwd ; ls -ldh ; [[ "$PWD" == "/" ]] &&
break ; done ; cd "$START")
EDIT2: fillipe's answer is probably the best, but here's my third and last attempt, which works on both files and directories:
magic_ls() {
fname="$1"
while true ; do
ls -lhd "$fname"
[[ "$fname" == "/" ]] && break ;
fname=$(dirname $(readlink -f "$fname"))
done
}
Just my 2 cents. My mac doesn't have the namei command (perhaps homebrew has a copy), but wanted to whip up a quick version that aligned the output in top-down order
#!/usr/bin/env bash
path="${1%/}"
DIRS=()
while test -n "$path"; do
DIRS=( "$path" "${DIRS[#]}" )
path="${path%/*}"
done
ls -ld "${DIRS[#]}"
Example output:
$ lspath $TMPDIR
lrwxr-xr-x# 1 root wheel 11 Oct 5 2018 /var -> private/var
drwxr-xr-x 7 root wheel 224 Jul 16 2020 /var/folders
drwxr-xr-x# 3 root wheel 96 Apr 5 2018 /var/folders/0c
drwxr-xr-x# 5 me staff 160 Apr 5 2018 /var/folders/0c/2_s_qxd11m3d1smzqdrs3qg40000gp
drwx------# 255 me staff 8160 Oct 7 09:18 /var/folders/0c/2_s_qxd11m3d1smzqdrs3qg40000gp/T

How to find out if ls command output is file or a directory Bash

ls command outputs everything that is contained in current directory. For example ls -la will output something like this
drwxr-xr-x 3 user user 4096 dec 19 17:53 .
drwxr-xr-x 15 user user 4096 dec 19 17:39 ..
drwxrwxr-x 2 user user 4096 dec 19 17:53 tess (directory)
-rw-r--r-- 1 user user 178 dec 18 21:52 file (file)
-rw-r--r-- 1 user user 30 dec 18 21:47 text (file)
And what if I want to know how much space does all files consume. For that I would have to sum $5 from all lines with ls -la | awk '{ sum+=$5 } END{print sum}'. So how can I only sum size of files and leave directories behind?
You can use the following :
find . -maxdepth 1 -type f -printf '%s\n' | awk '{s+=$1} END {print s}'
The find command selects all the files in the current directory and output their size. The awk command sums the integers and output the total.
Don't.
One of the most quoted pages on SO that I've seen is https://unix.stackexchange.com/questions/128985/why-not-parse-ls-and-what-do-to-instead.
That being said and as a hint for further development, ls -l | awk '/^-/{s+=$5} END {print s}' will probably do what you ask.

how to get previous date files and pass ls output to array in gawk

I have log files like below generated, and I need to daily run script ,which will list them , and then do 2 things.
1- get previous / yesterday files and transfer them to x server
2- get files older than one day and transfer them to y server
files are like below and I am trying below code but not working.
how can we pass ls -altr output to gawk ? can we built an associate array like below.
array[index]=ls -altr | awk '{print $6,$7,$8}'
code I am trying to retrieve previous date files , but not working
previous_dates=$(date -d "-1 days" '+-%d')
ls -altr |gawk '{if ( $7!=previous_dates ) print $9 }'
-r-------- 1 root root 6291563 Jun 22 14:45 audit.log.4
-r-------- 1 root root 6291619 Jun 24 09:11 audit.log.3
drwxr-xr-x. 14 root root 4096 Jun 26 03:47 ..
-r-------- 1 root root 6291462 Jun 26 04:15 audit.log.2
-r-------- 1 root root 6291513 Jun 27 23:05 audit.log.1
drwxr-x---. 2 root root 4096 Jun 27 23:05 .
-rw------- 1 root root 5843020 Jun 29 14:57 audit.log
To select files modified yesterday, you could use
find . -daystart -type f -mtime 1
and to select older files, you could use
find . -daystart -type f -mtime +1
possibly adding a -name test to select only files like audit.log*, for example. You could then use xargs to process the files, e.g.
find . -daystart -type f -mtime 1 | xargs -n 1 -I{} scp {} user#server

Script to remove all directories older than x days but keep certain ones

I'm trying to write a bash script to remove all directories and their files but keep certain ones.
drwxr-xr-x 20 ubuntu admin 4096 Jan 21 17:58 .
drwxr-xr-x 8 ubuntu admin 4096 Nov 21 16:45 ..
drwxr-xr-x 11 ubuntu admin 4096 Jan 9 13:09 1763
drwxr-xr-x 11 ubuntu admin 4096 Jan 16 16:46 1817
drwxr-xr-x 11 ubuntu admin 4096 Jan 16 17:39 1821
drwxr-xr-x 11 ubuntu admin 4096 Jan 19 10:15 1823
drwxr-xr-x 11 ubuntu admin 4096 Jan 19 11:57 1826
drwxr-xr-x 11 ubuntu admin 4096 Jan 19 14:55 1827
drwxr-xr-x 11 ubuntu admin 4096 Jan 19 21:34 1828
drwxr-xr-x 11 ubuntu admin 4096 Jan 20 13:29 1833
drwxr-xr-x 11 ubuntu admin 4096 Jan 20 16:13 1834
drwxr-xr-x 11 ubuntu admin 4096 Jan 21 10:06 1838
drwxr-xr-x 11 ubuntu admin 4096 Jan 21 12:51 1842
drwxr-xr-x 11 ubuntu admin 4096 Jan 21 15:20 1845
drwxr-xr-x 11 ubuntu admin 4096 Jan 22 13:00 1848
drwxr-xr-x 11 ubuntu admin 4096 Nov 24 16:34 217
drwxr-xr-x 11 ubuntu admin 4096 Dec 2 20:44 219
drwxr-xr-x 11 ubuntu admin 4096 Dec 15 16:42 221
drwxr-xr-x 11 ubuntu admin 4096 Dec 16 12:04 225
drwxr-xr-x 2 ubuntu admin 4096 Jan 20 16:10 app-conf
lrwxrwxrwx 1 ubuntu admin 19 Jan 21 17:58 latest -> /opt/qudiniapp/1848
In the example above we'd want to clear out all non sym-linked folders except the app-conf folder.
The plan is to have this triggered by my ansible deployment script before deployment so we can keep our server from filling up with builds.
Provided, all directories, that are to be deleted, consist only of numbers, this would be one way solve this:
cd /tempdir
rm -rf $(find . -type d -name "[0-9]*" | grep -v "$(readlink latest)")
As this is a housekeepingjob, you should create a cronjob, that regularly deletes old directories. The find command would then include checking, for example, if the last modification time is beyond a number of days:
rm -rf $(find . -type d -mtime +20 -name "[0-9]*" | grep -v "$(readlink latest)")
bash script:
#!/bin/bash
find /your/path -type d ! \( -path '*app-conf*' -prune \) -mtime +2 -delete
per man find
-P Never follow symbolic links. This is the default behaviour. When find examines or prints information a file, and the file is a symbolic link, the information used shall be taken from the properties of the symbolic link itself.
-mtime n File's data was last modified n*24 hours ago. See the comments for -atime to understand how rounding affects the interpretation of file modification times.
This is what I use in my Ansible deployments, hope it will be helpful for you as it does almost exactly what you need.
I always remove oldest release on each deployment if there are >= 5 builds in "{{ releases_path }}" directory. "{{ releases_path }}" contains directories which are basically Git commit hashes (long)
- name: Find oldest release to remove
shell: '[[ $(find "{{ releases_path | quote }}" -maxdepth 1 -mindepth 1 -type d | wc -l) -ge 6 ]] && IFS= read -r -d $"\0" line < <(find "{{ releases_path | quote }}" -maxdepth 1 -mindepth 1 -type d -printf "%T# %p\0" 2>/dev/null | sort -z -n); file="${line#* }"; echo "$file";'
args:
executable: /bin/bash
chdir: "{{ releases_path }}"
register: releasetoremove
changed_when: "releasetoremove.stdout != ''"
- debug: var=releasetoremove
- name: Remove oldest release
file: path={{ releasetoremove.stdout }} state=absent
when: releasetoremove|changed
This is what I always have on each server in releases directory (last 5 always kept):
$ ls -lt | cut -c 28-
62 Jan 22 17:42 current -> /srv/releases/2a7b80c82fb1dd658a3356fed7bba9718bc50527
4096 Jan 22 17:41 2a7b80c82fb1dd658a3356fed7bba9718bc50527
4096 Jan 22 15:22 73b1252ab4060833e43849e2e32f57fea6c6cd9b
4096 Jan 22 14:47 9df7f1097909aea69916695194ac41938a0c2e9a
4096 Jan 22 14:16 f6a2862d70f7f26ef75b67168a30fb9ef2202555
4096 Jan 22 13:49 fa89eefc5b2505e153b2e59ed02a23889400c4bf

counting number of directories in a specific directory

How to count the number of folders in a specific directory. I am using the following command, but it always provides an extra one.
find /directory/ -maxdepth 1 -type d -print| wc -l
For example, if I have 3 folders, this command provides 4. If it contains 5 folders, the command provides 6. Why is that?
find is also printing the directory itself:
$ find .vim/ -maxdepth 1 -type d
.vim/
.vim/indent
.vim/colors
.vim/doc
.vim/after
.vim/autoload
.vim/compiler
.vim/plugin
.vim/syntax
.vim/ftplugin
.vim/bundle
.vim/ftdetect
You can instead test the directory's children and do not descend into them at all:
$ find .vim/* -maxdepth 0 -type d
.vim/after
.vim/autoload
.vim/bundle
.vim/colors
.vim/compiler
.vim/doc
.vim/ftdetect
.vim/ftplugin
.vim/indent
.vim/plugin
.vim/syntax
$ find .vim/* -maxdepth 0 -type d | wc -l
11
$ find .vim/ -maxdepth 1 -type d | wc -l
12
You can also use ls:
$ ls -l .vim | grep -c ^d
11
$ ls -l .vim
total 52
drwxrwxr-x 3 anossovp anossovp 4096 Aug 29 2012 after
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 autoload
drwxrwxr-x 13 anossovp anossovp 4096 Aug 29 2012 bundle
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 colors
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 compiler
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 doc
-rw-rw-r-- 1 anossovp anossovp 48 Aug 29 2012 filetype.vim
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftdetect
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftplugin
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 indent
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 plugin
-rw-rw-r-- 1 anossovp anossovp 2505 Aug 29 2012 README.rst
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 syntax
$ ls -l .vim | grep ^d
drwxrwxr-x 3 anossovp anossovp 4096 Aug 29 2012 after
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 autoload
drwxrwxr-x 13 anossovp anossovp 4096 Aug 29 2012 bundle
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 colors
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 compiler
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 doc
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftdetect
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 ftplugin
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 indent
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 plugin
drwxrwxr-x 2 anossovp anossovp 4096 Aug 29 2012 syntax
Get a count of only the directories in the current directory
echo */ | wc
you will get out put like 1 309 4594
2nd digit represents no. of directories.
or
tree -L 1 | tail -1
find . -mindepth 1 -maxdepth 1 -type d | wc -l
For find -mindepth means total number recusive in directories
-maxdepth means total number recusive in directories
-type d means directory
And for wc -l means count the lines of the input
If you only have directories in the folder and no files this does it:
ls | wc -l
Run stat -c %h folder and subtract 2 from the result. This employs only a single subprocess as opposed to the 2 (or even 3) required by most of the other solutions here (typically find or ls plus wc).
Using sh/bash:
echo $((`stat -c %h folder` - 2))   # 'echo' is a shell builtin
Using csh/tcsh:
# cnt = `stat -c %h folder` - 2; echo $cnt   # 'echo' is a shell builtin
Explanation: stat -c %h folder prints the number of hardlinks to folder, and each subfolder under folder contains a ../ entry which is a hardlink back to folder. You must subtract 2 because there are two additional hardlinks in the count:
folder's own self-referential ./ entry, and
folder's parent's link to folder
Best way to navigate to your drive and simply execute
ls -lR | grep ^d | wc -l
and to Find all folders in total, including subdirectories?
find /mount/point -type d | wc -l
...or find all folders in the root directory (not including subdirectories)?
find /mount/point -maxdepth 1 -type d | wc -l
Cheers!
I think the easiest is
ls -ld images/* | wc -l
where images is your target directory. The -d flag limits to directories, and the -l flag will perform a per-line listing, compatible with the very familiar wc -l for line count.
No of directory we can find using below command
ls -l | grep "^d" | wc -l
Some useful examples:
count files in current dir
/bin/ls -lA | egrep -c '^-'
count dirs in current dir
/bin/ls -lA | egrep -c '^d'
count files and dirs in current dir
/bin/ls -lA | egrep -c '^-|^d'
count files and dirs in in one subdirectory
/bin/ls -lA subdir_name/ | egrep -c '^-|^d'
I have noticed a strange thing (at least in my case) :
When I have tried with ls instead /bin/ls
the -A parameter do not list implied . and .. NOT WORK as espected. When I use
ls that show ./ and ../ So that result wrong count. SOLUTION : /bin/ls instead ls
To get the number of directories - navigate
go to the directory and execute
ls -l | grep -c ^d
A pure bash solution:
shopt -s nullglob
dirs=( /path/to/directory/*/ )
echo "There are ${#dirs[#]} (non-hidden) directories"
If you also want to count the hidden directories:
shopt -s nullglob dotglob
dirs=( /path/to/directory/*/ )
echo "There are ${#dirs[#]} directories (including hidden ones)"
Note that this will also count links to directories. If you don't want that, it's a bit more difficult with this method.
Using find:
find /path/to/directory -type d \! -name . -prune -exec printf x \; | wc -c
The trick is to output an x to stdout each time a directory is found, and then use wc to count the number of characters. This will count the number of all directories (including hidden ones), excluding links.
The methods presented here are all safe wrt to funny characters that can appear in file names (spaces, newlines, glob characters, etc.).
Using zsh:
a=(*(/N)); echo ${#a}
The N is a nullglob, / makes it match directories, # counts. It will neatly cope with spaces in directory names as well as returning 0 if there are no directories.
The best answer to what you want is
echo `find . -maxdepth 1 -type d | wc -l`-1 | bc
this subtracts one to remove the unwanted '.' directory that find lists (as patel deven mentioned above).
If you want to count subfolders recursively, then just leave off the maxdepth option, so
echo `find . -type d | wc -l`-1 | bc
PS If you find command substitution ugly, subtracting one can be done as a pure stream using sed and bc.
Subtracting one from count:
find . -maxdepth 1 -type d | wc -l | sed 's/$/-1\n/' | bc
or, adding count to minus one:
find . -maxdepth 1 -type d | wc -l | sed 's/^/-1+/' | bc
Count all files and subfolders, windows style:
dir=/YOUR/PATH;f=$(find $dir -type f | wc -l); d=$(find $dir -mindepth 1 -type d | wc -l); echo "$f Files, $d Folders"
If you want to use regular expressions, then try:
ls -c | grep "^d" | wc -l
If you want to count folders that have similar names like folder01,folder02,folder03, etc then you can do
ls -l | grep ^d | grep -c folder
Best way to do it:
ls -la | grep -v total | wc -l
This gives you the perfect count.

Resources