I guess it's pretty simple.
I just want to locate a bash command. For example when I want to know which commands are existing, containing the phrase "user".
So the command I am looking for should print me wether the location of all commands containing user, or it could just tell me which commands exist with the name. That would be fine though.
I searched here in so and on google, but well both of them just talk about the "find" command.
List of executable files or symlinks in your PATH that contain "user":
find $(echo $PATH | tr ':' ' ') -maxdepth 1 \( -type f -or -type l \) -name '*user*' -executable
sample output:
/usr/bin/users
/usr/bin/xdg-user-dir
/usr/bin/xdg-user-dirs-gtk-update
/usr/bin/users-admin
/usr/bin/xdg-user-dirs-update
/bin/fuser
/bin/fusermount
/bin/ntfs-3g.usermap
/usr/sbin/deluser
/usr/sbin/adduser
/usr/sbin/useradd
/usr/sbin/userdel
/usr/sbin/usermod
/usr/sbin/newusers
also a lot faster than wormsparty's variant (no offence :P). Result almost identical (his returns directories too, AFAIK)
compgen -c | grep -i "user"
compgen [option] [word]
Generate possible completion matches for word according to the options, which may be any option accepted by the complete builtin with the exception of -p and -r, and write the matches to the standard output.
The matches will be generated in the same way as if the programmable completion code had generated them directly from a completion specification with the same flags. If word is specified, only those completions matching word will be displayed.
...
-A action The action may be one of the following to generate a list of possible completions:
alias Alias names. May also be specified as -a.
arrayvar Array variable names.
binding Readline key binding names (see Bindable Readline Commands).
builtin Names of shell builtin commands. May also be specified as -b.
command Command names. May also be specified as -c.
directory Directory names. May also be specified as -d.
disabled Names of disabled shell builtins.
enabled Names of enabled shell builtins.
export Names of exported shell variables. May also be specified as -e.
file File names. May also be specified as -f.
function Names of shell functions.
group Group names. May also be specified as -g.
helptopic Help topics as accepted by the help builtin (see Bash Builtins).
hostname Hostnames, as taken from the file specified by the HOSTFILE shell variable (see Bash Variables).
job Job names, if job control is active. May also be specified as -j.
keyword Shell reserved words. May also be specified as -k.
running Names of running jobs, if job control is active.
service Service names. May also be specified as -s.
setopt Valid arguments for the -o option to the set builtin (see The Set Builtin).
shopt Shell option names as accepted by the shopt builtin (see Bash Builtins).
signal Signal names.
stopped Names of stopped jobs, if job control is active.
user User names. May also be specified as -u.
variable Names of all shell variables. May also be specified as -v.
You may want to check for spaces in some path and improve it with more powerful regex, but this does the trick:
#!/bin/sh
if [ $# -ne 1 ]; then
echo "Usage: $0 pattern"
exit 1
fi
for x in `echo "${PATH}" | sed 's/:/ /g'`; do
for y in $x/*; do
if [ -x "$y" ]; then
if [ `echo "$y" | grep $1 | wc -l` -ne 0 ]; then
echo "$y"
fi
fi
done
done
If you want to find all commands in a directory, on Linux you can use:
find /bin -type f -perm -o+x -name '*z*'
In this example, it will list all executables (programs) on /bin directory that had a z in their name. If you want to search in multiple directories, you can write a script and call the find in a loop, one time for each directory.
You can combine this with the previous answer to search in all directories on your path:
find $(echo $PATH | tr ':' ' ') -type f -perm -o=x -name '*z*'
Related
How do i print out all the files of the current directory that start with the letter "k" ?Also needs to count this files.
I tried some methods but i only got errors or wrong outputs. Really stuck on this as a newbie in bash.
Try this Shellcheck-clean pure POSIX shell code:
count=0
for file in k*; do
if [ -f "$file" ]; then
printf '%s\n' "$file"
count=$((count+1))
fi
done
printf 'count=%d\n' "$count"
It works correctly (just prints count=0) when run in a directory that contains nothing starting with 'k'.
It doesn't count directories or other non-files (e.g. fifos).
It counts symlinks to files, but not broken symlinks or symlinks to non-files.
It works with 'bash' and 'dash', and should work with any POSIX-compliant shell.
Here is a pure Bash solution.
files=(k*)
printf "%s\n" "${files[#]}"
echo "${#files[#]} files total"
The shell expands the wildcard k* into the array, thus populating it with a list of matching files. We then print out the array's elements, and their count.
The use of an array avoids the various problems with metacharacters in file names (see e.g. https://mywiki.wooledge.org/BashFAQ/020), though the syntax is slightly hard on the eyes.
As remarked by pjh, this will include any matching directories in the count, and fail in odd ways if there are no matches (unless you set nullglob to true). If avoiding directories is important, you basically have to get the directories into a separate array and exclude those.
To repeat what Dominique also said, avoid parsing ls output.
Demo of this and various other candidate solutions:
https://ideone.com/XxwTxB
To start with: never parse the output of the ls command, but use find instead.
As find basically goes through all subdirectories, you might need to limit that, using the -maxdepth switch, use value 1.
In order to count a number of results, you just count the number of lines in your output (in case your output is shown as one piece of output per line, which is the case of the find command). Counting a number of lines is done using the wc -l command.
So, this comes down to the following command:
find ./ -maxdepth 1 -type f -name "k*" | wc -l
Have fun!
This should work as well:
VAR="k"
COUNT=$(ls -p ${VAR}* | grep -v ":" | wc -w)
echo -e "Total number of files: ${COUNT}\n" 1>&2
echo -e "Files,that begin with ${VAR} are:\n$(ls -p ${VAR}* | grep -v ":" )" 1>&2
Hi… Need a little help here…
I tried to emulate the DOS' dir command in Linux using bash script. Basically it's just a wrapped ls command with some parameters plus summary info. Here's the script:
#!/bin/bash
# default to current folder
if [ -z "$1" ]; then var=.;
else var="$1"; fi
# check file existence
if [ -a "$var" ]; then
# list contents with color, folder first
CMD="ls -lgG $var --color --group-directories-first"; $CMD;
# sum all files size
size=$(ls -lgGp "$var" | grep -v / | awk '{ sum += $3 }; END { print sum }')
if [ "$size" == "" ]; then size="0"; fi
# create summary
if [ -d "$var" ]; then
folder=$(find $var/* -maxdepth 0 -type d | wc -l)
file=$(find $var/* -maxdepth 0 -type f | wc -l)
echo "Found: $folder folders "
echo " $file files $size bytes"
fi
# error message
else
echo "dir: Error \"$var\": No such file or directory"
fi
The problem is when the argument contains an asterisk (*), the ls within the script acts differently compare to the direct ls command given at the prompt. Instead of return the whole files list, the script only returns the first file. See the video below to see the comparation in action. I don't know why it behaves like that.
Anyone knows how to fix it? Thank you.
Video: problem in action
UPDATE:
The problem has been solved. Thank you all for the answers. Now my script works as expected. See the video here: http://i.giphy.com/3o8dp1YLz4fIyCbOAU.gif
The asterisk * is expanded by the shell when it parses the command line. In other words, your script doesn't get a parameter containing an asterisk, it gets a list of files as arguments. Your script only works with $1, the first argument. It should work with "$#" instead.
This is because when you retrieve $1 you assume the shell does NOT expand *.
In fact, when * (or other glob) matches, it is expanded, and broken into segments by $IFS, and then passed as $1, $2, etc.
You're lucky if you simply retrieved the first file. When your first file's path contains spaces, you'll get an error because you only get the first segment before the space.
Seriously, read this and especially this. Really.
And please don't do things like
CMD=whatever you get from user input; $CMD;
You are begging for trouble. Don't execute arbitrary string from the user.
Both above answers already answered your question. So, i'm going a bit more verbose.
In your terminal is running the bash interpreter (probably). This is the program which parses your input line(s) and doing "things" based on your input.
When you enter some line the bash start doing the following workflow:
parsing and lexical analysis
expansion
brace expansion
tidle expansion
variable expansion
artithmetic and other substitutions
command substitution
word splitting
filename generation (globbing)
removing quotes
Only after all above the bash
will execute some external commands, like ls or dir.sh... etc.,
or will do so some "internal" actions for the known keywords and builtins like echo, for, if etc...
As you can see, the second last is the filename generation (globbing). So, in your case - if the test* matches some files, your bash expands the willcard characters (aka does the globbing).
So,
when you enter dir.sh test*,
and the test* matches some files
the bash does the expansion first
and after will execute the command dir.sh with already expanded filenames
e.g. the script get executed (in your case) as: dir.sh test.pas test.swift
BTW, it acts exactly with the same way for your ls example:
the bash expands the ls test* to ls test.pas test.swift
then executes the ls with the above two arguments
and the ls will print the result for the got two arguments.
with other words, the ls don't even see the test* argument - if it is possible - the bash expands the wilcard characters. (* and ?).
Now back to your script: add after the shebang the following line:
echo "the $0 got this arguments: $#"
and you will immediatelly see, the real argumemts how your script got executed.
also, in such cases is a good practice trying to execute the script in debug-mode, e.g.
bash -x dir.sh test*
and you will see, what the script does exactly.
Also, you can do the same for your current interpreter, e.g. just enter into the terminal
set -x
and try run the dir.sh test* = and you will see, how the bash will execute the dir.sh command. (to stop the debug mode, just enter set +x)
Everbody is giving you valuable advice which you should definitely should follow!
But here is the real answer to your question.
To pass unexpanded arguments to any executable you need to single quote them:
./your_script '*'
The best solution I have is to use the eval command, in this way:
#!/bin/bash
cmd="some command \"with_quetes_and_asterisk_in_it*\""
echo "$cmd"
eval $cmd
The eval command takes its arguments and evaluates them into the command as the shell does.
This solves my problem when I need to call a command with asterisk '*' in it from a script.
This might be a very simple thing for a shell scripting programmer but am pretty new to it. I was trying to execute the below command in a shell script and save the output into a variable
inputfile=$(ls -ltr *.{PDF,pdf} | head -1 | awk '{print $9}')
The command works fine when I fire it from terminal but fails when executed through a shell script (sh). Why is that the command fails, does it mean that shell script doesn't support the command or am I doing it wrong? Also how do I know if a command will work in shell or not?
Just to give you a glimpse of my requirement, I was trying to get the oldest file from a particular directory (I also want to make sure upper case and lower case extensions are handled). Is there any other way to do this ?
The above command will work correctly only if BOTH *.pdf and *.PDF files are in the directory you are currently.
If you would like to execute it in a directory with only one of those you should consider using e.g.:
inputfiles=$(find . -maxdepth 1 -type f \( -name "*.pdf" -or -name "*.PDF" \) | xargs ls -1tr | head -1 )
NOTE: The above command doesn't work with files with new lines, or with long list of found files.
Parsing ls is always a bad idea. You need another strategy.
How about you make a function that gives you the oldest file among the ones given as argument? the following works in Bash (adapt to your needs):
get_oldest_file() {
# get oldest file among files given as parameters
# return is in variable get_oldest_file_ret
local oldest f
for f do
[[ -e $f ]] && [[ ! $oldest || $f -ot $oldest ]] && oldest=$f
done
get_oldest_file_ret=$oldest
}
Then just call as:
get_oldest_file *.{PDF,pdf}
echo "oldest file is: $get_oldest_file_ret"
Now, you probably don't want to use brace expansions like this at all. In fact, you very likely want to use the shell options nocaseglob and nullglob:
shopt -s nocaseglob nullglob
get_oldest_file *.pdf
echo "oldest file is: $get_oldest_file_ret"
If you're using a POSIX shell, it's going to be a bit trickier to have the equivalent of nullglob and nocaseglob.
Is perl an option? It's ubiquitous on Unix.
I would suggest:
perl -e 'print ((sort { -M $b <=> -M $a } glob ( "*.{pdf,PDF}" ))[0]);';
Which:
uses glob to fetch all files matching the pattern.
sort, using -M which is relative modification time. (in days).
fetches the first element ([0]) off the sort.
Prints that.
As #gniourf_gniourf says, parsing ls is a bad idea. Such as leaving unquoted globs, and generally not counting for funny characters in file names.
find is your friend:
#!/bin/sh
get_oldest_pdf() {
#
# echo path of oldest *.pdf (case-insensitive) file in current directory
#
find . -maxdepth 1 -mindepth 1 -iname "*.pdf" -printf '%T# %p\n' \
| sort -n \
| tail -1 \
| cut -d\ -f1-
}
whatever=$(get_oldest_pdf)
Notes:
find has numerous ways of formatting the output, including
things like access time and/or write time. I used '%T# %p\n',
where %T# is last write time in UNIX time format incl.fractal part.
This will never containt space so it's safe to use as separator.
Numeric sort and tail get the last item, sorting by the time,
cut removes the time from the output.
I used IMO much easier to read/maintain pipe notation, with help of \.
the code should run on any POSIX shell,
You could easily adjust the function to parametrize the pattern,
time used (access/write), control the search depth or starting dir.
I'm trying to find all files with a specific extension in a directory and its subdirectories with my bash (Latest Ubuntu LTS Release).
This is what's written in a script file:
#!/bin/bash
directory="/home/flip/Desktop"
suffix="in"
browsefolders ()
for i in "$1"/*;
do
echo "dir :$directory"
echo "filename: $i"
# echo ${i#*.}
extension=`echo "$i" | cut -d'.' -f2`
echo "Erweiterung $extension"
if [ -f "$i" ]; then
if [ $extension == $suffix ]; then
echo "$i ends with $in"
else
echo "$i does NOT end with $in"
fi
elif [ -d "$i" ]; then
browsefolders "$i"
fi
done
}
browsefolders "$directory"
Unfortunately, when I start this script in terminal, it says:
[: 29: in: unexpected operator
(with $extension instead of 'in')
What's going on here, where's the error?
But this curly brace
find "$directory" -type f -name "*.in"
is a bit shorter than that whole thing (and safer - deals with whitespace in filenames and directory names).
Your script is probably failing for entries that don't have a . in their name, making $extension empty.
find {directory} -type f -name '*.extension'
Example: To find all csv files in the current directory and its sub-directories, use:
find . -type f -name '*.csv'
The syntax I use is a bit different than what #Matt suggested:
find $directory -type f -name \*.in
(it's one less keystroke).
Without using find:
du -a $directory | awk '{print $2}' | grep '\.in$'
Though using find command can be useful here, the shell itself provides options to achieve this requirement without any third party tools. The bash shell provides an extended glob support option using which you can get the file names under recursive paths that match with the extensions you want.
The extended option is extglob which needs to be set using the shopt option as below. The options are enabled with the -s support and disabled with he -u flag. Additionally you could use couple of options more i.e. nullglob in which an unmatched glob is swept away entirely, replaced with a set of zero words. And globstar that allows to recurse through all the directories
shopt -s extglob nullglob globstar
Now all you need to do is form the glob expression to include the files of a certain extension which you can do as below. We use an array to populate the glob results because when quoted properly and expanded, the filenames with special characters would remain intact and not get broken due to word-splitting by the shell.
For example to list all the *.csv files in the recursive paths
fileList=(**/*.csv)
The option ** is to recurse through the sub-folders and *.csv is glob expansion to include any file of the extensions mentioned. Now for printing the actual files, just do
printf '%s\n' "${fileList[#]}"
Using an array and doing a proper quoted expansion is the right way when used in shell scripts, but for interactive use, you could simply use ls with the glob expression as
ls -1 -- **/*.csv
This could very well be expanded to match multiple files i.e. file ending with multiple extension (i.e. similar to adding multiple flags in find command). For example consider a case of needing to get all recursive image files i.e. of extensions *.gif, *.png and *.jpg, all you need to is
ls -1 -- **/+(*.jpg|*.gif|*.png)
This could very well be expanded to have negate results also. With the same syntax, one could use the results of the glob to exclude files of certain type. Assume you want to exclude file names with the extensions above, you could do
excludeResults=()
excludeResults=(**/!(*.jpg|*.gif|*.png))
printf '%s\n' "${excludeResults[#]}"
The construct !() is a negate operation to not include any of the file extensions listed inside and | is an alternation operator just as used in the Extended Regular Expressions library to do an OR match of the globs.
Note that these extended glob support is not available in the POSIX bourne shell and its purely specific to recent versions of bash. So if your are considering portability of the scripts running across POSIX and bash shells, this option wouldn't be right.
find "$PWD" -type f -name "*.in"
There's a { missing after browsefolders ()
All $in should be $suffix
The line with cut gets you only the middle part of front.middle.extension. You should read up your shell manual on ${varname%%pattern} and friends.
I assume you do this as an exercise in shell scripting, otherwise the find solution already proposed is the way to go.
To check for proper shell syntax, without running a script, use sh -n scriptname.
To find all the pom.xml files in your current directory and print them, you can use:
find . -name 'pom.xml' -print
find $directory -type f -name "*.in"|grep $substring
for file in "${LOCATION_VAR}"/*.zip
do
echo "$file"
done
Is it possible to get, using Bash, a list of commands starting with a certain string?
I would like to get what is printed hitting <tab> twice after typing the start of the command and, for example, store it inside a variable.
You should be able to use the compgen command, like so:
compgen -A builtin [YOUR STRING HERE]
For example, "compgen -A builtin l" returns
let
local
logout
You can use other keywords in place of "builtin" to get other types of completion. Builtin gives you shell builtin commands. "File" gives you local filenames, etc.
Here's a list of actions (from the BASH man page for complete which uses compgen):
alias Alias names. May also be specified as -a.
arrayvar Array variable names.
binding Readline key binding names.
builtin Names of shell builtin commands. May also be specified as -b.
command Command names. May also be specified as -c.
directory Directory names. May also be specified as -d.
disabled Names of disabled shell builtins.
enabled Names of enabled shell builtins.
export Names of exported shell variables. May also be specified as -e.
file File names. May also be specified as -f.
function Names of shell functions.
group Group names. May also be specified as -g.
helptopic Help topics as accepted by the help builtin.
hostname Hostnames, as taken from the file specified by the HOSTFILE shell
variable.
job Job names, if job control is active. May also be specified as
-j.
keyword Shell reserved words. May also be specified as -k.
running Names of running jobs, if job control is active.
service Service names. May also be specified as -s.
setopt Valid arguments for the -o option to the set builtin.
shopt Shell option names as accepted by the shopt builtin.
signal Signal names.
stopped Names of stopped jobs, if job control is active.
user User names. May also be specified as -u.
variable Names of all shell variables. May also be specified as -v.
A fun way to do this is to hit M-* (Meta is usually left Alt).
As an example, type this:
$ lo
Then hit M-*:
$ loadkeys loadunimap local locale localedef locale-gen locate
lockfile-create lockfile-remove lockfile-touch logd logger login
logname logout logprof logrotate logsave look lorder losetup
You can read more about this in man 3 readline; it's a feature of the readline library.
If you want exactly how bash would complete
COMPLETIONS=$(compgen -c "$WORD")
compgen completes using the same rules bash uses when tabbing.
JacobM's answer is great. For doing it manually, i would use something like this:
echo $PATH | tr : '\n' |
while read p; do
for i in $p/mod*; do
[[ -x "$i" && -f "$i" ]] && echo $i
done
done
The test before the output makes sure only executable, regular files are shown. The above shows all commands starting with mod.
Interesting, I didn't know about compgen. Here a script I've used to do it, which doesn't check for non-executables:
#!/bin/bash
echo $PATH | tr ':' '\0' | xargs -0 ls | grep "$#" | sort
Save that script somewhere in your $PATH (I named it findcmd), chmod u+w it, and then use it just like grep, passing your favorite options and pattern:
findcmd ^foo # finds all commands beginning with foo
findcmd -i -E 'ba+r' # finds all commands matching the pattern 'ba+r', case insensitively
Just for fun, another manual variant:
find -L $(echo $PATH | tr ":" " ") -name 'pattern' -type f -perm -001 -print
where pattern specifies the file name pattern you want to use. This will miss commands that are not globally executable, but which you have permission for.
[tested on Mac OS X]
Use the -or and -and flags to build a more comprehensive version of this command:
find -L $(echo $PATH | tr ":" " ") -name 'pattern' -type f
\( \
-perm -001 -or \
\( -perm -100 -and -user $(whoami)\) \
\) -print
will pick up files you have permission for by virtue of owning them. I don't see a general way to get all those you can execute by virtue of group affiliation without a lot more coding.
Iterate over the $PATH variable and do ls beginningofword* for each directory in the path?
To get it exactly equivalent, you would need to filter out only executable files and sort by name (should be pretty easy with ls flags and the sort command).
What is listed when you hit are the binary files in your PATH that start with that string. So, if your PATH variable contains:
PATH=/usr/local/bin:/usr/bin:/bin:/usr/games:/usr/lib/java/bin:/usr/lib/java/jre/bin:/usr/lib/qt/bin:/usr/share/texmf/bin:.
Bash will look in each of those directories to show you the suggestions once you hit . Thus, to get the list of commands starting with "ls" into a variable you could do:
MYVAR=$(ls /usr/local/bin/ls* /usr/bin/ls* /bin/ls*)
Naturally you could add all the other directories I haven't.