getopt command partially parsing - linux

I have a part of bash script which suppose to do validate the arguments, if it matches then proceed or else exit.
Here is my script
TEMP=`getopt --options b,t:,h,n,v,z: --longoptions batch,targetdir:,help,notar,verbose,zone: --name 'mysql-backup-start' -- "$#"`
if [ $? -ne 0 ]; then
echo "Command Incoorect"
exit 1
fi
mysql-backup-start should take the following arguments: -b, -t, -h, -n, -v, -z --targetdir, --help, --notar, --verbose, and --zone. However, if i pass arguments like -nn, -hh, or --tar it works and it's not supposed to work.
To be more precise what i want, if i execute 'mysql-backup-start' should work, 'mysql- backup-start --notar' should work, 'mysql-backup-start --n' should not work, 'mysql-backup-start --targetdir=/home/backup/mysql' should work, 'mysql-backup-start --targetsdir=/home/backup/mysql' should not work, '--mysql-backup-start --ta=/home/backup/mysql' should not work.

When you pass -nn, getopt just interprets it as if you'd specified -n -n. This is often a valid way to pass arguments - For example, ssh -vvv runs with a much higher verbosity level than ssh -v. In other commands you have options to enable or disable features, and the last option (enable or disable) "wins." This is useful for example if the user has defined an alias like alias grep='grep -H' ("Print the file name for each match."), but wants to override it. To do that she could either run command grep or simply revert the option by running grep -h which is then resolved to grep -H -h.
If you want to check that an option has not been specified more than once (although this is usually not necessary), you should do that later when parsing $TEMP.
--tar should not work - See #tuxuday's comment and man getopt.

Related

Dynamically generate command in bash

I want to dynamically generate pretty long bash command depending on the command line options. Here is what I tried:
CONFIG_PATH=""
#Reading CONFIG_PATH from getopts if supplied
SOME_OPT=""
if [ ! -z "$CONFIG_PATH" ]; then
SOME_OPT="-v -s -cp $CONFIG_PATH"
fi
some_bash_command $SOME_OPT
The point here is that I want to pass 0 arguments to the some_bash_command if no arguments were passed to the script. In case there were some arguments I want to pass them.
It works fine, but the problem is that this approach looks rather unnatural to me.
What would be a better yet practical way to do this?
Your approach is more-or-less the standard one; the only significant improvement that I'd recommend is to use an array, so that you can properly quote the arguments. (Otherwise your command can horribly misbehave if any of the arguments happen to include special characters such as spaces or asterisks.)
So:
SOME_OPT=()
if [ ! -z "$CONFIG_PATH" ]; then
SOME_OPT=(-v -s -cp "$CONFIG_PATH")
fi
some_bash_command "${SOME_OPT[#]}"

Using history expansion in a bash alias or function

I am trying to make a simple thing to make my teammates lives easier. They are constantly copying quote into the command line that are formatted which breaks the command ie: “test“ vs. "test"
It's proved surprisingly annoying to do with:
function damn() { !!:gs/“/" }
or:
alias damn='!!:gs/“/"'
Neither seems to work and keeps giving me either the error
-bash: !!:gs/“/" : No such file or directory
or just:
>
I must be missing something obvious here.
! does not work in functions or aliases. According to bash manual:
History expansion is performed immediately after a complete line is read, before the shell breaks it into words.
You can use the builtin fc command:
[STEP 100] # echo $BASH_VERSION
4.4.19(1)-release
[STEP 101] # alias damn='fc -s “=\" ”=\" '
[STEP 102] # echo “test”
“test”
[STEP 103] # damn
echo "test"
test
[STEP 104] #
For quick referecne, the following is output of help fc.
fc: fc [-e ename] [-lnr] [first] [last] or fc -s [OLD=NEW] [command]
Display or execute commands from the history list.
fc is used to list or edit and re-execute commands from the history list.
FIRST and LAST can be numbers specifying the range, or FIRST can be a
string, which means the most recent command beginning with that
string.
Options:
-e ENAME select which editor to use. Default is FCEDIT, then EDITOR,
then vi
-l list lines instead of editing
-n omit line numbers when listing
-r reverse the order of the lines (newest listed first)
| With the `fc -s [OLD=NEW ...] [command]' format, COMMAND is
| re-executed after the substitution OLD=NEW is performed.
A useful alias to use with this is r='fc -s', so that typing `r cc'
runs the last command beginning with `cc' and typing `r' re-executes
the last command.
Exit Status:
Returns success or status of executed command; non-zero if an error occurs.
Here is a slightly more general solution using a bash function to wrap the fc call, if you want to do something to the string beyond substitution.
function damn() {
# Capture the previous command.
cmd=$(fc -ln -1)
# Do whatever you want with cmd here
cmd=$(echo $cmd | sed 's/[“”]/"/g')
# Re-run the command
eval $cmd
}

getopts: unable to identify arguments

This is the script I tried:
#!/bin/bash
while getopts ":u:p:" option; do
case $option in
u) USER=$OPTARG;;
p) PASS=$OPTARG;;
\?) echo "Invalid Option: $OPTARG"
exit 1;;
:) echo "Please provide an argument for $OPTARG!"
exit 1;;
esac
done
echo "Username/Password: $USER/$PASS"
If command for running the script is:
./test9.sh -u test -p -a
Then I am getting an output:
Username/Password: test/-a
-a is an invalid argument but the script is taking -a as password. I would like to display a message Please enter a password and exit the script. Please help me in fixing this.
There are three kinds of parameters: options, option arguments, and positional parameters. If a parameter is an option that requires an argument, then the next parameter will be treated as an an argument no matter what. It may start with a dash or even coincide with a valid option, it will still be treated as an option argument.
If your program wants to reject arguments that start with a dash, you need to program it yourself. Passwords that start with a dash are perfectly legitimate; a program that checks passwords must not reject them.
Option that accept optional arguments are extremely confusing and non-standard. Getopt in general doesn't support them. There's a GNU extension for that, but don't use it.
TL;DR there's nothing to fix, your script is fine.
I haven't tested your script, but I think that if you use getopt instead of getopts you'll get the result you expect, an error because -a is not a valid option.

'less' the file specified by the output of 'which'

command 'which' shows the link to a command.
command 'less' open the file.
How can I 'less' the file as the output of 'which'?
I don't want to use two commands like below to do it.
=>which script
/file/to/script/fiel
=>less /file/to/script/fiel
This is a use case for command substitution:
less -- "$(which commandname)"
That said, if your shell is bash, consider using type -P instead, which (unlike the external command which) is built into the shell:
less -- "$(type -P commandname)"
Note the quotes: These are important for reliable operation. Without them, the command may not work correctly if the filename contains characters inside IFS (by default, whitespace) or can be evaluated as a glob expression.
The double dashes are likewise there for correctness: Any argument after them is treated as positional (as per POSIX Utility Syntax Guidelines), so even if a filename starting with a dash were to be returned (however unlikely this may be), it ensures that less treats that as a filename rather than as the beginning of a sequence of options or flags.
You may also wish to consider honoring the user's pager selection via the environment variable $PAGER, and using type without -P to look for aliases, shell functions and builtins:
cmdsource() {
local sourcefile
if sourcefile="$(type -P -- "$1")"; then
"${PAGER:-less}" -- "$sourcefile"
else
echo "Unable to find source for $1" >&2
echo "...checking for a shell builtin:" >&2
type -- "$1"
fi
}
This defines a function you can run:
cmdsource commandname
You should be able to just pipe it over, try this:
which script | less

Nagios XI: Provide Multiple Arguments to a Command

I have created a custom plugin in order to monitor a parameter using Nagios XI. To execute that plugin remotely I must use:
/usr/local/nagios/libexec/check_nrpe -H [IP_ADDR] -c [PLUGIN_NAME] -a [ARGUMENT]
Having made appropriate changes in nrpe.cfg and /etc/sudoers, I could get correct results.
But, I need to provide multiple arguments to the command. What should be the syntax I must use?
I would make it a comment if i though anyone could read it. In my command.cfg I had made this
# 'clear_printqueue' event handler command definition
define command{
command_name clear_printqueue
command_line $USER1$/check_nrpe -H $HOSTADDRESS$ -p 5666 -c clear_printqueue -a "/PrinterName:$ARG1$" "/ServiceState:$SERVICESTATE$" "/StateType:$SERVICESTATETYPE$" "/ServiceAttempt:$SERVICEATTEMPT$" "/MaxServiceAttempts:$MAXSERVICEATTEMPTS$"
}
I only have Nagios Core 3.4.4 but I hope this might help. My ini file on the client contained this
clear_printqueue = cscript.exe //T:30 //NoLogo scripts\\lib\\wrapper.vbs scripts\\nagiosClear-PrintQueue.vbs "$ARG1$" "$ARG2$" "$ARG3$" "$ARG4$" "$ARG5$"
$ARG#$ gets passed to the script where it runs. In short I just passed the quoted arguments with spaces in between.

Resources