is there any possibility to add "help" to written by you bash script in Linux (Debian)? I mean specifically, by using command yourscript --help or yourscript -h
It doesn't have to be harder than this.
case $1 in
-[h?] | --help)
cat <<-____HALP
Usage: ${0##*/} [ --help ]
Outputs a friendly help message if you can figure out how.
____HALP
exit 0;;
esac
If you use getopts for option processing, use that to identify the option; but the action is going to look more or less similar (and IMNSHO getopts doesn't really offer anything over a simple while ... shift loop).
getopt
#!/bin/bash
args=$(getopt -n "$(basename "$0")" -o h --longoptions help -- "$#") || exit 1
eval set -- "$args"
while :; do
case $1 in
-h|--help) echo offer help here ; exit ;;
--) shift; break ;;
*) echo "error: $1"; exit 1;;
esac
done
echo "hello world, $*"
There are many ways to do this. Over time I have come to prefer separate usage and help functions. The help is provided in response to a request for either --help or -h and it provides extended help/option information in a heredoc format. The usage function is provided in response to an invalid input. It is short and provides a quick reminder of what the script needs. Both functions take a string as the first argument that allows you to pass an error message to be displayed along with the help or usage. Both also allow you to pass an exit code as the second argument.
The following is an example I pulled from an existing script. You can ignore the contents, but it was left by way of example:
function help {
local ecode=${2:-0}
[[ -n $1 ]] && printf "\n $1\n" >&2
cat >&2 << helpMessage
Usage: ${0##*/} <ofile> <file.c> [ <cflags> ... --log [ \$(<./bldflags)]]
${0##*/} calls 'gcc -Wall -o <ofile> <file.c> <cflags> <\$(<./bldflags)>'
If the file './bldflags' exists in the present directory, its contents are
read into the script as additional flags to pass to gcc. It is intended to
provide a simple way of specifying additional libraries common to the source
files to be built. (e.g. -lssl -lcrypto).
If the -log option is given, then the compile string and compiler ouput are
written to a long file in ./log/<ofile>_gcc.log
Options:
-h | --help program help (this file)
-l | --log write compile string and compiler ouput to ./log/<ofile>_gcc.log
helpMessage
exit $ecode
}
function usage {
local ecode=${2:-0}
[[ -n $1 ]] && printf "\n $1\n" >&2
printf "\n Usage: %s <ofile> <file.c> [ <cflags> ... --log [ \$(<./bldflags)]]\n\n" "${0##*/}"
exit $ecode
}
I generally test for help when looking at all arguments, e.g.:
## test for help and log flags and parse remaining args as cflags
for i in $*; do
test "$i" == "-h" || test "$i" == "--help" && help
...
done
Usage is provided in response to an invalid input, e.g.:
[ -f "$1" ] || usage "error: first argument is not a file." 1
They come in handy and I've preferred this approach to getopts.
Related
I am using a KSH script to execute a binary (program) that has the following syntax to execute correctly:
myprog [-v | --verbose (optional)] [input1] [input2]
The program prints nothing & returns exit code 0 (zero) on success. On failure it prints ERROR messages to STDERR & returns exit status > 0. If -v option is specified it prints verbose details to STDOUT both in case of success and failure.
To make this usable and reduce chances of argument swapping and user controlled logging I used a ksh shell script to invoke this binary. The syntax to run the ksh shell script is as:
myshell.sh [-v (optional)] [-a input1] [-b input2]
If -v option is specified, ksh redirects STDOUT to <execution_date_time>_out.log and STDERR to <execution_date_time>_err.log. My ksh script is as follows:
myshell.sh :
#! /bun/ksh
verbopt=""
log=""
arg1=""
arg2=""
dateTime=`date +%y-%m-%d_%H:%M:%S`
while getopts "va:b:" arg
do
case $arg in
v) # verbose output
verbopt="-v"
log="1>${dateTime}_out.log 2>${dateTime}_err.log"
;;
a) # Input 1
arg1=$OPTARG
;;
b) # Input 2
arg2=$OPTARG
;;
*) # usage
echo "USAGE: myshell.sh [-v] [-a input1] [-b input2]"
exit 2
;;
esac
done
if [[ -z $arg1|| -z $arg2]]
then
echo "Missing arguments"
exit 2
fi
myprog $verbopt $arg1 $arg2 $log
exit $?
The problem here is, all the output STDERR & STDOUT is printed on the screen (i.e, No redirection took place) as well as no *.log files were created after successful or unsuccessful execution (i.e, exit status: 0 or >0 respectively).
Can anyone help me out on this?
Thanks.
Rather than trying to monkey patch redirections into the command line, just redirect the streams when you parse the flags. That is:
while getopts "va:b:" arg
do
case $arg in
v) # verbose output
verbopt="-v"
exec 1>${dateTime}_out.log 2>${dateTime}_err.log
;;
...
You need to be a little careful, since you do some error checking after this and you probably don't want your later error messages going to the *_err.log, but that's fairly trivial to fix. (eg, error check sooner, or do a test -n "$verbopt" && exec > ... after the error check, or similar)
The problem is that > is not expanded in the value of $log.
I'm afraid you will need to use a conditional for this, for example:
cmd="myprog $verbopt $arg1 $arg2"
if [ "$log" ]; then
$cmd 1>${dateTime}_out.log 2>${dateTime}_err.log
else
$cmd
fi
I would use the idiom exec redirection, which runs the rest of the script as if the given redirection had been supplied when it was run:
if need_to_log; then
exec >stdout_file 2>stderr_file
fi
this command will be logged if the above if statement was true
If you need to restore stdout and stderr afterward for the script to do more unlogged things, you can just run the logging part in a subshell:
(
if need_to_log; then
exec >stdout_file 2>stderr_file
fi
this command will be logged if the above if statement was true
)
this command will not be logged regardless
I would also build the command in an array, so you can add things like -v to it without having to have a separate variable for each possible parameter. If the order in which the -a and -b arguments are supplied to myprog doesn't matter, you can just add those to the array instead of having separate variables as well.
You can see my version below. Besides the above changes, I also don't bother getting the timestamp if not logging, since it's unneeded, and send error messages to standard error instead of standard out using the ksh builtin print.
Here's what I put together:
#!/usr/bin/env ksh
# new array syntax requires ksh93+; for older ksh, use this:
# set -A cmd myprog
cmd=(myprog) # build up the command to run in an array
log_flag=0 # nonzero if the command should be logged
input_a= # the two input filenames
input_b=
while getopts 'va:b:' arg; do
case $arg in
v) # verbose output
# older ksh: set -A cmd "${cmd[#]}" -v
cmd+=(-v)
log_flag=1
;;
a) # Input 1
input_a=$OPTARG
;;
b) # Input 2
input_b=$OPTARG
;;
*) # usage
print -u2 "USAGE: $0 [-v] [-a input1] [-b input2]"
exit 2
;;
esac
done
if [[ -z $input_a || -z $input_b ]]; then
print -u2 "$0: Missing arguments"
exit 2
fi
if (( log_flag )); then
timestamp=$(date +%y-%m-%d_%H:%M:%S)
exec >"${timestamp}_out.log" 2>"${timestamp}_err.log"
fi
"${cmd[#]}" "$input_a" "$input_b"
Your timestamp uses the two-digit year (%y); that and the underscore between the components are the only deviations from the ISO 8601 standard, so I would recommend you go ahead and adopt the standard format. That'd be %Y-%m-%dT%H:%M:%S, or, in C libraries with newer versions of strftime, %FT%T.
You could also be a little more clever and make log_flag a string that is either empty or -q, pass that to the command, and test it against the empty string to determine whether or not to open the log files, but I find the logic easier to follow with the simple 0/1 value treated as a Boolean.
Take a look at the eval command.
Replace ...
myprog $verbopt $arg1 $arg2 $log
with:
eval myprog $verbopt $arg1 $arg2 $log
I don't know what your myprog does but here's a simple example using eval to run date (valid command) and date xyz (invalid command), redirecting output to log.stdout/log.stderr accordingly:
$ cat logout
log='1>log.stdout 2>log.stderr'
'rm' -rf log.std* > /dev/null 2>&1
echo ""
echo 'eval date ${log}'
eval date ${log}
echo ""
echo "++++++++++++ log.stdout"
cat log.stdout
echo "++++++++++++ log.stderr"
cat log.stderr
echo "++++++++++++"
'rm' -rf log.std* > /dev/null 2>&1
echo ""
echo 'eval date xyz ${log}'
eval date xyz ${log}
echo ""
echo "++++++++++++ log.stdout"
cat log.stdout
echo "++++++++++++ log.stderr"
cat log.stderr
echo "++++++++++++"
Now run the script:
$ logout
eval date ${log}
++++++++++++ log.stdout
Sun Jul 23 15:56:01 CDT 2017
++++++++++++ log.stderr
++++++++++++
eval date xyz ${log}
++++++++++++ log.stdout
++++++++++++ log.stderr
date: invalid date `xyz'
++++++++++++
#!/usr/dt/bin/dtksh
while getopts w:m: option
do
case $option in
w) wflag=1
wval="$OPTARG";;
m) mflag=1
mval="$OPTARG";;
?) printf 'BAD\n' $0
exit 2;;
esac
done
if [ ! -z "$wflag" ]; then
printf "W and -w arg is $wval\n"
fi
if [ ! -z "$mflag" ]; then
printf "M and -m arg is $mval\n"
fi
shift $(($OPTIND - 1))
printf "Remaining arguments are: $* \n"
at $wval <<ENDMARKER
echo $* >> Search_List
tr " " "\n" <Search_List >Usr_List
while true; do
if [ -s Usr_List ]; then
for i in $(cat Usr_List); do
if finger -m | grep $i; then
echo '$i is online' | elm user
sed '/$i/d' <Usr_List >tmplist
mv tmplist Usr_List
fi
done
else
break
fi
done
ENDMARKER
Essentially I want to keep searching through until it is empty. Each time an element of the list is found, it is deleted. Once the list is empty quit.
There are no error messages when I first run the command, it only shows up in an email containing the output of the at job.
Thanks in advance for any advice
EDIT: The script uses getopts and takes one argument for -w and one for -m, the w value is set as the time for the at job, the m still has to be used. Any arguments after the one for m are sent to a file called Search_List, Search_List is edited and saved as Usr_List. Then in the while loop, while Usr_List is not empty, the script checks the results of finger -m against the names in Usr_List. If a name is found, it is removed from Usr_List. Once Usr_List is empty, the program should stop.
elm is a way to send an email, so elm user sends an email to user.
The error is :
while: Expression syntax
at uses /bin/sh by default.
at now <<ENDMARKER
<code here>
ENDMARKER
All of this executes under /bin/sh, which on some systems can be Bourne Shell (Solaris for example).
You need to figure out what /bin/sh is for your system, then modify things accordingly. Plus, read the gurantees about what is and what is not in your "at" environment. I think the problem lies there. You have both UNIX and linux tags. So I cannot give a lot more help than that.
You can enable logging -- the way YOU need it -- of the at code chunk:
exec 2&>1 > /tmp/somefile.log
Then write debugging messages to stdout or stderr.
Your HEREDOC is being interpolated. Try quoting the delimiter:
at $wval << 'ENDMARKER'
Although ( I haven't looked closely) it appears that you want some interpolation. But you definitely do not want it on the line in which you reference $i, so quote that $ if you do not quote the entire heredoc:
if finger -m | grep \$i; then
You need to pass the -k option to at:
...
at -k $wval <<ENDMARKER
...
at is otherwise defaulting to your login shell which is csh or one of its derivatives.
It turns out that the while command and the if command needed to be combined.
while [[ -s Usr_List ]]; do
......
done
Hi can someone fix this issue, i am not able to get outpt.
I am not able to get output of -p.
#!/bin/bash
args=`getopt c:m:p $*`
if [ $? != 0 -o $# == 0 ]
then
echo 'Usage: -c <current-dir> -m <my dir> -p <argument>'
exit 1
fi
set -- $args
for i
do
case "$i" in
-c) shift;CURRDIR=$1;shift;shift ;;
-m) MYDIR=$1;shift;;
-p) ARGVAL=$OPTARG;;
esac
done
echo "CURRDIR = $CURRDIR"
echo "MYDIR = $MYDIR"
echo "ARGVAL = $ARGVAL"
./1.sh -c "def" -m "ref" -p "ref -k ref"
Expected output
output -c = "def"
-m ="ref"
-p ="ref -k ref"
getopt
args=`getopt c:m:p $*`
You need to add a colon after the p to indicate that -p takes an argument. Also you should change $* to "$#" for better handling of spaces.
args=`getopt c:m:p: "$#"`
You are also mixing up getopt and getopts. $OPTARG is a getopts feature. With plain getopt and set you should simply use $2 and then shift off the argument.
-p) ARGVAL=$2; shift 2;;
At this point you've done as good as you can with getopt. Unfortunately it doesn't handle the multi-word argument to -p no matter what you do. For that, we need to use getopts.
getopts
From getopt and getopts:
Easier to use and generally better than getopt, though of course not available in csh-like shells. You shouldn't be using those anyway.
This works rather differently than "getopt". First, because it's a built-in, you usually won't find a separate man page for it, though "help getopts" may give you what you need.
The old "getopt" is called once, and it modifies the environment as we saw above. The builtin "getopts" is called each time you want to process an argument, and it doesn't change the original arguments .
Using getopts is a lot simpler. Your entire loop can be simplified to this:
while getopts c:m:p: flag
do
case "$flag" in
c) CURRDIR=$OPTARG;;
m) MYDIR=$OPTARG;;
p) ARGVAL=$OPTARG;;
esac
done
No shifting needed, you just read $OPTARG each time to get each option's value.
I have to use a Unix script to pass arguments:
./Script.sh -c "abc" -d "def" -k "abc -d -c"
where the argument for:
-c = "abc"
-d = "def"
-k = "abc -d -c"
How can I handle options in a Uunix shell script?
Here is some option handling using getopts:
# -F Final version (do not append date to version)
# -s suffix Add '-suffix' after version number
# -V Print version and exit
# -h Print help and exit
# -j jdcfile JDC file for project - required
# -q Quiet operation
# -v Verbose operation
arg0=$(basename $0 .sh)
usage()
{
echo "Usage: $arg0 [-hqvFV] [-s suffix] -j jdcfile file.msd" 1>&2
exit 1
}
error()
{
echo "$0: $*" 1>&2
exit 1
}
Fflag=
suffix=
jdcfile=
qflag=
vflag=no
while getopts FVhj:qs:v opt
do
case "$opt" in
(F) Fflag="-F";;
(V) echo "Version information";;
(h) echo "Help information";;
(j) jdcfile="$OPTARG";;
(q) qflag="-q";;
(s) suffix="$OPTARG";;
(v) vflag=yes;;
(*) usage;;
esac
done
shift $(($OPTIND - 1))
case $# in
(1) : OK;;
(*) usage;;
esac
if [ -z "$jdcfile" ]
then error "you did not specify which jdcfile to use (-j option)"
fi
The script then continues and does its task based on the options it was given. The shift removes the 'consumed' options, leaving just the file name arguments.
The argument can contain whitespace, so either use the getopts built-in shell command or the GNU enhanced version of the external getopt program.
The getopts option is more portable, because not all systems have the GNU enhanced version of getopt. For example, Linux has the GNU enhanced version, but Mac OS X does not. The original version of getopt does not support whitespaces. Despite this limitation, there is a reason why you might want to use toe the GNU enhanced version: it supports long option names, which getopts does not.
This is how to use the GNU enhanced getopt with whitespaces. It is important to use "$#" (use $# instead of $* and make sure the double quotes are around it) and to eval the whole set command so that whitespaces are handled properly.
eval set -- `getopt --long currdir:,dir:,argval:,verbose -o c:d:k:v -- "$#"`
while [ $# -gt 0 ]
do
case "$1" in
-c | --currdir) CURRDIR="$2"; shift;;
-d | --dir) MYDIR="$2"; shift;;
-k | --argval) ARGVAL="$2"; shift;;
-v | --verbose) VERBOSE=yes;;
esac
shift
done
There is a command getopts, and a program getopt, though I'm of the opinion that by the time you need to handle arguments, you've outgrown shell scripting.
I'm not actually sure how getopts work, having never actually used it; but here, and check your shell's docs.
getopt splits your arguments into flags -- rest as far as I can tell.
I need a simple busybox sh wrapper which will do:
IF "-Q" PARAMETER IS PROVIDED THEN
acommand ALL PARAMETERS BUT "-Q" 2>&1 1>/dev/null
ELSE
acommand ALL PARAMETERS
FI
Parameters may include spaces.
BTW I want to run the script with busybox sh and it doesn't support arrays.
It's possible to do it all in busybox's ash shell:
#!/bin/sh
for i in "${#}"
do
if [ "$i" = "-Q" ]
then
flagQ=1
else
args="$args \"$i\""
fi
done
if [ "$flagQ" = "1" ]
then
eval acommand "$args" 2>&1 1>/dev/null
else
eval acommand "$args"
fi
This uses bash arrays - but I see from the comments to another answer that the code isn't supposed to run under bash (despite the bash tag originally applied to the question); it is meant to run under the busybox shell.
I'm almost certain it doesn't answer the question because the question is substantially unanswerable given the limitations of busybox. In times past, I have used a custom program I called 'escape' to build up an argument string that can be eval'd to get the original arguments - spaces and all. But that requires support from outside the shell.
This solution only uses 'bash'. I'm not sure it is fully idiomatic bash code, but it works.
#!/bin/bash
i=0
Qflag=0
for arg in "$#"
do
if [ "X$arg" = "X-Q" ]
then Qflag=1
else args[$((i++))]=$arg
fi
done
if [ $Qflag = 1 ]
then exec acommand "${args[#]}" 2>&1 >/dev/null
else exec acommand "${args[#]}"
fi
The first loops builds up an array, args, with the arguments to the script, except it doesn't add '-Q' to the list and records its presence in variable Qflag.
The if statement at the end notes whether Qflag was set to 1, and if so, sends the errors from 'acommand' to standard output and sends regular standard output to /dev/null (which is different from the effect if the I/O redirections are reversed - that would send standard output to /dev/null and send standard error to the same place, forcing silence on 'acommand').
The use of 'exec' is a trivial optimization that simplifies exit status handling in this case.
Tested with 'acommand' that prints its arguments on separate lines:
#!/bin/sh
for arg in "$#"
do echo "$arg"
done
and with command lines such as:
bash wrapper.sh -c -d 'arg with spaces'
which produces the output:
-c
-d
arg with spaces
Obviously, with the I/O redirection in place, there is no output from:
bash wrapper.sh -c -Q -d 'arg with spaces'
However, if you omit the I/O redirection, you get to see the same output.
It's a pity that you need to handle spaces in the arguments otherwise this might work:
#!/bin/sh
Q=0
ARGS=
while [ $# -ge 1 ]; do
case $1 in
-Q)
Q=1
;;
*)
ARGS="$ARGS $1"
;;
esac
shift
done
if [ $Q -eq 1 ] ; then
acommand $ARGS 2>&1 1>/dev/null
else
acommand $ARGS
fi
EDIT:
So this version handles spaces, at the expense of interpreting back-ticks.
#!/bin/busybox ash
Q=0
ARGS=
while [ $# -ge 1 ]; do
case $1 in
-Q)
Q=1
;;
*)
ARGS="$ARGS \"$1\""
;;
esac
shift
done
if [ "$Q" -eq 1 ] ; then
eval acommand $ARGS 2>&1 1>/dev/null
else
eval acommand $ARGS
fi
I think to have a complete solution you are going to have to code it in C, which will be a bit ugly.