How to make a non interactive shell scripts [duplicate] - linux

This question already has an answer here:
Shell script argument parsing
(1 answer)
Closed 8 years ago.
I want to make a non-interactive shell script,where I can give options at the beginning of the execution of the script.
Where I can hard-code the various actions to be taken when different inputs are provided by user.
for example:
Below should perform some action on target.txt
user#/root>myPlannedScript -p targer.txt
Below should perform some other actions on the target.txt
user#/root>myPlannedScript -a targer.txt
For example:
cat tool performs various actions when different options are given. I want my script to act like this:
:/root> cat --h
Usage: cat [OPTION] [FILE]...
Concatenate FILE(s), or standard input, to standard output.
-A, --show-all equivalent to -vET
-b, --number-nonblank number nonblank output lines
-e equivalent to -vE
-E, --show-ends display $ at end of each line
-n, --number number all output lines
-r, --reversible use \ to make the output reversible, implies -v
-s, --squeeze-blank never more than one single blank line
-t equivalent to -vT
-T, --show-tabs display TAB characters as ^I

query.sh
#!/bin/bash
if [ $# -eq 0 ]
then echo do one thing
else echo do other thing
fi
"query.sh" => do one thing
"query.sh anythingYouPut" => do other thing ;oP
but if you really want a parameter for each action
#!/bin/bash
if [ -z "$1" ]
then
echo do nothing
else
if [ $1 -eq 1 ]
then
echo do one thing
fi
if [ $1 -eq 2 ]
then
echo do other thing
fi
fi
"query.sh" => do nothing
"query.sh 1" => do one thing
"query.sh 2" => do other thing

Related

About egrep command

How can I create a bash script that admits a file as a command line argument and prints on screen all lines that have a length of more than 12 characters using egrep command?
You can use:
egrep '.{13}'
The . will match any character, and the {13} repeats it exactly 13 times. You can put this in a shell script like:
#!/bin/sh
# Make sure the user actually passed an argument. This is useful
# because otherwise grep will try and read from stdin and hang forever
if [ -z "$1" ]; then
echo "Filename needed"
exit 1
fi
egrep '.{13}' "$1"
The $1 refers to the first command argument. You can also use $2, $3, etc, and $# refers to all commandline arguments (useful if you want to run it over multiple files):
egrep '.{13}' "$#"

What is the best way to evaluate two variables representing a single pipeline command in bash?

I have a function produce which determines whether a file is present and if not it runs the following command. This works fine when the command output simply writes to stdout. However in the command below I pipe the output to a second command and then to a third command before it outputs to stdout. In this scenario I get the output writing to file correctly but it does not echo the preceding $# from the produce function and the contents of the initial unpopulated outputfile.vcf (contains header columns) which is generated by the pipeline command on execution is also being outputted to stdout. Is there a more appropriate way to evaluate $# > "${curfile}"
produce() {
local curfile=$1
#Remove the first element of the list of passed arguments
shift
if [ ! -e "${curfile}" ]; then
#Run the subsequent command as shown in the list of passed arguments
echo $#
$# > "${curfile}"
fi
}
produce outputfile.vcf samtools view -bq 20 input.bam | samtools mpileup -Egu -t DP,SP -f hs37d5formatted.fa -| bcftools call -cNv -
Ok as I mentioned in my comment the issue seems to relate to the pipe characters so I had to evaluate the variable using eval and escape the pipe character. So in order to ensure the function produce interprets $# correctly I fed the command as follows. Note also that the variables are all now quoted
#######
produce() {
local curfile="$1"
#Remove the first element of the list of passed arguments
shift
if [ ! -e "${curfile}" ]; then
#Run the subsequent command as shown in the list of passed arguments
echo "$#"
eval "$# > ${curfile}"
fi
}
produce outputfile.vcf samtools view -bq 20 input.bam \| samtools mpileup -Egu -t DP,SP -f hs37d5formatted.fa -\| bcftools call -cNv -
You can use >> to append to a file. For example:
echo "line 1" >> filename
echo "line 2" >> filename
Will result in a file containing:
line 1
line 2

How to use grep with single brackets?

I was looking at an answer in another thread about which bracket pair to use with if in a bash script. [[ is less surprising and has more features such as pattern matching (=~) whereas [ and test are built-in and POSIX compliant making them portable.
Recently, I was attempting to test the result of a grep command and it was failing with [: too many arguments. I was using [. But, when I switched to [[ it worked. How would I do such a test with [ in order to maintain the portability?
This is the test that failed:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
echo "slew mode"
else
echo "not slew mode"
fi
And the test that succeeded:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [[ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]]; then
echo "slew mode"
else
echo "not slew mode"
fi
if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
This command will certainly fail for multiple matches. It will throw an error as the grep output is being split on line ending.
Multiple matches of grep are separated by new line and the test command becomes like:
[ match1 match2 match3 ... ]
which doesn't make much of a sense. You will get different error messages as the number of matches returned by grep (i.e the number of arguments for test command [).
For example:
2 matches will give you unary operator expected error
3 matches will give you binary operator expected error and
more than 3 matches will give you too many arguments error or such, in Bash.
You need to quote variables inside [ to prevent word splitting.
On the other hand, the Bash specific [[ prevents word splitting by default. Thus the grep output doesn't get split on new line and remains a single string which is a valid argument for the test command.
So the solution is to look only at the exit status of grep:
if grep -E -f slew_pattern /etc/sysconfig/ntpd; then
Or use quote when capturing output:
if [ "$(grep -E -f slew_pattern /etc/sysconfig/ntpd)" ]; then
Note:
You don't really need to capture the output here, simply looking at the exit status will suffice.
Additionally, you can suppress output of grep command to be printed with -q option and errors with -s option.

Script parameters in Bash

I'm trying to make a shell script which should be used like this:
ocrscript.sh -from /home/kristoffer/test.png -to /home/kristoffer/test.txt
The script will then ocr convert the image file to a text file. Here is what I have come up with so far:
#!/bin/bash
export HOME=/home/kristoffer
/usr/local/bin/abbyyocr9 -rl Swedish -if ???fromvalue??? -of ???tovalue??? 2>&1
But I don't know how to get the -from and -to values. Any ideas on how to do it?
The arguments that you provide to a bashscript will appear in the variables $1 and $2 and $3 where the number refers to the argument. $0 is the command itself.
The arguments are seperated by spaces, so if you would provide the -from and -to in the command, they will end up in these variables too, so for this:
./ocrscript.sh -from /home/kristoffer/test.png -to /home/kristoffer/test.txt
You'll get:
$0 # ocrscript.sh
$1 # -from
$2 # /home/kristoffer/test.png
$3 # -to
$4 # /home/kristoffer/test.txt
It might be easier to omit the -from and the -to, like:
ocrscript.sh /home/kristoffer/test.png /home/kristoffer/test.txt
Then you'll have:
$1 # /home/kristoffer/test.png
$2 # /home/kristoffer/test.txt
The downside is that you'll have to supply it in the right order. There are libraries that can make it easier to parse named arguments on the command line, but usually for simple shell scripts you should just use the easy way, if it's no problem.
Then you can do:
/usr/local/bin/abbyyocr9 -rl Swedish -if "$1" -of "$2" 2>&1
The double quotes around the $1 and the $2 are not always necessary but are adviced, because some strings won't work if you don't put them between double quotes.
If you're not completely attached to using "from" and "to" as your option names, it's fairly easy to implement this using getopts:
while getopts f:t: opts; do
case ${opts} in
f) FROM_VAL=${OPTARG} ;;
t) TO_VAL=${OPTARG} ;;
esac
done
getopts is a program that processes command line arguments and conveniently parses them for you.
f:t: specifies that you're expecting 2 parameters that contain values (indicated by the colon). Something like f:t:v says that -v will only be interpreted as a flag.
opts is where the current parameter is stored. The case statement is where you will process this.
${OPTARG} contains the value following the parameter. ${FROM_VAL} for example will get the value /home/kristoffer/test.png if you ran your script like:
ocrscript.sh -f /home/kristoffer/test.png -t /home/kristoffer/test.txt
As the others are suggesting, if this is your first time writing bash scripts you should really read up on some basics. This was just a quick tutorial on how getopts works.
Use the variables "$1", "$2", "$3" and so on to access arguments. To access all of them you can use "$#", or to get the count of arguments $# (might be useful to check for too few or too many arguments).
I needed to make sure that my scripts are entirely portable between various machines, shells and even cygwin versions. Further, my colleagues who were the ones I had to write the scripts for, are programmers, so I ended up using this:
for ((i=1;i<=$#;i++));
do
if [ ${!i} = "-s" ]
then ((i++))
var1=${!i};
elif [ ${!i} = "-log" ];
then ((i++))
logFile=${!i};
elif [ ${!i} = "-x" ];
then ((i++))
var2=${!i};
elif [ ${!i} = "-p" ];
then ((i++))
var3=${!i};
elif [ ${!i} = "-b" ];
then ((i++))
var4=${!i};
elif [ ${!i} = "-l" ];
then ((i++))
var5=${!i};
elif [ ${!i} = "-a" ];
then ((i++))
var6=${!i};
fi
done;
Rationale: I included a launcher.sh script as well, since the whole operation had several steps which were quasi independent on each other (I'm saying "quasi", because even though each script could be run on its own, they were usually all run together), and in two days I found out, that about half of my colleagues, being programmers and all, were too good to be using the launcher file, follow the "usage", or read the HELP which was displayed every time they did something wrong and they were making a mess of the whole thing, running scripts with arguments in the wrong order and complaining that the scripts didn't work properly. Being the choleric I am I decided to overhaul all my scripts to make sure that they are colleague-proof. The code segment above was the first thing.
In bash $1 is the first argument passed to the script, $2 second and so on
/usr/local/bin/abbyyocr9 -rl Swedish -if "$1" -of "$2" 2>&1
So you can use:
./your_script.sh some_source_file.png destination_file.txt
Explanation on double quotes;
consider three scripts:
# foo.sh
bash bar.sh $1
# cat foo2.sh
bash bar.sh "$1"
# bar.sh
echo "1-$1" "2-$2"
Now invoke:
$ bash foo.sh "a b"
1-a 2-b
$ bash foo2.sh "a b"
1-a b 2-
When you invoke foo.sh "a b" then it invokes bar.sh a b (two arguments), and with foo2.sh "a b" it invokes bar.sh "a b" (1 argument). Always have in mind how parameters are passed and expaned in bash, it will save you a lot of headache.

I keep getting a 'while syntax' error on the output of the at job in unix and I have no idea why

#!/usr/dt/bin/dtksh
while getopts w:m: option
do
case $option in
w) wflag=1
wval="$OPTARG";;
m) mflag=1
mval="$OPTARG";;
?) printf 'BAD\n' $0
exit 2;;
esac
done
if [ ! -z "$wflag" ]; then
printf "W and -w arg is $wval\n"
fi
if [ ! -z "$mflag" ]; then
printf "M and -m arg is $mval\n"
fi
shift $(($OPTIND - 1))
printf "Remaining arguments are: $* \n"
at $wval <<ENDMARKER
echo $* >> Search_List
tr " " "\n" <Search_List >Usr_List
while true; do
if [ -s Usr_List ]; then
for i in $(cat Usr_List); do
if finger -m | grep $i; then
echo '$i is online' | elm user
sed '/$i/d' <Usr_List >tmplist
mv tmplist Usr_List
fi
done
else
break
fi
done
ENDMARKER
Essentially I want to keep searching through until it is empty. Each time an element of the list is found, it is deleted. Once the list is empty quit.
There are no error messages when I first run the command, it only shows up in an email containing the output of the at job.
Thanks in advance for any advice
EDIT: The script uses getopts and takes one argument for -w and one for -m, the w value is set as the time for the at job, the m still has to be used. Any arguments after the one for m are sent to a file called Search_List, Search_List is edited and saved as Usr_List. Then in the while loop, while Usr_List is not empty, the script checks the results of finger -m against the names in Usr_List. If a name is found, it is removed from Usr_List. Once Usr_List is empty, the program should stop.
elm is a way to send an email, so elm user sends an email to user.
The error is :
while: Expression syntax
at uses /bin/sh by default.
at now <<ENDMARKER
<code here>
ENDMARKER
All of this executes under /bin/sh, which on some systems can be Bourne Shell (Solaris for example).
You need to figure out what /bin/sh is for your system, then modify things accordingly. Plus, read the gurantees about what is and what is not in your "at" environment. I think the problem lies there. You have both UNIX and linux tags. So I cannot give a lot more help than that.
You can enable logging -- the way YOU need it -- of the at code chunk:
exec 2&>1 > /tmp/somefile.log
Then write debugging messages to stdout or stderr.
Your HEREDOC is being interpolated. Try quoting the delimiter:
at $wval << 'ENDMARKER'
Although ( I haven't looked closely) it appears that you want some interpolation. But you definitely do not want it on the line in which you reference $i, so quote that $ if you do not quote the entire heredoc:
if finger -m | grep \$i; then
You need to pass the -k option to at:
...
at -k $wval <<ENDMARKER
...
at is otherwise defaulting to your login shell which is csh or one of its derivatives.
It turns out that the while command and the if command needed to be combined.
while [[ -s Usr_List ]]; do
......
done

Resources