Bash how check if an argument is a sed patterns? - linux

I'm working on a bash script that should receive many arguments. those arguments could be simple like:
Myscript [-r] [-l|-u] <dir/file names...>
or it could be just a sed patterns given instead of a simple arguments like:
Myscript <sed pattern> <dir/file names...>
So my question is how to check if the arguments given is a sed patterns and not directory path or filename?
I have done something but it doesn't work for a long path (dir1/dir2/dir)
while [[ $# > 0 ]]
do
key="$1"
case $key in
-ru|-ur)
config_recusive
config_upper
shift
;;
-rl|-lr)
config_recusive
config_lower
shift
;;
-r)
config_recusive
shift
;;
-l)
config_lower
shift
;;
-u)
config_upper
shift
;;
*)
check="^.*/.*/.*$"
if [[ $key =~ $check ]] ; then
config_sed_pattern
else
TARGET=$key
fi
shift
;;
esac
done
To be more clear, here is an example of my problem when I'm trying to run the script like that:
./myscript -u tmp/users/regions
its being confused taking the path (tmp/users/regions) as a sed patterns.
hope that I was clear enough.
waiting for your help :)
thanks

i think you could use try run that script on something
echo a | sed "$pattern" >/dev/null 2>/dev/null
and then check the $?
if [ $? != 0 ]; then # means sed failed
#...
but maybe sed will think it will be its argument too.
btw, handle arguments using getopt will make things easier.
i don't think the way you design argument format is good. maybe you think making arguments in different forms could make your program looks dynamic and awesome. but now you are trying to solve a problem have nothing to do with your serious business and wasting your own time and ours. that's not good. it's better to make things stupid and clear.
in your case, maybe you can add another argument to show it's a sed expression that follows.
YourScript -e <sed expr> <others>
and in such a way, you will have idea about what you have done two weeks later.

Related

Script parameters in Bash

I'm trying to make a shell script which should be used like this:
ocrscript.sh -from /home/kristoffer/test.png -to /home/kristoffer/test.txt
The script will then ocr convert the image file to a text file. Here is what I have come up with so far:
#!/bin/bash
export HOME=/home/kristoffer
/usr/local/bin/abbyyocr9 -rl Swedish -if ???fromvalue??? -of ???tovalue??? 2>&1
But I don't know how to get the -from and -to values. Any ideas on how to do it?
The arguments that you provide to a bashscript will appear in the variables $1 and $2 and $3 where the number refers to the argument. $0 is the command itself.
The arguments are seperated by spaces, so if you would provide the -from and -to in the command, they will end up in these variables too, so for this:
./ocrscript.sh -from /home/kristoffer/test.png -to /home/kristoffer/test.txt
You'll get:
$0 # ocrscript.sh
$1 # -from
$2 # /home/kristoffer/test.png
$3 # -to
$4 # /home/kristoffer/test.txt
It might be easier to omit the -from and the -to, like:
ocrscript.sh /home/kristoffer/test.png /home/kristoffer/test.txt
Then you'll have:
$1 # /home/kristoffer/test.png
$2 # /home/kristoffer/test.txt
The downside is that you'll have to supply it in the right order. There are libraries that can make it easier to parse named arguments on the command line, but usually for simple shell scripts you should just use the easy way, if it's no problem.
Then you can do:
/usr/local/bin/abbyyocr9 -rl Swedish -if "$1" -of "$2" 2>&1
The double quotes around the $1 and the $2 are not always necessary but are adviced, because some strings won't work if you don't put them between double quotes.
If you're not completely attached to using "from" and "to" as your option names, it's fairly easy to implement this using getopts:
while getopts f:t: opts; do
case ${opts} in
f) FROM_VAL=${OPTARG} ;;
t) TO_VAL=${OPTARG} ;;
esac
done
getopts is a program that processes command line arguments and conveniently parses them for you.
f:t: specifies that you're expecting 2 parameters that contain values (indicated by the colon). Something like f:t:v says that -v will only be interpreted as a flag.
opts is where the current parameter is stored. The case statement is where you will process this.
${OPTARG} contains the value following the parameter. ${FROM_VAL} for example will get the value /home/kristoffer/test.png if you ran your script like:
ocrscript.sh -f /home/kristoffer/test.png -t /home/kristoffer/test.txt
As the others are suggesting, if this is your first time writing bash scripts you should really read up on some basics. This was just a quick tutorial on how getopts works.
Use the variables "$1", "$2", "$3" and so on to access arguments. To access all of them you can use "$#", or to get the count of arguments $# (might be useful to check for too few or too many arguments).
I needed to make sure that my scripts are entirely portable between various machines, shells and even cygwin versions. Further, my colleagues who were the ones I had to write the scripts for, are programmers, so I ended up using this:
for ((i=1;i<=$#;i++));
do
if [ ${!i} = "-s" ]
then ((i++))
var1=${!i};
elif [ ${!i} = "-log" ];
then ((i++))
logFile=${!i};
elif [ ${!i} = "-x" ];
then ((i++))
var2=${!i};
elif [ ${!i} = "-p" ];
then ((i++))
var3=${!i};
elif [ ${!i} = "-b" ];
then ((i++))
var4=${!i};
elif [ ${!i} = "-l" ];
then ((i++))
var5=${!i};
elif [ ${!i} = "-a" ];
then ((i++))
var6=${!i};
fi
done;
Rationale: I included a launcher.sh script as well, since the whole operation had several steps which were quasi independent on each other (I'm saying "quasi", because even though each script could be run on its own, they were usually all run together), and in two days I found out, that about half of my colleagues, being programmers and all, were too good to be using the launcher file, follow the "usage", or read the HELP which was displayed every time they did something wrong and they were making a mess of the whole thing, running scripts with arguments in the wrong order and complaining that the scripts didn't work properly. Being the choleric I am I decided to overhaul all my scripts to make sure that they are colleague-proof. The code segment above was the first thing.
In bash $1 is the first argument passed to the script, $2 second and so on
/usr/local/bin/abbyyocr9 -rl Swedish -if "$1" -of "$2" 2>&1
So you can use:
./your_script.sh some_source_file.png destination_file.txt
Explanation on double quotes;
consider three scripts:
# foo.sh
bash bar.sh $1
# cat foo2.sh
bash bar.sh "$1"
# bar.sh
echo "1-$1" "2-$2"
Now invoke:
$ bash foo.sh "a b"
1-a 2-b
$ bash foo2.sh "a b"
1-a b 2-
When you invoke foo.sh "a b" then it invokes bar.sh a b (two arguments), and with foo2.sh "a b" it invokes bar.sh "a b" (1 argument). Always have in mind how parameters are passed and expaned in bash, it will save you a lot of headache.

I keep getting a 'while syntax' error on the output of the at job in unix and I have no idea why

#!/usr/dt/bin/dtksh
while getopts w:m: option
do
case $option in
w) wflag=1
wval="$OPTARG";;
m) mflag=1
mval="$OPTARG";;
?) printf 'BAD\n' $0
exit 2;;
esac
done
if [ ! -z "$wflag" ]; then
printf "W and -w arg is $wval\n"
fi
if [ ! -z "$mflag" ]; then
printf "M and -m arg is $mval\n"
fi
shift $(($OPTIND - 1))
printf "Remaining arguments are: $* \n"
at $wval <<ENDMARKER
echo $* >> Search_List
tr " " "\n" <Search_List >Usr_List
while true; do
if [ -s Usr_List ]; then
for i in $(cat Usr_List); do
if finger -m | grep $i; then
echo '$i is online' | elm user
sed '/$i/d' <Usr_List >tmplist
mv tmplist Usr_List
fi
done
else
break
fi
done
ENDMARKER
Essentially I want to keep searching through until it is empty. Each time an element of the list is found, it is deleted. Once the list is empty quit.
There are no error messages when I first run the command, it only shows up in an email containing the output of the at job.
Thanks in advance for any advice
EDIT: The script uses getopts and takes one argument for -w and one for -m, the w value is set as the time for the at job, the m still has to be used. Any arguments after the one for m are sent to a file called Search_List, Search_List is edited and saved as Usr_List. Then in the while loop, while Usr_List is not empty, the script checks the results of finger -m against the names in Usr_List. If a name is found, it is removed from Usr_List. Once Usr_List is empty, the program should stop.
elm is a way to send an email, so elm user sends an email to user.
The error is :
while: Expression syntax
at uses /bin/sh by default.
at now <<ENDMARKER
<code here>
ENDMARKER
All of this executes under /bin/sh, which on some systems can be Bourne Shell (Solaris for example).
You need to figure out what /bin/sh is for your system, then modify things accordingly. Plus, read the gurantees about what is and what is not in your "at" environment. I think the problem lies there. You have both UNIX and linux tags. So I cannot give a lot more help than that.
You can enable logging -- the way YOU need it -- of the at code chunk:
exec 2&>1 > /tmp/somefile.log
Then write debugging messages to stdout or stderr.
Your HEREDOC is being interpolated. Try quoting the delimiter:
at $wval << 'ENDMARKER'
Although ( I haven't looked closely) it appears that you want some interpolation. But you definitely do not want it on the line in which you reference $i, so quote that $ if you do not quote the entire heredoc:
if finger -m | grep \$i; then
You need to pass the -k option to at:
...
at -k $wval <<ENDMARKER
...
at is otherwise defaulting to your login shell which is csh or one of its derivatives.
It turns out that the while command and the if command needed to be combined.
while [[ -s Usr_List ]]; do
......
done

Prevent ssh from breaking up shell script parameters

I have a script, which is essentially a wrapper around an executable by the same name on a different machine. For the sake of example, i'll wrap printf here. My current script looks like this:
#!/bin/bash
ssh user#hostname.tld. printf "$#"
Unfortunately, this breaks when one of the arguments contains a space, e.g. i'd expect the following commands to give identical outputs.:
~$ ./wrap_printf "%s_%s" "hello world" "1"
hello_world1_
~$ printf "%s_%s" "hello world" "1"
hello world_1
The problem gets even worse when (escaped) newlines are involved. How would I properly escape my arguments here?
Based on the answer from Peter Lyons, but also allow quotes inside arguments:
#!/bin/bash
QUOTE_ARGS=''
for ARG in "$#"
do
ARG=$(printf "%q" "$ARG")
QUOTE_ARGS="${QUOTE_ARGS} $ARG"
done
ssh user#hostname.tld. "printf ${QUOTE_ARGS}"
This works for everything i've tested so far, except newlines:
$ /tmp/wrap_printf "[-%s-]" "hello'\$t\""
[-hello'$t"-]
#!/bin/sh
QUOTE_ARGS=''
for ARG in "$#"
do
QUOTE_ARGS="${QUOTE_ARGS} '${ARG}'"
done
ssh user#hostname.tld. "${QUOTE_ARGS}"
This works for spaces. It doesn't work if the argument has an embedded single quote.
Getting quoting right is pretty hard and doing it in bash (in a general and robust way) almost impossible.
Use Perl:
#!/usr/bin/perl
use Net::OpenSSH;
my $ssh = Net::OpenSSH->new('user#hostname');
$ssh->system('printf', #ARGV);
Based on the answers from Koert and Peter Lyons, here a wrapper for ssh; i call it "sshsystem". (also available at https://gist.github.com/4672115)
#!/bin/bash
# quote command in ssh call to prevent remote side from expanding any arguments
# uses bash printf %q for quoting - no idea how compatible this is with other shells.
# http://stackoverflow.com/questions/6592376/prevent-ssh-from-breaking-up-shell-script-parameters
sshargs=()
while (( $# > 0 )); do
case "$1" in
-[1246AaCfgKkMNnqsTtVvXxYy])
# simple argument
sshargs+=("$1")
shift
;;
-[bcDeFIiLlmOopRSWw])
# argument with parameter
sshargs+=("$1")
shift
if (( $# == 0 )); then
echo "missing second part of long argument" >&2
exit 99
fi
sshargs+=("$1")
shift
;;
-[bcDeFIiLlmOopRSWw]*)
# argument with parameter appended without space
sshargs+=("$1")
shift
;;
--)
# end of arguments
sshargs+=("$1")
shift
break
;;
-*)
echo "unrecognized argument: '$1'" >&2
exit 99
;;
*)
# end of arguments
break
;;
esac
done
# user#host
sshargs+=("$1")
shift
# command - quote
if (( $# > 0 )); then
# no need to make COMMAND an array - ssh will merge it anyway
COMMAND=
while (( $# > 0 )); do
arg=$(printf "%q" "$1")
COMMAND="${COMMAND} ${arg}"
shift
done
sshargs+=("${COMMAND}")
fi
exec ssh "${sshargs[#]}"
The easiest and quickest is to just use Bash's Quoting Parameter Transformation: ${parameter#Q}. This can automatically applied during array expansion with ${array[#]#Q}, but when using the builtin argument array, the name and the brackets are dropped, so it becomes ${##Q}. Therefore the original script only needs 4 characters added to it to work.
#!/bin/bash
ssh user#hostname.tld. printf "${##Q}"
Now any escaping will work, even terminal colors like this:
./wrap_printf "%s\e[39m\e[49m\n" $'\e[30m\e[42mBlack on Green' "Just Normal Text"

issue in getopt , Unix shell script

Hi can someone fix this issue, i am not able to get outpt.
I am not able to get output of -p.
#!/bin/bash
args=`getopt c:m:p $*`
if [ $? != 0 -o $# == 0 ]
then
echo 'Usage: -c <current-dir> -m <my dir> -p <argument>'
exit 1
fi
set -- $args
for i
do
case "$i" in
-c) shift;CURRDIR=$1;shift;shift ;;
-m) MYDIR=$1;shift;;
-p) ARGVAL=$OPTARG;;
esac
done
echo "CURRDIR = $CURRDIR"
echo "MYDIR = $MYDIR"
echo "ARGVAL = $ARGVAL"
./1.sh -c "def" -m "ref" -p "ref -k ref"
Expected output
output -c = "def"
-m ="ref"
-p ="ref -k ref"
getopt
args=`getopt c:m:p $*`
You need to add a colon after the p to indicate that -p takes an argument. Also you should change $* to "$#" for better handling of spaces.
args=`getopt c:m:p: "$#"`
You are also mixing up getopt and getopts. $OPTARG is a getopts feature. With plain getopt and set you should simply use $2 and then shift off the argument.
-p) ARGVAL=$2; shift 2;;
At this point you've done as good as you can with getopt. Unfortunately it doesn't handle the multi-word argument to -p no matter what you do. For that, we need to use getopts.
getopts
From getopt and getopts:
Easier to use and generally better than getopt, though of course not available in csh-like shells. You shouldn't be using those anyway.
This works rather differently than "getopt". First, because it's a built-in, you usually won't find a separate man page for it, though "help getopts" may give you what you need.
The old "getopt" is called once, and it modifies the environment as we saw above. The builtin "getopts" is called each time you want to process an argument, and it doesn't change the original arguments .
Using getopts is a lot simpler. Your entire loop can be simplified to this:
while getopts c:m:p: flag
do
case "$flag" in
c) CURRDIR=$OPTARG;;
m) MYDIR=$OPTARG;;
p) ARGVAL=$OPTARG;;
esac
done
No shifting needed, you just read $OPTARG each time to get each option's value.

How to properly handle wildcard expansion in a bash shell script?

#!/bin/bash
hello()
{
SRC=$1
DEST=$2
for IP in `cat /opt/ankit/configs/machine.configs` ; do
echo $SRC | grep '*' > /dev/null
if test `echo $?` -eq 0 ; then
for STAR in $SRC ; do
echo -en "$IP"
echo -en "\n\t ARG1=$STAR ARG2=$2\n\n"
done
else
echo -en "$IP"
echo -en "\n\t ARG1=$SRC ARG2=$DEST\n\n"
fi
done
}
hello $1 $2
The above is the shell script which I provide source (SRC) & desitnation (DEST) path. It worked fine when I did not put in a SRC path with wild card ''. When I run this shell script and give ''.pdf or '*'as follows:
root#ankit1:~/as_prac# ./test.sh /home/dev/Examples/*.pdf /ankit_test/as
I get the following output:
192.168.1.6
ARG1=/home/dev/Examples/case_Contact.pdf ARG2=/home/dev/Examples/case_howard_county_library.pdf
The DEST is /ankit_test/as but DEST also get manupulated due to '*'. The expected answer is
ARG1=/home/dev/Examples/case_Contact.pdf ARG2=/ankit_test/as
So, if you understand what I am trying to do, please help me out to solve this BUG.
I'll be grateful to you.
Thanks in advance!!!
I need to know exactly how I use '*.pdf' in my program one by one without disturbing DEST.
Your script needs more work.
Even after escaping the wildcard, you won't get your expected answer. You will get:
ARG1=/home/dev/Examples/*.pdf ARG2=/ankit__test/as
Try the following instead:
for IP in `cat /opt/ankit/configs/machine.configs`
do
for i in $SRC
do
echo -en "$IP"
echo -en "\n\t ARG1=$i ARG2=$DEST\n\n"
done
done
Run it like this:
root#ankit1:~/as_prac# ./test.sh "/home/dev/Examples/*.pdf" /ankit__test/as
The shell will expand wildcards unless you escape them, so for example if you have
$ ls
one.pdf two.pdf three.pdf
and run your script as
./test.sh *.pdf /ankit__test/as
it will be the same as
./test.sh one.pdf two.pdf three.pdf /ankit__test/as
which is not what you expect. Doing
./test.sh \*.pdf /ankit__test/as
should work.
If you can, change the order of the parameters passed to your shell script as follows:
./test.sh /ankit_test/as /home/dev/Examples/*.pdf
That would make your life a lot easier since the variable part moves to the end of the line. Then, the following script will do what you want:
#!/bin/bash
hello()
{
SRC=$1
DEST=$2
for IP in `cat /opt/ankit/configs/machine.configs` ; do
echo -en "$IP"
echo -en "\n\t ARG1=$SRC ARG2=$DEST\n\n"
done
}
arg2=$1
shift
while [[ "$1" != "" ]] ; do
hello $1 $arg2
shift
done
You are also missing a final "done" to close your outer for loop.
OK, this appears to do what you want:
#!/bin/bash
hello() {
SRC=$1
DEST=$2
while read IP ; do
for FILE in $SRC; do
echo -e "$IP"
echo -e "\tARG1=$FILE ARG2=$DEST\n"
done
done < /tmp/machine.configs
}
hello "$1" $2
You still need to escape any wildcard characters when you invoke the script
The double quotes are necessary when you invoke the hello function, otherwise the mere fact of evaluating $1 causes the wildcard to be expanded, but we don't want that to happen until $SRC is assigned in the function
Here's what I came up with:
#!/bin/bash
hello()
{
# DEST will contain the last argument
eval DEST=\$$#
while [ $1 != $DEST ]; do
SRC=$1
for IP in `cat /opt/ankit/configs/machine.configs`; do
echo -en "$IP"
echo -en "\n\t ARG1=$SRC ARG2=$DEST\n\n"
done
shift || break
done
}
hello $*
Instead of passing only two parameters to the hello() function, we'll pass in all the arguments that the script got.
Inside the hello() function, we first assign the final argument to the DEST var. Then we loop through all of the arguments, assigning each one to SRC, and run whatever commands we want using the SRC and DEST arguments. Note that you may want to put quotation marks around $SRC and $DEST in case they contain spaces. We stop looping when SRC is the same as DEST because that means we've hit the final argument (the destination).
For multiple input files using a wildcard such as *.txt, I found this to work perfectly, no escaping required. It should work just like a native bash app like "ls" or "rm." This was not documented just about anywhere so since I spent a better part of 3 days trying to figure it out I decided I should post it for future readers.
Directory contains the following files (output of ls)
file1.txt file2.txt file3.txt
Run script like
$ ./script.sh *.txt
Or even like
$ ./script.sh file{1..3}.txt
The script
#!/bin/bash
# store default IFS, we need to temporarily change this
sfi=$IFS
#set IFS to $'\n\' - new line
IFS=$'\n'
if [[ -z $# ]]
then
echo "Error: Missing required argument"
echo
exit 1
fi
# Put the file glob into an array
file=("$#")
# Now loop through them
for (( i=0 ; i < ${#file[*]} ; i++ ));
do
if [ -w ${file[$i]} ]; then
echo ${file[$i]} " writable"
else
echo ${file[$i]} " NOT writable"
fi
done
# Reset IFS to its default value
IFS=$sfi
The output
file1.txt writable
file2.txt writable
file3.txt writable
The key was switching the IFS (Internal Field Separator) temporarily. You have to be sure to store this before switching and then switch it back when you are done with it as demonstrated above.
Now you have a list of expanded files (with spaces escaped) in the file[] array which you can then loop through. I like this solution the best, easiest to program for and easiest for the users.
There's no need to spawn a shell to look at the $? variable, you can evaluate it directly.
It should just be:
if [ $? -eq 0 ]; then
You're running
./test.sh /home/dev/Examples/*.pdf /ankit_test/as
and your interactive shell is expanding the wildcard before the script gets it. You just need to quote the first argument when you launch it, as in
./test.sh "/home/dev/Examples/*.pdf" /ankit_test/as
and then, in your script, quote "$SRC" anywhere where you literally want the things with wildcards (ie, when you do echo $SRC, instead use echo "$SRC") and leave it unquoted when you want the wildcards expanded. Basically, always put quotes around things which might contain shell metacharacters unless you want the metacharacters interpreted. :)

Resources