Parameter list with double quotes does not pass through properly in Bash - string

I have a Bash script that calls another Bash script. The called script does some modification and checking on a few things, shifts, and then passes the rest of the caller's command line through.
In the called script, I have verified that I have everything managed and ready to call. Here's some debug-style code I've put in:
echo $SVN $command $# > /tmp/shimcmd
bash /tmp/shimcmd
$SVN $command $#
Now, in /tmp/shimcmd you'll see:
svn commit --username=myuser --password=mypass --non-interactive --trust-server-cert -m "Auto Update autocommit Wed Apr 11 17:33:37 CDT 2012"
That is, the built command, all on one line, perfectly fine, including a -m "my string with spaces" portion.
It's perfect. And the "bash /tmp/shimcmd" execution of it works perfectly as well.
But of course I don't want this silly tmp file and such (only used it to debug). The problem is that calling the command directly, instead of via the shim file:
$SVN $command $#
results in the svn command itself NOT receiving the quoted string with spaces--it garbles the '-m "my string with spaces"' parameter and shanks the command as if it was passed as '-m my string with spaces'.
I have tried all manner of crazy escape methods to no avail. Can't believe it's dogging me this badly. Again, by echoing the very same thing ($SVN $command $#) to a file and then executing that file, it's FINE. But calling directly garbles the quoted string. That element alone shanks.
Any ideas?
Dan

Did you try:
eval "$SVN $command $#"
?

Here's a way to demonstrate the problem:
$ args='-m "foo bar"'
$ printf '<%s> ' $args
<-m> <"foo> <bar">
And here's a way to avoid it:
$ args=( -m "foo bar" )
$ printf '<%s> ' "${args[#]}"
<-m> <foo bar>
In this latter case, args is an array, not a quoted string.
Note, by the way, that it has to be "$#", not $#, to get this behavior (in which string-splitting is avoided in favor of respecting the array entries' boundaries).

this
echo -n -e $SVN \"$command\" > /tmp/shimcmd
for x in "$#"
do
a=$a" "\"$x\"
done
echo -e " " $a >> /tmp/shimcmd
bash /tmp/shimcmd
or simply
$SVN "$command" "$#"

Related

Bash discards command line arguments when passing to another bash shell

I have a big script (call it test) that, after stripping out the unrelated parts, comes down to just this using which I can explain my question:
#!/bin/bash
bash -c "$#"
This doesn't work as expected. E.g. ./test echo hi executes the only the echo and the argument disappears!
Testing with various inputs I can see only $1 is passed to bash -c ... and rest are discarded.
But if I use a variable like:
#!/bin/bash
cmd="$#"
bash -c "$cmd"
it works as expected for all inputs.
Questions:
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
For example:
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
(If possible, please refer to the bash grammar where this behaviour is documented).
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
From info bash #:
#
($#) Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, each parameter expands
to a separate word. That is, "$#" is equivalent to "$1" "$2" ....
Thus, bash -c "$#" is equivalent to bash -c "$1" "$2" .... In the case of ./test echo hi invocation, the expression is expanded to
bash -c "echo" "hi"
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
Bash actually doesn't discard anything. From man bash:
If the -c option is present, then commands are read from the first non-option argument command_string. If there are arguments after the command_string, they are assigned to the positional parameters, starting with $0.
Thus, for the command bash -c "echo" "hi", Bash passes "hi" as $0 for the "echo" script.
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
According to the rules mentioned above, Bash executes "ls" script and passes the following positional parameters to this script:
$0: "-l"
$1: "-a"
$2: "hi"
$3: "hello"
$4: "blah"
Thus, the command actually executes ls, and the positional parameters are unused in the script. You can use them by referencing to the positional parameters, e.g.:
$ set -x
$ bash -c "ls \$0 \$1 \$3" -l -a hi hello blah
+ bash -c 'ls $0 $1 $3' -l -a hi hello blah
ls: cannot access hello: No such file or directory
You should be using $* instead of $# to pass command line as string. "$#" expands to multiple quoted arguments and "$*" combines multiple arguments into a single argument.
#!/bin/bash
bash -c "$*"
Problem is with your $# it executes:
bash -c echo hi
But with $* it executes:
bash -c 'echo hi'
When you use:
cmd="$#"
and use: bash -c "$cmd" it does the same thing for you.
Read: What is the difference between “$#” and “$*” in Bash?

bash passing strings to "gnome-terminal -e"

this question looks like Opening multiple tabs in gnome terminal with complex commands from a cycle, but I am looking for a more generic solution.
I have a C program that calls a script "xvi" with arguments. Each argument is originally enclosed within quotes (''') and each quote in an argument is isolated and back-slashed (this format is a prerequisite) ex:
xvi 'a file' 'let'\''s try another'
The script xvi must launch gnome-terminal with "-e vim args"
With xterm instead of gnome-terminal, this is easy because xterm assumes that "-e" is the last argument and passes all the tail to the shell, so the following is OK:
exec /usr/bin/xterm -e /usr/bin/vim "$#"
For gnome-terminal, "-e" is an option among others and we need to 'package' the whole command line in one argument. This is what I have done, which is OK: Enclose each argument within double quotes(\"arg\") and backslash any double quote within an argument:
cmd="/usr/bin/vim"
while [ "$1" != "" ] ; do
arg=`echo "$1" | sed -e 's/\"/\\\"/g'`
cmd="$cmd \"$arg\""
shift
done
exec gnome-terminal --zoom=0.9 --disable-factory -e "$cmd"
Again, this works fine and I am nearly happy with that.
Question: Is there any nicer solution, avoiding the loop?
Thanks
Untested, but you could probably finagle printf '%q' into doing the job:
exec gnome-terminal --zoom=0.9 --disable-factory -e "$(printf '%q ' "$#")"
I know this thread is old but recently I had a similar need and I created a bash script to launch multiple tabs and run different commands on each of them:
#!/bin/bash
# Array of commands to run in different tabs
commands=(
'tail -f /var/log/apache2/access.log'
'tail -f /var/log/apache2/error.log'
'tail -f /usr/local/var/postgres/server.log'
)
# Build final command with all the tabs to launch
set finalCommand=""
for (( i = 0; i < ${#commands[#]}; i++ )); do
export finalCommand+="--tab -e 'bash -c \"${commands[$i]}\"' "
done
# Run the final command
eval "gnome-terminal "$finalCommand
You just need to add your commands in the array and execute.
Gist link: https://gist.github.com/rollbackpt/b4e17e2f4c23471973e122a50d591602

I keep getting a 'while syntax' error on the output of the at job in unix and I have no idea why

#!/usr/dt/bin/dtksh
while getopts w:m: option
do
case $option in
w) wflag=1
wval="$OPTARG";;
m) mflag=1
mval="$OPTARG";;
?) printf 'BAD\n' $0
exit 2;;
esac
done
if [ ! -z "$wflag" ]; then
printf "W and -w arg is $wval\n"
fi
if [ ! -z "$mflag" ]; then
printf "M and -m arg is $mval\n"
fi
shift $(($OPTIND - 1))
printf "Remaining arguments are: $* \n"
at $wval <<ENDMARKER
echo $* >> Search_List
tr " " "\n" <Search_List >Usr_List
while true; do
if [ -s Usr_List ]; then
for i in $(cat Usr_List); do
if finger -m | grep $i; then
echo '$i is online' | elm user
sed '/$i/d' <Usr_List >tmplist
mv tmplist Usr_List
fi
done
else
break
fi
done
ENDMARKER
Essentially I want to keep searching through until it is empty. Each time an element of the list is found, it is deleted. Once the list is empty quit.
There are no error messages when I first run the command, it only shows up in an email containing the output of the at job.
Thanks in advance for any advice
EDIT: The script uses getopts and takes one argument for -w and one for -m, the w value is set as the time for the at job, the m still has to be used. Any arguments after the one for m are sent to a file called Search_List, Search_List is edited and saved as Usr_List. Then in the while loop, while Usr_List is not empty, the script checks the results of finger -m against the names in Usr_List. If a name is found, it is removed from Usr_List. Once Usr_List is empty, the program should stop.
elm is a way to send an email, so elm user sends an email to user.
The error is :
while: Expression syntax
at uses /bin/sh by default.
at now <<ENDMARKER
<code here>
ENDMARKER
All of this executes under /bin/sh, which on some systems can be Bourne Shell (Solaris for example).
You need to figure out what /bin/sh is for your system, then modify things accordingly. Plus, read the gurantees about what is and what is not in your "at" environment. I think the problem lies there. You have both UNIX and linux tags. So I cannot give a lot more help than that.
You can enable logging -- the way YOU need it -- of the at code chunk:
exec 2&>1 > /tmp/somefile.log
Then write debugging messages to stdout or stderr.
Your HEREDOC is being interpolated. Try quoting the delimiter:
at $wval << 'ENDMARKER'
Although ( I haven't looked closely) it appears that you want some interpolation. But you definitely do not want it on the line in which you reference $i, so quote that $ if you do not quote the entire heredoc:
if finger -m | grep \$i; then
You need to pass the -k option to at:
...
at -k $wval <<ENDMARKER
...
at is otherwise defaulting to your login shell which is csh or one of its derivatives.
It turns out that the while command and the if command needed to be combined.
while [[ -s Usr_List ]]; do
......
done

How can I preserve quotes in printing a bash script's arguments

I am making a bash script that will print and pass complex arguments to another external program.
./script -m root#hostname,root#hostname -o -q -- 'uptime ; uname -a'
How do I print the raw arguments as such:
-m root#hostname,root#hostname -o -q -- 'uptime ; uname -a'
Using $# and $* removes the single quotes around uptime ; uname -a which could cause undesired results. My script does not need to parse each argument. I just need to print / log the argument string and pass them to another program exactly how they are given.
I know I can escape the quotes with something like "'uptime ; uname -a'" but I cannot guarantee the user will do that.
The quotes are removed before the arguments are passed to your script, so it's too late to preserve them. What you can do is preserve their effect when passing the arguments to the inner command, and reconstruct an equivalent quoted/escaped version of the arguments for printing.
For passing the arguments to the inner command "$#" -- with the double-quotes, $# preserves the original word breaks, meaning that the inner command receives exactly the same argument list that your script did.
For printing, you can use the %q format in bash's printf command to reconstruct the quoting. Note that this won't always reconstruct the original quoting, but will construct an equivalent quoted/escaped string. For example, if you passed the argument 'uptime ; uname -a' it might print uptime\ \;\ uname\ -a or "uptime ; uname -a" or any other equivalent (see #William Pursell's answer for similar examples).
Here's an example of using these:
printf "Running command:"
printf " %q" innercmd "$#" # note the space before %q -- this inserts spaces between arguments
printf "\n"
innercmd "$#"
If you have bash version 4.4 or later, you can use the #Q modifier on parameter expansions to add quoting. This tends to prefer using single-quotes (as opposed to printf %q's preference for escapes). You can combine this with $* to get a reasonable result:
echo "Running command: innercmd ${*#Q}"
innercmd "$#"
Note that $* mashes all arguments together into a single string with whitespace between them, which is normally not useful, but in this case each argument is individually quoted so the result is actually what you (probably) want. (Well, unless you changed IFS, in which case the "whitespace" between arguments will be the first character of $IFS, which may not be what you want.)
Use ${##Q} for a simple solution. To test put the lines below in a script bigQ.
#!/bin/bash
line="${##Q}"
echo $line
./bigQ 1 a "4 5" b="6 7 8"
'1' 'a' '4 5' 'b=6 7 8'
If the user invokes your command as:
./script 'foo'
the first argument given to the script is the string foo without the quotes. There is no way for your script to differentiate between that and any of the other methods by which it could get foo as an argument (eg ./script $(echo foo) or ./script foo or ./script "foo" or ./script \f\o""''""o).
If you want to print the argument list as close as possible to what the user probably entered:
#!/bin/bash
chars='[ !"#$&()*,;<>?\^`{|}]'
for arg
do
if [[ $arg == *"'"* ]]
then
arg=\""$arg"\"
elif [[ $arg == *$chars* ]]
then
arg="'$arg'"
fi
allargs+=("$arg") # ${allargs[#]} is to be used only for printing
done
printf '%s\n' "${allargs[*]}"
It's not perfect. An argument like ''\''"' is more difficult to accommodate than is justified.
As someone else already mentioned, when you access the arguments inside of your script, it's too late to know which arguments were quote when it was called. However, you can re-quote the arguments that contain spaces or other special characters that would need to be quoted to be passed as parameters.
Here is a Bash implementation based on Python's shlex.quote(s) that does just that:
function quote() {
declare -a params
for param; do
if [[ -z "${param}" || "${param}" =~ [^A-Za-z0-9_#%+=:,./-] ]]; then
params+=("'${param//\'/\'\"\'\"\'}'")
else
params+=("${param}")
fi
done
echo "${params[*]}"
}
Your example slightly changed to show empty arguments:
$ quote -m root#hostname,root#hostname -o -q -- 'uptime ; uname -a' ''
-m root#hostname,root#hostname -o -q -- 'uptime ; uname -a' ''
In my case, I have tried to call the bash like script --argument="--arg-inner=1 --arg-inner2".
Unfortunately any solution upper don't help in my case.
Definitive solution was
#!/bin/bash
# Fix given array argument quotation
function quote() {
local QUOTED_ARRAY=()
for ARGUMENT; do
case ${ARGUMENT} in
--*=*)
QUOTED_ARRAY+=( "${ARGUMENT%%=*}=$(printf "%q" "${ARGUMENT#*=}")" )
shift
;;
*)
QUOTED_ARRAY+=( "$(printf " %q" "${ARGUMENT}")" )
;;
esac
done
echo ${QUOTED_ARRAY[#]}
}
ARGUMENTS="$(quote "${#}")"
echo "${ARGUMENTS}"
The result in the case of MacOS is --argument=--arg-inner=1\ --arg-inner2 which is logically the same.
Just separate each argument using quotes, and the nul character:
#! /bin/bash
sender () {
printf '"%s"\0' "$#"
}
receiver () {
readarray -d '' args < <(function "$#")
}
receiver "$#"
As commented by Charles Duffy.

How can I store a command in a variable in a shell script?

I would like to store a command to use at a later time in a variable (not the output of the command, but the command itself).
I have a simple script as follows:
command="ls";
echo "Command: $command"; #Output is: Command: ls
b=`$command`;
echo $b; #Output is: public_html REV test... (command worked successfully)
However, when I try something a bit more complicated, it fails. For example, if I make
command="ls | grep -c '^'";
The output is:
Command: ls | grep -c '^'
ls: cannot access |: No such file or directory
ls: cannot access grep: No such file or directory
ls: cannot access '^': No such file or directory
How could I store such a command (with pipes/multiple commands) in a variable for later use?
Use eval:
x="ls | wc"
eval "$x"
y=$(eval "$x")
echo "$y"
Do not use eval! It has a major risk of introducing arbitrary code execution.
BashFAQ-50 - I'm trying to put a command in a variable, but the complex cases always fail.
Put it in an array and expand all the words with double-quotes "${arr[#]}" to not let the IFS split the words due to Word Splitting.
cmdArgs=()
cmdArgs=('date' '+%H:%M:%S')
and see the contents of the array inside. The declare -p allows you see the contents of the array inside with each command parameter in separate indices. If one such argument contains spaces, quoting inside while adding to the array will prevent it from getting split due to Word-Splitting.
declare -p cmdArgs
declare -a cmdArgs='([0]="date" [1]="+%H:%M:%S")'
and execute the commands as
"${cmdArgs[#]}"
23:15:18
(or) altogether use a bash function to run the command,
cmd() {
date '+%H:%M:%S'
}
and call the function as just
cmd
POSIX sh has no arrays, so the closest you can come is to build up a list of elements in the positional parameters. Here's a POSIX sh way to run a mail program
# POSIX sh
# Usage: sendto subject address [address ...]
sendto() {
subject=$1
shift
first=1
for addr; do
if [ "$first" = 1 ]; then set --; first=0; fi
set -- "$#" --recipient="$addr"
done
if [ "$first" = 1 ]; then
echo "usage: sendto subject address [address ...]"
return 1
fi
MailTool --subject="$subject" "$#"
}
Note that this approach can only handle simple commands with no redirections. It can't handle redirections, pipelines, for/while loops, if statements, etc
Another common use case is when running curl with multiple header fields and payload. You can always define args like below and invoke curl on the expanded array content
curlArgs=('-H' "keyheader: value" '-H' "2ndkeyheader: 2ndvalue")
curl "${curlArgs[#]}"
Another example,
payload='{}'
hostURL='http://google.com'
authToken='someToken'
authHeader='Authorization:Bearer "'"$authToken"'"'
now that variables are defined, use an array to store your command args
curlCMD=(-X POST "$hostURL" --data "$payload" -H "Content-Type:application/json" -H "$authHeader")
and now do a proper quoted expansion
curl "${curlCMD[#]}"
var=$(echo "asdf")
echo $var
# => asdf
Using this method, the command is immediately evaluated and its return value is stored.
stored_date=$(date)
echo $stored_date
# => Thu Jan 15 10:57:16 EST 2015
# (wait a few seconds)
echo $stored_date
# => Thu Jan 15 10:57:16 EST 2015
The same with backtick
stored_date=`date`
echo $stored_date
# => Thu Jan 15 11:02:19 EST 2015
# (wait a few seconds)
echo $stored_date
# => Thu Jan 15 11:02:19 EST 2015
Using eval in the $(...) will not make it evaluated later:
stored_date=$(eval "date")
echo $stored_date
# => Thu Jan 15 11:05:30 EST 2015
# (wait a few seconds)
echo $stored_date
# => Thu Jan 15 11:05:30 EST 2015
Using eval, it is evaluated when eval is used:
stored_date="date" # < storing the command itself
echo $(eval "$stored_date")
# => Thu Jan 15 11:07:05 EST 2015
# (wait a few seconds)
echo $(eval "$stored_date")
# => Thu Jan 15 11:07:16 EST 2015
# ^^ Time changed
In the above example, if you need to run a command with arguments, put them in the string you are storing:
stored_date="date -u"
# ...
For Bash scripts this is rarely relevant, but one last note. Be careful with eval. Eval only strings you control, never strings coming from an untrusted user or built from untrusted user input.
For bash, store your command like this:
command="ls | grep -c '^'"
Run your command like this:
echo $command | bash
Not sure why so many answers make it complicated!
use alias [command] 'string to execute'
example:
alias dir='ls -l'
./dir
[pretty list of files]
I tried various different methods:
printexec() {
printf -- "\033[1;37m$\033[0m"
printf -- " %q" "$#"
printf -- "\n"
eval -- "$#"
eval -- "$*"
"$#"
"$*"
}
Output:
$ printexec echo -e "foo\n" bar
$ echo -e foo\\n bar
foon bar
foon bar
foo
bar
bash: echo -e foo\n bar: command not found
As you can see, only the third one, "$#" gave the correct result.
I faced this problem with the following command:
awk '{printf "%s[%s]\n", $1, $3}' "input.txt"
I need to build this command dynamically:
The target file name input.txt is dynamic and may contain space.
The awk script inside {} braces printf "%s[%s]\n", $1, $3 is dynamic.
Challenge:
Avoid extensive quote escaping logic if there are many " inside the awk script.
Avoid parameter expansion for every $ field variable.
The solutions bellow with eval command and associative arrays do not work. Due to bash variable expansions and quoting.
Solution:
Build bash variable dynamically, avoid bash expansions, use printf template.
# dynamic variables, values change at runtime.
input="input file 1.txt"
awk_script='printf "%s[%s]\n" ,$1 ,$3'
# static command template, preventing double-quote escapes and avoid variable expansions.
awk_command=$(printf "awk '{%s}' \"%s\"\n" "$awk_script" "$input")
echo "awk_command=$awk_command"
awk_command=awk '{printf "%s[%s]\n" ,$1 ,$3}' "input file 1.txt"
Executing variable command:
bash -c "$awk_command"
Alternative that also works
bash << $awk_command
As you don't specify any scripting language, I would recommand tcl, the Tool Command Language for this kind of purpose.
Then in the first line, add the appropriate shebang:
#!/usr/local/bin/tclsh
with appropriate location you can retrieve with which tclsh.
In tcl scripts, you can call operating system commands with exec.
#!/bin/bash
#Note: this script works only when u use Bash. So, don't remove the first line.
TUNECOUNT=$(ifconfig |grep -c -o tune0) #Some command with "Grep".
echo $TUNECOUNT #This will return 0
#if you don't have tune0 interface.
#Or count of installed tune0 interfaces.
First of all, there are functions for this. But if you prefer variables then your task can be done like this:
$ cmd=ls
$ $cmd # works
file file2 test
$ cmd='ls | grep file'
$ $cmd # not works
ls: cannot access '|': No such file or directory
ls: cannot access 'grep': No such file or directory
file
$ bash -c $cmd # works
file file2 test
$ bash -c "$cmd" # also works
file
file2
$ bash <<< $cmd
file
file2
$ bash <<< "$cmd"
file
file2
Or via a temporary file
$ tmp=$(mktemp)
$ echo "$cmd" > "$tmp"
$ chmod +x "$tmp"
$ "$tmp"
file
file2
$ rm "$tmp"
Be careful registering an order with the: X=$(Command)
This one is still executed. Even before being called. To check and confirm this, you can do:
echo test;
X=$(for ((c=0; c<=5; c++)); do
sleep 2;
done);
echo note the 5 seconds elapsed
It is not necessary to store commands in variables even as you need to use it later. Just execute it as per normal. If you store in variables, you would need some kind of eval statement or invoke some unnecessary shell process to "execute your variable".

Resources