What's the point of eval/bash -c as opposed to just evaluating a variable? - linux

Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?

The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.

$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.

There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.

Related

Exporting environment variables both to bash as csh using a bash script with functions

I have a bash shell-script with a function which exports an environment variable.
For sake of argument lets use the following example:
#!/bin/bash
function my_function()
{
export my_env_var=$1
}
Since the whole purpose is to export the variable to the main shell I source it.
When the main shell is bash this works fine:
<bash-shell>
> source ~/tmp/my_test.sh
> my_function test
> echo $my_env_var
test
But other customers use csh and there things start to fail if I use the same command with the same script, since csh does not know functions :-(
<csh-shell>
% source ~/tmp/my_test.sh
Badly placed ()'s
I already tried to wrap it in a wrapper-script:
#!/bin/sh
bash -c 'source ~/tmp/my_test.sh; my_function test`
echo my_env_var = $my_env_var
But my_env_var is not exported in this way:
<csh-shell>
% source ~/tmp/my_test2.sh
my_env_var: Undefined variable.
Where it is known in the bash shell (as can be seen by changing the 2nd script to:
#!/bin/sh
bash -c 'source ~/tmp/my_test.sh; my_function test; echo my_env_var in bash = $my_env_var`
echo my_env_var = $my_env_var
<csh-shell>
% source ~/tmp/my_test2.sh
my_env_var in bash = test
my_env_var: Undefined variable.
What am I missing / doing wrong so the script exports the variable when it is called from bash and when it is called from csh?
The Bourne shell and csh are not compatible; many commands are different, and csh misses many features (it doesn't have functions at all). Plus, sooner or later you're going to have someone who uses fish, which is different yet still. The only way to make a non-trivial script work for both is to write it twice.
That said, if you want to set some environment variables then the general strategy is to create a script which outputs the required commands; this can be in any language (shell, Python, C); for example:
#!/bin/sh
# ... do work here ...
var="foo"
# Getting the shell in a cross-platform way isn't too easy. This was only tested
# on Linux. Can add a "-c" or "-f" flag if you need cross-platform support.
shell=$(ps -ho comm $(ps -ho ppid $$))
case "$shell" in
(csh|tcsh) echo "setenv VAR $var" ;;
(fish) echo "set -Ux VAR $var" ;;
(*) echo "export VAR=$var"
esac
And when you run it, it outputs the appropriate commands:
% ./work
export VAR=foo
% tcsh
> ./work
setenv VAR foo
> fish
martin#x270 ~> ./work
set -Ux VAR foo
And to actually set it, eval the output like so:
% eval $(./work)
% echo $VAR
foo
% tcsh
> eval `./work`
> echo $VAR
foo
> fish
martin#x270 ~> eval (./work)
martin#x270 ~> echo $VAR
foo
The downside of this is that informational messages, warnings, etc. will also get eval'd; to solve this make sure to always output these to stderr:
echo >&2 "warning: foo"
If you don't want to run eval you can also use something slightly more complicated which prints VAR=foo and then create a Bourne and csh wrapper script to parse those lines, but "output the variables you want to set, instead of directly setting them" is the general approach to take to make something work in multiple incompatible shells.

Bash discards command line arguments when passing to another bash shell

I have a big script (call it test) that, after stripping out the unrelated parts, comes down to just this using which I can explain my question:
#!/bin/bash
bash -c "$#"
This doesn't work as expected. E.g. ./test echo hi executes the only the echo and the argument disappears!
Testing with various inputs I can see only $1 is passed to bash -c ... and rest are discarded.
But if I use a variable like:
#!/bin/bash
cmd="$#"
bash -c "$cmd"
it works as expected for all inputs.
Questions:
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
For example:
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
(If possible, please refer to the bash grammar where this behaviour is documented).
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
From info bash #:
#
($#) Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, each parameter expands
to a separate word. That is, "$#" is equivalent to "$1" "$2" ....
Thus, bash -c "$#" is equivalent to bash -c "$1" "$2" .... In the case of ./test echo hi invocation, the expression is expanded to
bash -c "echo" "hi"
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
Bash actually doesn't discard anything. From man bash:
If the -c option is present, then commands are read from the first non-option argument command_string. If there are arguments after the command_string, they are assigned to the positional parameters, starting with $0.
Thus, for the command bash -c "echo" "hi", Bash passes "hi" as $0 for the "echo" script.
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
According to the rules mentioned above, Bash executes "ls" script and passes the following positional parameters to this script:
$0: "-l"
$1: "-a"
$2: "hi"
$3: "hello"
$4: "blah"
Thus, the command actually executes ls, and the positional parameters are unused in the script. You can use them by referencing to the positional parameters, e.g.:
$ set -x
$ bash -c "ls \$0 \$1 \$3" -l -a hi hello blah
+ bash -c 'ls $0 $1 $3' -l -a hi hello blah
ls: cannot access hello: No such file or directory
You should be using $* instead of $# to pass command line as string. "$#" expands to multiple quoted arguments and "$*" combines multiple arguments into a single argument.
#!/bin/bash
bash -c "$*"
Problem is with your $# it executes:
bash -c echo hi
But with $* it executes:
bash -c 'echo hi'
When you use:
cmd="$#"
and use: bash -c "$cmd" it does the same thing for you.
Read: What is the difference between “$#” and “$*” in Bash?

bash passing strings to "gnome-terminal -e"

this question looks like Opening multiple tabs in gnome terminal with complex commands from a cycle, but I am looking for a more generic solution.
I have a C program that calls a script "xvi" with arguments. Each argument is originally enclosed within quotes (''') and each quote in an argument is isolated and back-slashed (this format is a prerequisite) ex:
xvi 'a file' 'let'\''s try another'
The script xvi must launch gnome-terminal with "-e vim args"
With xterm instead of gnome-terminal, this is easy because xterm assumes that "-e" is the last argument and passes all the tail to the shell, so the following is OK:
exec /usr/bin/xterm -e /usr/bin/vim "$#"
For gnome-terminal, "-e" is an option among others and we need to 'package' the whole command line in one argument. This is what I have done, which is OK: Enclose each argument within double quotes(\"arg\") and backslash any double quote within an argument:
cmd="/usr/bin/vim"
while [ "$1" != "" ] ; do
arg=`echo "$1" | sed -e 's/\"/\\\"/g'`
cmd="$cmd \"$arg\""
shift
done
exec gnome-terminal --zoom=0.9 --disable-factory -e "$cmd"
Again, this works fine and I am nearly happy with that.
Question: Is there any nicer solution, avoiding the loop?
Thanks
Untested, but you could probably finagle printf '%q' into doing the job:
exec gnome-terminal --zoom=0.9 --disable-factory -e "$(printf '%q ' "$#")"
I know this thread is old but recently I had a similar need and I created a bash script to launch multiple tabs and run different commands on each of them:
#!/bin/bash
# Array of commands to run in different tabs
commands=(
'tail -f /var/log/apache2/access.log'
'tail -f /var/log/apache2/error.log'
'tail -f /usr/local/var/postgres/server.log'
)
# Build final command with all the tabs to launch
set finalCommand=""
for (( i = 0; i < ${#commands[#]}; i++ )); do
export finalCommand+="--tab -e 'bash -c \"${commands[$i]}\"' "
done
# Run the final command
eval "gnome-terminal "$finalCommand
You just need to add your commands in the array and execute.
Gist link: https://gist.github.com/rollbackpt/b4e17e2f4c23471973e122a50d591602

Quoting with ssh command with a function call

I need to execute the shell command as follows:
ssh <device> "command"
command is invoked as:
$(typeset); <function_name> \"arguement_string\"; cd ...; ls ...
How exactly to quote here? Is this correct?
""$(typeset); <function_name> \"arguement_string\""; cd ...; ls ..."
I am confused with this quoting in shell scripts.
Don't try to do the quoting by hand -- ask the shell to do it for you!
command_array=( function_name "first argument" "second argument" )
printf -v command_str '%q ' "${command_array[#]}"
ssh_str="$(typeset); $command_str"
ssh machine "$ssh_str"
You can then build up command_array as you wish -- using logic to conditionally append values, with only the kind of quoting you'd usually refer to use to those values, and let printf %q add all additional quoting needed to make the content safe to pass through ssh.
If you're trying to incrementally build up a script, you can do that like so:
remote_script="$(typeset)"$'\n'
safe_append_command() {
local command_str
printf -v command_str '%q ' "$#"
remote_script+="$command_str"$'\n'
}
safe_append_command cp "$file" "$destination"
safe_append_command tar -cf /tmp/foo.tar "${destination%/*}"
# ...etc...
ssh machine "$remote_script"
Note that in this case, all expansions take place locally, when the script is being generated, and shell constructs such as redirection operators cannot be used (except by embedding them in a function you then pass to the remote system with typeset). Doing so means that no data passed to safe_append_command can be treated as code -- foreclosing large classes of potential security holes at the cost of flexibility.
I would use a here document:
ssh machine <<'EOF'
hello() {
echo "hello $1!"
}
hello "world"
EOF
Note that I wrapped the starting EOF in single quotes. Doing so prevents bash from interpreting variables or command substitutions in the local shell.

Parameter list with double quotes does not pass through properly in Bash

I have a Bash script that calls another Bash script. The called script does some modification and checking on a few things, shifts, and then passes the rest of the caller's command line through.
In the called script, I have verified that I have everything managed and ready to call. Here's some debug-style code I've put in:
echo $SVN $command $# > /tmp/shimcmd
bash /tmp/shimcmd
$SVN $command $#
Now, in /tmp/shimcmd you'll see:
svn commit --username=myuser --password=mypass --non-interactive --trust-server-cert -m "Auto Update autocommit Wed Apr 11 17:33:37 CDT 2012"
That is, the built command, all on one line, perfectly fine, including a -m "my string with spaces" portion.
It's perfect. And the "bash /tmp/shimcmd" execution of it works perfectly as well.
But of course I don't want this silly tmp file and such (only used it to debug). The problem is that calling the command directly, instead of via the shim file:
$SVN $command $#
results in the svn command itself NOT receiving the quoted string with spaces--it garbles the '-m "my string with spaces"' parameter and shanks the command as if it was passed as '-m my string with spaces'.
I have tried all manner of crazy escape methods to no avail. Can't believe it's dogging me this badly. Again, by echoing the very same thing ($SVN $command $#) to a file and then executing that file, it's FINE. But calling directly garbles the quoted string. That element alone shanks.
Any ideas?
Dan
Did you try:
eval "$SVN $command $#"
?
Here's a way to demonstrate the problem:
$ args='-m "foo bar"'
$ printf '<%s> ' $args
<-m> <"foo> <bar">
And here's a way to avoid it:
$ args=( -m "foo bar" )
$ printf '<%s> ' "${args[#]}"
<-m> <foo bar>
In this latter case, args is an array, not a quoted string.
Note, by the way, that it has to be "$#", not $#, to get this behavior (in which string-splitting is avoided in favor of respecting the array entries' boundaries).
this
echo -n -e $SVN \"$command\" > /tmp/shimcmd
for x in "$#"
do
a=$a" "\"$x\"
done
echo -e " " $a >> /tmp/shimcmd
bash /tmp/shimcmd
or simply
$SVN "$command" "$#"

Resources