bash passing strings to "gnome-terminal -e" - string

this question looks like Opening multiple tabs in gnome terminal with complex commands from a cycle, but I am looking for a more generic solution.
I have a C program that calls a script "xvi" with arguments. Each argument is originally enclosed within quotes (''') and each quote in an argument is isolated and back-slashed (this format is a prerequisite) ex:
xvi 'a file' 'let'\''s try another'
The script xvi must launch gnome-terminal with "-e vim args"
With xterm instead of gnome-terminal, this is easy because xterm assumes that "-e" is the last argument and passes all the tail to the shell, so the following is OK:
exec /usr/bin/xterm -e /usr/bin/vim "$#"
For gnome-terminal, "-e" is an option among others and we need to 'package' the whole command line in one argument. This is what I have done, which is OK: Enclose each argument within double quotes(\"arg\") and backslash any double quote within an argument:
cmd="/usr/bin/vim"
while [ "$1" != "" ] ; do
arg=`echo "$1" | sed -e 's/\"/\\\"/g'`
cmd="$cmd \"$arg\""
shift
done
exec gnome-terminal --zoom=0.9 --disable-factory -e "$cmd"
Again, this works fine and I am nearly happy with that.
Question: Is there any nicer solution, avoiding the loop?
Thanks

Untested, but you could probably finagle printf '%q' into doing the job:
exec gnome-terminal --zoom=0.9 --disable-factory -e "$(printf '%q ' "$#")"

I know this thread is old but recently I had a similar need and I created a bash script to launch multiple tabs and run different commands on each of them:
#!/bin/bash
# Array of commands to run in different tabs
commands=(
'tail -f /var/log/apache2/access.log'
'tail -f /var/log/apache2/error.log'
'tail -f /usr/local/var/postgres/server.log'
)
# Build final command with all the tabs to launch
set finalCommand=""
for (( i = 0; i < ${#commands[#]}; i++ )); do
export finalCommand+="--tab -e 'bash -c \"${commands[$i]}\"' "
done
# Run the final command
eval "gnome-terminal "$finalCommand
You just need to add your commands in the array and execute.
Gist link: https://gist.github.com/rollbackpt/b4e17e2f4c23471973e122a50d591602

Related

Open new gnome-terminal from scripts and input vars from present script.

#!/bin/bash
Dpath=/home/$USER/Docker/
IP=`sed -n 1p /home/medma/.medmadoc`
DockerMachine=`sed -n 2p /home/$USER/.medmadoc`
DockerPort=`sed -n 5p /home/$USER/.medmadoc`
DockerUser=`sed -n 3p /home/$USER/.medmadoc`
DockerPass=`sed -n 4p /home/$USER/.medmadoc`
if [ ! -d $Dpath ] ; then
mkdir -p $Dpath
else
stat=`wget -O ".dockerid" http://$IP/DOCKER-STAT.txt`
for ids in `cat .dockerid`
do
if [ "$ids" == "$DockerMachine" ] ; then
gnome-terminal -x sh -c 'sshfs -p$DockerPort $DockerUser#$IP:/var/www/html $Dpath ; bash '
nautilus $Dpath
zenity --info --text "Mounted $DockerMachine"
exit
else
:
fi
done
zenity --info --text "No Such ID:$DockerMachine"
fi
gnome-terminal -x sh -c 'sshfs -p$DockerPort $DockerUser#$IP:/var/www/html $Dpath ; bash '
this command opens up a new terminal but the problem is that it does not load vars like $DockerPort $DockerUser $IP $Dpath from this script.
How do I input the values in these vars from this script to the newly opened terminal ?
Thanks !
As indicated before, you could try to use double quotes instead of single quotes around the sshfs invocation.
Single quotes in Bash are used to delimit verbatim text, in which variables are not expanded. Double quotes, in contrast, allow for variables expansion and command substitution ($(...)) to take place.
If you do use double quotes, beware of unintended side-effects (your username may contain a space, a dollar, a semicolon, or any other shell-special character). A cleaner approach would be to export the variables to the environment before calling gnome-terminal (and not forgetting to add double quotes around your variables inside the single-quotes), so that your code looks like :
export Docker{Port,User} IP Dpath
gnome-terminal -x sh -c 'sshfs -p"$DockerPort" "$DockerUser#$IP":/var/www/html "$Dpath" ; bash'
You may not want to pollute the environment with variables that will only be used once. If that is the case, instead of exporting them, you can use Bash's declare -p feature to serialize variables before loading them into a new environment (in my opinion, this is the cleanest approach). Here is what it looks like :
set_vars="$(declare -p Docker{Port,User} IP Dpath)"
gnome-terminal -x bash -c "$set_vars;"'sshfs ....'
Using this latest method, the variables are only visible to the shell process that runs the sshfs command, not gnome-terminal itself nor any sub-process run thereafter.
PS: you could read all your variables at once from the ~/.medmadoc file by using the following code instead of repeated sed invocations :
for var in IP Docker{Machine,User,Pass,Port}; do
read $var
done < ~/.medmadoc
This code makes use of the read builtin, that reads a line of input into a variable (in its simplest form).
PPS: That stat variable probably won't contain any useful information, since the output of wget was redirected by the -O flag. Perhaps you meant to store the result code of wget into stat, in which case what you meant was :
wget -O .dockerid ...
stat=$?

Bash discards command line arguments when passing to another bash shell

I have a big script (call it test) that, after stripping out the unrelated parts, comes down to just this using which I can explain my question:
#!/bin/bash
bash -c "$#"
This doesn't work as expected. E.g. ./test echo hi executes the only the echo and the argument disappears!
Testing with various inputs I can see only $1 is passed to bash -c ... and rest are discarded.
But if I use a variable like:
#!/bin/bash
cmd="$#"
bash -c "$cmd"
it works as expected for all inputs.
Questions:
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
For example:
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
(If possible, please refer to the bash grammar where this behaviour is documented).
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
From info bash #:
#
($#) Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, each parameter expands
to a separate word. That is, "$#" is equivalent to "$1" "$2" ....
Thus, bash -c "$#" is equivalent to bash -c "$1" "$2" .... In the case of ./test echo hi invocation, the expression is expanded to
bash -c "echo" "hi"
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
Bash actually doesn't discard anything. From man bash:
If the -c option is present, then commands are read from the first non-option argument command_string. If there are arguments after the command_string, they are assigned to the positional parameters, starting with $0.
Thus, for the command bash -c "echo" "hi", Bash passes "hi" as $0 for the "echo" script.
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
According to the rules mentioned above, Bash executes "ls" script and passes the following positional parameters to this script:
$0: "-l"
$1: "-a"
$2: "hi"
$3: "hello"
$4: "blah"
Thus, the command actually executes ls, and the positional parameters are unused in the script. You can use them by referencing to the positional parameters, e.g.:
$ set -x
$ bash -c "ls \$0 \$1 \$3" -l -a hi hello blah
+ bash -c 'ls $0 $1 $3' -l -a hi hello blah
ls: cannot access hello: No such file or directory
You should be using $* instead of $# to pass command line as string. "$#" expands to multiple quoted arguments and "$*" combines multiple arguments into a single argument.
#!/bin/bash
bash -c "$*"
Problem is with your $# it executes:
bash -c echo hi
But with $* it executes:
bash -c 'echo hi'
When you use:
cmd="$#"
and use: bash -c "$cmd" it does the same thing for you.
Read: What is the difference between “$#” and “$*” in Bash?

What's the point of eval/bash -c as opposed to just evaluating a variable?

Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?
The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.
$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.
There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.

Triple nested quotations in shell script

I'm trying to write a shell script that calls another script that then executes a rsync command.
The second script should run in its own terminal, so I use a gnome-terminal -e "..." command. One of the parameters of this script is a string containing the parameters that should be given to rsync. I put those into single quotes.
Up until here, everything worked fine until one of the rsync parameters was a directory path that contained a space. I tried numerous combinations of ',",\",\' but the script either doesn't run at all or only the first part of the path is taken.
Here's a slightly modified version of the code I'm using
gnome-terminal -t 'Rsync scheduled backup' -e "nice -10 /Scripts/BackupScript/Backup.sh 0 0 '/Scripts/BackupScript/Stamp' '/Scripts/BackupScript/test' '--dry-run -g -o -p -t -R -u --inplace --delete -r -l '\''/media/MyAndroid/Internal storage'\''' "
Within Backup.sh this command is run
rsync $5 "$path"
where the destination $path is calculated from text in Stamp.
How can I achieve these three levels of nested quotations?
These are some question I looked at just now (I've tried other sources earlier as well)
https://unix.stackexchange.com/questions/23347/wrapping-a-command-that-includes-single-and-double-quotes-for-another-command
how to make nested double quotes survive the bash interpreter?
Using multiple layers of quotes in bash
Nested quotes bash
I was unsuccessful in applying the solutions to my problem.
Here is an example. caller.sh uses gnome-terminal to execute foo.sh, which in turn prints all the arguments and then calls rsync with the first argument.
caller.sh:
#!/bin/bash
gnome-terminal -t "TEST" -e "./foo.sh 'long path' arg2 arg3"
foo.sh:
#!/bin/bash
echo $# arguments
for i; do # same as: for i in "$#"; do
echo "$i"
done
rsync "$1" "some other path"
Edit: If $1 contains several parameters to rsync, some of which are long paths, the above won't work, since bash either passes "$1" as one parameter, or $1 as multiple parameters, splitting it without regard to contained quotes.
There is (at least) one workaround, you can trick bash as follows:
caller2.sh:
#!/bin/bash
gnome-terminal -t "TEST" -e "./foo.sh '--option1 --option2 \"long path\"' arg2 arg3"
foo2.sh:
#!/bin/bash
rsync_command="rsync $1"
eval "$rsync_command"
This will do the equivalent of typing rsync --option1 --option2 "long path" on the command line.
WARNING: This hack introduces a security vulnerability, $1 can be crafted to execute multiple commands if the user has any influence whatsoever over the string content (e.g. '--option1 --option2 \"long path\"; echo YOU HAVE BEEN OWNED' will run rsync and then execute the echo command).
Did you try escaping the space in the path with "\ " (no quotes)?
gnome-terminal -t 'Rsync scheduled backup' -e "nice -10 /Scripts/BackupScript/Backup.sh 0 0 '/Scripts/BackupScript/Stamp' '/Scripts/BackupScript/test' '--dry-run -g -o -p -t -R -u --inplace --delete -r -l ''/media/MyAndroid/Internal\ storage''' "

Parameter list with double quotes does not pass through properly in Bash

I have a Bash script that calls another Bash script. The called script does some modification and checking on a few things, shifts, and then passes the rest of the caller's command line through.
In the called script, I have verified that I have everything managed and ready to call. Here's some debug-style code I've put in:
echo $SVN $command $# > /tmp/shimcmd
bash /tmp/shimcmd
$SVN $command $#
Now, in /tmp/shimcmd you'll see:
svn commit --username=myuser --password=mypass --non-interactive --trust-server-cert -m "Auto Update autocommit Wed Apr 11 17:33:37 CDT 2012"
That is, the built command, all on one line, perfectly fine, including a -m "my string with spaces" portion.
It's perfect. And the "bash /tmp/shimcmd" execution of it works perfectly as well.
But of course I don't want this silly tmp file and such (only used it to debug). The problem is that calling the command directly, instead of via the shim file:
$SVN $command $#
results in the svn command itself NOT receiving the quoted string with spaces--it garbles the '-m "my string with spaces"' parameter and shanks the command as if it was passed as '-m my string with spaces'.
I have tried all manner of crazy escape methods to no avail. Can't believe it's dogging me this badly. Again, by echoing the very same thing ($SVN $command $#) to a file and then executing that file, it's FINE. But calling directly garbles the quoted string. That element alone shanks.
Any ideas?
Dan
Did you try:
eval "$SVN $command $#"
?
Here's a way to demonstrate the problem:
$ args='-m "foo bar"'
$ printf '<%s> ' $args
<-m> <"foo> <bar">
And here's a way to avoid it:
$ args=( -m "foo bar" )
$ printf '<%s> ' "${args[#]}"
<-m> <foo bar>
In this latter case, args is an array, not a quoted string.
Note, by the way, that it has to be "$#", not $#, to get this behavior (in which string-splitting is avoided in favor of respecting the array entries' boundaries).
this
echo -n -e $SVN \"$command\" > /tmp/shimcmd
for x in "$#"
do
a=$a" "\"$x\"
done
echo -e " " $a >> /tmp/shimcmd
bash /tmp/shimcmd
or simply
$SVN "$command" "$#"

Resources