Why is my shell (sh) variable in command mode not exported? - linux

If I run a shell command like this my exported variable is not visible.
sh -c "export x=100; echo x is $x"
I would expect that it outputs "x is 100" but it just say "x is ". If I run this is in an interactive mode it works like expected.
My shell version is: GNU bash, version 3.2.51(1)-release (x86_64-suse-linux-gnu)

The $x get interpreted by the current shell, not the sh shell that's you're starting.
Escape it with a backslash:
sh -c "export x=100; echo x is \$x"
Or use single quotes to prevent the shell from interpreting variables:
sh -c 'export x=100; echo x is $x'

Related

Using escape characters inside double quotes in ssh command in bash script

I want to run some commands each time when I log in to a remote system. Storing commands in .bashrc on remote is not an option.
What is the proper way to escape the escape chars inside of quotes in bash script for ssh?
How can I write each command in new line?
My script
#!/bin/bash
remote_PS1=$'\[\033[01;32m\]\u#\[\033[03;80m\]\h\[\033[00m\]:\[\033[01;34m\]\!:\w\[\033[00m\]\$ '
ssh -t "$#" 'export SYSTEMD_PAGER="";' \
'export $remote_PS1;' \
'echo -e "set nocompatible" > /home/root/.vimrc;' \
'bash -l;'
didn't work.
Escaping escape characters inside double-quotes and run them on remote server is way too complicated for me :)
Instead, I wrote a remoterc file for remote and a small remotessh script.
In remotessh, first I copy remoterc on remote machine and run bash command with that remoterc file interactively.
remoterc:
#!/bin/bash
SYSTEMD_PAGER=""
PS1="\[\033[01;32m\]\u#\[\033[03;80m\]\h\[\033[00m\]:\[\033[01;34m\]\!:\w\[\033[00m\]\$ "
echo -e "set nocompatible" > /home/root/.vimrc
remotessh:
#!/bin/bash
scp remoterc "$1":/home/root/
ssh "$1" -t "bash --rcfile remoterc -i"
It works :)
You can use Bash's printf %q.
According to help printf:
%q      quote the argument in a way that can be reused as shell input
See the following example:
$ cat foo.sh
ps1='\[\033[1;31m\]\u:\w \[\033[0m\]\$ '
ps1_quoted=$( printf %q "$ps1" )
ssh -t foo#localhost \
'export FOO=bar;' \
"export PS1=$ps1_quoted;" \
'bash --norc'
Result:

Could not print the temporary variable in the shell command

I was trying to print the temporary variable in the shell command, but I only got four empty line.
/bin/sh -c "for i in {1..4}; do echo "$i"; done"
My sh version is GNU bash, version 3.2.57(1)-release.
The format follows this tutorial.

remote ssh command: first echo output is lost

I'm trying to run several commands on a remote box via ssh 1-liner call by specifying them as semicolon-separated string passed to "bash -c". It works for some cases, but does not for others. Check this out:
# Note: the "echo 1" output is lost:
bash-3.2$ ssh sandbox bash -c "echo 1; echo 2; echo 3"
2
3
# Note: first echo is ignored again
bash-3.2$ ssh sandbox bash -c "echo 0; echo 1; echo 2; echo 3"
1
2
3
# But when we run other commands (for example "date") then nothing is lost
bash-3.2$ ssh sandbox bash -c "date; date;"
Wed Nov 7 20:27:55 UTC 3018
Wed Nov 7 20:27:55 UTC 3018
What am I missing?
Remote OS: Ubuntu 16.04.5 LTS
Remote ssh: OpenSSH_7.2p2 Ubuntu-4ubuntu2.4, OpenSSL 1.0.2g 1 Mar 2016
Local OS: macOS High Sierra Versoin 10.13.3
Local ssh: OpenSSH_7.6p1, LibreSSL 2.6.2
Update:
The above example is heavily simplified picture of what I'm trying to do.
The practical application is actually to generate few files on remote box by echo'ing into remote filesystem:
#!/bin/bash
A=a
B=b
C=c
ssh -i ~/.ssh/${REMOTE_FQDN}.pem ${REMOTE_FQDN} sudo bash -c \
"echo $A > /tmp/_a; echo $B > /tmp/_b; echo $C > /tmp/_c;"
After I run the above script and go to remote box to check results I see the following:
root#sandbox:/tmp# for i in `find ./ -name '_*'|sort`; do echo "----- ${i} ----"; cat $i; done
----- ./_a ----
----- ./_b ----
b
----- ./_c ----
c
As you can see the 1st "echo" command generated blank file!
To be clear, there's 3 shells at work here - the one that interprets ssh, your local shell that is; the one that ssh will be automatically running for you, and the bash you're invoking explicitly.
The reason the 1 is "disappearing" is that the shell that interprets the ssh command "eats" the quotes around the -c arguments, and then the shell on the other side of ssh splits the arguments at whitespace. So it ends up looking like bash -c echo 1 ; echo 2; echo 3. In turn, -c just gets echo, which echos an empty line; 1 becomes the value of that shell's $1, which isn't used. Then the inner bash returns, and the direct ssh shell runs the echo 2; echo 3 normally.
Consider this:
$ ssh xxx bash -c "'echo 1'; echo 2; echo 3"
1
2
3
where echo 1 is protected within the ssh arguments, so the 2nd level ssh shell is passed bash -c 'echo 1'; echo 2; echo 3. The innermost 3rd level shell echos 1, and then the 2nd level ssh shell echos 2 and 3.
Here is yet another interesting permutation:
$ ssh xxx bash -c "'echo 1; echo 2; echo 3'"
1
2
3
here, the inner shell gets all the echos as they're kept grouped within the first shell by " and within the second shell by '.
In general, shell scripts to pass arguments to shell scripts that run shell scripts can be pretty difficult to build. I'd recommend you change your technique a bit to save yourself a lot of effort. Instead of passing the shell commands as command line parameters to the ssh argument, instead provide it through the standard input to the shell. Consider using a pipeline like this, which avoids recursive shell interpretation:
$ echo "echo 1; echo 2; echo 3" | ssh -T xxx
1
2
3
( Here, the -T is just to supress ssh complaining of lack of pseudoterminal).
All the arguments to ssh are combined into a single whitespace-separate string passed to sh -c on the remote end. This means that
ssh sandbox bash -c "echo 1; echo 2; echo 3"
results in the execution of
sh -c 'bash -c echo 1; echo 2; echo 3'
Note the loss of quotes; ssh got the three arguments bash, -c, and echo 1; echo 2; echo 3 after quote removal. On the remote end, bash -c echo 1 just executes echo, with $0 in the shell set to 1.
The command
ssh sandbox bash -c "date; date;"
is treated the same way, but now the first command contains no whitespace. The result on the remote end is
sh -c 'bash -c date; date;'
which means first a new instance of bash runs the date command, followed by the date command being executed directly by sh.
In general, it's a bad idea to use ssh's implicit concatenation. Always pass the command you want executed as a properly escaped single argument:
ssh sandbox 'bash -c "echo 1; echo 2; echo 3"'

bash passing strings to "gnome-terminal -e"

this question looks like Opening multiple tabs in gnome terminal with complex commands from a cycle, but I am looking for a more generic solution.
I have a C program that calls a script "xvi" with arguments. Each argument is originally enclosed within quotes (''') and each quote in an argument is isolated and back-slashed (this format is a prerequisite) ex:
xvi 'a file' 'let'\''s try another'
The script xvi must launch gnome-terminal with "-e vim args"
With xterm instead of gnome-terminal, this is easy because xterm assumes that "-e" is the last argument and passes all the tail to the shell, so the following is OK:
exec /usr/bin/xterm -e /usr/bin/vim "$#"
For gnome-terminal, "-e" is an option among others and we need to 'package' the whole command line in one argument. This is what I have done, which is OK: Enclose each argument within double quotes(\"arg\") and backslash any double quote within an argument:
cmd="/usr/bin/vim"
while [ "$1" != "" ] ; do
arg=`echo "$1" | sed -e 's/\"/\\\"/g'`
cmd="$cmd \"$arg\""
shift
done
exec gnome-terminal --zoom=0.9 --disable-factory -e "$cmd"
Again, this works fine and I am nearly happy with that.
Question: Is there any nicer solution, avoiding the loop?
Thanks
Untested, but you could probably finagle printf '%q' into doing the job:
exec gnome-terminal --zoom=0.9 --disable-factory -e "$(printf '%q ' "$#")"
I know this thread is old but recently I had a similar need and I created a bash script to launch multiple tabs and run different commands on each of them:
#!/bin/bash
# Array of commands to run in different tabs
commands=(
'tail -f /var/log/apache2/access.log'
'tail -f /var/log/apache2/error.log'
'tail -f /usr/local/var/postgres/server.log'
)
# Build final command with all the tabs to launch
set finalCommand=""
for (( i = 0; i < ${#commands[#]}; i++ )); do
export finalCommand+="--tab -e 'bash -c \"${commands[$i]}\"' "
done
# Run the final command
eval "gnome-terminal "$finalCommand
You just need to add your commands in the array and execute.
Gist link: https://gist.github.com/rollbackpt/b4e17e2f4c23471973e122a50d591602

Shell scripting shell inside shell

I would like to connect to different shells (csh, ksh etc.,) and execute command inside each switched shell.
Following is the sample program which reflects my intention:
#!/bin/bash
echo $SHELL
csh
echo $SHELL
exit
ksh
echo $SHELL
exit
Since, i am not well versed with Shell scripting need a pointer on how to achieve this. Any help would be much appreciated.
If you want to execute only one single command, you can use the -c option
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
If you want to execute several commands, or even a whole script in a child-shell, you can use the here-document feature of bash and use the -s (read commands from stdin) on the child shells:
#!/bin/bash
echo "this is bash"
csh -s <<- EOF
echo "here go the commands for csh"
echo "and another one..."
EOF
echo "this is bash again"
ksh -s <<- EOF
echo "and now, we're in ksh"
EOF
Note that you can't easily check the shell you are in by echo $SHELL, because the parent shell expands this variable to the text /././bash. If you want to be sure that the child shell works, you should check if a shell-specific syntax is working or not.
It is possible to use the command line options provided by each shell to run a snippet of code.
For example, for bash use the -c option:
bash -c $code
bash -c 'echo hello'
zsh and fish also use the -c option.
Other shells will state the options they use in their man pages.
You need to use the -c command line option if you want to pass commands on bash startup:
#!/bin/bash
# We are in bash already ...
echo $SHELL
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
You can pass arbitrary complex scripts to a shell, using the -c option, as in
sh -c 'echo This is the Bourne shell.'
You will save you a lot of headaches related to quotes and variable expansion if you wrap the call in a function reading the script on stdin as:
execute_with_ksh()
{
local script
script=$(cat)
ksh -c "${script}"
}
prepare_complicated_script()
{
# Write shell script on stdout,
# for instance by cat-ting a here-document.
cat <<'EOF'
echo ${SHELL}
EOF
}
prepare_complicated_script | execute_with_ksh
The advantage of this method is that it easy to insert a tee in the pipe or to break the pipe to control the script being passed to the shell.
If you want to execute the script on a remote host through ssh you should consider encode your script in base 64 to transmit it safely to the remote shell.

Resources