Unable to use local and remote variables within a heredoc or command over SSH - linux

Below is an example of a ssh script using a heredoc (the actual script is more complex). Is it possible to use both local and remote variables within an SSH heredoc or command?
FILE_NAME is set on the local server to be used on the remote server. REMOTE_PID is set when running on the remote server to be used on local server. FILE_NAME is recognised in script. REMOTE_PID is not set.
If EOF is changed to 'EOF', then REMOTE_PID is set and `FILE_NAME is not. I don't understand why this is?
Is there a way in which both REMOTE_PID and FILE_NAME can be recognised?
Version 2 of bash being used. The default remote login is cshell, local script is to be bash.
FILE_NAME=/example/pdi.dat
ssh user#host bash << EOF
# run script with output...
REMOTE_PID=$(cat $FILE_NAME)
echo $REMOTE_PID
EOF
echo $REMOTE_PID

You need to escape the $ sign if you don't want the variable to be expanded:
$ x=abc
$ bash <<EOF
> x=def
> echo $x # This expands x before sending it to bash. Bash will see only "echo abc"
> echo \$x # This lets bash perform the expansion. Bash will see "echo $x"
> EOF
abc
def
So in your case:
ssh user#host bash << EOF
# run script with output...
REMOTE_PID=$(cat $FILE_NAME)
echo \$REMOTE_PID
EOF
Or alternatively you can just use a herestring with single quotes:
$ x=abc
$ bash <<< '
> x=def
> echo $x # This will not expand, because we are inside single quotes
> '
def

remote_user_name=user
instance_ip=127.0.0.1
external=$(ls /home/)
ssh -T -i ${private_key} -l ${remote_user_name} ${instance_ip} << END
internal=\$(ls /home/)
echo "\${internal}"
echo "${external}"
END

Related

Using escape characters inside double quotes in ssh command in bash script

I want to run some commands each time when I log in to a remote system. Storing commands in .bashrc on remote is not an option.
What is the proper way to escape the escape chars inside of quotes in bash script for ssh?
How can I write each command in new line?
My script
#!/bin/bash
remote_PS1=$'\[\033[01;32m\]\u#\[\033[03;80m\]\h\[\033[00m\]:\[\033[01;34m\]\!:\w\[\033[00m\]\$ '
ssh -t "$#" 'export SYSTEMD_PAGER="";' \
'export $remote_PS1;' \
'echo -e "set nocompatible" > /home/root/.vimrc;' \
'bash -l;'
didn't work.
Escaping escape characters inside double-quotes and run them on remote server is way too complicated for me :)
Instead, I wrote a remoterc file for remote and a small remotessh script.
In remotessh, first I copy remoterc on remote machine and run bash command with that remoterc file interactively.
remoterc:
#!/bin/bash
SYSTEMD_PAGER=""
PS1="\[\033[01;32m\]\u#\[\033[03;80m\]\h\[\033[00m\]:\[\033[01;34m\]\!:\w\[\033[00m\]\$ "
echo -e "set nocompatible" > /home/root/.vimrc
remotessh:
#!/bin/bash
scp remoterc "$1":/home/root/
ssh "$1" -t "bash --rcfile remoterc -i"
It works :)
You can use Bash's printf %q.
According to help printf:
%q      quote the argument in a way that can be reused as shell input
See the following example:
$ cat foo.sh
ps1='\[\033[1;31m\]\u:\w \[\033[0m\]\$ '
ps1_quoted=$( printf %q "$ps1" )
ssh -t foo#localhost \
'export FOO=bar;' \
"export PS1=$ps1_quoted;" \
'bash --norc'
Result:

remote ssh command: first echo output is lost

I'm trying to run several commands on a remote box via ssh 1-liner call by specifying them as semicolon-separated string passed to "bash -c". It works for some cases, but does not for others. Check this out:
# Note: the "echo 1" output is lost:
bash-3.2$ ssh sandbox bash -c "echo 1; echo 2; echo 3"
2
3
# Note: first echo is ignored again
bash-3.2$ ssh sandbox bash -c "echo 0; echo 1; echo 2; echo 3"
1
2
3
# But when we run other commands (for example "date") then nothing is lost
bash-3.2$ ssh sandbox bash -c "date; date;"
Wed Nov 7 20:27:55 UTC 3018
Wed Nov 7 20:27:55 UTC 3018
What am I missing?
Remote OS: Ubuntu 16.04.5 LTS
Remote ssh: OpenSSH_7.2p2 Ubuntu-4ubuntu2.4, OpenSSL 1.0.2g 1 Mar 2016
Local OS: macOS High Sierra Versoin 10.13.3
Local ssh: OpenSSH_7.6p1, LibreSSL 2.6.2
Update:
The above example is heavily simplified picture of what I'm trying to do.
The practical application is actually to generate few files on remote box by echo'ing into remote filesystem:
#!/bin/bash
A=a
B=b
C=c
ssh -i ~/.ssh/${REMOTE_FQDN}.pem ${REMOTE_FQDN} sudo bash -c \
"echo $A > /tmp/_a; echo $B > /tmp/_b; echo $C > /tmp/_c;"
After I run the above script and go to remote box to check results I see the following:
root#sandbox:/tmp# for i in `find ./ -name '_*'|sort`; do echo "----- ${i} ----"; cat $i; done
----- ./_a ----
----- ./_b ----
b
----- ./_c ----
c
As you can see the 1st "echo" command generated blank file!
To be clear, there's 3 shells at work here - the one that interprets ssh, your local shell that is; the one that ssh will be automatically running for you, and the bash you're invoking explicitly.
The reason the 1 is "disappearing" is that the shell that interprets the ssh command "eats" the quotes around the -c arguments, and then the shell on the other side of ssh splits the arguments at whitespace. So it ends up looking like bash -c echo 1 ; echo 2; echo 3. In turn, -c just gets echo, which echos an empty line; 1 becomes the value of that shell's $1, which isn't used. Then the inner bash returns, and the direct ssh shell runs the echo 2; echo 3 normally.
Consider this:
$ ssh xxx bash -c "'echo 1'; echo 2; echo 3"
1
2
3
where echo 1 is protected within the ssh arguments, so the 2nd level ssh shell is passed bash -c 'echo 1'; echo 2; echo 3. The innermost 3rd level shell echos 1, and then the 2nd level ssh shell echos 2 and 3.
Here is yet another interesting permutation:
$ ssh xxx bash -c "'echo 1; echo 2; echo 3'"
1
2
3
here, the inner shell gets all the echos as they're kept grouped within the first shell by " and within the second shell by '.
In general, shell scripts to pass arguments to shell scripts that run shell scripts can be pretty difficult to build. I'd recommend you change your technique a bit to save yourself a lot of effort. Instead of passing the shell commands as command line parameters to the ssh argument, instead provide it through the standard input to the shell. Consider using a pipeline like this, which avoids recursive shell interpretation:
$ echo "echo 1; echo 2; echo 3" | ssh -T xxx
1
2
3
( Here, the -T is just to supress ssh complaining of lack of pseudoterminal).
All the arguments to ssh are combined into a single whitespace-separate string passed to sh -c on the remote end. This means that
ssh sandbox bash -c "echo 1; echo 2; echo 3"
results in the execution of
sh -c 'bash -c echo 1; echo 2; echo 3'
Note the loss of quotes; ssh got the three arguments bash, -c, and echo 1; echo 2; echo 3 after quote removal. On the remote end, bash -c echo 1 just executes echo, with $0 in the shell set to 1.
The command
ssh sandbox bash -c "date; date;"
is treated the same way, but now the first command contains no whitespace. The result on the remote end is
sh -c 'bash -c date; date;'
which means first a new instance of bash runs the date command, followed by the date command being executed directly by sh.
In general, it's a bad idea to use ssh's implicit concatenation. Always pass the command you want executed as a properly escaped single argument:
ssh sandbox 'bash -c "echo 1; echo 2; echo 3"'

How to pass bash variable to cmd?

how can I pass $line to the cmd command?
#!/bin/bash
while read line
do
sshpass -p "..." ssh -o StrictHostKeyChecking=no -tt windows#172....-p 333 cmd /c "cd C:\ & download_give_id.exe '$#$line' "
done <apps.txt
Basically, if you want to interpolate a variable into a bash string you need to use double quotes instead of single quotes:
str="${var} foo bar" # works
str='${var} for bar' # works not
In the special case that you are running ssh commands inside a while loop, I strongly recommend to pass /dev/tty explicitly as input to the command since once the remote command should read from stdin - for whatever reason - it will slurp stdin of the while loop otherwise:
while read line ; do
ssh ... -- "${line} ..." < /dev/tty # Pass tty to stdin
done < input.txt
Note: The above command will work only if the process has been started in a terminal. If the process is not running in a terminal, you need to pass something else as stdin for the inner ssh command.

Bash script runs one command before previous. I want them one after the other

So part of my script is as follows:
ssh user#$remoteServer "
cd ~/a/b/c/;
echo -e 'blah blah'
sleep 1 # Added this just to make sure it waits.
foo=`grep something xyz.log |sed 's/something//g' |sed 's/something-else//g'`
echo $foo > ~/xyz.list
exit "
In my output I see:
grep: xyz.log: No such file or directory
blah blah
Whereas when I ssh to the server, xyz.log does exist within ~/a/b/c/
Why is the grep statement getting executed before the echo statement?
Can someone please help?
The problem here is that your command in backticks is being run locally, not on the remote end of the SSH connection. Thus, it runs before you've even connected to the remote system at all! (This is true for all expansions that run in double-quotes, so the $foo in echo $foo as well).
Use a quoted heredoc to protect your code against local evaluation:
ssh user#$remoteServer bash -s <<'EOF'
cd ~/a/b/c/;
echo -e 'blah blah'
sleep 1 # Added this just to make sure it waits.
foo=`grep something xyz.log |sed 's/something//g' |sed 's/something-else//g'`
echo $foo > ~/xyz.list
exit
EOF
If you want to pass through a variable from the local side, the easy way is with positional parameters:
printf -v varsStr '%q ' "$varOne" "$varTwo"
ssh "user#$remoteServer" "bash -s $varsStr" <<'EOF'
varOne=$1; varTwo=$2 # set as remote variables
echo "Remote value of varOne is $varOne"
echo "Remote value of varTwo is $varTwo"
EOF
[command server] ------> [remote server]
The better way is to create shell script in the "remote server" , and run the command in the "command server" such as :
ssh ${remoteserver} "/bin/bash /foo/foo.sh"
It will solve many problem , the aim is to make things simple but not complex .

cat in multiple ssh commands does not work

This is probably very basic but unfortunately I have no idea how to google it.
Why doesn't the snippet below work as expected? I mean, how can I make cat point to the remote file?
#!/bin/bash
ssh user#remoteaddress << EOF
mkdir sandpit
cd sandpit
echo "foo" > foo.txt
echo `cat foo.txt` > foo2.txt
EOF
Use it as:
ssh -t -t user#remoteaddress<<'EOF'
mkdir sandpit
cd sandpit
echo "foo" > foo.txt
cat foo.txt > foo2.txt
xargs kill < pid.txt
exit
EOF
Without quotes around starting EOF all words are subject to shell expansion and reverse quotes are expanded in your current shell not on ssh.

Resources