How to pass local variables to ssh scope in bash script? - linux

I am writing the bash script, which has to perform some of the commands on remote server via ssh. The script has two major parts:
Part 1:
Using local variables $A and $B
Part 2:
Execute commands on remote server as follows:
ssh -T user#servername << 'EOF'
...
Using local variables $A and $B
...
EOF
The problem is that local variables $A and $B are not available within scope of ssh commands on remote server. As far as I understand the variables $A and $B outside and within ssh scope are not the same.
So my question is how to pass local variables from bash script to ssh scope?
One more note, the part 2 is pretty big so I can't use "one liner" after ssh.
Thanks

The problem has nothing to do with ssh but is only related to here documents in bash or any other Posix shell.
Man page for bash says in the here document paragraph:
The format of here-documents is:
<<[-]word
here-document
delimiter
No parameter expansion, command substitution, arithmetic expansion, or pathname expansion is performed on word. If any characters in word are quoted, the delimiter is the result of quote removal on word, and the lines in the here-document are not expanded. If word is unquoted, all lines of the here-document are subjected to parameter expansion, command substitution, and arithmetic expansion. In the latter case, the character sequence \<newline> is ignored, and \ must be used to quote the characters \, $, and ` .
As you quote EOF, you explicitely ask the shell not to replace the $1 and $2 variables.
The most robust way is to not quote EOF and consistently quote all others special characters in the here document.
For example, if your here document contained something like:
ssh -T user#servername << 'EOF'
for f in /var/log/messages*
do echo "Filename: " $f
done
EOF
you could rewrite it with no quotes around EOF but with one inside $:
ssh -T user#servername << EOF
for f in /var/log/messages*
do echo "Filename: " \$f
done
EOF
That way all unquoted varibles would be interpolated.
Alternatively, if the server allows it, you can try to pass the 2 parameters as environment variables.
Say you want to use the names PARAM1 and PARAM2. The sshd_config file on the server should contain the line AcceptEnv PARAM1 PARAM2 because by default and for security reasons no environment variable is accepted.
You can then use:
export PARAM1=$1
export PARAM2=$2
ssh -T -o SendEnv=PARAM1 -o SenEnv=PARAM2 user#servername << 'EOF'
...
Using variables $PARAM1 and $PARAM2
...
EOF

There could be a way to directly tell ssh to use local variables but I'll quickly answer and mention that you could use a script that wraps the ssh with code that inserts the variables remotely where you will have access to them once you get control of the command prompt.

Related

Variable is empty when passed to bash [duplicate]

This question already has answers here:
How to cat <<EOF >> a file containing code?
(5 answers)
Closed 5 years ago.
I'm trying to create a script file using substitution string from ENV but want also to prevent some from escaping
export PLACEHOLDER1="myPlaceholder1Value"
sudo /bin/su -c "cat << EOF > /etc/init.d/my-script
#!/bin/bash
myvariable_1=toto$PLACEHOLDER1
myvariable_final=\"dynamicvar=\${myvariable_1},\${myvariable_2}\"
EOF
"
It results in which is not good as the myvariable_final are not escaped and substituted as the one from the init script dependencies ($remote_fs, $syslog, $network, $time)
#!/bin/bash
myvariable_1=totomyPlaceholder1Value
myvariable_2=titimyPlaceholder2Value
myvariable_final="dynamicvar=,"
If i try to put a backslash \ behind the dollars $, I manage to avoid the substitution but I getting an unwanted backslash \:
export PLACEHOLDER1="myPlaceholder1Value"
export PLACEHOLDER2="myPlaceholder2Value"
sudo /bin/su -c "cat << EOF > /etc/init.d/my-script
#!/bin/bash
myvariable_1=toto$PLACEHOLDER1
myvariable_2=titi$PLACEHOLDER2
myvariable_final=\"dynamicvar=\$\{myvariable_1},\$\{myvariable_2}\"
EOF
"
results in:
#!/bin/bash
myvariable_1=totomyPlaceholder1Value
myvariable_2=titimyPlaceholder2Value
myvariable_final="dynamicvar=$\{myvariable_1},$\{myvariable_2}"
Wanted/attended result whould have been :
#!/bin/bash
myvariable_1=totomyPlaceholder1Value
myvariable_2=titimyPlaceholder2Value
myvariable_final="dynamicvar=${myvariable_1},${myvariable_2}"
solved by putting quote around the EOF as below and using backslash to control the escaping when needed
export PLACEHOLDER1="myPlaceholder1Value"
export PLACEHOLDER2="myPlaceholder2Value"
sudo /bin/su -c "cat << 'EOF' > /etc/init.d/my-script
#!/bin/bash
myvariable_1=toto$PLACEHOLDER1
myvariable_2=titi$PLACEHOLDER2
myvariable_final=\"dynamicvar=\${myvariable_1},\${myvariable_2}\"
EOF
"
Just use 'EOF' to prevent the variable from expanding:
sudo /bin/su -c "cat << 'EOF' > /etc/init.d/my-script
# ^ ^
From man bash:
Here Documents
This type of redirection instructs the shell to read input from the
current source until a line containing only delimiter (with no
trailing blanks) is seen. All of the lines read up to that point are
then used as the standard input for a command.
The format of here-documents is:
<<[-]word
here-document
delimiter
No parameter expansion, command substitution, arithmetic expansion,
or pathname expansion is performed on word. If any characters in word
are quoted, the delimiter is the result of quote removal on word, and
the lines in the here-document are not expanded. If word is
unquoted, all lines of the here-document are subjected to parameter
expansion, command substitution, and arithmetic expansion. In the
latter case, the character sequence \<newline> is ignored, and \
must be used to quote the characters \, $, and `.
when using the su command put the command itself in sigle quotes and just escape the $ with a backslash. the placeholder variables has to set in command bash context (here after su). so you need to do sth like
su -c 'ph="ph"; cat << EOF > script
varinscript=$ph
var=\${var}
EOF'

linux bash, passing paramenters using a varible issue

I am trying to use a variable to store the parameters, here is the simple test:
#!/bin/bash
sed_args="-e \"s/aaaa/bbbb/g\""
echo $sed_args`
I expected the output to be
-e "s/aaaa/bbbb/g"
but it gives:
"s/aaaa/bbbb/g"
without the "-e"
I am new to bash, any comment is welcome. Thanks, maybe this is already answered somewhere.
You need an array to construct arguments dynamically:
#!/usr/bin/env bash
sed_args=('-e' 's/aaaa/bbbb/g')
echo "${sed_args[#]}"
When you use the variable without double quotes, it gets word split by the shell even before echo sees the value(s). Then, the bash's builtin echo interprets -e as a parameter for itself (which is normally used to turn on interpretation of backslash escapes).
When you double quote the variable, it won't be split and will be interpreted as a single argument to echo:
echo "$sed_args"
For strings you don't control, it's safer to use printf as it doesn't take any arguments after the format string:
printf %s "$string"

bash escape exclamation character inside variable with backtick

I have this bash script:
databases=`mysql -h$DBHOST -u$DBUSER -p$DBPASSWORD -e "SHOW DATABASES;" | tr -d "| " | grep -v Database`
and the issue is when the password has all the characters possible. how can i escape the $DBPASSWORD in this case? If I have a password with '!' and given the fact that command is inside backticks. I have no experience in bash scripts but I've tried with "$DBPASSWORD" and with '$DBPASSWORD' and it doesn't work. Thank you
LATER EDIT: link to script here, line 170 -> https://github.com/Ardakilic/backmeup/blob/master/backmeup.sh
First: The answer from #bishop is spot on: Don't pass passwords on the command line.
Second: Use double quotes for all shell expansions. All of them. Always.
databases=$(mysql -h"$DBHOST" -u"$DBUSER" -p"$DBPASSWORD" -e "SHOW DATABASES;" | tr -d "| " | grep -v Database)
Don't pass the MySQL password on the command line. One, it can be tricky with passwords containing shell meta-characters (as you've discovered). Two, importantly, someone using ps can sniff the password.
Instead, either put the password into the system my.cnf, your user configuration file (eg .mylogin.cnf) or create an on-demand file to hold the password:
function mysql() {
local tmpfile=$(mktemp)
cat > "$tmpfile" <<EOCNF
[client]
password=$DBPASSWORD
EOCNF
mysql --defaults-extra-file="$tmpfile" -u"$DBUSER" -h"$DBHOST" "$#"
rm "$tmpfile"
}
Then you can run it as:
mysql -e "SHOW DATABASES" | tr -d "| " ....
mysql -e "SELECT * FROM table" | grep -v ...
See the MySQL docs on configuration files for further examples.
I sometimes have the same problem when automating activities:
I have a variable containing a string (usually a password) that is set in a config file or passed on the command-line, and that string includes the '!' character.
I need to pass that variable's value to another program, as a command-line argument.
If I pass the variable unquoted, or in double-quotes ("$password"), the shell tries to interpret the '!', which fails.
If I pass the variable in single quotes ('$password'), the variable isn't expanded.
One solution is to construct the full command in a variable and then use eval, for example:
#!/bin/bash
username=myuser
password='my_pass!'
cmd="/usr/bin/someprog -user '$username' -pass '$password'"
eval "$cmd"
Another solution is to write the command to a temporary file and then source the file:
#!/bin/bash
username=myuser
password='my_pass!'
cmd_tmp=$HOME/.tmp.$$
touch $cmd_tmp
chmod 600 $cmd_tmp
cat > $cmd_tmp <<END
/usr/bin/someprog -user '$username' -pass '$password'
END
source $cmd_tmp
rm -f $cmd_tmp
Using eval is simple, but writing a file allows for multiple complex commands.
P.S. Yes, I know that passing passwords on the command-line isn't secure - there is no need for more virtue-signalling comments on that topic.

Bash command line arguments passed to sed via ssh

I am looking to write a simple script to perform a SSH command on many hosts simultaneously, and which hosts exactly are generated from another script. The problem is that when I run the script using sometihng like sed it doesn't work properly.
It should run like sshall.sh {anything here} and it will run the {anything here} part on all the nodes in the list.
sshall.sh
#!/bin/bash
NODES=`listNodes | grep "node-[0-9*]" -o`
echo "Connecting to all nodes and running: ${#:1}"
for i in $NODES
do
:
echo "$i : Begin"
echo "----------------------------------------"
ssh -q -o "StrictHostKeyChecking no" $i "${#:1}"
echo "----------------------------------------"
echo "$i : Complete";
echo ""
done
When it is run with something like whoami it works but when I run:
[root#myhost bin]# sshall.sh sed -i '/^somebeginning/ s/$/,appendme/' /etc/myconfig.conf
Connecting to all nodes and running: sed -i /^somebeginning/ s/$/,appendme/ /etc/myconfig.conf
node-1 : Begin
----------------------------------------
sed: -e expression #1, char 18: missing command
----------------------------------------
node-1 : Complete
node-2 : Begin
----------------------------------------
sed: -e expression #1, char 18: missing command
----------------------------------------
node-2 : Complete
…
Notice that the quotes disappear on the sed command when sent to the remote client.
How do I go about fixing my bash command?
Is there a better way of achieving this?
Substitute an eval-safe quoted version of your command into a heredoc:
#!/bin/bash
# ^^^^- not /bin/sh; printf %q is an extension
# Put your command into a single string, with each argument quoted to be eval-safe
printf -v cmd_q '%q ' "$#"
while IFS= read -r hostname; do
# run bash -s remotely, with that string passed on stdin
ssh -q -o 'StrictHostKeyChecking no' "$hostname" "bash -s" <<EOF
$cmd_q
EOF
done < <(listNodes | grep -o -e "node-[0-9*]")
Why this works reliably (and other approaches don't):
printf %q knows how to quote contents to be eval'd by that same shell (so spaces, wildcards, various local quoting methods, etc. will always be supported).
Arguments given to ssh are not passed to the remote command individually!
Instead, they're concatenated into a string passed to sh -c.
However: The output of printf %q is not portable to all POSIX-derived shells! It's guaranteed to be compatible with the same shell locally in use -- ksh will always parse output from printf '%q' in ksh, bash will parse output from printf '%q' in bash, etc; thus, you can't safely pass this string on the remote argument vector, because it's /bin/sh -- not bash -- running there. (If you know your remote /bin/sh is provided by bash, then you can run ssh "$hostname" "$cmd_q" safely, but only under this condition).
bash -s reads the script to run from stdin, meaning that passing your command there -- not on the argument vector -- ensures that it'll be parsed into arguments by the same shell that escaped it to be shell-safe.
You want to pass the entire command -- with all of its arguments, spaces, and quotation marks -- to ssh so it can pass it unchanged to the remote shell for parsing.
One way to do that is to put it all inside single quotation marks. But then you'll also need to make sure the single quotation marks within your command are preserved in the arguments, so the remote shell builds the correct arguments for sed.
sshall.sh 'sed -i '"'"'/^somebeginning/ s/$/,appendme/'"'"' /etc/myconfig.conf'
It looks redundant, but '"'"' is a common Bourne trick to get a single quotation mark into a single-quoted string. The first quote ends single-quoting temporarily, the double-quote-single-quote-double-quote construct appends a single quotation mark, and then the single quotation mark resumes your single-quoted section. So to speak.
Another trick that can be helpful for troubleshooting is to add the -v flag do your ssh flags, which will spit out lots of text, but most importantly it will show you exactly what string it's passing to the remote shell for parsing and execution.
--
All of this is fairly fragile around spaces in your arguments, which you'll need to avoid, since you're relying on shell parsing on the opposite end.
Thinking outside the box: instead of dealing with all the quoting issues and the word-splitting in the wrong places, you could attempt to a) construct the script locally (maybe use a here-document?), b) scp the script to the remote end, then c) invoke it there. This easily allows more complex command sequences, with all the power of shell control constructs etc. Debugging (checking proper quoting) would be a breeze by simply looking at the generated script.
I recommend reading the command(s) from the standard input rather than from the command line arguments:
cmd.sh
#!/bin/bash -
# Load server_list with user#host "words" here.
cmd=$(</dev/stdin)
for h in ${server_list[*]}; do
ssh "$h" "$cmd"
done
Usage:
./cmd.sh <<'CMD'
sed -i '/^somebeginning/ s/$/,appendme/' /path/to/file1
# other commands
# here...
CMD
Alternatively, run ./cmd.sh, type the command(s), then press Ctrl-D.
I find the latter variant the most convenient, as you don't even need for here documents, no need for extra escaping. Just invoke your script, type the commands, and press the shortcut. What could be easier?
Explanations
The problem with your approach is that the quotes are stripped from the arguments by the shell. For example, the argument '/^somebeginning/ s/$/,appendme/' will be interpreted as /^somebeginning/ s/$/,appendme/ string (without the single quotes), which is an invalid argument for sed.
Of course, you can escape the command with the built-in printf as suggested in other answer here. But the command becomes not very readable after escaping. For example
printf %q 'sed -i /^somebeginning/ s/$/,appendme/ /home/ruslan/tmp/file1.txt'
produces
sed\ -i\ /\^somebeginning/\ s/\$/\,appendme/\ /home/ruslan/tmp/file1.txt
which is not very readable, and will look ugly, if you print it to the screen in order to show the progress.
That's why I prefer to read from the standard input and leave the command intact. My script prints the command strings to the screen, and I see them just in the form I have written them.
Note, the for .. in loop iterates $IFS-separated "words", and is generally not preferred way to traverse an array. It is generally better to invoke read -r in a while loop with adjusted $IFS. I have used the for loop for simplicity, as the question is really about invoking the ssh command.
Logging into multiple systems over SSH and using the same (or variations on the same) command is the basic use case behind ansible. The system is not without significant flaws, but for simple use cases is pretty great. If you want a more solid solution without too much faffing about with escaping and looping over hosts, take a look.
Ansible has a 'raw' module which doesn't even require any dependencies on the target hosts, and you might find that a very simple way to achieve this sort of functionality in a way that frees you from the considerations of looping over hosts, handling errors, marshalling the commands, etc and lets you focus on what you're actually trying to achieve.

How to pass local shell script variable to expect?

My question is related to How to pass variables from a shell script to an expect script?
but its slightly different:
Apart from passing two runtime shell script variables, I want to pass a variable thats inside the shell script For eg:
#!/bin/sh
d=`date '+%Y%m%d_%H%M'`
expect -c '
expect "sftp>"
#use $d here(how?)
'
You don't need to pass a date variable into expect. It has a very good date command builtin:
expect -c '
# ...
set d [timestamp -format "%Y%m%d_%H%M"]
something_with $d
# ...
'
If you need to do more complicated date manipulation, read about the Tcl clock command
Another good technique to pass shell variables to expect (without having to do complicated/messy quoting/escaping) is to use the environment: export your shell variables, and then access them with expect's global env array:
export d=$(date ...)
expect -c 'puts "the date is $env(d)"'
This seems the wrong way to do things. You should set up SSH keys (with ssh-keygen and ssh-copy-id), google about this.
Anyway, try this :
#!/bin/sh
d=`date '+%Y%m%d_%H%M'`
expect -c "
something_with $d"
Note the double quotes instead of single quotes.
"Double quote" every literal that contains spaces/metacharacters and every expansion: "$var", "$(command "$var")", "${array[#]}", "a & b". Use 'single quotes' for code or literal $'s: 'Costs $5 US', ssh host 'echo "$HOSTNAME"'. See http://mywiki.wooledge.org/Quotes , http://mywiki.wooledge.org/Arguments and http://wiki.bash-hackers.org/syntax/words

Resources