Unable to SSH a script of bash commands via expect - linux

I'm attempting to push a single set of commands to multiple remote hosts. I can use the following on an individual basis:
ssh user#remoteHost "bash -s" <./commands.sh
I can put this into a loop, but then I'm stuck typing in the password n number of times. Disabling the password prompts in the SSH config files is not an option for me.
I've attempted to use expect within a loop, but I'm unable to get it working.
#!/bin/bash
HOSTS="host1 host2"
read -sp "Password: " PASSWORD
for HOST in $HOSTS; do
expect -c "
spawn /usr/bin/ssh user#$HOST "bash -s" <./commands.sh
expect_before {
"*yes/no*" {send "yes"\r;exp_continue}}
expect {
"*Password*" {send $PASSWORD\r;interact}}
exit"
done
I get the following error:
spawn /usr/bin/ssh root#host1 bash
expect: invalid option -- 's'
usage: expect [-div] [-c cmds] [[-f] cmdfile] [args]
spawn /usr/bin/ssh root#host2 bash
expect: invalid option -- 's'
usage: expect [-div] [-c cmds] [[-f] cmdfile] [args]
Any ideas? It appears as though expect is trying to interpret the bash commands. I'm unsure how to stop this.

Solution:
replace
spawn /usr/bin/ssh user#$HOST "bash -s" <./commands.sh
with
spawn sh -c {ssh root#$HOST 'bash -ls' < /tmp/commands.sh}
Final Code:
#!/bin/bash
HOSTS="host1 host2"
read -sp "Password: " PASSWORD
for HOST in $HOSTS; do
expect -c "
spawn sh -c {ssh root#$HOST 'bash -ls' < /tmp/commands.sh}
expect_before {
"*yes/no*" {send "yes"\r;exp_continue}}
expect {
"*assword*" {send $PASSWORD\r;interact}}
exit"
done

I'd suggest this:
#!/bin/bash
hosts=(host1 host2)
read -sp "Password: " password
for host in "${hosts[#]}"; do
env h="$host" p="$password" expect <<'END_EXPECT'
spawn sh -c "/usr/bin/ssh user#$env(h) 'bash -s' <./commands.sh"
expect {
"*yes/no*" {send "yes\r"; exp_continue}
"*Password*" {send "$env(p)\r"}
}
interact
END_EXPECT
done
notes
uses lower case variable names: leave upper case varnames for the shell's use
uses a quoted heredoc to contain the expect code
that lets you use single and double quotes within expect without having to worry about quoting hell in the shell
uses env to pass shell variables to expect via the environment
simplifies your expect statement
The danger with using
expect -c " ...; expect "*Password*" ..."
is that the inner double quotes get matched with the outer quotes, and are removed by the shell. That leaves *Password* as a bare glob that the shell can expand based on the files in your current directory and the shell settings. For example, create a file named "The Password" (with a space) and you'll get an
error.

Related

Bash+Expect script not running properly in crontab

I have a bash script that is supposed to run periodically. the script should connect to a remote SFTP server and get a file from there.
Since this is a SFTP server I had to use expect with the bash script.
the script runs well when I run it manually but fails when running via crontab.
the problematic function is the get_JSON_file()
please advise...
this is the code:
#!/bin/bash
export xxxxx
export xxxxx
export PATH=xxxxx
check_if_file_is_open(){
while :
do
if ! [[ `lsof | grep file.txt` ]]
then
break
fi
sleep 1
done
}
get_JSON_file(){
/usr/bin/expect -f <(cat << EOF
spawn sftp -P port user#ip
expect "Password:"
send "password\r"
expect "$ "
send "get path/to/file/file.json\r"
send "exit\r"
interact
EOF
)
}
get_JSON_file
check_if_file_is_open
cp file.txt /path/to/destination/folder
Expect's interact works only when stdin is on a tty/pty but cron job is not running on tty/pty. So replace interact with expect eof (or expect -timeout 12345 eof if necessary).
That's a very awkward way to pass expect commands to the expect interpreter. Use a (quoted) heredoc instead, and you would drop the -f option for expect
get_JSON_file(){
/usr/bin/expect <<'EOF'
spawn sftp -P port user#ip
expect "Password:"
send "password\r"
expect "$ "
send "get path/to/file/file.json\r"
send "exit\r"
expect eof
EOF
}
The most important tip for debugging expect scripts is to invoke expect's debug output. While you're working out the kinks, use
expect -d <<'EOF'
and in the crontab, you'd want to redirect stderr to stdout so you get the debugging output
* * * * * /path/to/script.sh 2>&1
To run a function within a shell script, no parentheses should be used.
Your code then becomes:
#!/bin/bash
export xxxxx
export xxxxx
export PATH=xxxxx
function check_if_file_is_open(){
while :
do
if ! [[ `lsof | grep file.txt` ]]
then
break
fi
sleep 1
done
}
function get_JSON_file(){
/usr/bin/expect -f <(cat << EOF
spawn sftp -P port user#ip
expect "Password:"
send "password\r"
expect "$ "
send "get path/to/file/file.json\r"
send "exit\r"
interact
EOF
)
}
get_JSON_file
check_if_file_is_open
cp file.txt /path/to/destination/folder
create a new script with screen command and add it in crontab
new_script.sh
#!/bin/bash
cd script_path
screen -dm -S screen_name ./your_script.sh

How to pass a variable value to another server using ssh

#!/bin/ksh
CTN=1
ssh -q user#host 'exec bash -s' << 'ENDSSH'
cd abc/def
./scriptname \$CTN
ENDSSH
exit;
However in the remote server, value of variable CTN is not getting passed.
Please help.
It should be:
CTN=1
ssh -q user#host 'exec bash -s' << ENDSSH
cd abc/def
./scriptname "$CTN"
ENDSSH
Since you want $CTN to expanded locally you must not escape the $ and must not put ENDSSH between single quotes.

Run bash script inside the expect script

Im trying to run my bash script inside an expect script but getting errors.
/usr/bin/expect <<EOD
spawn ssh nginubud#10.123.25.83 $(< try1.sh)
expect "assword:"
send "$reg\r"
expect eof
EOD
im trying to do this in expect ssh nginubud#10.123.25.83 "$(< try1.sh)", this one is working but i need to find a way to run it in automated way. I dont want to use RSA keys.
error that in encountered:
spawn ssh nginubud#10.123.25.83 #tats script
invalid command name "echo"
while executing
"echo "Enter Year:""
Also i can run my expect ssh script but when i include and try to run my $(< try1.sh) im getting "no variable errors"
You can use ssh user#host bash -c .... For example:
[bash] % cat foo.sh
export CMD=$( printf '%q' "$(< try.sh)" )
expect << EOF
spawn ssh foo#localhost bash -c \$::env(CMD)
expect -nocase password:
send bar\r
expect eof
EOF
[bash] % cat try.sh
echo hello world | tr a-z A-Z
[bash] % bash foo.sh
spawn ssh foo#localhost bash -c echo\ hello\ world\ \|\ tr\ a-z\ A-Z
foo#localhost's password:
HELLO WORLD
[bash] %

shell script for remote connection to other system and execute bunch of command in it

I need a shell script that can take remote login in to a system and i can execute a bunch of commands in that system.
I made a script and actually it's working:
#!/bin/bash
USERNAME=KRUNAL
IP=10.61.162.241
ssh -l ${USERNAME} ${IP} "pwd "
ssh -l ${USERNAME} ${IP} "ls -la"
ssh -l ${USERNAME} ${IP} ./a.out
I have problem that if suppose i made script
ssh -l ${USERNAME} ${IP} "pwd " # this execute in remote system
ls -la # this execute in current system.
so every time i need ssh command to execute file on remote system.
Is there any way that i can run bunch of code in remote system with one time login.
You can send as much commands to ssh as you want, provided that you separate them with ; or linebreaks. So this should work:
ssh -l ${USERNAME} ${IP} "pwd; ls -la"
#Joao's suggestion works fine however its impractical when writing many lines.
If this is the case you can do
ssh -1 ${USERNAME} ${IP} bash << 'EOF'
cd /some/directory
./a.out
who am i
for i in `seq 1 10`
do
echo $i
done
EOF
Anything between 'EOF' and the final EOF will be executed in the server side.
You can also replace bash with csh or python and write code for that interpreter instead
If you want the output of the ssh session be stored in a file (say session.log) then replace
ssh -1 ${USERNAME} ${IP} bash << 'EOF'
with
ssh -1 ${USERNAME} ${IP} bash << 'EOF' > 'session.log'
rest remains unchanged

how to escape unusual/uniq characters from expect scripts?

in expect script I can set any command or character to run it on remote machine
but the sad thing is that expect cant send the same character as they defined in the expect script
for example
I want to run this line from expect script in order to change IP address from 10.10.10.10
to 1.1.1.1
expect # {send "perl -i -pe 's/\Q10.10.10.10\E/1.1.1.1/' /etc/hosts\r"}
but when I run the expect screen actually I see this line runing on the console:
[root#localhost ~]# perl -i -pe 's/Q10.10.10.10E/1.1.1.1/' /etc/hosts
pay attention that the backslash before Q and before E was Disappeared
so I wonder hoW to escape those characters from expect script?
so expect will run the same line on the console as following
[root#localhost ~]# perl -i -pe 's/\Q10.10.10.10\E/1.1.1.1/' /etc/hosts
REMARK set a backslash "\" before backslash doesn’t help!!!
my script:
#!/bin/ksh
#
expect=`cat << EOF
set timeout -1
spawn ssh 192.9.200.10
expect {
")?" { send "yes\r" ; exp_continue }
word: {send secret1\r}
}
expect # {send "perl -i -pe 's/\\Q10.10.10.10\\E/1.1.1.1/' /etc/hosts\r"}
expect # {send exit\r}
expect eof
EOF`
expect -c "$expect"
RESULTS ( after I run my script: )
spawn ssh 192.9.200.10
root#'192.9.200.10 s password:
Last login: Sun Aug 4 22:46:53 2013 from 192.9.200.10
[root#localhost ~]# perl -i -pe 's/Q10.10.10.10E/1.1.1.1/' /etc/hosts
[root#localhost ~]# exit
logout
Connection to 192.9.200.10 closed.
Using different Tcl quotes will work
expect # {
# send text verbatim here
send {perl -i -pe 's/\Q10.10.10.10\E/1.1.1.1/' /etc/hosts}
# interpret backslash sequence as carriage return here
send "\r"
}
Either escape it with \ or enclose the whole thing in {}
expect # {send "perl -i -pe 's/\\Q10.10.10.10\\E/1.1.1.1/' /etc/hosts\r"}
(Enclosing the entire thing with {} would send \r as this 2 characters, not as line terminator, so not appropriate here.)
See the the manual page about the Tcl Syntax
And an other note:
You could do the same thing with Tcl, as long as you don't send the commands over SSH

Resources