Run bash script inside the expect script - linux

Im trying to run my bash script inside an expect script but getting errors.
/usr/bin/expect <<EOD
spawn ssh nginubud#10.123.25.83 $(< try1.sh)
expect "assword:"
send "$reg\r"
expect eof
EOD
im trying to do this in expect ssh nginubud#10.123.25.83 "$(< try1.sh)", this one is working but i need to find a way to run it in automated way. I dont want to use RSA keys.
error that in encountered:
spawn ssh nginubud#10.123.25.83 #tats script
invalid command name "echo"
while executing
"echo "Enter Year:""
Also i can run my expect ssh script but when i include and try to run my $(< try1.sh) im getting "no variable errors"

You can use ssh user#host bash -c .... For example:
[bash] % cat foo.sh
export CMD=$( printf '%q' "$(< try.sh)" )
expect << EOF
spawn ssh foo#localhost bash -c \$::env(CMD)
expect -nocase password:
send bar\r
expect eof
EOF
[bash] % cat try.sh
echo hello world | tr a-z A-Z
[bash] % bash foo.sh
spawn ssh foo#localhost bash -c echo\ hello\ world\ \|\ tr\ a-z\ A-Z
foo#localhost's password:
HELLO WORLD
[bash] %

Related

Bash+Expect script not running properly in crontab

I have a bash script that is supposed to run periodically. the script should connect to a remote SFTP server and get a file from there.
Since this is a SFTP server I had to use expect with the bash script.
the script runs well when I run it manually but fails when running via crontab.
the problematic function is the get_JSON_file()
please advise...
this is the code:
#!/bin/bash
export xxxxx
export xxxxx
export PATH=xxxxx
check_if_file_is_open(){
while :
do
if ! [[ `lsof | grep file.txt` ]]
then
break
fi
sleep 1
done
}
get_JSON_file(){
/usr/bin/expect -f <(cat << EOF
spawn sftp -P port user#ip
expect "Password:"
send "password\r"
expect "$ "
send "get path/to/file/file.json\r"
send "exit\r"
interact
EOF
)
}
get_JSON_file
check_if_file_is_open
cp file.txt /path/to/destination/folder
Expect's interact works only when stdin is on a tty/pty but cron job is not running on tty/pty. So replace interact with expect eof (or expect -timeout 12345 eof if necessary).
That's a very awkward way to pass expect commands to the expect interpreter. Use a (quoted) heredoc instead, and you would drop the -f option for expect
get_JSON_file(){
/usr/bin/expect <<'EOF'
spawn sftp -P port user#ip
expect "Password:"
send "password\r"
expect "$ "
send "get path/to/file/file.json\r"
send "exit\r"
expect eof
EOF
}
The most important tip for debugging expect scripts is to invoke expect's debug output. While you're working out the kinks, use
expect -d <<'EOF'
and in the crontab, you'd want to redirect stderr to stdout so you get the debugging output
* * * * * /path/to/script.sh 2>&1
To run a function within a shell script, no parentheses should be used.
Your code then becomes:
#!/bin/bash
export xxxxx
export xxxxx
export PATH=xxxxx
function check_if_file_is_open(){
while :
do
if ! [[ `lsof | grep file.txt` ]]
then
break
fi
sleep 1
done
}
function get_JSON_file(){
/usr/bin/expect -f <(cat << EOF
spawn sftp -P port user#ip
expect "Password:"
send "password\r"
expect "$ "
send "get path/to/file/file.json\r"
send "exit\r"
interact
EOF
)
}
get_JSON_file
check_if_file_is_open
cp file.txt /path/to/destination/folder
create a new script with screen command and add it in crontab
new_script.sh
#!/bin/bash
cd script_path
screen -dm -S screen_name ./your_script.sh

How to pass a variable value to another server using ssh

#!/bin/ksh
CTN=1
ssh -q user#host 'exec bash -s' << 'ENDSSH'
cd abc/def
./scriptname \$CTN
ENDSSH
exit;
However in the remote server, value of variable CTN is not getting passed.
Please help.
It should be:
CTN=1
ssh -q user#host 'exec bash -s' << ENDSSH
cd abc/def
./scriptname "$CTN"
ENDSSH
Since you want $CTN to expanded locally you must not escape the $ and must not put ENDSSH between single quotes.

Unable to use local and remote variables within a heredoc or command over SSH

Below is an example of a ssh script using a heredoc (the actual script is more complex). Is it possible to use both local and remote variables within an SSH heredoc or command?
FILE_NAME is set on the local server to be used on the remote server. REMOTE_PID is set when running on the remote server to be used on local server. FILE_NAME is recognised in script. REMOTE_PID is not set.
If EOF is changed to 'EOF', then REMOTE_PID is set and `FILE_NAME is not. I don't understand why this is?
Is there a way in which both REMOTE_PID and FILE_NAME can be recognised?
Version 2 of bash being used. The default remote login is cshell, local script is to be bash.
FILE_NAME=/example/pdi.dat
ssh user#host bash << EOF
# run script with output...
REMOTE_PID=$(cat $FILE_NAME)
echo $REMOTE_PID
EOF
echo $REMOTE_PID
You need to escape the $ sign if you don't want the variable to be expanded:
$ x=abc
$ bash <<EOF
> x=def
> echo $x # This expands x before sending it to bash. Bash will see only "echo abc"
> echo \$x # This lets bash perform the expansion. Bash will see "echo $x"
> EOF
abc
def
So in your case:
ssh user#host bash << EOF
# run script with output...
REMOTE_PID=$(cat $FILE_NAME)
echo \$REMOTE_PID
EOF
Or alternatively you can just use a herestring with single quotes:
$ x=abc
$ bash <<< '
> x=def
> echo $x # This will not expand, because we are inside single quotes
> '
def
remote_user_name=user
instance_ip=127.0.0.1
external=$(ls /home/)
ssh -T -i ${private_key} -l ${remote_user_name} ${instance_ip} << END
internal=\$(ls /home/)
echo "\${internal}"
echo "${external}"
END

Unable to SSH a script of bash commands via expect

I'm attempting to push a single set of commands to multiple remote hosts. I can use the following on an individual basis:
ssh user#remoteHost "bash -s" <./commands.sh
I can put this into a loop, but then I'm stuck typing in the password n number of times. Disabling the password prompts in the SSH config files is not an option for me.
I've attempted to use expect within a loop, but I'm unable to get it working.
#!/bin/bash
HOSTS="host1 host2"
read -sp "Password: " PASSWORD
for HOST in $HOSTS; do
expect -c "
spawn /usr/bin/ssh user#$HOST "bash -s" <./commands.sh
expect_before {
"*yes/no*" {send "yes"\r;exp_continue}}
expect {
"*Password*" {send $PASSWORD\r;interact}}
exit"
done
I get the following error:
spawn /usr/bin/ssh root#host1 bash
expect: invalid option -- 's'
usage: expect [-div] [-c cmds] [[-f] cmdfile] [args]
spawn /usr/bin/ssh root#host2 bash
expect: invalid option -- 's'
usage: expect [-div] [-c cmds] [[-f] cmdfile] [args]
Any ideas? It appears as though expect is trying to interpret the bash commands. I'm unsure how to stop this.
Solution:
replace
spawn /usr/bin/ssh user#$HOST "bash -s" <./commands.sh
with
spawn sh -c {ssh root#$HOST 'bash -ls' < /tmp/commands.sh}
Final Code:
#!/bin/bash
HOSTS="host1 host2"
read -sp "Password: " PASSWORD
for HOST in $HOSTS; do
expect -c "
spawn sh -c {ssh root#$HOST 'bash -ls' < /tmp/commands.sh}
expect_before {
"*yes/no*" {send "yes"\r;exp_continue}}
expect {
"*assword*" {send $PASSWORD\r;interact}}
exit"
done
I'd suggest this:
#!/bin/bash
hosts=(host1 host2)
read -sp "Password: " password
for host in "${hosts[#]}"; do
env h="$host" p="$password" expect <<'END_EXPECT'
spawn sh -c "/usr/bin/ssh user#$env(h) 'bash -s' <./commands.sh"
expect {
"*yes/no*" {send "yes\r"; exp_continue}
"*Password*" {send "$env(p)\r"}
}
interact
END_EXPECT
done
notes
uses lower case variable names: leave upper case varnames for the shell's use
uses a quoted heredoc to contain the expect code
that lets you use single and double quotes within expect without having to worry about quoting hell in the shell
uses env to pass shell variables to expect via the environment
simplifies your expect statement
The danger with using
expect -c " ...; expect "*Password*" ..."
is that the inner double quotes get matched with the outer quotes, and are removed by the shell. That leaves *Password* as a bare glob that the shell can expand based on the files in your current directory and the shell settings. For example, create a file named "The Password" (with a space) and you'll get an
error.

shell script for remote connection to other system and execute bunch of command in it

I need a shell script that can take remote login in to a system and i can execute a bunch of commands in that system.
I made a script and actually it's working:
#!/bin/bash
USERNAME=KRUNAL
IP=10.61.162.241
ssh -l ${USERNAME} ${IP} "pwd "
ssh -l ${USERNAME} ${IP} "ls -la"
ssh -l ${USERNAME} ${IP} ./a.out
I have problem that if suppose i made script
ssh -l ${USERNAME} ${IP} "pwd " # this execute in remote system
ls -la # this execute in current system.
so every time i need ssh command to execute file on remote system.
Is there any way that i can run bunch of code in remote system with one time login.
You can send as much commands to ssh as you want, provided that you separate them with ; or linebreaks. So this should work:
ssh -l ${USERNAME} ${IP} "pwd; ls -la"
#Joao's suggestion works fine however its impractical when writing many lines.
If this is the case you can do
ssh -1 ${USERNAME} ${IP} bash << 'EOF'
cd /some/directory
./a.out
who am i
for i in `seq 1 10`
do
echo $i
done
EOF
Anything between 'EOF' and the final EOF will be executed in the server side.
You can also replace bash with csh or python and write code for that interpreter instead
If you want the output of the ssh session be stored in a file (say session.log) then replace
ssh -1 ${USERNAME} ${IP} bash << 'EOF'
with
ssh -1 ${USERNAME} ${IP} bash << 'EOF' > 'session.log'
rest remains unchanged

Resources