I need to setup a cron job on multiple servers (all Unix based). For this, I have created a script (lets refer to it as script "A") which sets up the cron job correctly when executed manually. Also, I have uploaded this script to all the target servers with the intention of calling and executing it automatically via another script on another server.
Now, this other server contains the script "B" (as mentioned above) which logs into all the target servers and calls script A in order to automatically setup the cron on all servers.
The problem is, script A is never called and cron is never setup. Script B is an Expect script and here's the code,
spawn rsync -avz -e ssh $localDir/autodir $username#$ipaddress:$remoteDir
sleep 10
expect {
"assword:" {
send "$pwd\r"
sleep 5
exp_continue
}
expect {
"#" {
send "cd /home/$username/autodir/config\r"
sleep 5
exp_continue
expect "#"
send "./setCron.sh" # NEED TO EXECUTE THIS AT REMOTE SERVER
sleep 5
exp_continue
}
}
"yes/no" {
send "yes\r"
set timeout -5
exp_continue
}
-re $prompt {
send "\r"
}
timeout {
exit
}
eof {
exit
}
}
Basically, I am looking for a general purpose solution to call a script on a remote server without using ssh or public key authentication because I cannot use both of these methods due to some limitations. Hence I tried using Expect but to no success. I could not find the proper solution even after a lot of rummaging. Help needed here!
Just discovered the solution while tinkering around. Here's the code,
spawn rsync -avz -e ssh $localDir/autodir $username#$ipaddress:$remoteDir
sleep 10
expect {
"assword:" {
send "$pwd\r"
sleep 5
exp_continue
}
timeout {
exit
}
}
spawn ssh $username#$ipaddress
sleep 10
expect {
"assword:" {
send "$pwd\r"
sleep 5
send "sh /home/$username/autodir/config/setCron.sh\r"
sleep 5
exp_continue
}
timeout {
exit
}
}
This logs into all my servers, syncs the file/directories and sets up the cron job. Works like a charm. I had to use ssh in the end though.
Related
I am trying to call a script from another script because I am using "spawn" (from expect packet) to execute an SSH to another machine. Basically, after executing a comparison and routine in my main script (scriptA.sh) it calls for another script (scriptB.sh). The calls works fine, as I can see that the SSH is correctly executed. However, something looks wrong as not all the commands are executed correctly. If I execute those commands manually it works.
I have running an application on a machine that sometimes gets frozen and shows totally white screen. scriptB.sh kills firefox and re-open the broswer on background. The commands works fine if i execute them manually:
sudo killall firefox
export DISPLAY=:0.0 (I don´t need "sudo" for this)
/usr/local/bin/run_digital_signage_firefox.sh& (This command must be running without "sudo")
This is a very small resume from my scriptA.sh (I execute this script like this: sudo ./scriptA.sh):
#!/bin/bash
...something....
export username
export password
export IP
sh /home/mydirectory/scriptB.sh
...something....
and this is my scriptB.sh (I pass the username, password and IP using "export" in my scriptA.sh):
#!/bin/bash
/usr/bin/expect << EOF
spawn ssh test#$IP "sudo killall firefox && export DISPLAY=:0.0 && /usr/local/bin/run_digital_signage_firefox.sh&"
sleep 3
expect "*?ame:*" {
send "$username\r"
sleep 2
expect "*?assword:*"
send "$password\r"
sleep 2
expect "\r"
sleep 2
}
expect "*?(yes/no)*" {
send "yes\r"
sleep 2
expect "*?ame:*"
send "$username\r"
sleep 2
expect "*?assword:*"
send "$password\r"
expect "\r"
sleep 2
}
EOF
When scriptA.sh calls scriptB.sh it seems that it gets to kill firefox browser but not to initialize this with the command "/usr/local/bin/run_digital_signage_firefox.sh&". I must run this command without sudo but even if i try to add "sudo /usr/local/bin/run_digital_signage_firefox.sh&" in the SSH in scriptB.sh it does not work as well. It seems that the script is not executing "/usr/local/bin/run_digital_signage_firefox.sh&" neither "sudo /usr/local/bin/run_digital_signage_firefox.sh&". I need to execute the last part of the script to open again the firefox browser.
Both scripts have rwx permissions in ugo.
I wrote a bash code to send files to a sever computer from my remote laptop. I used 'scp' command and wrote it on a bash script to bypass entering a password every time I ran it.
expect <<EOF
spawn scp -P 1111 -o StrictHostKeyChecking=no -r /Users/Desktop/sync_mac user#192.111.111.101:/home/folder
expect "password:"
send "11111\r"
expect eof
EOF
However, the problem is when I ran the bash script on the terminal, it seemed like working well but suddenly failed sending files without any sign of warnings.(Especially for the case of sending a large number of files or a large size of file, it was okay for the case of a small number and a small size)
Thanks for your help
The default timeout for expect is 10 seconds so expect eof would wait for at most 10 seconds which may be not enough for many files as you mentioned.
To fix, you can set timeout -1 before expect eof or just expect -timeout -1 eof.
I had to do this a while back, but for multiple servers, the list of which would change from night to night. I discovered sometimes scp would ask to add the host to your list of known hosts first, or other messages for which I had to code handling. That could be getting buried in your script.
This is an excerpt of what ultimately worked, if that helps:
expect -c "
spawn ssh-copy-id -i /x/home/$USER/.ssh/id_rsa.pub $USER#$HOST
expect {
\"password:\" {
send \"$PASS\n\"
expect {
\"expecting.\" { }
timeout {exit 1}
\"again.\" {exit 1}
}
}
\"yes/no)?\" {
send \"yes\n\"
expect \"password:\" {
send \"$PASS\n\"
expect {
\"expecting.\" { }
timeout {exit 1}
\"again.\" {exit 1}
}
}
}
}
I have an shell script which uses expect to launch a ssh session to another server and list a directory. It works fine but my question is what is the best way to handle remote prompts after logging in? For example, here is what I have so far:
# Wait for the prompt on a remote ssh server
-re "\[%|>|\$|#\] $" {
send "ls -1t /home/user/\r"
expect {
"*not found" {
puts "\nDirectory not found\n"
exp_continue
}
timeout { puts "\ntimeout happened\n" }
-re "\[%|>|\$|#\] $" {
puts "Exiting..."
send "exit\r"
return
}
}
}
So, after a successful login the expect script waits for the prompt before sending the ls -1t /home/user/ command. The prompt can be different depending how how the ssh destination is setup. So far I'm checking for
-re "\[%|>|\$|#\] $"
prompts handled so far....
%
>
$
#
I would like to make this as generic as possible so I know the destination ssh server is ready to receive the ls -1t /home/user/ command.
Is there a better way to do this or should I add more cases to the reg_ex in my expect script?
I've also tried -re ". $" but this doesn't work
I am trying to execute commands on a remote UNIX host using send and expect ssh module, but even if the script logs in to the server successfully it does not execute commands.
#!/usr/bin/expect
set timeout 60
spawn ssh xxxx#xxxxxx
expect "yes/no" {
send "yes\r"
expect "*?assword" { send "xxxxxx\r" }
} "*?assword" { send "xxxxxxx\r" }
expect "$ "
#sleep 5
send "ps -aef \r"
Output
[xxxxx#xxxxxx Scripts]$ ./TestExpect.sh
spawn ssh xxxxx#xxxxxx
xxxxxx#xxxxxx's password:
Last login: Wed May 9 02:05:47 2018 from xxxxxxxxx
Kickstarted on 2015-05-12
[xxxxx#xxxxx ~]$ [xxxxxx#xxxxx Scripts]$
The Prompt looks like below
[aacdd123#linprod345 ~]$
Issue may be because, you are not expecting anything after sending the ps -aef. Hence the expect spawn process has exited before printing the output.
Try adding few more commands after the sending ps -aef
send "ps -aef\r"
expect $prompt
send "echo hello\r"
expect $prompt
Try looking into the expect_out buffers too, which will give you the captured streams.
puts $expect_out(buffer)
I am really new to using expect, and a bit confused regarding passing commands to an expect script, so please bear with me... I have searched numerous forums, but cannot seem to find an example of an expect script that uses the read command to get user input.
In my Korn shell script, I call an expect script (expectssh.exp) to ssh login to another host and get user input regarding that host's network configuration (network interface card number and subnet mask information). I pass four arguments to the expect script: the remote host ip address, the username, the password, and the list of commands to run. My expect script is below:
#!/usr/bin/expect
# Usage: expectssh <host> <ssh user> <ssh password> <script>
set timeout 60
set prompt "(%|#|\\$) $"
set commands [lindex $argv 3];
spawn ssh [lindex $argv 1]#[lindex $argv 0]
expect {
"*assword:" {
send -- "[lindex $argv 2]\r"
expect -re "$prompt"
send -- "$commands\r"
}
"you sure you want to continue connecting" {
send -- "yes\r"
expect "*assword:"
send -- "[lindex $argv 2]\r"
expect -re "$prompt"
send -- "$commands\r"
}
timeout {
exit }
}
The script runs well, except that when it gets to the 'read' command, the script does not continue or exit after the user presses enter. It just hangs.
The commands I pass to the expect script and its call are as follows:
SCRIPT='hostname > response.txt;netstat -rn;read net_card?"What is the network interface card number? " >> response.txt; read net_mask?"What is the subnet mask? " >> response.txt'
/usr/bin/expect ./expectssh.exp $hostip $usr $pswd "$SCRIPT"
Any suggestions on how I can pass the read command through my expect script without it hanging?
On a side note because I know it will come up - I am not allowed to do key-based automatic SSH login. I have to prompt for a username and password, which is done from the Korn shell script that calls this expect script.
Thanks for any suggestions and help you can provide!
For anyone interested, I was able to get the read command to work for user input by doing a few things:
(1) Putting it within an -re $prompt block instead of appending send -- "$commands\r" after the password entry.
(2) Hard coding the commands into the script rather than passing them in.
(3) Following the command with an interact statement so that the next send command isn't entered before the user responds.
My expect block now looks like this:
expect {
-re "(.*)assword:" {
send -s "$pswd\r"
exp_continue
}
"denied, please try again" {
send_user "Invalid password or account.\n"
exit 5
}
"incorrect" {
send_user "Invalid password or account.\n"
exit 5
}
"you sure you want to continue connecting" {
send -s "yes\r"
exp_continue
}
-re $prompt {
set timeout -1
send -- "hostname > partnerinit\r"
expect -exact "hostname > partnerinit\r"
send -s "netstat -rn\r"
expect -re "$prompt"
send -- "read n_card?'Enter the network interface card number for this server (i.e. eth0): '\r"
interact "\r" return
send -- "\r"
send -- "echo \$n_card >> partnerinit\r"
send -- "msk=\$(cat /etc/sysconfig/network-scripts/ifcfg-\$n_card | grep NETMASK)\r"
send -- "msk=\$(echo \${msk#NETMASK=})\r"
send -- "echo \$msk >> partnerinit\r"
send -- "cat partnerinit\r"
set retval 0
}
timeout {
send_user "Connection to host $host timed out.\n"
exit 10
}
eof {
send_user "Connection to host $host failed.\n"
exit 1
}
}
I also updated the script to automatically determine the subnet mask based on the network interface card number entered by the user. It was brought to my attention that finding the network interface card number would be very difficult to automate in the case of a box that has multiple interface cards. Better to start small and have the user enter it and fine-tune/automate it later once the overall script is working.
Now I'm working on modifying this to scp my partnerinit file back to my local host and to return meaningful exit statuses from each of the expect conditions.