Calling a script from another script doesn´t execute all the commands when SSH - linux

I am trying to call a script from another script because I am using "spawn" (from expect packet) to execute an SSH to another machine. Basically, after executing a comparison and routine in my main script (scriptA.sh) it calls for another script (scriptB.sh). The calls works fine, as I can see that the SSH is correctly executed. However, something looks wrong as not all the commands are executed correctly. If I execute those commands manually it works.
I have running an application on a machine that sometimes gets frozen and shows totally white screen. scriptB.sh kills firefox and re-open the broswer on background. The commands works fine if i execute them manually:
sudo killall firefox
export DISPLAY=:0.0 (I don´t need "sudo" for this)
/usr/local/bin/run_digital_signage_firefox.sh& (This command must be running without "sudo")
This is a very small resume from my scriptA.sh (I execute this script like this: sudo ./scriptA.sh):
#!/bin/bash
...something....
export username
export password
export IP
sh /home/mydirectory/scriptB.sh
...something....
and this is my scriptB.sh (I pass the username, password and IP using "export" in my scriptA.sh):
#!/bin/bash
/usr/bin/expect << EOF
spawn ssh test#$IP "sudo killall firefox && export DISPLAY=:0.0 && /usr/local/bin/run_digital_signage_firefox.sh&"
sleep 3
expect "*?ame:*" {
send "$username\r"
sleep 2
expect "*?assword:*"
send "$password\r"
sleep 2
expect "\r"
sleep 2
}
expect "*?(yes/no)*" {
send "yes\r"
sleep 2
expect "*?ame:*"
send "$username\r"
sleep 2
expect "*?assword:*"
send "$password\r"
expect "\r"
sleep 2
}
EOF
When scriptA.sh calls scriptB.sh it seems that it gets to kill firefox browser but not to initialize this with the command "/usr/local/bin/run_digital_signage_firefox.sh&". I must run this command without sudo but even if i try to add "sudo /usr/local/bin/run_digital_signage_firefox.sh&" in the SSH in scriptB.sh it does not work as well. It seems that the script is not executing "/usr/local/bin/run_digital_signage_firefox.sh&" neither "sudo /usr/local/bin/run_digital_signage_firefox.sh&". I need to execute the last part of the script to open again the firefox browser.
Both scripts have rwx permissions in ugo.

Related

Pass two password to su commands

I am trying to run shell comment through PHP scripts.
I want to first su into a user then run a sudo command. I tried:
echo mypassword | (su -c "sudo reboot" user2)
It doesn't work because it requires two password and I only passed one password. I checked many other posts, the solution below doesn't work for me because I don't have the sudo password for the current user. I need to change the user a first then do a sudo. Can I get some help?
echo mypassword | sudo -s ... Not work...
I know this is a bad practice. I just need it to restart server as the port 22 is closed accidentally. I can't ssh into the server to do any operations..
This is ONE time use to reboot the server from PHP side as I am not able to reach the admin to reboot the server right now. I fully understand the disadvantages... Please DO NOT suggest the disadvantages.
Since it's an one time thing, try to use the following. It spawns a sudo environment (for the current user) in which sudo reboot is called.
#! /bin/bash
read -sp "pass? " pass
expect 2>&1 <<-EOF
spawn sudo reboot
expect "*: " { send "${pass}\r" }
expect eof
catch wait result
exit [lindex \$result 3]
EOF
exit $?
You can call it as follows to automate stuff.
$ ./test.sh <<-EOF
notreallymypassword
EOF
Note: Although I think it works, I haven't tested it yet.

Bash script to respond to console output?

Currently trying to run a bash script on startup to automatically install squid, however the command I'm running requires input.
Currently the script i have is:
#!/bin/sh
PROXY_USER=user1
PROXY_PASS=password1
wget https://raw.githubusercontent.com/hidden-refuge/spi/master/spi && bash spi -rhel7 && rm spi
#After i run this command it asks "Enter username"
#followed by "Enter password" and "Renter password"
echo $PROXY_USER
echo $PROXY_PASS
echo $PROXY_PASS
echo yes
However i am unable to get the input working, and the script fails to create a username and password. I'm running centos 7.
Look you are calling some tools which act in interactive mode, so as dani-gehtdichnixan mentioned at (passing arguments to an interactive program non interactively) you can use expect utilities.
Install expect at debian:
apt-get install expect
Create a script call spi-install.exp which could look like this:
#!/usr/bin/env expect
set user username
set pass your-pass
spawn spi -rhel7
expect "Enter username"
send "$user\r"
expect "Renter password"
send "$pass\r"
Then call it at your main bash script:
#!/bin/bash
wget https://raw.githubusercontent.com/hidden-refuge/spi/master/spi && ./spi-install.exp && rm spi
Expect is used to automate control of interactive applications such as Telnet, FTP, passwd, fsck, rlogin, tip, SSH, and others. Expect uses pseudo terminals (Unix) or emulates a console (Windows), starts the target program, and then communicates with it, just as a human would, via the terminal or console interface. Tk, another Tcl extension, can be used to provide a GUI.
https://en.wikipedia.org/wiki/Expect
Reference :
[1] passing arguments to an interactive program non interactively
[2] https://askubuntu.com/questions/307067/how-to-execute-sudo-commands-with-expect-send-commands-in-bash-script
[3] https://superuser.com/questions/488713/what-is-the-meaning-of-spawn-linux-shell-commands-centos6
Try just passing the values to bash's stdin
#!/bin/sh
PROXY_USER=user1
PROXY_PASS=password1
if wget https://raw.githubusercontent.com/hidden-refuge/spi/master/spi; then
printf "%s\n" "$PROXY_USER" "$PROXY_PASS" "$PROXY_PASS" yes | bash spi -rhel7
rm spi
fi

Connect via ssh and run a program after a switch user

I want to create inside a bash script a funcion that connects via ssh to another server (using rsa keys), do a switch user (inserting the password), start a program and then exit with the exit code of the program started.
Below a test I'm doing:
#! /usr/bin/expect
set timeout 120
spawn ssh user1#10.211.55.24
expect ".*user1"
sleep 3
send "whoami\r"
send "/bin/su hdfs\r"
expect "*?assword:"
send "hdfs\r"
expect "$"
send "whomai\r"
send "exit\r"
It works until the switch user, I can switch to the hdfs user but the following commands are not sent (whomai). The $ prompt is correct. Furthermore I'm not able to get the exit code of the command (in this example echo command).

pipe timely commands to ssh

I am trying to pipe commands to an opened SSH session. The commands will be generated by a script, analyzing the results, and sending the next commands in accordance.
I do not want to put all the commands in a script on the remote host, and just run that script, because I am interested also in the status of the SSH process: sending locally the commands allow to "test" whether the SSH connection is alive or not, and get the appropriate return code from the SSH process.
I tried using something along these lines:
$ mkfifo /tpm/commands
$ ssh -t remote </tmp/commands
And from another term:
$ echo "command" >> /tmp/commands
Problem: SSH tells me that no pseudo-tty will be opened for stdin, and closes the connection as soon as "command" terminates.
I tried another approach:
$ ssh -t remote <<EOF
$(echo "command"; while true; do sleep 10; echo "command"; done)
EOF
But then, nothing is flushed to ssh until EOF is reached (in my case, never).
Do any of you have a solution ?
Stop closing /tmp/commands before you're done with it. When you close the pipe, ssh stops reading from it.
exec 7> /tmp/commands. # open once
echo foo >&7 # write multiple times
echo bar >&7
exec 7>&- # close once
You can additionally use ssh -tt to force ssh to open a tty on the remote.

Use SSH to start a background process on a remote server, and exit session

I am using SSH to start a background process on a remote server. This is what I have at the moment:
ssh remote_user#server.com "nohup process &"
This works, in that the process does start. But the SSH session itself does not end until I hit Ctr-C.
When I hit Ctr-C, the remote process continues to run in the background.
I would like to place the ssh command in a script that I can run locally, so I would like the ssh session to exit automatically once the remote process has started.
Is there a way to make this happen?
The "-f" option to ssh tells ssh to run the remote command in the background and to return immediately. E.g.,
ssh -f user#host "echo foo; sleep 5; echo bar"
If you type the above, you will get your shell prompt back immediately, you will then see "foo" output. Five seconds later you will then see "bar" output. In the meantime, you could have been using the shell.
When using nohup, make sure you also redirect stdin, stdout and stderr:
ssh user#server 'DISPLAY=:0 nohup xeyes < /dev/null > std.out 2> std.err &'
In this way you will be completely detached from the remote process. Be carefull with using ssh -f user#host... since that will only put the ssh process in the background on the calling side. You can verify this by running a ps -aux | grep ssh on the calling machine and this will show you that the ssh call is still active, but just put in the background.
In my example above I use DISPLAY=:0 since xeyes is an X11 program and I want it started on the remote machine.
You could use screen to run your process on this screen, detach from screen Ctrl-a :detach and exit your current session without problem. Then you can reconnect to SSH and attach to this screen again to continue with your task or check if is finished.
Or you can send the command to an already running screen. Your local script should look like this:
ssh remote_user#server.com
screen -dmS new_screen sh
screen -S new_screen -p 0 -X stuff $'nohup process \n'
exit
For more info see this tutorial
Well this question is almost 10 years old, but I recently had to launch a very long script (taking several hours to complete) on a remote server and I found a way using the crontab.
If can edit your user's crontab on the remote server, connect with ssh to the server, edit the crontab and add an entry that will start your script the next minute. Let's say it's 15h03. Add this line :
4 15 * * * /path/to/your/script.sh
save your crontab, wait a minute for the script to be launched. Then edit again your crontab to remove this entry.
You can then safely exit ssh, even shut down your computer while the script is running.

Resources