Bash Automatic Exit/Log out the SSH Session - linux

I have a script that I want to launch upon a successful SSH session.
So far I have this working by placing the path to my script and the script name in the .bash_profile file.
The script that I have written is text menu driven with multiple choices for the user.
One of the options is quit/exit, which I need (once selected) to exit the script and log-out user from the SSH session.
Is this possible?
I've not been able achieve this as of yet. Exiting the script is easy enough but no matter what I try I always end up with a command line prompt, which in this instance, we are trying to avoid.
Thanks In Advance,
Dan.

Since feature requests to mark a comment as an answer remain declined, I copy the above solution here.
Use ssh-keys with a forced command, or investigate the ForceCommand option in the sshd_config file (that would be for all users though). – Carlos Campderrós

Related

Python - (SSH/Telnet) connection with multiple commands

I need to run a shell script that should interact with couple of servers / routers in daily schedule. The script will simply login to the remote end, run a command. Up to here i hear that you are saying it is easy. The real problem here is that i need to analyze the output of the first command and based on the output i need to organize the second command. This is also doable, but the situation here is that i do not want to login and logout from the box for every command.
What i need is to login once, stay alive ,run the command take the output, analyze it and then run the second command later on logout and close connection.
Correct me if i am wrong but expect is not an option here. I want to ask for your suggestions.
Which language / module of this language i can use to complete this requirement.
Environment is not pure ssh, so i should have something general that can be used for both ssh/telnet.
Thanks in advance
Since you mentioned python, pexpect should do the job pretty well:
https://pexpect.readthedocs.org/en/latest/overview.html

Automatically pass a value to a script menu for automation's sake in Bash/KSH

Trying to make a small script and cron job it in order to automate a task. Said script runs another script which has already been created, grabs the output, emails to specified recipients, and cleans up the output. I've got it almost down however am running into one major issue. The script that mine is running has a menu on the outset. That is to say, running the script by itself manually, i would have to select option 1 in order to get the output i want (the only other option, 2, is quit.)
How can I automatically enter (or simulate entering) the value 1 into the other script, so it does not hang when in a cron job waiting for user input?
Is there a sane way to do this?
Thanks in Advance.
You could try something as simple as using yes | command if answering yes is all that is needed. Otherwise you probably want to use expect to drive the imaginary keyboard for you.
http://expect.sourceforge.net/
Using autoexpect to record your session is a convenient way to come up with rough draft expect scripts as well.

How to track file creation and modification

We have put together a perl script that essentially looks at the argument that is being passed to it checks if is creating or modifying a file then it saves that in a mysql database so that it is easily accessible later. Here is the interesting part, how do I make this perl script run before all of the commands typed in the terminal. I need to make this dummy proof so people don't forget to run it.
Sorry I didn't formulate this question properly. What I want to do is prepend to each command such that each command will run like so "./run.pl ls" for example. That way I can track file changes if the command is mv or it creates an out file for example. The script pretty much takes care of that but I just don't know how to run it seamlessly to the user.
I am running ubuntu server with the bash terminal.
Thanks
If I understood correctly you need to execute a function before running every command, something similar to preexec and precmd in zsh.
Unfortunately bash doesn't have a native support for this but you can do it using DEBUG trap.
Here is a sample code applying this method.
This page also provide some useful information.
You can modify the ~/.bashrc file and launch your script there. Do note that each user would (and should) still have the privelege to modify this file, potentially removing the script invocation.
The /etc/bash.bashrc file is system-wide and only changeable by root.
These .bashrcs are executed when a new instance of bash is created (e.g. new terminal).
It is not the same as sh, the system shell, that is dash on Ubuntu systems.

Shell script - input redirection when prompted by the shell script multiple times

We have a shell script that expects multiple user inputs to be entered when prompted. e.g
At first it may ask for the operation to be performed. When that answer is given, it may ask for username then password etc. We want to automate this task by providing the inputs using file redirection i.e.
script < input.
The input file will have all the answers for different questions that the script may ask. However it is not working and the shell script is reading only the first line of the input file. What do I need to change or use to make this work?
What you can use is the program expect. You create a script for it that tells it when to give what input to some command it executes. This way you can automate exactly the kind of thing you're struggling with.
More info on Google and here:
http://www.linuxjournal.com/article/3065
man page: http://linux.die.net/man/1/expect
You say 'it only reads the first line of input.'
So you have to kill the script?
Is there any output? (error messages especially)?
Are you redirecting STDERR to /dev/null or else where? If so, remove that.
Here is the hightest probability helper ... Modify the top-level script and add set -vx at the 2nd line. Then you'll be able to see what was processed, where it has stopped and possibly formulate theories about why it is not processing data.
Any chance that the input file was created in a Windows environment and the cr\lf pair is messing up the expected input?
I hope this helps.
Thanks all for commenting and answering. I tried except and that did not work. So I am going to mention what worked for us. Here was our workflow - 1. At the linux prompt, type the command, it was connect() in our case. 2. Once that command is given, the script would ask for parameters for the command like port number, server etc. we had to provide that manually 3. Then we again are presented with a shell prompt with another input. In our case, we were able to provide the first command connect() at the prompt using file redirection, but the parameter passing was an issue. The solution we found was provide the parameters inside the parentheses of connect only i.e. our input file for redirection would contain - connect(). This worked for us.

Run a command in a shell and keep running the command when you close the session

I am using Putty to connect to a remote server. What I want to know is if there is any way to write my commands and allow them to keep running after I close the session with Putty. The reason for this is that I do not want to keep the computer ON all the time. Is there any way to do this?.
Update with the solution
For my question as it is presented the best solution is use one of the commands provided such as nohup, because you do not have to install any additional software. But if you are in the same problem use screen, install it and use it. It is amazing.
I have selected the answer of Norman Ramsey as favourite because propose several solutions using commands and screen. But please check the other answers specially the one of PEZ, then you get an insight of what screen is able todo.
screen! It's the best thing since sliced bread. (Yeah, I know others have already suggested it, but it's so good the whole world should join in and suggest it too.)
screen is like, like, ummmm ... like using VNC or the like to connect to a GUI destop, but for command shell windows. You can have several shell "windows" open at once in the same screen session. You can do stuff like:
Start a screens session using "screen -dR" (get used to using -dR)
run some commands in one window
press CTRL-A,C to create a new window open a file there in vim
press CTRL-A,0 to go back to the first window and issue some command on the file you just edited
CTRL-A, 1 to go back to your vim session
CTRL-A, C for yet another window and maybe do "sudo - su" (because you just happen to need a full root shell)
CTRL-A, 0 and start a background process
CTRL-A, C to create yet a new window, "tail -f" the log for that background process
CTRL-A, d to disconnect your screen then CTRL-D to disconnect from the server
Go on vacation for three weeks
Log on to the server again and issue "screen -dR" to connect to your existing screen session
check the log in the the fourth window with CTRL-A, 3 (it's like you've been there watching it all the time)
CTRL-A, 1 to pick up that vim session again
I guess you're starting to get the picture now? =)
It's like magic. I've been using screen for longer than I can remember and I'm still totally amazed with how bloody great it is.
EDIT: Just want to mention there's now also tmux. Very much like screen, but has some unique features, splitting the windows being the most prominent one.
nohup, disown, and screen are all good but screen is the best because unlike the other two, screen allows you to disconnect from the remote server, keep everything running, and then reconnect later to see what is happening. With nohup and disown you can't resume interacting.
Try using GNU Screen. It allows you to have several shells open at once. And you can disconnect from those running shells (i.e. close session with Putty) and they will keep doing their thing.
What you are looking for is nohup.
See the wiki link for how to use it.
screen is the best.
Try:
screen -dmS "MyTail" tail -f /var/log/syslog
This start command in background.
Use screen -r to list, and or screen -r Mytail to enter session.
If more users need access same session, use: screen -rx MyTail, and both or more users share the session.
If you can't use screen (because, for instance, your SSH session is being programmatically driven), you can also use daemonize to run the program as a daemon.
One way that works well for me is at.
at works like cron, but for a one-time job. I used it today to download a large file without having to keep my session alive.
for example:
$ at 23:55
at> wget http://file.to.download.com/bigfile.iso
at> ^D
You pass at a time (in the future) and it gives you a prompt. You enter the commands you want to run at that time and hit ctrl+d. You can exit out of your session and it will run the commands at the specified time.
Wikipedia has more info on at.
./command & disown
ssh localhost && ./command && exit

Resources