Redirecting Terminal Output of Running Process - linux

I am connecting to a server using SSH and running a minecraft server there with
java -jar server.jar
Now, i want to be able to close my SSH session without closing the minecraft server and, if i need to type some commands to the server or read the server output, i want to be able to re-connect with SSH and get the output and input back to my terminal window.
Is this possible? I've read about redirecting the output to a file, but that won't solve my problem :/

I will use screen to detach and reattach the window.
https://www.digitalocean.com/community/articles/how-to-set-up-a-minecraft-server-on-linux

Related

Using local system as ssh client and server

I am using local system to learn ssh and what I am trying to do is execute a command on the remote server.
I have ssh server running on terminal1 and client on terminal2.
I used the following command on terminal2:
ssh user1#127.0.0.1 echo Display this.
but it echoes on terminal2. How would I know if the command actually worked if it's not displaying in terminal1?
Thank you.
It worked correctly. It ssh'd into the server, executed the command, and returned the stdout of that command back to you.
SSH gains access to the server, but not necessarily any TTY's active on it. You would have to jump through some hoops to send text to a specific TTY, such as your Terminal1.
A better test would be:
ssh user1#127.0.0.1 'touch ~/testfile'
Then you can check on your server (which is localhost) to see if testfile was created in your user1 home folder. If it did, then the connection and the command succeeded.

Execute script on remote host - output given in local host

I am trying to execute two scripts which are available as sh files on remote host having 755 permissions.
I try callling them from client host as below:
REMOTE_HOST="host1"
BOUNCE_SCRIPT="
/code/sys/${ENV}/comp/1/${ENV}/scripts/unix/stopScript.sh ${ENV};
/code/sys/${ENV}/comp/1/${ENV}/scripts/unix/startScript.sh ${ENV};
"
ssh ${REMOTE_HOST} "${BOUNCE_SCRIPT}"
Above lines are in a script on local host.
While running the script on local host, the first command on remote host i.e. stopScript.sh gets executed correctly. It kills the running process which it was inteded to kill w/o any error.
However output of second script i.e. startScript.sh gets printed to local host window but the process it intended to start does not start on remote host.
Can anyone please let me know?
Is the way executing script on remote host correct?
Should I see output of running script on remote host locally as well? i.e. on the window of local host?
Thanks
You could try the -n flag for ssh:
ssh -n $REMOTE_HOST "$BOUNCE_SCRIPT" >> $LOG
The man page has further information (http://unixhelp.ed.ac.uk/CGI/man-cgi?ssh+1). The following is a snippet:
-n Redirects stdin from /dev/null (actually, prevents reading from
stdin).
Prefacing your startScript.sh line with 'nohup' may help. Often times if you remotely execute commands they will die when your ssh session ends, nohup allows your process to live after the session has ended. It would be helpful to know if your process is starting at all or if it starts and then dies.
I think cyber-monk is right, you should launch the processes with nohup to create à new independent process. Look if your stop script is killing the right process (the new one included).

SSH Persistent Connection Timeout

I setup an ssh tunnel using a bash script, and the ssh tunnel is configured as a shared persistent connection tunnel.
At the end of my script, though, I have it setup to invoke a close command against the tunnel and to delete the .ssh/config file so that it doesn't remain open and nor does subsequent ssh tunnels that are manually started by a user.
Question is this... what is the best way to handle this issue of making sure the tunnel is closed in case someone ctrl+c the script or it crashes for some reason in the middle of the script before it invokes the close command and deletes the config file? I was going to add a timeout to the control master, but I cannot determine what I need to use based on my readings in the ssh_config man page.
Try to use trap:
#!/bin/bash
on_sigint(){
echo this function is called on ctrl+c
}
trap "on_sigint" SIGINT SIGTERM
echo start
# Do what you want
...
echo stop

Killing ssh session kills running process

I'm connecting to my ubuntu server using ssh. I start an encoding program using a command. However, it seems that when my ssh session closes (because I started it on a laptop which went to sleep). Is there a way to avoid this (of course preventing my laptop from sleeping is not a permanent solution).
Run your command with nohup or use screen
nohup is better when your program generate some loging output because it's forward to file and then you can check it, but with screen you can detach ssh session and when you log again you can restore your work-space. For encoding I'll use nohup. It is easier and you probably run it in background, so you really don't need detaching
Screen is the best for you.
screen -S some_name
than run it. Detach it with: ctrl+a d
Next time, attach it with:
screen -rd some_name
You can have more runnning screens. To show the list of them:
screen -ls
Install "screen" on your ubuntu server, that way you can start any program in your session, disconnect the output of the program from your current session and exit.
Later when you connect again to your server, you can restore the program which will continue running and see its progress.

how to log all coming messages of SSH Secure Shell client?

I am using SSH Secure Shell client, a nice tool to connect to the server.
However, I am wondering whether is it possible to log all of coming messages from my program that I run via the SSH Secure Shell client. for example: ./test and my program will run with giving debug lines. how can I log the debug lines to a txt file for analysing?
Have you tryed?
./test > log.txt

Resources