Attaching to the the output of a running process - linux

A process has been started remotely through a SSH session. The output stream (text) is displayed OK thru SSH. I would like to display the results locally without interrupting the running process.
Is there a way to attach to a running process and 'piggyback' a stream?
A Linux-only solution is acceptable.
Thanks!

Use reptyr:
reptyr is a utility for taking an existing running program and
attaching it to a new terminal. Started a long-running process over
ssh, but have to leave and don't want to interrupt it? Just start a
screen, use reptyr to grab it, and then kill the ssh session and head
on home.
Or retty:
retty is a tiny tool that lets you attach processes running on other
terminals.

Related

What happens to a command running on a server if you lose your Internet connection?

Sorry for the noob question. I'm running a shell script on my webserver to download 900 or so files. However I'm doing this through a Terminal shell on my laptop and I'm about to lose my current Internet connection.
Does the script on the server continue running until finished? Or does it stop because its no longer providing feedback to my terminal on the disconnected laptop.
By default, your interactive shell on the web server will send your script a HUP signal as soon as it terminates. You can prevent this by calling your script like this:
<scriptname> & disown
This way it will be run in the background (&) and removed from the shell's job list (disown), so the shell will not send it a HUP signal.
Note that the disown may not be necessary in some cases.
It will die as the parent process (the terminal) has died. However, using utilities such as screen or nohup (among others) it is possible to have the process continue even if the terminal dies, and you can even background a process and disown it after its already running, see this Q for more there.
Further reading you may want to peruse, there's a ton out there on this subject.

why does node.js process terminates after ssh session?

I am deploying my node.js program on a remote machine (Ubuntu 14.04) and the program terminates after ssh session even if I deploy it as background process.
node app.js &
I understand that using forever can solve this, which I have tried and it pretty much works. There is already a thread here that describes the good solutions to it, and there are many other threads all describes good tools and solutions for it.
But, I would like to understand why does the node.js process stops at
first place even if it runs as background process?
Since you are connecting through SSH all the processes belong to that session.
Unless specified by a command like
nohup
or no hang up for short, all the processes that belong to your session through SSH will die with the session.
It's like logging in with a user opening chrome and logging out. The chrome thread will be release once the owner was logged out.

terminate matlab running through nohup and access the workspace variables

I am using nohup to run a matlab script in background on a linux server. Last part of the script has commands to save the workspace variables to the disk. In my case, I do not want to wait for completion of the script. Is is possible to terminate the running script midway and access/save the variables existing at that particular stage to the disk.
Using the command nohup, you do not have a nice way to communicate with your program after the terminal is closed. What can help is the window manager screen (take a look at man screen). Even after the session is detached from the terminal once you are still able to get control over your program back.

Close and reopen ssh connection without losing current process

If I open an ssh connection and start a long-running process, is there any way to close the ssh connection, and not only keep the process running, but be able to later ssh back in again, and "reattach" the process to the terminal?
I am able to do the following:
Ctrl-z
bg
disown
And that lets me keep the process running after I leave my ssh session, but I am not able to "reown" the job later; is there a way to do this? The real-world scenario is that I'd like to start a process at work, drive home, then log back in and check on it/interact with it.
I know that tmux is able to handle things like this, but I am often forgetful, or I just don't know ahead of time what process will be long-running and what won't, so I don't always remember to start the process from within tmux.
There are several ways to accomplish this. I used to use screen and that was a round about way of doing it. But check out mosh, built just for this: http://mosh.mit.edu/

How to run Derby as a background process in Linux

I am using derby on a remote Ubuntu 12.04 server. The standard derby commands are all working correctly and I am able to open my databases and access them via ij. I need to be able to start and stop the server from the terminal while logging in and out between commands. The problem is that I can not find a way to run the server as a background process. The closest I have come is: nohup java -jar $DERBY_HOME/lib/derbyrun.jar server start & > ~/dblog.txt which works except that it requires I hit [enter] before returning to the command line. I am aware of the daemon package but I am uncertain of whether it will allow me to then stop the server. What would be helpful is a explanation of how tomcat manages it since that is my app server.
Derby is just a Java application. Any technique you wish to use to run Java applications in the background (/etc/init.d, job control in your shell, etc.) will work fine for Derby.
You can use commands like "kill" or "killall" to kill your background process. Use "jobs" command to see list of running process you've sent to background. Also you can put them back in foreground by doing - "fg %n" (where n is the job number) and kill it using CTRL-C.

Resources