Trying to persist process through SSH is failing [closed] - linux

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have a long-running python program that I'm trying to run on a remote server.
I've looked at "How to keep processes running after ending ssh session?", "
How to start process via SSH, so it keeps running?", "Run a persistent process via ssh", and a few other topics, but they don't seem to help.
I've tried running the python process with screen (via detaching a screen containing a background process) and nohup, but in both cases, when I exit the ssh session (which--I'm not sure if this matters--is run with X11 forwarding, since the python program is creating some graphics), the ssh session hangs.
The ssh process hangs even if I redirect stdin, stdout, stdout from/to /dev/null.
Killing the ssh session kills the python process. When I kill the ssh, the following error message is printed on the remote server: g_dbus_connection_real_closed: Remote peer vanished with error: Underlying GIOStream returned 0 bytes on an async read (g-io-error-quark, 0). Exiting.
Furthermore, I don't actually want to redirect stdout or stderr to /dev/null, since I want to redirect them to a log file. So I didn't try running the python process as a daemon. (Perhaps it's bad that the logging is sent to stdout, I guess...)
What should I do, so that I can: (1) keep my process running after logging out, (2) redirect stdout/stderr to a log file?
(One thing which "worked" was suspending and then rerunning the ssh process [after it hangs] in the background, but what if I want to shut off my computer?)

The X11 connection is indeed the problem. Screen takes care of keeping stdin/stdout/stderr connected, and it also protects the program from the HUP signal. However, it does not keep a virtual X server for it to write graphics on.
So the question is: what do you want with the graphics? If your program really needs to output them, you need to set up a (virtual) X server which it can continue to reach even after the connection is lost. You can connect to this virtual server with vnc, just like you can connect to your screen session.
Or you can make the program more suitable for running in the background, which means it must not use the X server. In that case, you probably want to output some files which you can then turn into graphics with a separate program when you want to see them.

I thought sshd create new session leader bash for it's connection, so if you put your programmer at background and redirect your stdout/stderr ( >log 2>&1 ) , then even if you lose the connection ,the running bash will control your program.

Related

Keep Go script running on Linux server after ssh connection is closed [duplicate]

This question already has answers here:
How to prevent a background process from being stopped after closing SSH client in Linux [closed]
(20 answers)
How to make a program continue to run after log out from ssh? [duplicate]
(6 answers)
Closed 1 year ago.
I have a Go script that creates an HTTP request and responds to any incoming requests. I want this script to run on a Linux server. However, I can't keep the ssh connection active forever, and when it disconnects the script also stops, as expected. I know that you have 'forever' for NodeJS that fixes this issue. I was wondering if something similar also exists for Golang, as I was not able to find anything. Any other tips on how to fix this are also very welcome.
There are mainly three options here, one is to use the nohup command, the other is to use the screen command, and the last is the upgraded version of byobu of screen. After reading these three commands, I actually prefer to use the byobu command, because the byobu command is more powerful, an upgraded version of screen, and the interface is more friendly.

Attaching to the the output of a running process

A process has been started remotely through a SSH session. The output stream (text) is displayed OK thru SSH. I would like to display the results locally without interrupting the running process.
Is there a way to attach to a running process and 'piggyback' a stream?
A Linux-only solution is acceptable.
Thanks!
Use reptyr:
reptyr is a utility for taking an existing running program and
attaching it to a new terminal. Started a long-running process over
ssh, but have to leave and don't want to interrupt it? Just start a
screen, use reptyr to grab it, and then kill the ssh session and head
on home.
Or retty:
retty is a tiny tool that lets you attach processes running on other
terminals.

How to let command continue run even though ssh connection lost? [duplicate]

This question already has answers here:
How to prevent a background process from being stopped after closing SSH client in Linux [closed]
(20 answers)
Closed 5 years ago.
It's a time cost heavily job when compile something on server. But, my VPS ssh connection is unstable. Connection will be lost in about 10 minutes. How can I let my command continue run when ssh connection lost ?
If you have an unstable connection, the screen(1) command is your best solution. This keeps the terminal session alive when you get disconnected and allows you to log back in a reconnect with it, preserving the screen state and whatever else you had running. It may not be installed by default on your linux distribution (its not on Ubuntu), but is available in any package system.
There are useful tutorials in various places -- a web search for linux screen command gives many pointers.
A quick Google search pulled up the following 3 SO posts that should help:
Linux: Prevent a background process from being stopped after closing SSH client
Getting ssh to execute a command in the background on target machine
Use SSH to start a background process on a remote server, and exit session
TL;DR - use nohup

why does node.js process terminates after ssh session?

I am deploying my node.js program on a remote machine (Ubuntu 14.04) and the program terminates after ssh session even if I deploy it as background process.
node app.js &
I understand that using forever can solve this, which I have tried and it pretty much works. There is already a thread here that describes the good solutions to it, and there are many other threads all describes good tools and solutions for it.
But, I would like to understand why does the node.js process stops at
first place even if it runs as background process?
Since you are connecting through SSH all the processes belong to that session.
Unless specified by a command like
nohup
or no hang up for short, all the processes that belong to your session through SSH will die with the session.
It's like logging in with a user opening chrome and logging out. The chrome thread will be release once the owner was logged out.

Why does amazon kill the process even thought i make it forever? [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 4 days ago.
Improve this question
Every time I turn off Putty after I write:
sudo node server.js & (after getting the process started)
a few seconds later it shuts down the website.
and than i see that it had deleted the process...
what can be the reason?
ok i need to add some information: it happens only after i fill a form of nodemailer and send it to the server.js, then it kills the process and everything shuts down. the form is the trigger not anything else!
When the SSH session ends (disconnects) it kills the process.
It sounds like you need forever or nohup or my personal favorite PM2.
Forever: https://github.com/foreverjs/forever
PM2: https://github.com/Unitech/pm2
It's most likely an issue with your server.js (or mailer) code. I've been running a node program as a background process on ec2 for quite a while...
--------------------------- edit ---------------------------
You would need to use nohup or another longer term service to maintain the program binding after logging off.
I would suggest you look into something like Screen. It is a window manager that allows you to detach a shell, but still keep a process running in it. That would avoid nasty log files and stuff like that, plus it can be easily reaccessed.

Resources