Using screen in bash script - linux

I'm running a game server on a remote server where I use a detached screen instance to leave it running.
I'm now creating a script that can be used to shut down the server, back up all the vital files and start it up again, however I'm having a few difficulties with dealing with the screen.
I assumed that I could just switch into the detached screen in the script (after the server had already been shut down) by calling screen -r in the script.
But that doesn't seem to work because if I run the script from outside screen it just launches the server in that session.
screen -r
cd ~/servers/StarMade/
sh StarMade-dedicated-server-linux.sh
screen -d
This is what I thought would do the trick but it doesn't. Maybe somebody can help me out here. I'm not a bash expert. In fact this is propably my first bash script that doesn't include "Hello World". Thanks.

Your script, as in your example, will get executed by your sell, not the one in the screen. You need to tell the running screen to read a file and execute it - that's what the -X option is for.
Try
tempfile=$(mktemp)
cat > $tempfile <<EOF
cd ~/servers/StarMade/
sh StarMade-dedicated-server-linux.sh
EOF
screen -X readbuf $tempfile
screen -X paste .
rm -f $tempfile
You can leave screen running in a 2nd terminal session to see what happens.

Related

Stop script | Spigot | Reatach to a detached screen

I am trying to make a minecraft server for a while and I have a little problem with a stop script. My startup script ( i want to mention in , just in case of something ) looks like this:
#!/bin/sh
screen -S server java -Xms2G -Xmx2G -XX:+UseConcMarkSweepGC -jar spigot.jar
And my stop script:
#!/bin/sh
screen -r server stop
Now , the problem is that this script is just accessing the screen without executing that command. I don't want to access the screen then type the "stop" command to stop the server (I mean I don't want to do that myself ) . I want a script which can reattach to my detached screen and execute that command for me. Can you find a solution, please? Oh yeah , and I know it's not important but I want to mention that I am using Spigot 1.12.2
Maybe you can run the command server stop in another script:
#!/bin/sh
screen -r ./Script.sh
------------------------------------
#./Sript.sh
#!/bin/sh
server stop

Script runs fully when ran manually but misses commands when ran from another script

I have 2 scripts that I'm testing to automate starting services on my server however they behave weirdly.
The first script is
#!/bin/sh
screen -dmS Test_Screen
sleep 1
sudo sh cd.sh
echo "finished"
Which runs perfectly however the script it runs does not and is as follows
#!/bin/sh
screen -S Test_Screen -X stuff "cd /home/Test"
sleep 1
screen -S Test_Screen -X eval "stuff \015"
sleep 1
echo "Complete"
The second script will run perfect if I run it from command line and will CD into the directory within the screen. However, if it runs from the first script it Will Not CD into the correct directory within the screen, but it will still print "Complete".
I'm Using CENTOS 6.7 and the latest version of GNU screen
Any Ideas?
This seems to be a problem with session nesting.
In your first script you create a session named Test_Screen.
In your second script the -S parameter tells screen to create a session of the same name. This might cause screen to exit and not cd into the correct directory.
You could move the cd command in front of the sudo sh cd.sh and remove those screen calls from the second script leaving only
stuff \015
echo "Complete"
Using the correct screenflags should also work.
#!/bin/sh
screen -dr Test_Screen -X stuff "cd /home/Test"
sleep 1
screen -dr Test_Screen -X eval "stuff \015"
sleep 1
echo "Complete"
For a more modern alternative to screen, have a look at tmux.
Ok so this turned out really weird. After posting i tried a couple of things on a centos 6.7 hyper V test environment and got the exact same issue. However, later in the day we ended up changing service provider and upgrading to centos 7 in the process. I am not sure why but since the update the script now runs perfectly and i was able to actually merge the two scripts into one in order to make it more efficient. If anyone knows why the update fixed it feel free to let me know.

create screen session that doesn't terminate with the program

I'm working on a startup script that is initiated from rc.local. I start up several programs with
screen -d -m my-prog
and that works great. However, if one of the programs has problems and exits, so does the session. I'd like to be able to have the session stick around so I can attach to it and see the output from the program before it crashed.
Is there a way to do this? I thought about
screen -d -m bash -c my-prog
But again, if my-prog terminates then so does bash and then so does screen.
You can follow the answer at https://unix.stackexchange.com/questions/47271/prevent-gnu-screen-from-terminating-session-once-executed-script-ends
They suggest something like you were trying in your second attempt, but instead of using bash to invoke the command (which terminates with the command as you noted), invoke bash after the command finishes like:
screen -dmS session_name sh -c 'my-prog; exec bash'

How do I pass a command to a screen session?

I'm writing a Linux shell script in which I need to start a new screen session, run a node.js server in the screen, then detach from the screen so that my server runs in the background.
Right now, these are the commands I run manually to do this:
screen
node server.js
[detach screen]
However, I need a way to automate this via the script, and if I just use the above commands in a shell script, it creates the screen and gets stuck there. How can I pass the "node server.js" command to the screen command?
EDIT:
Based on the suggested answer I have a script that works, except that I need to manually create a screen and detach from it before I run it. I tried adding screen -d -m as the first line to create a detached screen, but the script hangs after that line.
tempfile=$(mktemp)
indices=`tail -1 debug.log`
cat > $tempfile <<EOF
node server $indices
EOF
screen -X readbuf $tempfile
screen -X paste .
rm -f $tempfile
How can I create and detach a screen with the script?
This didn't work either:
screen
screen -d
It is as simple as this:
screen -md node server.js
This requires the command to run as a service (as it does), otherwise the screen stops immediately.
To optionally also set a name for the session (e.g. "session-name"):
screen -mdS session-name node server.js
You can then attach to the screen with:
screen -rd session-name
If you want to redirect all output to a file, you can do like this:
screen -mdS session-name bash -c 'node server.js &> output.log'
You can then monitor the output with e.g.:
tail -f output.log
You can list your running screens with:
screen -ls
or
screen -list
Example: Start a Python3 web server in a detached screen
Start a Python3 web server listening on port 8000 that serves files in the current directory, in a named
detached screen:
screen -mdS my-web-server python3 -mhttp.server
Or, with logging to a file:
screen -mdS my-web-server bash -c 'python3 -mhttp.server &> output.log'
For Python 2.x it looks like this:
screen -mdS my-web-server bash -c 'python -mSimpleHTTPServer &> output.log'
EDIT: Try this:
tempfile=$(mktemp)
cat > $tempfile <<EOF
node server.js
EOF
screen -S SessionName -X readbuf $tempfile; screen -RdS SessionName
screen -X paste .
rm -f $tempfile
It should create the temp file, create a screen called SessionName and runs the commands, then delete the temp file. Let me know if that works.

SSH: guarding stdout against disconnect

My server deployment script triggers a long-running process through SSH, like so:
ssh host 'install.sh'
Since my internet connection at home is not the best, I can sometimes be disconnected while the install.sh is running. (This is easily simulated by closing the terminal window.) I would really like for the install.sh script to keep running in those cases, so that I don't end up with interrupted apt-get processes and similar nuisances.
The reason why install.sh gets killed seems to be that stdout and stderr are closed when the SSH session is yanked, so writing to them fails. (It's not an issue of SIGHUP, by the way -- using nohup makes no difference.) If I put touch ~/1 && echo this fails && touch ~/2 into install.sh, only ~/1 is created.
So running ssh host 'install.sh &> install.out' solves the problem, but then I lose any "live" progress and error output.
So my question is: What's an easy/idiomatic way to run a process through SSH so that it doesn't crash if SSH dies, but so that I can still see the output as it runs?
Solutions I have tried:
When I run things manually, I use screen for cases like this, but I don't think it will be of much help here because I need to run install.sh automatically from a shell script. Screen seems to be made for interactive use (it complains "Must be connected to a terminal.").
Using install.sh 2>&1 | tee install.out didn't help either (silly of me to think it might).
You can redirect stdout/stderr into install.out and then tail -f it. The following snippet actually works:
touch install.out && # so tail does not bark (race condition)
(install.sh < /dev/null &> install.out &
tail --pid "$!" -F install.out)
But surely there must a less awkward way to do the same thing?
Try using screen:
screen ./install.sh
If your ssh session gets interrupted, you can simply reattach to the session via another ssh connection:
screen -x
You can provide a terminal to your ssh session using the -t switch:
ssh -t server screen ./install.sh
install.sh 2>&1 | tee install.out
if the only issue is not getting stderr. You didn't say exactly why the tee wasn't acceptable. You may need the other nohup/stdin tweaks.

Resources