Run a command from a script directly to another process - linux

I'm currently working on a script which interact with another process.
If it is relevant, the said process is a simdebug console. What I want is exiting it properly because when I kill the process itself, it creates a lock file .lck.
The Simdebug console is waiting for inputs and closes on receiving quit, then q and n, both sperated by an enter keypress to validate the command.
I managed to send some commands to the Simdebug using
echo quit > /proc/< PID >/fd/1
But it only print the results of the echo and I can't find how to send a enter keypress, only new lines '\n' .
I can't aswell manage to send a quit command which would execute directly in the Simdebug and not the terminal where it is sent from.
My question would be resolved if one of those two points is answered:
Is it possible to simulate a validate keypress as in :
Term 1 : echo ifconfig ; echo < enter keypress>
Which would then execute what's in the read buffer of the Term 2
Is there a way to already execute a commande in another process as in
Term 1 : < unknown syntax > pwd
Term 2 < shows pwd of term2 not term1>
Which would not be working only from terminal to terminal but with an already opened process in read mode.

This is actually a hard thing to do. If you send characters to the /proc/self/fd/0 device or similar stdin device link from a different master terminal then it will just output the characters to the output side of the master terminal of the other process.
With tools like expect or pdip or screen you can send anything you want to a process encapsulated in their pseudoterminals as if it comes from their master terminal. But if a process is running then it will already have it's own terminal.
You can be in luck if your console can be persuaded for a controlling terminal transfer with reptyr.
For example if your console has process id 999999 (and you have screen and reptyr installed and maybe did something to appease selinux or apparmor/yama protections):
screen -dmS automateconsole
screen -S automateconsole -p 0 -X stuff 'reptyr 999999^M'
screen -S automateconsole -p 0 -X stuff 'quit^M'
sleep 1s
screen -S automateconsole -p 0 -X stuff 'q^M'
sleep 1s
screen -S automateconsole -p 0 -X stuff 'n^M'
sleep 1s
screen -S automateconsole -p 0 -X stuff 'exit^M'
But note:
You probably should cleanup the program that init'ed the console.
On Ubuntu at least I could not reptyr processes from other SSH sessions.
https://github.com/nelhage/reptyr
http://theterminallife.com/sending-commands-into-a-screen-session/

Related

Keep attached screen session alive and waiting for more commands to enter from detached screens

I had this idea of a script that I was trying to make come to light. I want one main process that is running and keeping track of subprocesses running in the background. I decided to go with screen for my implementation, and I can run the main.sh in attached mode, with the subcommands all running in detached. I can also send text to the main.sh once each sub-process has hit their point. However, i want the main script to be running until all sub-processes finish (accomplished), but I want it to be updating what it is printing out to the user. I currently have the main process doing:
while screen -list | grep -q process-1 || screen -list | grep -q process-2 || screen -list | grep -q process-3; do
sleep 1
done
which works perfectly as the main loop, but any data i send to that process just prints out like text, not like a command. Is there a way I can keep a screen session alive and receiving more commands/variables?
I currently plan on sending data from the sub-process like screen -S main -X stuff "PROCESS_1=FINISHED" and the main will keep trying to grab the variable PROCESS_1 to get it's status.
I can't just get the result of the sub-process either, as for at least one of the commands i plan on it running continuously, but I want to let the main-process know when it has hit a certain point. I was also tinkering with the idea of using file descriptors but I had issues getting that working in detached mode.
Is this possible and I haven't just used/found the right command? Do i need to somehow use another screen as a like a data layer so that the main layer just prints out to the user?
For completeness, here is the current setup:
start.sh
#!/bin/bash
screen -S process-1 -dm ./process-1.sh
screen -S process-2 -dm ./process-2.sh
screen -S main -m ./main.sh
main.sh
#!/bin/bash
while screen -list | grep -q process-1 || screen -list | grep -q process-2; do
sleep 1
done
process-1.sh
#!/bin/bash
count=0
while [[ count -ne 5 ]]; do
((count+=1))
sleep 5
done
screen -S main -X stuff "PROCESS_1=FINISHED"
process-2.sh
#!/bin/bash
count=0
while [[ count -ne 3 ]]; do
((count+=1))
sleep 5
done
screen -S main -X stuff "PROCESS_2=FINISHED"
Thanks for any advice you can provide.
EDIT:
Something along these lines is the end goal, like Vue.js, a constant screen that updates whenever changes are made (or in this case processes end)

Automate SSH and screen commands

I have a PHP script with a web interface which is used to provide input into a process which I am automating. In this process, I am attempting to include using SSH and screen to run some commands (all sent through PHP's exec).
I am currently using the following which is writing echo sent from script to the screen, but isn't executing it. I've tried adding the -ne option to echo and adding \n or ^M to the end. I've also tried changing around the types of quotes I'm using, but I'm having trouble getting the code to execute (sending enter).
ssh -t -t myUser#myDomain.com 'screen -r -d -S -X myScreen stuff "echo -ne sent from script"' 2>&1
How do I go about getting that code to execute?
This took much longer to figure out than I would like to admit. They key was adding a $ before the command sent to stuff and then using a \n to send the enter press.
I added an extra screen -list; to demonstrate how to send multiple commands before entering the screen. I also added an extra echo to demonstrate how to send multiple commands to the screen.
ssh -t -t myUser#myDomain.com "screen -list; screen -r -d -X -S myScreen stuff $'echo here; echo here\n'" 2>&1

Run commands in screen after creating one per bash

I have the following bash file which should create a screen, go to a directory and then start a node script:
screen -S shared // 1
cd /home/nodejsapp // 2
node start.js app.js // 3
The problem is, after executing 1, I indeed see the screen 'shared', but 2 & 3 will execute on the previous terminal, not on the screen 'shared'.
How can I achieve that commands 2 and 3 will be executed on the current screen?
You may create a detached screen and then send commands to it. For example:
screen -d -m -S shared
screen -S shared -X -p 0 stuff $'cd /home/nodejsapp\n'
screen -S shared -X -p 0 stuff $'node start.js app.js\n'
If you need to attach to the screen session afterwards, then you can add one more line:
screen -S shared -r
See screen's manual for more details:
screen options
screen commands
You could run a "server" as the program within screen, which reads commands to execute from the pseudoterminal which the "tty" program identifies. For instance, as I'm writing this, tty says (inside screen)
/dev/pts/2
and I can write to it by
date >/dev/pts/2
On the server side, the script would read line-by-line in a loop from the same device. (On some other systems, there are differently-named
devices for each side of the pseudoterminal).
That only needs a script which starts by getting the output of "tty", writing that to a file (which a corresponding client would know of), and then the client would read commands (whether from the keyboard or a file), write them to the server via the pty device.
That's doable with just a couple of shell scripts (a little more lengthy though than the usual answer here).

How to stop ffmpeg remotely?

I'm running ffmpeg on another machine for screen capture. I'd like to be able to stop it recording remotely. FFMPEG requires that q is pressed to stop encoding as it has to do some finalization to finish the file cleanly. I know I could kill it with kill/killall however this can lead to corrupt videos.
Press [q] to stop encoding
I can't find anything on google specifically for this, but some there is suggestion that echoing into /proc//fd/0 will work.
I've tried this but it does not stop ffmpeg. The q is however shown in the terminal in which ffmpeg is running.
echo -n q > /proc/16837/fd/0
So how can I send a character to another existing process in such a way it is as if it were typed locally? Or is there another way of remotely stopping ffmpeg cleanly.
Here's a neat trick I discovered when I was faced with this problem: Make an empty file (it doesn't have to be a named pipe or anything), then write 'q' to it when it's time to stop recording.
$ touch stop
$ <./stop ffmpeg -i ... output.ext >/dev/null 2>>Capture.log &
$ wait for stopping time
$ echo 'q' > stop
FFmpeg stops as though it got 'q' from the terminal STDIN.
Newer versions of ffmpeg don't use 'q' anymore, at least on Ubuntu Oneiric, instead they say to press Ctrl+C to stop them. So with a newer version you can simply use 'killall -INT' to send them SIGINT instead of SIGTERM, and they should exit cleanly.
Elaborating on the answer from sashoalm, i have tested both scenarios, and here are the results:
My experiments shows that doing
killall --user $USER --ignore-case --signal INT ffmpeg
Produces the following on the console where ffmpeg was running
Exiting normally, received signal 2.
While doing
killall --user $USER --ignore-case --signal SIGTERM ffmpeg
Produces
Exiting normally, received signal 15.
So it looks that ffmpeg is fine with both signals.
System: Debian GNU/Linux 9 (stretch), 2020-02-28
You can also try to use "expect" to automate the execution and stop of the program. You would have to start it using some virtual shell like screen, tmux or byobu and then start the ffmpeg inside of it. This way you would be able to get again the virtual shell screen and give the "q" option.
Locally or remotely start a virtual shell session, lets say with "screen". Name the session with -S option, like screen -S recvideo Then you can start the ffmpeg as you like. You can, optionally, detach from this session with a Ctrl+a + d.
Connect to the machine where the ffmpeg is running inside the screen (or tmux or whatever) and reconnect to it: screen -d -RR recvideo and then send the "q"
To do that from inside a script you can then use expect, like:
prompt="> "
expect << EOF
set timeout 20
spawn screen -S recvideo
expect "$prompt"
send -- "ffmpeg xxxxx\r"
set timeout 1
expect eof
EOF
Then, in another moment or script point or in another script you recover it:
expect << EOF
set timeout 30
spawn screen -d -RR recvideo
expect "$prompt"
send -- "q"
expect "$prompt"
send -- "exit\r"
expect eof
EOF
You can also automate the whole ssh session with expect, passing a sequence of commands and "expects" to do what you want.
The question has already been answered for Linux, but it came up when I was looking for the windows equivalent, so I'm gonna add that to the answers:
On powershell, you start the process like this:
$((Start-Process ffmpeg -passthru -argument "FFMPEG_ARGS").ID)
This sends back the PID of the FFMPEG process that you can store in a variable, or echo, and then you send the windows equivalent of sigint (Ctrl + C) using taskkill
taskkill /pid FFMPEG_PID
I tried with Stop-Process (which is what comes up when looking how to do this on Google) but it actually kills the process. (And yes, taskkill doesn't kill it, it gently asks the process to stop... good naming :D)

How to stop a screen process in linux?

I am running a script on a remote server. I ran the script in screen, however I need to stop it before it completes since I need to update the script. I can easily detach from screen, however, is there a way to kill a screen process?
CTRL+a and then 'k' will kill a screen session.
There are a couple of 'screen' ways to kill a specific screen session from the command line (non-interactively).
1) send a 'quit' command:
screen -X -S "sessionname" quit
2) send a Ctrl-C to a screen session running a script:
screen -X -S "sessionname" stuff "^C"
In both cases, you would need to use 'screen -ls' to find the session name of the screen session you want to kill ... if there is only one screen session running, you won't need to specify the -S "sessionname" parameter.
I used this to quit hundreds of erroneous screen sessions created by a buggy command:
for s in $(screen -ls|grep -o -P "1\d+.tty"); do screen -X -S $s quit; done;
where: the grep -o -P "1\d+.tty" is the command to get session names with Perl-like name regex "1\d+.tty" which captures all sessions start with number 1, has some other numbers (\d) and end with .tty
Warning:
You should test with this command first to see you get the exact list of sessions you want before apply the above command. This is to avoid quitting unwanted sessions:
for s in $(screen -ls|grep -o -P "1\d+.tty"); do echo $s; done;
I always to this echo test whenever the list in for loop is not clear, for example, the one generated by sub-command in $() expansion.
previous answers didn't work for me on a winputty terminal and amazon ssh server connection.. but this one does work:
screen -S yourscreentitlehere -X stuff $'\003'
references:
Sending ctrl-c to specific screen session
$'\003' is ctrl+c http://donsnotes.com/tech/charsets/ascii.html
stuff is https://www.gnu.org/software/screen/manual/screen.html#Paste
I am using putty, and it seems I am already in the screen and couldn't open and close. Every time I do "exit", I just close the putty window. Here is the termimal print
>>screen -r
21063.unlimited (11/08/20 15:45:19) (Attached)
24054.cure6 (11/08/20 09:46:13) (Attached)
There is no screen to be resumed.
and
screen -S 21063.unlimited -X stuff $'\003'
does not do anything.
I found that as simple as the following line works perfect
screen -x 21063.unlimited
it sends me back into the screen and from there "exit" works.
Note that it is lower-case -x

Resources