How to wait for user input in a terminal called with -e option? [duplicate] - linux

This question already has answers here:
Prevent Gnome Terminal From Exiting After Execution [duplicate]
(4 answers)
Closed 9 years ago.
I'm trying to open gnome-terminal (though I think it would be related to any x-terminal-emulator) with a command provided using -e option, like gnome-terminal -e 'ls'. The terminal is closed as soon as the command is done working, so I need a way to wait for user input to be able to read the result and then finally close the window with Enter press.
I tried gnome-terminal -e 'ls; read -p "..."' and it works if I run ls; read -p "..." in an already opened terminal, but a terminal called with -e option keeps getting closed.
So is there any way to keep the terminal open until some user input is provided while using -e option?

Spawn a shell;
xterm -e bash -c 'ls; read -p "Press any key ..."'

Related

SSH the output to different terminals [duplicate]

This question already has answers here:
how do i start commands in new terminals in BASH script
(2 answers)
Closed 17 days ago.
I am using for loop to SSH multiple hosts
#!/usr/bin/bash
bandit=$(cat /root/Desktop/bandit.txt)
for host in {1..2}
do
echo "inside the loop"
ssh bandit$host#$bandit -p 2220
echo "After the loop"
done
#ssh bandit0#bandit.labs.overthewire.org -p 2220
bandit.txt has the following content " bandit.labs.overthewire.org"
I am getting the SSH prompt but one at a time, say for example First I got "bandit1" host login prompt, and after closing the "bandit1" ssh host I am getting second ssh session for "bandit1"
I would like to get two different terminals for each SSH session.
But there is no such things as "terminal window" in bash (well, there is a tty, yours; but I mean, you can't just open a new window. Bash is not aware that it is running inside a specific program that emulate the behavior of a terminal in a GUI window).
So it can't be as easy as you would think.
Of course, you can choose a terminal emulator, and run it yourself.
For example
for host is {1..2}
do
xterm -e ssh bandit$host#$bandit -p 2220 &
done
maybe what you are looking for, if you have xterm program installed.
Maybe with some additional error checking, something like this -
scr=( /dev/stdout
$(ps -fu $USERNAME |
awk '$4~/^pty/{lst[$4]} END{for (pty in lst) print pty}' ) )
for host in {1..2}; do echo ssh bandit$host#etc... >> /dev/${scr[$host]}; done
There are a lot of variations and kinks to work out though. tty or pty? what is there's no other window open? etc... But with luck it will give you something to work from.

Shell Script - Run shell commands in parallel in different bash sessions [duplicate]

This question already has answers here:
How do you run multiple programs in parallel from a bash script?
(19 answers)
Closed 2 years ago.
I have a requirement to run database restore commands in parallel from shell scripts. Both the commands should run in different bash sessions in parallel.
The following are the commands I need to run.
sudo su - $user -c "db2 RESTORE DATABASE ${SDBP} FROM '/dbnfs/db2main/backups/${DB2DBP}' TAKEN AT $TIMESTAMPP ON '/data1/DB2/tablespaces/${DB2DBP}' , '/data2/DB2/tablespaces/${DB2DBP}' DBPATH ON '/home/db2inst1' INTO ${DB2DBP} NEWLOGPATH '/data1/activelogs/${DB2DBP}' without rolling forward without prompting 2>&1"
sudo su - $user -c "db2 RESTORE DATABASE ${SDBS} FROM '/dbnfs/db2main/backups/${DB2DBS}' TAKEN AT $TIMESTAMPS ON '/data1/DB2/tablespaces/${DB2DBS}' , '/data2/DB2/tablespaces/${DB2DBS}' DBPATH ON '/home/db2inst1' INTO ${DB2DBS} NEWLOGPATH '/data2/activelogs/${DB2DBS}' without rolling forward without prompting 2>&1"
Let me know how to achieve it.
Since you want different bash sessions (perhaps due to long running commands), screen command might be of interest to you.
You can create new named screens (sessions), let's call it restore1 for the first command:
screen -S restore1
This will create a new screen. In this screen you can run your first command. Once command starts running, you can "detach" (ctrl+a d) from it.
Create another named screen called restore2 for SDBS:
screen -S restore2
Run the second command here, then detach from it. You can check the screens (sessions) by:
screen -list
You can reattach to any of the screens with screen -r <screen_name>, to check on the status of that command. Example:
screen -r restore1

Process killed after closing terminal SSH [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
I'm trying to clear facebook cache on my server every 2 seconds so i logged in the SSH and i run this command
while true; do sleep 2; curl -F id="http://twd.ma" -F scrape=true -F access_token='token' -F appID=appID https://graph.facebook.com; done &
And every thing worked fine and the cache started to be cleaned every 2 seconds. However, when i close the Terminal SSH the cache stop being cleaned and i think the process is killed, what should i do please?
Your command will stop executing because when you log out, the shell is lost. The '&' means that the script runs in background "as long as the shell is active"
You can do the following:
Write your script into a file, i.e. clearcache.sh and omit the '&'
#!/bin/bash
while true; do
sleep 2
curl -F id="http://twd.ma" -F scrape=true -F access_token='token' -F appID=appID https://graph.facebook.com
done
Write the path to your script into /etc/rc.local
/path/to/clearcache.sh > /dev/null 2&>1 &
The ' >/dev/null 2&>1 means that all output that your script produces will be deleted.
If screen is available to you then you can start a screen session by running screen, run your commands, then press ctrl-a ctrl-d to detach the session.
When you log in later you can issue screen -r to reconnect to the detached session.

How to use output redirection with sudo? [duplicate]

This question already has answers here:
How do I use sudo to redirect output to a location I don't have permission to write to? [closed]
(15 answers)
Closed 5 years ago.
I wish to write text from stdin to a file that is owned by root.
I'm doing this programmatically across an SSH connection - essentially from a script - so using a text editor is out of the question. There is no terminal. The process is entirely automated.
Given that [a] root elevation can be obtained via sudo, and [b] files can be written to using cat with redirection, I assumed the following would work:
ssh user#host sudo cat >the-file
Unfortunately, the redirection is applied to sudo, not to cat. How can I apply redirection to cat in this example?
The normal pattern to do this is:
ssh user#host 'cat | sudo tee the-file'
tee redirects output to a file and can be run with sudo.
If you want to mimic >> where the file is appended-to, use sudo tee -a.
You'll also need to be sure to quote your command as above, so that the pipe isn't interpreted by the local shell.
Edit
The tee command is part of POSIX, so you can rely on it existing.
To redirect Standard Error as well:
ssh user#host 'some_command 2>&1 | sudo tee output-file'

Run Multiple Commands on Same SSH Session in Bash? [duplicate]

This question already has answers here:
What is the cleanest way to ssh and run multiple commands in Bash?
(14 answers)
Closed 3 years ago.
I would like to run several commands on one SSH session. For example, my script right now has something like the following:
ssh "machine A" do-thing-1
ssh "machine B" do-thing-2
ssh "machine A" do-thing-3
Having to SSH to A again in the third line is wasting a lot of time. How do I execute this without having to SSH again? Is this possible?
If the ssh to A does not consume its standard input, you can easily make it wait for input. Perhaps something like this.
ssh B 'sleep 5; do-thing-2; echo done.' | ssh A 'do-thing-1; read done; do-thing3'
The arbitrary sleep to allow do-thing-1 to happen first is obviously a wart and a potential race condition.
A somewhat simpler and more robust solution is to use the ControlMaster feature to create a reusable ssh session instead.
cm=/tmp/cm-$UID-$RANDOM$RANDOM$RANDOM
ssh -M -S "$cm" -N A &
session=$!
ssh -S "$cm" A do-thing-1
ssh B do-thing-2
ssh -S "$cm" A do-thing-3
kill "$session"
wait "$session"
See https://en.wikibooks.org/wiki/OpenSSH/Cookbook/Multiplexing for more.
You can use screen
$: screen
$: do-thing-1
Ctrl-A and Ctrl-D exit this screen,
$: screen
$: do-thing-2
Ctrl-A and Ctrl-D exit this screen,
$: screen
$: do-thing-2
Ctrl-A and Ctrl-D exit this screen,
view all `screen`,
$: screen -ls
Restore screen by id,
$: screen -r <Screen ID>

Resources