Shell script while only loops once [duplicate] - linux

This question already has answers here:
ssh breaks out of while-loop in bash [duplicate]
(2 answers)
Closed 8 years ago.
I want to get all the servers' time with ssh, but my script only loops once and then exit, what's the reason?
servers.txt
192.168.1.123
192.168.1.124
192.168.1.125
192.168.1.126
gettime.sh
#!/bin/bash
while read host
do
ssh root#${host} date
done < "servers.txt"
OUTPUT
Tue Feb 3 09:56:54 CST 2015

This happens because ssh starts reading the input you intended for your loop. You can add < /dev/null to prevent it:
#!/bin/bash
while read host
do
ssh root#${host} date < /dev/null
done < "servers.txt"
The same thing tends to happen with ffmpeg and mplayer, and can be solved the same way.
PS: This and other issues are automatically caught by shellcheck:
In yourscript line 4:
ssh root#${host} date
^-- SC2095: Add < /dev/null to prevent ssh from swallowing stdin.

Related

SSH the output to different terminals [duplicate]

This question already has answers here:
how do i start commands in new terminals in BASH script
(2 answers)
Closed 17 days ago.
I am using for loop to SSH multiple hosts
#!/usr/bin/bash
bandit=$(cat /root/Desktop/bandit.txt)
for host in {1..2}
do
echo "inside the loop"
ssh bandit$host#$bandit -p 2220
echo "After the loop"
done
#ssh bandit0#bandit.labs.overthewire.org -p 2220
bandit.txt has the following content " bandit.labs.overthewire.org"
I am getting the SSH prompt but one at a time, say for example First I got "bandit1" host login prompt, and after closing the "bandit1" ssh host I am getting second ssh session for "bandit1"
I would like to get two different terminals for each SSH session.
But there is no such things as "terminal window" in bash (well, there is a tty, yours; but I mean, you can't just open a new window. Bash is not aware that it is running inside a specific program that emulate the behavior of a terminal in a GUI window).
So it can't be as easy as you would think.
Of course, you can choose a terminal emulator, and run it yourself.
For example
for host is {1..2}
do
xterm -e ssh bandit$host#$bandit -p 2220 &
done
maybe what you are looking for, if you have xterm program installed.
Maybe with some additional error checking, something like this -
scr=( /dev/stdout
$(ps -fu $USERNAME |
awk '$4~/^pty/{lst[$4]} END{for (pty in lst) print pty}' ) )
for host in {1..2}; do echo ssh bandit$host#etc... >> /dev/${scr[$host]}; done
There are a lot of variations and kinks to work out though. tty or pty? what is there's no other window open? etc... But with luck it will give you something to work from.

Taking sequentially output of multi ssh with bash scripting

I'm wrote a bash script and I don't have a chance to download pssh. However, I need to run multiple ssh commands in parallel. There is no problem so far, but when I run ssh commands, I want to see the outputs sequentially from the remote machine. I mean one ssh has multiple outputs and they get mixed up because more than one ssh is running.
#!/bin/bash
pid_list=""
while read -r list
do
ssh user#$list 'commands'&
c_pid=$!
pid_list="$pid_list $c_pid"
done < list.txt
for pid in $pid_list
do
wait $pid
done
What should I add to the code to take the output unmixed?
The most obvious way to me would be to write the outputs in a file and cat the files at the end:
#!/bin/bash
me=$$
pid_list=""
while read -r list
do
ssh user#$list 'hostname; sleep $((RANDOM%5)); hostname ; sleep $((RANDOM%5))' > /tmp/log-${me}-$list &
c_pid=$!
pid_list="$pid_list $c_pid"
done < list.txt
for pid in $pid_list
do
wait $pid
done
cat /tmp/log-${me}-*
rm /tmp/log-${me}-* 2> /dev/null
I didn't handle stderr because that wasn't in your question. Nor did I address the order of output because that isn't specified either. Nor is whether the output should appear as each host finishes. If you want those aspects covered, please improve your question.

'nohup 2>&1 >out' vs 'nohup >out 2>&1' [duplicate]

This question already has answers here:
Whats the difference between redirections "1>/dev/null 2>&1" and " 2>&1 1>/dev/null"?
(3 answers)
Closed 2 years ago.
For me, the following two bash commands produce different results:
# returns instantly
ssh localhost 'nohup sleep 5 >out 2>&1 &'
# returns after 5 seconds
ssh localhost 'nohup sleep 5 2>&1 >out &'
It's surprising because the following two bash commands produce the same result:
# both return instantly
nohup sleep 5 >out 2>&1 &
nohup sleep 5 2>&1 >out &
Why?
This form:
command >out 2>&1
... takes standard out and puts it to a file. It then takes standard error and puts it to the same location as standard out. Both streams are sent to a file.
This form:
command 2>&1 >out
... takes standard error and puts it on standard out. It then takes what would be sent to standard out and sends it to a file. It does not send standard error to the file.
So, at your terminal, you are backgrounding the task and the prompt returns immediately. When using ssh, in the first case, there is no output to display because everything has been sent to a file. In the other case, standard out can still display information that the application tries to send to standard error.
My expectation is that ssh returns immediately in the first case because there will never by any output. In the second case, there is an open stream that may return data for you to see.

Running Vagrant SSH causes BASH loop to terminate prematurely [duplicate]

This question already has answers here:
ssh breaks out of while-loop in bash [duplicate]
(2 answers)
Closed 7 years ago.
I have a bash script that fetches running selenium nodes, grabs their ID, and SSHs into them to perform configuration tasks.
#!/bin/bash
# retrieve list of running vagrant machines, filter them to selenium nodes, and
# process provisioning for each
vagrant global-status --prune | grep "selenium_node" | while read -ra line ; do
echo -e "\033[32mBEGININNG ${line[1]} PROVISIONING\033[37m"
# adding this statement causes the loop to exit after 1 iteration
vagrant ssh ${line[0]} -- "
echo 'it runs'
"
echo -e "\033[32mEND ${line[1]} PROVISION\033[37m"
done
My problem is that running vagrant ssh causes the loop to terminate after the first iteration. I confirmed this by removing 'vagrant ssh' and the results were that both the BEGINNING and END echo commands ran successfully for every iteration (in my case - two iterations).
What's stranger is that the loop DOES complete it's first iteration (as evinced by the END echo line completing), it just doesn't run any further iterations.
Also, I've confirmed that it's not just neglecting to show the output from the other iterations. It never performs any operations on the other machines.
ssh - including vagrant ssh - consumes standard input, so if you run it inside a while read loop, it won't leave anything for the next loop iteration to read.
You can fix that by either telling ssh not to read standard input (ssh -n) or by using a different construct than while read. In this case, since vagrant ssh doesn't support the -n option, I suggest running it with its input redirected from /dev/null:
</dev/null vagrant ssh ${line[0]} -- "
echo 'it runs'
"

How to wait for user input in a terminal called with -e option? [duplicate]

This question already has answers here:
Prevent Gnome Terminal From Exiting After Execution [duplicate]
(4 answers)
Closed 9 years ago.
I'm trying to open gnome-terminal (though I think it would be related to any x-terminal-emulator) with a command provided using -e option, like gnome-terminal -e 'ls'. The terminal is closed as soon as the command is done working, so I need a way to wait for user input to be able to read the result and then finally close the window with Enter press.
I tried gnome-terminal -e 'ls; read -p "..."' and it works if I run ls; read -p "..." in an already opened terminal, but a terminal called with -e option keeps getting closed.
So is there any way to keep the terminal open until some user input is provided while using -e option?
Spawn a shell;
xterm -e bash -c 'ls; read -p "Press any key ..."'

Resources