script to read a file with IP addresses and login - linux

I have a file named "host.txt" with listing IP addresses of two systems.
~] cat hosts.txt
10.1.1.10
10.1.1.20
Using below script I am trying to login to each system, check status of a service and print the output of each system. The script prompts to login, however does not continue to execute the /opt/agent.sh status command. Can someone please help fix this script?
#!/bin/bash
for HOST in `cat hosts.txt`
do
ssh root#$HOST
STATUS=`/opt/agent.sh status | awk 'NR==1{print $3 $4}'`
echo $STATUS
if [ $STATUS! == "isrunning" ]; then
echo "$host == FAIL"
else
echo "$host == PASS"
fi

You script does not continue until the ssh command completes, which does not happen the interactive shell on $HOST that you started with ssh exits. Instead, you want to execute a script on $HOST.
(Also, note the correct way to iterate over the contents of hosts.txt.)
#!/bin/bash
while read HOST; do
do
if ssh root#$HOST '
STATUS=`/opt/agent.sh status | awk 'NR==1{print $3 $4}'`
[ "$STATUS" = "isrunning" ]
'; then
echo "$HOST == FAIL"
else
echo "$HOST == PASS"
fi
done < hosts.txt
The remote script simply exits with the result of comparing $STATUS to "isrunning". An if statement on the local host outputs a string based on the that result (which is the result of the ssh command itself). This saves the trouble of having to pass the value of $HOST to the remote host, simplifying the quoting required for the remote script.

Related

how to check if bash function exist on remote computer

I read various answers on similar topic, but I still can't deal with my problem. Namely, on the remote computer I have a .bashrc file with a bunch of custom made functions. I would like to check if that function exists in that file. Just to add that the script constantly reports that there is a specified function on the remote computer even though it is not. This is what I have done so far:
echo "Enter IP addres of the remote PC [def host#XX.XX.XX.XX]"
read ip
ip=${ip:-host#XX.XX.XX.XX}
$(ssh $ip "[ '$(type -t $1)' = function ]")
if [ $? -eq 0 ]; then
echo "function exist"
else
echo 'function doesnt exist'
fi
$(...)is expanded localy inside " quotes. Reseach difference between single and double quotes.
the_function_you_want_to_check=something
ssh "$ip" '[ "$(type -t "'$the_function_you_want_to_check'")" = function ]'
Do not use $?. Just:
if ssh stuff...; then
echo yes
else
echo no
fi
Thank you for your prompt response. Please note that $1 is actually the first parameter of the bash functions that I run on my local computer. Now, the change you suggested reports that there is no function on the remote computer even though it exists. More complete function that I run on the local machine is:
appendFunction_to_remotePC(){
echo "Enter the IP addres of the PC [def host#XX.XX.XX.XX]"
read ip
ip=${ip:-host#XX.XX.XX.XX}
if ssh "$ip" '[ "$(type -t "'$1'")" = function ]'; then
echo yes
else
echo no
fi
}
I call the function on the local computer in the usual way:
$ appendFunction_to_remotePC "test"

SSH Remote command exit code

I know there are lots of discussions about it but i need you help with ssh remote command exit codes. I have that code:
(scan is a script which scans for viruses in the given file)
for i in $FILES
do
RET_CODE=$(ssh $SSH_OPT $HOST "scan $i; echo $?")
if [ $? -eq 0 ]; then
SOME_CODE
The scan works and it returns either 0 or (1 for errors) or 2 if a virus is found. But somehow my return code is always 0. Even, if i scan a virus.
Here is set -x output:
++ ssh -i /home/USER/.ssh/id host 'scan Downloads/eicar.com; echo 0'
+ RET_CODE='File Downloads/eicar.com: VIRUS: Virus found.
code of the Eicar-Test-Signature virus
0'
Here is the Output if i run those commands on the "remote" machine without ssh:
[user#ws ~]$ scan eicar.com; echo $?
File eicar.com: VIRUS: Virus found.
code of the Eicar-Test-Signature virus
2
I just want to have the return Code, i dont need all the other output of scan.
!UPDATE!
It seems like, echo is the problem.
The reason your ssh is always returning 0 is because the final echo command is always succeeding! If you want to get the return code from scan, either remove the echo or assign it to a variable and use exit. On my system:
$ ssh host 'false'
$ echo $?
1
$ ssh host 'false; echo $?'
1
$ echo $?
0
$ ssh host 'false; ret=$?; echo $ret; exit $ret'
1
$ echo $?
1
ssh returns the exit status of the entire pipeline that it runs - in this case, that's the exit status of echo $?.
What you want to do is simply use the ssh result directly (since you say that you don't want any of the output):
for i in $FILES
do
if ssh $SSH_OPT $HOST "scan $i >/dev/lull 2>&1"
then
SOME_CODE
If you really feel you must print the return code, that you can do that without affecting the overall result by using an EXIT trap:
for i in $FILES
do
if ssh $SSH_OPT $HOST "trap 'echo \$?' EXIT; scan $i >/dev/lull 2>&1"
then
SOME_CODE
Demo:
$ ssh $host "trap 'echo \$?' EXIT; true"; echo $?
0
0
$ ssh $host "trap 'echo \$?' EXIT; false"; echo $?
1
1
BTW, I recommend you avoid uppercase variable names in your scripts - those are normally used for environment variables that change the behaviour of programs.

Unable to array values outside of function in shell script [duplicate]

Please explain to me why the very last echo statement is blank? I expect that XCODE is incremented in the while loop to a value of 1:
#!/bin/bash
OUTPUT="name1 ip ip status" # normally output of another command with multi line output
if [ -z "$OUTPUT" ]
then
echo "Status WARN: No messages from SMcli"
exit $STATE_WARNING
else
echo "$OUTPUT"|while read NAME IP1 IP2 STATUS
do
if [ "$STATUS" != "Optimal" ]
then
echo "CRIT: $NAME - $STATUS"
echo $((++XCODE))
else
echo "OK: $NAME - $STATUS"
fi
done
fi
echo $XCODE
I've tried using the following statement instead of the ++XCODE method
XCODE=`expr $XCODE + 1`
and it too won't print outside of the while statement. I think I'm missing something about variable scope here, but the ol' man page isn't showing it to me.
Because you're piping into the while loop, a sub-shell is created to run the while loop.
Now this child process has its own copy of the environment and can't pass any
variables back to its parent (as in any unix process).
Therefore you'll need to restructure so that you're not piping into the loop.
Alternatively you could run in a function, for example, and echo the value you
want returned from the sub-process.
http://tldp.org/LDP/abs/html/subshells.html#SUBSHELL
The problem is that processes put together with a pipe are executed in subshells (and therefore have their own environment). Whatever happens within the while does not affect anything outside of the pipe.
Your specific example can be solved by rewriting the pipe to
while ... do ... done <<< "$OUTPUT"
or perhaps
while ... do ... done < <(echo "$OUTPUT")
This should work as well (because echo and while are in same subshell):
#!/bin/bash
cat /tmp/randomFile | (while read line
do
LINE="$LINE $line"
done && echo $LINE )
One more option:
#!/bin/bash
cat /some/file | while read line
do
var="abc"
echo $var | xsel -i -p # redirect stdin to the X primary selection
done
var=$(xsel -o -p) # redirect back to stdout
echo $var
EDIT:
Here, xsel is a requirement (install it).
Alternatively, you can use xclip:
xclip -i -selection clipboard
instead of
xsel -i -p
I got around this when I was making my own little du:
ls -l | sed '/total/d ; s/ */\t/g' | cut -f 5 |
( SUM=0; while read SIZE; do SUM=$(($SUM+$SIZE)); done; echo "$(($SUM/1024/1024/1024))GB" )
The point is that I make a subshell with ( ) containing my SUM variable and the while, but I pipe into the whole ( ) instead of into the while itself, which avoids the gotcha.
#!/bin/bash
OUTPUT="name1 ip ip status"
+export XCODE=0;
if [ -z "$OUTPUT" ]
----
echo "CRIT: $NAME - $STATUS"
- echo $((++XCODE))
+ export XCODE=$(( $XCODE + 1 ))
else
echo $XCODE
see if those changes help
Another option is to output the results into a file from the subshell and then read it in the parent shell. something like
#!/bin/bash
EXPORTFILE=/tmp/exportfile${RANDOM}
cat /tmp/randomFile | while read line
do
LINE="$LINE $line"
echo $LINE > $EXPORTFILE
done
LINE=$(cat $EXPORTFILE)

Bash script runs one command before previous. I want them one after the other

So part of my script is as follows:
ssh user#$remoteServer "
cd ~/a/b/c/;
echo -e 'blah blah'
sleep 1 # Added this just to make sure it waits.
foo=`grep something xyz.log |sed 's/something//g' |sed 's/something-else//g'`
echo $foo > ~/xyz.list
exit "
In my output I see:
grep: xyz.log: No such file or directory
blah blah
Whereas when I ssh to the server, xyz.log does exist within ~/a/b/c/
Why is the grep statement getting executed before the echo statement?
Can someone please help?
The problem here is that your command in backticks is being run locally, not on the remote end of the SSH connection. Thus, it runs before you've even connected to the remote system at all! (This is true for all expansions that run in double-quotes, so the $foo in echo $foo as well).
Use a quoted heredoc to protect your code against local evaluation:
ssh user#$remoteServer bash -s <<'EOF'
cd ~/a/b/c/;
echo -e 'blah blah'
sleep 1 # Added this just to make sure it waits.
foo=`grep something xyz.log |sed 's/something//g' |sed 's/something-else//g'`
echo $foo > ~/xyz.list
exit
EOF
If you want to pass through a variable from the local side, the easy way is with positional parameters:
printf -v varsStr '%q ' "$varOne" "$varTwo"
ssh "user#$remoteServer" "bash -s $varsStr" <<'EOF'
varOne=$1; varTwo=$2 # set as remote variables
echo "Remote value of varOne is $varOne"
echo "Remote value of varTwo is $varTwo"
EOF
[command server] ------> [remote server]
The better way is to create shell script in the "remote server" , and run the command in the "command server" such as :
ssh ${remoteserver} "/bin/bash /foo/foo.sh"
It will solve many problem , the aim is to make things simple but not complex .

grep statement in bash

I am using a for loop to connect to a list of servers and perform some simple commands. If the server is not accessible then stderr is written to a file. I then grep that file for the server name. It seems relatively simple and for some reason it isn't working. For troubleshooting purposes I have narrowed my server list to two servers and only run simple commands.
for i in $(cat serverlist)
do
nexec -i $i hostname 2>>errorlog.txt
if grep -q $i errorlog.txt; then echo "error accessing" $i
else echo "was able to connect to" $i
fi
done
So in the serverlist I have defined two incorrect hosts for troubleshooting purposes. Nexec tries to connect to each and perform the hostname command. If it is unable to connect an error message is printed to errorlog.txt
e.g.,
nexec: Error accessing host test1
Since both servers are incorrectly specified I am not able to connect to either. Again for troubleshooting purposes.
When grep runs the first time against $i which is the first server in the list it doen't find any matches in error.txt. However, it should. If I cat the results instead of grepping it is there.
I am actually doing this in bladelogic so the rules are a bit different. It should still work.
while read -r i <&3; do
nexec -i "$i" hostname 2>>"errorlog.$i.txt" || {
echo "nexec for $i exited with status $?" >&2
continue
}
# check for case where it claimed success but actually failed
# if nexec is written correctly, you don't need any of this logic
# ...and can completely remove the rest of the loop.
if grep -q -e "$i" "errorlog.$i.txt"; then
echo "error accessing $i" >&2
else
echo "was able to connect to $i" >&2
fi
done 3<serverlist
# and combine all the individual logs into one file:
cat errorlog.*.txt >errorlog.txt && rm -f -- errorlog.*.txt
Not familiar with nexec, but I imagine something like this is what you are looking for
for i in $(cat serverlist)
do
if [ ! "$(nexec -i $i hostname)" ]
then echo "error accessing" $i
else echo "was able to connect to" $i
fi
done

Resources