checking information of different computers in a network using bash script - linux

I am trying to write a bash script to access a file (nodeNames) that contains ip addresses to different computers in a cluster network, ssh into each of these computers and output some basic information namely: Hostname, Host IPaddress, load average and the process using the most memory and append all these information into a file seperating each wit commas. Also, each of the computers have same user and password. This is my code so far but it isn't working, please I need help here
egrep -ve '^#|^$'nodeNames | while read a
do
ssh $a "$#" &
output1=`hostname`
#This will display the server's IP address
output2=`hostname -i`
#This will output the server's load average
output3=`uptime | grep -oP '(?<=average:).*'| tr -d ','`
#This outputs memory Information
output4=`ps aux --sort=-%mem | awk 'NR<=1{print $0}'`
#This concantenates all output to a single line of text written to
echo "$output1, $output2, $output3, $output4" | tee clusterNodeInfo
done

You need to understand what is executed on which computer. The shell script you start is executed on your host A and you want information from your host B. ssh $a "$#" & will not suddenly make all the commands execute on the remote host B. Therefore, the
output1=`hostname`
will be executed on host A and output1 will have the hostname of host A.
You may also want to put the tee outside the loop or use tee -a to prevent overwriting your output file.
For bash, use $() instead of `` .
So, that would make your script:
egrep -ve '^#|^$'nodeNames | while read a
do
output1=$(ssh $a hostname)
#This will display the server's IP address
output2=$(ssh $a hostname -i)
#This will output the server's load average
output3=$(ssh $a "uptime | grep -oP '(?<=average:).*'| tr -d ','")
#This outputs memory Information
output4=$(ssh $a "ps aux --sort=-%mem | awk 'NR<=1{print $0}'")
#This concantenates all output to a single line of text written to
echo "$output1, $output2, $output3, $output4" | tee -a clusterNodeInfo
done
(have not tested it, but it should be something like this)

Related

Assign a output of jps -vl command to a variable in shell script

I need to assign the output of a command to a variable. The command I tried is:
#!/bin/bash
JAVA_PROCESSES=`jps -vl | grep -v 'sun.tools.jps.Jps' | grep -v 'hudson.remoting.jnlp.Main' | grep -v grep`
NUMBER_OF_JAVA_PROCESSES=`echo $JAVA_PROCESSES | wc -l`
echo $NUMBER_OF_JAVA_PROCESSES
echo $JAVA_PROCESSES
..
When I tried as in above, all java processes grepped are assigned to JAVA_PROCESSES variable in one line. Processes are not separated by new line. Therefore $NUMBER_OF_JAVA_PROCESSES always give 1 for me.
Also $NUMBER_OF_JAVA_PROCESSES show 1 even no processes are assigned to JAVA_PROCESSES due to the empty line in $JAVA_PROCESSES.
Please suggest a way to assign grepped processes separated by new line.
If the main thing you want is to know whether or not you got any at all, you could just test if the variable is empty:
java_procs=$(jps -vl | grep -v 'sun.tools.jps.Jps' | grep -v 'hudson.remoting.jnlp.Main' | grep -v grep)
if [ -z "$java_procs" ]; then
echo "No processes"
fi
Also, we can simplify the grep by using extended regex and just needing a single processes:
java_procs=$(jps -vl | grep -Ev 'sun.tools.jps.Jps|hudson.remoting.jnlp.Main|grep')
Assuming none of the lines output by jps can contain linebreaks themselves, we could get the count after that if we need it:
num_procs=$(printf '%s\n' "$java_procs" | wc -l)
The main problem you were running into is that you weren't quoting your variable, so echo $JAVA_PROCESSES was being expanded and then subject to word splitting, so your newlines were being "eaten" by the shell. You'd always have only one line which would be a space separated list of all the words in your JAVA_PROCESSES variable. To protect from word splitting you can quote the variable, as I did in my code above.
echo will also always add a line break at the end, which is good sometimes, and not so good sometimes, but you should be aware of it happening (that's why you would always get a count of 1 even when there were no processes).

How to list the machine name along with running programs list in Linux?

I have a shell script to check if Firefox is running in my Linux machine like;
ps -ef|grep firefox
this will list all the instances of firefox running in my machine, showing their PIDs, so that I can manually kill them. My question is, is it possible to display the machine name also in this list? If there are multiple instances, each line should contain the machine name (or IP) also. In my shellscript, i did something like;
hostname
ps -ef|grep firefox
which returns the hostname once, and multiple instances are listed below that one by one. How can I print machine name (or IP) also along with each line?
Like this:
ps -ef | egrep '[/ ]firefox' | sed "s/^/$(hostname -s) : /"
This will do it:
ps -ef | grep [f]irefox | xargs -I{} echo "$(hostname) {}"
Notice the brackets around 'f' in firefox. This prevents your grep command from showing up in the results.

how to set an expect variable with output of shell command

I want to set a variable b in expect file,here initially i did ssh to a machine through this script,in that machine I want to do fetch a value and set the expect variable using following command:
set b [exec `cat /home/a |grep "work"|awk -F '=' '{print $2}'`]
send_user "$b"
file /home/a have following structure:
home=10.10.10.1
work=10.20.10.1
I am trying to use variable b after printing it
but after doing ssh script it is giving:
can't read "2": no such variable
while executing
If I put this output in a file name temp in that machine and try to do:
set b [exec cat ./temp]
then also it gives:
cat: ./temp: No such file or directory
If I do send "cat ./temp" it prints the correct output.
Please let me know where I am going wrong.
Single quotes are not quoting mechanism for Tcl, so brace your awk expressions.
% set b [exec cat /home/a | grep "work" | awk -F {=} {{print $2}}]
10.20.10.1
Reference : Frequently Made Mistakes in Tcl
Assuming you spawned an ssh session, or something similar, send "cat ./temp\r" shows you the file on the remote host, and exec cat ./temp shows you the file on the local host. exec is a plain Tcl command.
Capturing the command output of a send command is a bit of PITA, because you have to parse out the actual command and the next prompt from the output. You need to do something like:
# don't need 3 commands when 1 can do all the work
send {awk -F= '/work/ {print $2}' /home/a}
send "\r"
expect -re "awk\[^\n]+\n(.+)\r\n$PROMPT" # where "$PROMPT" is a regex that
# matches your shell prompt.
set command_output $expect_out(1,string)
send_user "$command_output\n"

Mail output with Bash Script

SSH from Host A to a few hosts (only one listed below right now) using the SSH Key I generated and then go to a specific file, grep for a specific word with a date of yesterday .. then I want to email this output to myself.
It is sending an email but it is giving me the command as opposed to the output from the command.
#!/bin/bash
HOST="XXXXXXXXXXXXXXXXXX, XXXXXXXXXXXXX"
DATE=$(date -d "yesterday")
INVALID=' cat /xxx/xxx/xxxxx | grep 'WORD' | sed 's/$/.\n/g' | grep "$DATE"'
COUNT=$(echo "$INVALID" | wc -c)
for x in $HOSTS
do
ssh BLA#"$x" $COUNT
if [ "$COUNT" -gt 1 ];
then
EMAILTEXT=""
if [ "$COUNT" -gt 1 ];
then
EMAILTEXT="$INVALID"
fi
fi
done | echo -e "$EMAILTEXT" | mail XXXXXXXXXXX.com
This isn't properly an attempt to answer your question, but I think you should be aware of some fundamental problems with your code.
INVALID=' cat /xxx/xxx/xxxxx | grep 'WORD' | sed 's/$/.\n/g' | grep "$DATE"'
This assigns a simple string to the variable INVALID. Because of quoting issues, s/$/.\n/g is not quoted at all, and will probably be mangled by the shell. (You cannot nest single quotes -- the first single-quoted string extends from the first quote to the next one, and then WORD is outside of any quotes, followed by the next single-quoted string, etc.)
If your intent is to execute this as a command at this point, you are looking for a command substitution; with the multiple layers of uselessness peeled off, perhaps something like
INVALID=$(sed -n -e '/WORD/!d' -e "/$DATE/s/$/./p" /xxx/xxx/xxxx)
which looks for a line matching WORD and $DATE and prints the match with a dot appended at the end -- I believe that's what your code boils down to, but without further insights into what this code is supposed to do, it's impossible to tell if this is what you actually need.
COUNT=$(echo "$INVALID" | wc -c)
This assigns a number to $COUNT. With your static definition of INVALID, the number will always be 62; but I guess that's not actually what you want here.
for x in $HOSTS
do
ssh BLA#"$x" $COUNT
This attempts to execute that number as a command on a number of remote hosts (except the loop is over HOSTS and the variable containing the hosts is named just HOST). This cannot possibly be useful, unless you have a battery of commands named as natural numbers which do something useful on these remote hosts; but I think it's safe to assume that that is not what is supposed to be going on here (and if it was, it would absolutely be necessary to explain this in your question).
if [ "$COUNT" -gt 1 ];
then
EMAILTEXT=""
if [ "$COUNT" -gt 1 ];
then
EMAILTEXT="$INVALID"
fi
fi
So EMAILTEXT is either an empty string or the value of INVALID. You assigned it to be a static string above, which is probably the source of your immediate question. But even if it was somehow assigned to a command on the local host, why do you need to visit remote hosts and execute something there? Or is your intent actually to execute the command on each remote host and obtain the output?
done | echo -e "$EMAILTEXT" | mail XXXXXXXXXXX.com
Piping into echo makes no sense at all, because it does not read its standard input. You should probably just have a newline after done; though a possibly more useful arrangement would be to have your loop produce output which we then pipe to mail.
Purely speculatively, perhaps something like the following is what you actually want.
for host in $HOSTS; do
ssh BLA#"$host" sed -n -e '/WORD/!d' -e "/$DATE/s/$/./p" /xxx/xxx/xxxx |
grep . || echo INVALID
done | mail XXXXXXXXXXX.com
If you want to check that there is strictly more than one line of output (which is what the -gt 1 suggests) then this may need to be a little bit more complicated.
Your command substitution is not working. You should read up on how it works but here are the problem lines:
COUNT=$(echo "$INVALID" | wc -c)
[...]
ssh BLA#"$x" $COUNT
should be:
COUNT_CMD="'${INVALID} | wc -c'"
[...]
COUNT=$(ssh BLA#"$x" $COUNT_CMD)
This inserts the value of $INVALID into the string, and puts the whole thing in single quotes. The single quotes are necessary for the ssh call so the pipes aren't evaluated in the script but on the remote host. (COUNT is changed to COUNT_CMD for readability/clarity.)
EDIT:
I misread the question and have corrected my answer.

Is is possible to pipe the output of a command from a server to a local machine?

I have a series of functionally identical servers provided by my school that run various OS and hardware configurations. For the most part, I can use 5 of these interchangeably. Unfortunately, other students tend to bunch up on some machines and It's a pain to find one that isn't bogged down.
What I want to is ssh into a machine, run the command:
w | wc -l
to get a rough estimate of the load on that server, and use that information to select the least impacted one. A sort of client-side load balancer.
Is there a way to do this or achieve the same result?
I'd put this on your .bashrc file
function choose_host(){
hosts="host1 ... hostn"
for host in $hosts
do
echo $(ssh $host 'w|wc -l') $host
done | sort | head -1 | awk '{print $2}'
}
function ssh_host(){
ssh $(choose_host)
}
choose_host should give you the one you're looking for. This is absolutely overkill but i was feeling playful :D
sort will order the output according to the result of w|wc -l, then head -1 gets the first line and awk will just print the hostname !
You can call ssh_host and should log you automatically.
You can use pdsh command from your desktop which run the specified command on the set of machines you specified and return the results. This way you can find out the one which is least loaded. This will avoid you doing ssh to every single machine and run the w | wc -l.
Yes. See e.g.:
ssh me#host "ls /etc | sort" | wc -l
The part inside "" is done remotely. The part afterwards is local.

Resources