After bash script completion, change variable and restart script - linux

I have some working script, in start of which I enter start parameters (server IP, user login and root login), and every time, when I need to restart this script on another server, I need to edit script, to change server IP variable.
How can I change this, to enter a bunch of IP addresses in variable(s), maybe in some array and when script finishes with first IP, it goes to second, and so on to the end of IP list?
Script example:
##!/bin/bash
serv_address="xxx.xx.xx.xxx"
"here goes some script body"

You do indeed want an array, which you can then iterate over with a loop.
serv_address=(xxx.xx.xx.xxx yyy.yy.yyy.yy)
for address in "${serv_address[#]}"; do
if ! ping -c 1 "$serv_address"; then
echo "$serv_address is not available" >&2
continue
fi
# Do some stuff here if the address did respond.
done

use a text file for storing ips like
$ cat ip.txt
xxx.xx.xx.xxx
xxx.xx.xx.xxx
xxx.xx.xx.xxx
xxx.xx.xx.xxx
then modify your script
#!/bin/bash
while read ip
do
#some command on "$ip"
done<ip.txt # Your text file fed to while loop here
Or use bash array
declare ip_array=( xxx.xx.xx.xxx xxx.xx.xx.xxx xxx.xx.xx.xxx )
for ip in "${ip_array[#]}"
do
#something with "$ip"
done
Both gives you the flexibility to add/delete IP addresses later.

Related

redirect the ouput of command into a socket on linux

I'm using netcat to connect to a server.
the problem is that i want to send somme non printable to the server caracters.
I wanted to achieve this with a command redirection in linux.
lets say this is the command: nc hostname port
so when i checked the file descriptors of the command nc in the folder: cd /proc/$(pidof nc)/fd is saw the there was another fd with number 3 that conserns the socket. 3 -> socket:[1675643]
the problem is that i wanted to redirect the output of let's say echo -ne '\xff\x0f\xab\xde' > ./3 to the socket.
I couldn't do so and the ouput is: bash: ./3: No such device or address
One cannot output something to a socket which is opened only by another process.
In order to first use interactive input/output and afterwards send the echo string, you can do:
(cat; echo -ne '\xff\x0f\xab\xde')|nc hostname port
(press the EOF character Ctrl-D to end your input and start the echo).

Transfer variables in ssh an get answer back

I want to write script in bash to connect to server, ping on it another server, get ip from ping command, and send that info back to pc who ran the script, and at the end connect to server who is pinged before.
My script:
#!/bin/bash
echo "Script to connect to server"
#ip_p used to connect to first server, on that server i want to use ping command'
ip_p=XYz.XYZ.XYZ.XYZ
user='username to the servers'
#ip_d - in that variable i want to save ip of the pinged server
ssh -t $user#$ip_p ip_d="ip_d=ping -c1 $1.domain | sed -nE 's/^PING[^(]+\(([^)]+)\).*/\1/p' && exit "
echo "start"
echo $ip_d
echo "stop"
ssh -t $user#$ip_d
How i wish to work:
domain i want to check test.nyslay.pl
connect to server which ip, and username was defined in script
ping server(part ".nyslay.pl", is always the same, but "test" i want to read from first argument of script run
get ip of domain from previous point
transfer ip from point: 2 to local machine, on which script is run
connect to the server which ip we get from point: 2
Use command substitution:
ip_d=$(ssh -t $user#$ip_p "ping -c1 $1.domain | sed -nE 's/^PING[^(]+\(([^)]+)\).*/\1/p'")
The output of ping (through sed) comes to the local machine via ssh, whose output is captured in the local variable ip_d.

Passing Arguments to Running Bash Script

I have a bash script that takes a list of IP Addresses, and pings them every 15 seconds to test connectivity. Some of these IP Addresses are servers and computers as to which I have the ability to control. I would like to be able to do something of the following:
Run The Bash File
It pings non-controlled IP Addresses
It will list the controlled Computers
When a computer turns off, it sends my script a response saying it turned off
The script outputs accordingly
I have the code all set up that pings these computers every 15 seconds and displays. What I wish to achieve is to NOT ping my controlled computers. They will send a command to the bash script. I know this can be done by writing a file and reading such file, but I would like a way that changes the display AS IT HAPPENS. Would mkfifo be an viable option?
Yes, mkfifo is ok for this task. For instance, this:
mkfifo ./commandlist
while read f < ./commandlist; do
# Actions here
echo $f
done
will wait until a new line can be read from FIFO commandlist, read it into $f and execute the body.
From the outside, write to the FIFO with:
echo 42 > ./commandlist
But, why not let the remote server call this script, perhaps via SSH or even CGI? You can setup a /notify-disconnect CGI script with no parameters and get the IP address of the peer from the REMOTE_ADDR environment variable.

Executing two command at remote server and storing the output to local server using shell scripting

I'm trying to clean up my shell script. I am trying to read the list of server for
a text file "serverlist" and shh each server to get the outcome of the command and store in to a file "result_segfault". I managed to do that however for each server i wanted to list the name of the server and the output below:
Example :
-----------------
Servername
-----------------
Output of the command
----------------
Servername2
------------------
Output of the command
This is my code
#!/bin/bash
for HOSTNAME in $(cat serverlist);
do
SCRIPT="cat /var/log/messages | grep 'segfault at 0000000000000098'"
for HOSTNAME in ${HOSTNAME} ;
do
ssh ${HOSTNAME} "${SCRIPT}" >"result_segfault"
done;
done
I didn't know how to add the Server name and the separator
{
echo "----------------"
echo "$HOSTNAME"
echo "----------------"
ssh "$HOSTNAME" "${SCRIPT}"
} >>"result_segfault"
The braces {...} group statements together so that their stdout can be collected with a single redirect. I used ">>" in place of ">" for the redirect because it appeared from your sample that you wanted to append together the results from each host rather than overwrite it each time a new host is queried.
With various stylistic fixes;
#!/bin/bash
while read HOSTNAME
do
ssh ${HOSTNAME} grep 'segfault at 0000000000000098' /var/log/messages
done <serverlist >result_segfault

Parallel SSH with Custom Parameters to Each Host

There are plenty of threads and documentation about parallel ssh, but I can't find anything on passing custom parameters to each host. Using pssh as an example, the hosts file is defined as:
111.111.111.111
222.222.222.222
However, I want to pass custom parameters to each host via a shell script, like this:
111.111.111.111 param1a param1b ...
222.222.222.222 param2a param2b ...
Or, better, the hosts and parameters would be split between 2 files.
Because this isn't common, is this misuse of parallel ssh? Should I just create many ssh processes from my script? How should I approach this?
You could use GNU parallel.
Suppose you have a file argfile:
111.111.111.111 param1a param1b ...
222.222.222.222 param2a param2b ...
Then running
parallel --colsep ' ' ssh {1} prog {2} {3} ... :::: argfile
Would run prog on each host with the corresponding parameters. It is important that the number of parameters be the same for each host.
Here is a solution that you can use, after tailoring it to suit your needs:
#!/bin/bash
#filename: example.sh
#usage: ./example.sh <par1> <par2> <par3> ... <par6>
#set your ip addresses
$traf1=1.1.1.1
$traf2=2.2.2.2
$traf3=3.3.3.3
#set some custom parameters for your scripts and use them as you wish.
#In this example, I use the first 6 command line parameters passed when run the example.sh
ssh -T $traf1 -l username "/export/home/path/to/script.sh $1 $2" 1>traf1.txt 2>/dev/null &
echo "Fetching data from traffic server 2..."
ssh -T $traf2 -l username "/export/home/path/to/script.sh $3 $4" 1> traf2.txt 2>/dev/null &
echo "Fetching data from traffic server 3..."
ssh -T $traf3 -l username "/export/home/path/to/script.sh $5 $6" 1> traf3.txt 2>/dev/null &
#your application will block on this line, and will only continue if all
#3 remotely executed scripts will complete
wait
Keep in mind that the above requires that you setup passwordless login between the machines, otherwise the solution will break to request for password input.
If you can use Perl:
use Net::OpenSSH::Parallel;
use Data::Dumper;
my $pssh = Net::OpenSSH::Parallel->new;
$pssh->add_host('111.111.111.111');
$pssh->add_host('222.222.222.222');
$pssh->push('111.111.111.111', $cmd, $param11, $param12);
$pssh->push('222.222.222.222', $cmd, $param21, $param22);
$pssh->run;
if (my %errors = $ssh->get_errors) {
print STDERR "ssh errors:\n", Dumper \%errors;
}

Resources