Collecting system data from Red Hat Linux - linux

I am planning to write a small program .The input for this should be the IP,username,password for a Linux machine ,and it should give me the system details of that machine as output.
I am planning to write this using Shell ,using RSH for the login . I am in no way asking for a solution ,but could you please point me towards other options that I have ? I am not really comfortable using Shell scripts .
Thanks in advance

i have a same demand. and what i do is:
first write a script which will be executed at target host (T). something like this
> cat check_server.sh
#!/usr/bin/env bash
# execute at target host
all_cmd=(
"uname -a"
"lscpu"
"free -m"
)
function _check {
for one_cmd in "${all_cmd[#]}"; do
echo -e "\n\n$one_cmd" >> /tmp/server_info.txt
eval "$one_cmd" >> /tmp/server_info.txt
done
}
then execute it in target and copy back result, like this
_cmd=`base64 -w0 check_server`
ssh $user#$ip "echo $_cmd | base64 -d | bash"
scp $user#$ip:/tmp/server_info.txt ./

Related

Running bash script over SSH [duplicate]

I have to run a local shell script (windows/Linux) on a remote machine.
I have SSH configured on both machine A and B. My script is on machine A which will run some of my code on a remote machine, machine B.
The local and remote computers can be either Windows or Unix based system.
Is there a way to run do this using plink/ssh?
If Machine A is a Windows box, you can use Plink (part of PuTTY) with the -m parameter, and it will execute the local script on the remote server.
plink root#MachineB -m local_script.sh
If Machine A is a Unix-based system, you can use:
ssh root#MachineB 'bash -s' < local_script.sh
You shouldn't have to copy the script to the remote server to run it.
This is an old question, and Jason's answer works fine, but I would like to add this:
ssh user#host <<'ENDSSH'
#commands to run on remote host
ENDSSH
This can also be used with su and commands which require user input. (note the ' escaped heredoc)
Since this answer keeps getting bits of traffic, I would add even more info to this wonderful use of heredoc:
You can nest commands with this syntax, and that's the only way nesting seems to work (in a sane way)
ssh user#host <<'ENDSSH'
#commands to run on remote host
ssh user#host2 <<'END2'
# Another bunch of commands on another host
wall <<'ENDWALL'
Error: Out of cheese
ENDWALL
ftp ftp.example.com <<'ENDFTP'
test
test
ls
ENDFTP
END2
ENDSSH
You can actually have a conversation with some services like telnet, ftp, etc. But remember that heredoc just sends the stdin as text, it doesn't wait for response between lines
I just found out that you can indent the insides with tabs if you use <<-END!
ssh user#host <<-'ENDSSH'
#commands to run on remote host
ssh user#host2 <<-'END2'
# Another bunch of commands on another host
wall <<-'ENDWALL'
Error: Out of cheese
ENDWALL
ftp ftp.example.com <<-'ENDFTP'
test
test
ls
ENDFTP
END2
ENDSSH
(I think this should work)
Also see
http://tldp.org/LDP/abs/html/here-docs.html
Also, don't forget to escape variables if you want to pick them up from the destination host.
This has caught me out in the past.
For example:
user#host> ssh user2#host2 "echo \$HOME"
prints out /home/user2
while
user#host> ssh user2#host2 "echo $HOME"
prints out /home/user
Another example:
user#host> ssh user2#host2 "echo hello world | awk '{print \$1}'"
prints out "hello" correctly.
This is an extension to YarekT's answer to combine inline remote commands with passing ENV variables from the local machine to the remote host so you can parameterize your scripts on the remote side:
ssh user#host ARG1=$ARG1 ARG2=$ARG2 'bash -s' <<'ENDSSH'
# commands to run on remote host
echo $ARG1 $ARG2
ENDSSH
I found this exceptionally helpful by keeping it all in one script so it's very readable and maintainable.
Why this works. ssh supports the following syntax:
ssh user#host remote_command
In bash we can specify environment variables to define prior to running a command on a single line like so:
ENV_VAR_1='value1' ENV_VAR_2='value2' bash -c 'echo $ENV_VAR_1 $ENV_VAR_2'
That makes it easy to define variables prior to running a command. In this case echo is our command we're running. Everything before echo defines environment variables.
So we combine those two features and YarekT's answer to get:
ssh user#host ARG1=$ARG1 ARG2=$ARG2 'bash -s' <<'ENDSSH'...
In this case we are setting ARG1 and ARG2 to local values. Sending everything after user#host as the remote_command. When the remote machine executes the command ARG1 and ARG2 are set the local values, thanks to local command line evaluation, which defines environment variables on the remote server, then executes the bash -s command using those variables. Voila.
<hostA_shell_prompt>$ ssh user#hostB "ls -la"
That will prompt you for password, unless you have copied your hostA user's public key to the authorized_keys file on the home of user .ssh's directory. That will allow for passwordless authentication (if accepted as an auth method on the ssh server's configuration)
I've started using Fabric for more sophisticated operations. Fabric requires Python and a couple of other dependencies, but only on the client machine. The server need only be a ssh server. I find this tool to be much more powerful than shell scripts handed off to SSH, and well worth the trouble of getting set up (particularly if you enjoy programming in Python). Fabric handles running scripts on multiple hosts (or hosts of certain roles), helps facilitate idempotent operations (such as adding a line to a config script, but not if it's already there), and allows construction of more complex logic (such as the Python language can provide).
cat ./script.sh | ssh <user>#<host>
chmod +x script.sh
ssh -i key-file root#111.222.3.444 < ./script.sh
Try running ssh user#remote sh ./script.unx.
Assuming you mean you want to do this automatically from a "local" machine, without manually logging into the "remote" machine, you should look into a TCL extension known as Expect, it is designed precisely for this sort of situation. I've also provided a link to a script for logging-in/interacting via SSH.
https://www.nist.gov/services-resources/software/expect
http://bash.cyberciti.biz/security/expect-ssh-login-script/
ssh user#hostname ". ~/.bashrc;/cd path-to-file/;. filename.sh"
highly recommended to source the environment file(.bashrc/.bashprofile/.profile). before running something in remote host because target and source hosts environment variables may be deffer.
I use this one to run a shell script on a remote machine (tested on /bin/bash):
ssh deploy#host . /home/deploy/path/to/script.sh
if you wanna execute command like this
temp=`ls -a`
echo $temp
command in `` will cause errors.
below command will solve this problem
ssh user#host '''
temp=`ls -a`
echo $temp
'''
If the script is short and is meant to be embedded inside your script and you are running under bash shell and also bash shell is available on the remote side, you may use declare to transfer local context to remote. Define variables and functions containing the state that will be transferred to the remote. Define a function that will be executed on the remote side. Then inside a here document read by bash -s you can use declare -p to transfer the variable values and use declare -f to transfer function definitions to the remote.
Because declare takes care of the quoting and will be parsed by the remote bash, the variables are properly quoted and functions are properly transferred. You may just write the script locally, usually I do one long function with the work I need to do on the remote side. The context has to be hand-picked, but the following method is "good enough" for any short scripts and is safe - should properly handle all corner cases.
somevar="spaces or other special characters"
somevar2="!##$%^"
another_func() {
mkdir -p "$1"
}
work() {
another_func "$somevar"
touch "$somevar"/"$somevar2"
}
ssh user#server 'bash -s' <<EOT
$(declare -p somevar somevar2) # transfer variables values
$(declare -f work another_func) # transfer function definitions
work # call the function
EOT
The answer here (https://stackoverflow.com/a/2732991/4752883) works great if
you're trying to run a script on a remote linux machine using plink or ssh.
It will work if the script has multiple lines on linux.
**However, if you are trying to run a batch script located on a local
linux/windows machine and your remote machine is Windows, and it consists
of multiple lines using **
plink root#MachineB -m local_script.bat
wont work.
Only the first line of the script will be executed. This is probably a
limitation of plink.
Solution 1:
To run a multiline batch script (especially if it's relatively simple,
consisting of a few lines):
If your original batch script is as follows
cd C:\Users\ipython_user\Desktop
python filename.py
you can combine the lines together using the "&&" separator as follows in your
local_script.bat file:
https://stackoverflow.com/a/8055390/4752883:
cd C:\Users\ipython_user\Desktop && python filename.py
After this change, you can then run the script as pointed out here by
#JasonR.Coombs: https://stackoverflow.com/a/2732991/4752883 with:
`plink root#MachineB -m local_script.bat`
Solution 2:
If your batch script is relatively complicated, it may be better to use a batch
script which encapsulates the plink command as well as follows as pointed out
here by #Martin https://stackoverflow.com/a/32196999/4752883:
rem Open tunnel in the background
start plink.exe -ssh [username]#[hostname] -L 3307:127.0.0.1:3306 -i "[SSH
key]" -N
rem Wait a second to let Plink establish the tunnel
timeout /t 1
rem Run the task using the tunnel
"C:\Program Files\R\R-3.2.1\bin\x64\R.exe" CMD BATCH qidash.R
rem Kill the tunnel
taskkill /im plink.exe
This bash script does ssh into a target remote machine, and run some command in the remote machine, do not forget to install expect before running it (on mac brew install expect )
#!/usr/bin/expect
set username "enterusenamehere"
set password "enterpasswordhere"
set hosts "enteripaddressofhosthere"
spawn ssh $username#$hosts
expect "$username#$hosts's password:"
send -- "$password\n"
expect "$"
send -- "somecommand on target remote machine here\n"
sleep 5
expect "$"
send -- "exit\n"
You can use runoverssh:
sudo apt install runoverssh
runoverssh -s localscript.sh user host1 host2 host3...
-s runs a local script remotely
Useful flags:
-g use a global password for all hosts (single password prompt)
-n use SSH instead of sshpass, useful for public-key authentication
If it's one script it's fine with the above solution.
I would set up Ansible to do the Job. It works in the same way (Ansible uses ssh to execute the scripts on the remote machine for both Unix or Windows).
It will be more structured and maintainable.
It is unclear if the local script uses locally set variables, functions, or aliases.
If it does this should work:
myscript.sh:
#!/bin/bash
myalias $myvar
myfunction $myvar
It uses $myvar, myfunction, and myalias. Let us assume they is set locally and not on the remote machine.
Make a bash function that contains the script:
eval "myfun() { `cat myscript.sh`; }"
Set variable, function, and alias:
myvar=works
alias myalias='echo This alias'
myfunction() { echo This function "$#"; }
And "export" myfun, myfunction, myvar, and myalias to server using env_parallel from GNU Parallel:
env_parallel -S server -N0 --nonall myfun ::: dummy
Extending answer from #cglotr. In order to write inline command use printf, it useful for simple command and it support multiline using char escaping '\n'
example :
printf "cd /to/path/your/remote/machine/log \n tail -n 100 Server.log" | ssh <user>#<host> 'bash -s'
See don't forget to add bash -s
There is another approach ,you can copy your script in your host with scp command then execute it easily .
First, copy the script over to Machine B using scp
[user#machineA]$ scp /path/to/script user#machineB:/home/user/path
Then, just run the script
[user#machineA]$ ssh user#machineB "/home/user/path/script"
This will work if you have given executable permission to the script.

Invocation command using SSH getting failed?

As per project requirement, i need to check the content of zip file generated which been generated on remote machine.This entire activity is done using automation framework suites. which has been written in shell scripts. I am performing above activity using ssh command abd execute unzip command with -l and -q switches. But this command is getting failed. and shows below error messages.
[SOMEUSER#MACHINE IP Function]$ ./TESTS.sh
ssh SOMEUSER#MACHINE IP unzip -l -q SOME_PATH/20130409060734*.zip | grep -i XML |wc -l
unzip: cannot find or open SOME_PATH/20130409060734*.zip, SOME_PATH/20130409060734*.zip.zip or SOME_PATH/20130409060734*.zip.ZIP.
No zipfiles found.
0
the same command i had written manually but that works properly. I really have no idea.Why this is getting failed whenever i executed via shell scripts.
[SOMEUSER#MACHINE IP Function]$ ssh SOMEUSER#MACHINE IP unzip -l -q SOME_PATH/20130409060734*.zip | grep -i XML |wc -l
2
Kindly help me to resolve that issue.
Thanks in Advance,
Priyank Shah
when you run the command from your local machine, the asterisk character is being expanded on your local machine before it is passed on to your remote ssh command. So your command is expecting to find SOME_PATH/20130409060734*.zip files on your machine and insert them into your ssh command to be passed to the other machine, whereas you (I'm assuming) mean, SOME_PATH/20130409060734*.zip files on the remote machine.
for that, precede the * character by a backslash ( \ ) and see if it helps you. In some shells escape character might be defined differently and if yours is one of them you need to find the escape character and use that one instead. Also, use quotes around the commands being passed to other server. Your command line should look something like this in my opinion:
ssh SOMEUSER#MACHINE_IP "/usr/bin/unzip -l -q SOME_PATH/20130409060734\*.zip | grep -i XML |wc -l"
Hope this helps

Execute Remote Perl Script on a Windows Box from a Linux Box

We recently got SSH setup on our Windows boxes so we could eliminate the need for disc mounts on our Linux machines. We are using Pentaho and I am writing a shell script that will, from a Linux box, SSH into the Windows box and execute a perl script.
I have able to write in a way to SSH into the windows box and switch to the directory that holds the Perl scripts that I need to execute, I just can't figure out how to actually execute them.
This is what I have:
#!/bin/sh
ssh -t xxxxx#xxxxx "cd /path/to/script/ /path/to/perl.exe HelloWorld.pl"
I have also tried:
#!/bin/sh
ssh -t xxxxx#xxxxx "cd /path/to/directory/with/perl/script" \
"/path/to/perl.exe HelloWorld.pl"
Both attempts result in a short delay and then a "disconnected from xxxxx" and the perl does not run. I can do all of these steps manually through a shell, but can't seem to get them working in script form. As a note, the only way I've been able to execute the perl scripts is if have the shell in the directory the perl script is in.
You need to use either a semi colon to end your statements, or execute with one statement.
try the following:
ssh xxxxx#xxxxx "cd /path/to/script/; /path/to/perl.exe HelloWorld.pl"
or:
ssh xxxxx#xxxxx "/path/to/perl.exe /path/to/script/HelloWorld.pl"
In the Windows command shell, you can use && like in a Unix-shell. If you expect the first command to succeed,
ssh -t xxxxx#xxxxx "cd /path/to/script/ && /path/to/perl.exe HelloWorld.pl"
will work.

using a variable in a BASH command?

I have 20 machines, each running a process. The machines are named:
["machine1", "machine2", ...., "machine20"]
To inspect how the process is doing on machine1, I issue the following command from a remote machine:
ssh machine1 cat log.txt
For machine2, I issue the following command:
ssh machine2 cat log.txt
Similarly, for machine20, I issue the following command:
ssh machine20 cat log.txt
Is there a bash command that will allow me to view the output from all machines using one command?
If the machines are nicely numbered like in your example:
for i in {1..20} ; do ssh machine$i cat log.txt; done
If you have the list of machines in a file, you can use:
cat machinesList.txt | xargs -i ssh {} cat log.txt
You could store all your machine names in an array or text file, and loop through it.
declare -a machineList=('host1' 'host2' 'otherHost') # and more...
for machine in ${machineList[#]}
do
ssh $machine cat log.txt
done
I assume your machines aren't literally named 'machine1', 'machine2', etc.
Some links:
bash Array Tutorial
GNU Bash Array Documentation
for i in {1..20}
do
ssh machine$i cat log.txt
done
Use a loop?
for i in {1..20}
do
ssh machine$i cat log.txt
done
But note that you're running cat within a remote shell session, not the current one, so this might not quite work as you expect. Try it and see.
Put your hosts in a file and use a while loop as shown below. Note the use of the -n flag on ssh:
while read host; do ssh -n $host cat log.txt; done < hosts-file
Alternatively you can use PSSH:
pssh -h hosts-file -i "cat log.txt"
I would recommend using a program called Shmux. Despite the name, it works really well. I've used it with more than 100 machines with good results. It also gracefully handles machine failures for you which could be a disadvantage with a bash for loop approach.
I think the coolest thing about this program is the ability to issue multiple threads for your commands which allows to run the commands on all 20 machines in parallel.
Aside from the suggestions for using a loop, you might want to take a look at tools, like pssh or dsh, designed for running commands on multiple clients.

How do I execute a Perl program on a remote machine?

I wrote a Perl program to capture a live data stream from a tail command on a Linux machine using the following command in the console:
tail -f xyz.log | myperl.pl
It works fine. But now I have to execute this Perl program on a different machine because the log file is on that different machine. Can anyone tell me how I can do it?
You could say
ssh remotemachine tail -f xyz.log | myperl.pl
I suppose or maybe mount the remote log directories locally onto your administrative machine and do the processing there.
Or you could even say
ssh remotemachine bash -c "tail -f xyz.log | myperl.pl"
in order to run the script on the remote machine (if your script produces some output files and you want them on remote machine)

Resources