Using Linux commands on files across multiple servers - linux

I am new to Linux as a whole and so far I have not found a solution to this that isnt clumsy at best. I have a Windows background and so I am accustomed to running commands on one server that access text files on multiple systems in the same domain.
Example of what is processed in Windows:
find "Some text" \\ServerName01\c$\inetpub\*.log
find "Some text" \\ServerName02\c$\inetpub\*.log
find "Some text" \\ServerName03\c$\inetpub\*.log
Example of what I would LIKE to do in Linux:
sed 's/SomeText/OtherText/p //ServerName01/var/opt/somefolder/*.log
sed 's/SomeText/OtherText/p //ServerName02/var/opt/somefolder/*.log
sed 's/SomeText/OtherText/p //ServerName03/var/opt/somefolder/*.log
What is the best way to do the above in Linux, or is it even possible?
Thanks!

See the pssh and pscp suite, you can run commands on a bunch of remote servers : http://www.theether.org/pssh/

pssh or cssh would work
pssh provides a number of commands for executing against a group of
computers, using SSH. It’s most useful for operating on clusters of
homogenously-configured hosts.
http://www.ubuntugeek.com/execute-commands-simultaneously-on-multiple-servers-using-psshcluster-sshmultixterm.html

there is a lot of way for doing it :
Via NFS/Fuse Mount, mount the logs directory on one system and you could do the same thing as windows (which automatically mount remote filesystem with the "\\")
use ssh,(that would be my prefered solution)
cat serverlist | xargs -i ssh {} " grep \"some text\" yourfilepaths"
which helps if you use ssh keys pairs

Related

searching text in a file remotely

I have log files, based on Linux servers, and I'm working on Windows OS.
I'm using Filezilla to log in the Linux server and searching specific text or strings by open the log file.
I want to automate this process using batch in Windows, I tried using below;
#echo off
cls
set /p string="Enter the string: "
echo open xx.xx.xx.xx 21> ftpc.dat
echo xxxxxxxx>> ftpc.dat
echo xxxxxxxx>> ftpc.dat
echo bin >> ftpc.dat
echo grep '%string%' /PATH IS HERE/log.log >> ftpc.dat
ftp -s:ftpc.dat
I'm just new to that, I want ideas on that, how I automate this search process? where I can make a search tool for any text, that this tool goes and find specific file in linux server and shows the results in lines (before/after 15 lines) of thatsearch results.
Do I need to write bash scripts, or I can do this basic script in batch file as above to show or output the results?
If you have a number of Linux servers to watch, it might be worth installing something like rsyslog or logstash. It's a big topic, but those might be good starting points in your research.
Other things to google: elasticsearch, kibana ... and their alternatives.
You cannot run grep using FTP.
So either:
Use FTP to download whole file and grep/search it locally.
Or (as you seem to have an SSH access too) use a command-line SSH client to execute grep on the server. On Windows, you can use Plink (which comes with PuTTY):
plink -pw password user#example.com grep '%string%' /remote/path/log.log >

Writing to multiple remote terminals over SSH

Let's say I am SSH'ing to multiple remote machines. I'd like to send commands to all the machines through one interface. I was thinking of opening a named pipe whose output would be piped to each machine I SSH into. That way if I echo "ls -l" > namedpipe, then the command would run on each machine. I tried this, but it didn't work. Any suggestions on how can I have one terminal from which I could interact with all the remote machines?
GNU Parallel is the way to go. There are lots of examples on SO and elsewhere, also using named pipes when needed. Other tools are mentioned and compared in the parallel manpage.
As to your example, what you want can be done as simply as
parallel ssh {} "ls -l" ::: user1#machine1 user2#machine2 ...
Some linux distributions come with a configuration file (usually /etc/parallel/config ) where the option --tollef is set by default. If this is your case and you don’t want to change, you must use -- instead of ::: in the first example above, or, alternatively, use the --gnu option to override --tollef.
Equivalently, if you create a file called remotelist containing
user1#machine1
user2#machine2
you can issue:
parallel -a remotelist ssh {} "ls -l"
or, as noted by a comment below,
parallel --nonall --slf remotelist ls -l
the --slf option (short for --sshloginfile) allows stuffing more information in the remotelist file: comments, number of processors to use on each remote host, and the like.
You might also consider the --tag option, which prepends each output line with the name of the host it originates from.
There are plenty of tools available which could help you do that. Some of them it's worth checking at are:
pssh
Ansible Ad-Hoc module
They both work over SSH.
If you still want to use a custom script with a SSH client, you could do something like:
while read i
do
ssh user#$i <command to execute> <arg>
done < server.list

Search linux log files using mac terminal or script

I want to search my access logs for traffic going to a directory on a Linux. I have a mac which I know I can do this with my terminal but I can't find an example how to do this.
Do you have the log locally on your Mac? Or are you asking how you would use a Mac to administer a Linux OS?
Open terminal, either change directory to the dir that contains the log or adjust the command to suit. Depending on what you want to search for, you'd use grep.
grep -i "dir name" logfile.log
It's a bit of a wide empty question so I don't know what you want to search for.
If it's remote, you'd open terminal and ssh to the server which has the log and do something similar to the above. Or, if you're asking how you would do it in one command, you could use ssh too.
ssh user#linux "grep something /var/log/apache/access.log"

to transfer files in linux fedora 12 by giving password at command prompt

I am fully aware that this question has been asked many times but I cant able to find any solution which satisy my requirement.
Task -> I need to transfer files from machine A to machineB and remotely execute scripts on Machine B. Due to my limitation I cant able to use keygen, expect utility or any other utility which requires to install packages. To Transfer the file I need to give password and I want to give password in Url. as this will run inside bash script and requires no user interference .
My investigation- I thought of using scp but realise, its not possible to give password at command prompt. So i wondering , if there is any other alternative from rsync .
below is the small attempt
#!/bin/bash
PATH=/usr/bin:/usr/sbin:/bin:/sbin
USER="bob"
RSYNC_PASSWORD="blue"
MACHINE_B="192.168.200.2"
if ping -c 1 -W 1 $MACHINE_B
then
echo "There is machine b as well"
echo " cheking to transfer file to machine b"
rsync lol.sh 192.168.200.2:/home/bob/
fi
Thanks and regards,
Sam
I have tried various option mentioned above ,, but unfortunately none of them works in my case. But I would like to thanks everyone for helping me and surely I have learned few new things specially rsync.
In my case, I have to rely on ssh keys to make it work.
From the rsync man page:
Some modules on the remote daemon may require authentication. If so, you will receive a password prompt when you connect. You can avoid the password
prompt by setting the environment variable RSYNC_PASSWORD to the password you want to use or using the --password-file option. This may be useful
when scripting rsync.

iLO3: Multiple SSH commands

is there a way to run multiple commands in HPs integrated Lights-Out 3 system via SSH? I can login to iLO and run a command line by line, but I need to create a small shell-script, to connect to iLO and to run some commands one by one.
This is the line I use, to get information about the iLO-version:
/usr/bin/ssh -i dsa_key administrator#<iLO-IP> "version"
Now, how can I do something like this?
/usr/bin/ssh -i dsa_key administrator#<iLO-IP> "version" "show /map1 license" "start /system1"
This doesn't work, because iLO thinks it's all one command. But I need something to login into iLO, run these commands and then exit from iLO. It takes too much time to run them one after the other because every login into iLO-SSH takes ~5-6 seconds (5 commands = 5*5 seconds...).
I've also tried to seperate the commands directly in iLO after manual login but there is no way to use multiple commands in one line. Seems like one command is finished by pressing return.
iLO-SSH Version is: SM-CLP Version 1.0
The following solutions did NOT work:
/usr/bin/ssh -i dsa_key administrator#<iLO-IP> "version; show /map1 license; start /system1"
/usr/bin/ssh -i dsa_key administrator#<iLO-IP> "version && show /map1 license && start /system1"
This Python module is for HP iLO Management. check it out
http://pypi.python.org/pypi/python-hpilo/
Try putting your commands in a file (named theFile in this example):
version
show /map1 license
start /system1
Then:
ssh -i dsa_key administrator#iLO-IP < theFile
Semicolons and such won't work because you're using the iLO shell on the other side, not a normal *nix shell. So above I redirect the file, with newlines intact, as if you were typing all that into the session by hand. I hope it works.
You are trying to treat iLO like it's a normal shell, but its really HP's dopy interface.
That being said, the easiest way is to put all the commands in a file and then pipe it to ssh (sending all of the newline characters):
echo -e "version\nshow /map1 license\nstart /system1" | /usr/bin/ssh -i dsa_key administrator#<iLO-IP>
That's a messy workaround, but would you might fancy using expect? Your script in expect would look something like that:
# Make an ssh connection
spawn ssh -i dsa_key administrator#<iLO-IP>
# Wait for command prompt to appear
expect "$"
# Send your first command
send "version\r"
# Wait for command prompt to appear
expect "$"
# Send your second command
send "show /map1 license\r"
# Etc...
On the bright side, it's guaranteed to work. On the darker side, it's a pretty clumsy workaround, very prone to breaking if something goes not the way it should (for example, command prompt character would appear in version output, or something like that).
I'm on the same case and wish to avoid to run a lot of plink commands. So I've seen you can add a file with the -m option but apparently it executes just one command at time :-(
plink -ssh Administrator#AddressIP -pw password -m test.txt
What's the purpose of the file ? Is there a special format for this file ?
My current text file looks like below:
set /map1/oemhp_dircfg1 oemhp_usercntxt1=CN=TEST
set /map1/oemhp_dircfg1 oemhp_usercntxt2=CN=TEST2
...
Is there a solution to execute these two commands ?
I had similar issues and ended up using the "RIBCL over HTTPS" interface to the iLO. This has advantages in that it is much more responsive than logging in/out over ssh.
Using curl or another command-line HTTP client try:
USERNAME=<YOUR_ILO_USERNAME>
PASSWORD=<YOUR_ILO_PASSWORD>
ILO_URL=https://<YOUR_ILO_IP>/ribcl
curl -k -X POST -d "<RIBCL VERSION=\"2.0\">
<LOGIN USER_LOGIN=\"${USERNAME}\" PASSWORD=\"${PASSWORD}\">
<RIB_INFO MODE="READ">
<GET_FW_VERSION/>
<GET_ALL_LICENSES/>
</RIB_INFO>
<SERVER_INFO MODE=\"write\">
<SET_HOST_POWER HOST_POWER=\"Yes\">
</SERVER_INFO>
</LOGIN>
</RIBCL>" ${ILO_URL}
The formatting isn't exactly the same, but if you have the ability to access the iLO via HTTPS instead of only ssh, this may give you some flexibility.
More details on the various RIBCL commands and options may be found at HP iLO 3 Scripting Guide (PDF).

Resources