searching text in a file remotely - linux

I have log files, based on Linux servers, and I'm working on Windows OS.
I'm using Filezilla to log in the Linux server and searching specific text or strings by open the log file.
I want to automate this process using batch in Windows, I tried using below;
#echo off
cls
set /p string="Enter the string: "
echo open xx.xx.xx.xx 21> ftpc.dat
echo xxxxxxxx>> ftpc.dat
echo xxxxxxxx>> ftpc.dat
echo bin >> ftpc.dat
echo grep '%string%' /PATH IS HERE/log.log >> ftpc.dat
ftp -s:ftpc.dat
I'm just new to that, I want ideas on that, how I automate this search process? where I can make a search tool for any text, that this tool goes and find specific file in linux server and shows the results in lines (before/after 15 lines) of thatsearch results.
Do I need to write bash scripts, or I can do this basic script in batch file as above to show or output the results?

If you have a number of Linux servers to watch, it might be worth installing something like rsyslog or logstash. It's a big topic, but those might be good starting points in your research.
Other things to google: elasticsearch, kibana ... and their alternatives.

You cannot run grep using FTP.
So either:
Use FTP to download whole file and grep/search it locally.
Or (as you seem to have an SSH access too) use a command-line SSH client to execute grep on the server. On Windows, you can use Plink (which comes with PuTTY):
plink -pw password user#example.com grep '%string%' /remote/path/log.log >

Related

Python - read from a remote logfile that is updated frequently

I have a logfile that is written constantly on a remote networking device (F5 bigip). I have a Linux hopping station from where I can fetch that log file and parse it. I did find a solution that would implement a "tail -f" but I cannot use nice or similar to keep my script running after I log out. What I can do is to run a cronjob and copy over the file every 5 min let's say. I can process the file I downloaded but the next time I copy it it will contain a lot of common data, so how do I process only what is new? Any help or sugestions are welcome!
Two possible (non-python) solutions for your problems. If you want to keep a script running on your machine after logout, check nohup in combination with & like:
nohup my_program & > /dev/null
On a linux machine you can extract the difference between the two files with
grep -Fxv -f old.txt new.txt > dif.txt
This might be slow if the file is large. The dif.txt file will only contain the new stuff and can be inspected by your program. There also might be a solution involving diff.

Can PSFTP execute loops?

I've searched a lot on the internet but haven't been able to find any useful info this yet. Does PFTP not allow you to run loops like 'IF' and 'WHILE' at all?
If it does, please let me know the syntax, I'm tried of banging my head against it. Annoyingly, PuTTY allows these commands but psftp doesn't seem to even though both are from the same family. I really hope there is a solution to this!
PSFTP isn't a language. It's just an SFTP client. SFTP itself is just a protocol for moving files between computers. If you have SFTP set up on the remote computer then it suggests that you have SSH running (since SFTP generally comes bundled with the SSH server install).
You can do a test in a bash shell script, for instance, to see if the file exists on the remote server, then execute your psftp command based on the result. Something like:
#!/bin/bash
# test if file exists on remote system
fileExists=$(ssh user#yourothercomputer "test -f /tmp/foo && echo 'true' || echo 'false'")
if $fileExists; then
psftp <whatever>
fi
You can stick that whole mess in a loop or whatevs. What's happening here is that we are sending a command test -f /tmp/foo && echo 'true' || echo 'false' to the remote computer to execute. The stdout of the command is returned and stored in the variable fileExists. Then we just test it.
If you are in windows you could convert this to a batch script and use plink.exe to send the command kind of like they do here. Or maybe just plop cygwin on your computer with an SSH and SFTP client and use what's above.
The big take-away here is that you will need a separate scripting environment to do the loop and run psftp based on a test.

Search linux log files using mac terminal or script

I want to search my access logs for traffic going to a directory on a Linux. I have a mac which I know I can do this with my terminal but I can't find an example how to do this.
Do you have the log locally on your Mac? Or are you asking how you would use a Mac to administer a Linux OS?
Open terminal, either change directory to the dir that contains the log or adjust the command to suit. Depending on what you want to search for, you'd use grep.
grep -i "dir name" logfile.log
It's a bit of a wide empty question so I don't know what you want to search for.
If it's remote, you'd open terminal and ssh to the server which has the log and do something similar to the above. Or, if you're asking how you would do it in one command, you could use ssh too.
ssh user#linux "grep something /var/log/apache/access.log"

How do i make my .bat file run linux command to remote linux

Below is my current .bat content. i run it on window cmd. it will connect to remote linux server and prompt me password. but after i put the password and login as remotehost, linux server wont run my ls command. please help.
#echo off
ssh remotehost#10.1.1.10
ls
You really should do man ssh as this is explained there (and you could also make an internet search to get an answer).
But, to answer your question anyway: you should put all commands you want to run on the remote machine on the same line with the actual ssh command, for example to run directory listing and grep all files containing "foo", do: ssh <user>#<host> 'ls|grep foo'.
I hinted that it is possible to have the code in a batch file in my comment to #Sami Laine. This is what it would look like:
#echo off
setlocal
:: Run the end of this file in remote computer
more +8 %0 | plink user#remote.compu.ter "tr -d '\r'| bash"
endlocal
exit /b 0
:: remote bash stuff to be bootstrapped
pwd
ls -h
I'm using plink, because that what I have installed but it should work with most flavors of ssh too. Works also with ksh and zsh. Probably also with tcsh csh etc. This can sometimes be useful. Same technique can be used for a lot of things. Be careful with the +8 offset value it has to be on the right line.

Using Linux commands on files across multiple servers

I am new to Linux as a whole and so far I have not found a solution to this that isnt clumsy at best. I have a Windows background and so I am accustomed to running commands on one server that access text files on multiple systems in the same domain.
Example of what is processed in Windows:
find "Some text" \\ServerName01\c$\inetpub\*.log
find "Some text" \\ServerName02\c$\inetpub\*.log
find "Some text" \\ServerName03\c$\inetpub\*.log
Example of what I would LIKE to do in Linux:
sed 's/SomeText/OtherText/p //ServerName01/var/opt/somefolder/*.log
sed 's/SomeText/OtherText/p //ServerName02/var/opt/somefolder/*.log
sed 's/SomeText/OtherText/p //ServerName03/var/opt/somefolder/*.log
What is the best way to do the above in Linux, or is it even possible?
Thanks!
See the pssh and pscp suite, you can run commands on a bunch of remote servers : http://www.theether.org/pssh/
pssh or cssh would work
pssh provides a number of commands for executing against a group of
computers, using SSH. It’s most useful for operating on clusters of
homogenously-configured hosts.
http://www.ubuntugeek.com/execute-commands-simultaneously-on-multiple-servers-using-psshcluster-sshmultixterm.html
there is a lot of way for doing it :
Via NFS/Fuse Mount, mount the logs directory on one system and you could do the same thing as windows (which automatically mount remote filesystem with the "\\")
use ssh,(that would be my prefered solution)
cat serverlist | xargs -i ssh {} " grep \"some text\" yourfilepaths"
which helps if you use ssh keys pairs

Resources