AWK taking action on a running output in bash - linux

I need to use the output of a command, which is:
nmap -n -Pn -sS --vv <*IPs*>
The output of this command is devided into 2 parts; first a discovery and after that a port scan, both of them separated by the very first line that says
Nmap scan report for <FirstIP>
What I need is the output of this first part, which by the way is the fastest one, and I need to pipe it to a further command/s (awk, grep or whatever) to filter only the IP addresses. Obviously, the intention of doing this is the need of stopping the running command exactly when the line "Nmap scan report for <*FirstIP*>" appears on the shell (first, because I don't need the other part and second, because the other part takes too much time!)
I found a very close solution here but it didn't worked because it executes both commands (nmap and awk) but there's no output in stdout in shell.
I would be looking for something like this:
nmpa -n -Pn -sS --vv <*IPs*> | awk '/Not shown/ {system("pkill nmap")}' | awk '/^Discovered/{print $6}'
But obviously this doesn't work.
Any ideas?

Most flavors of awk do buffering. There's no option to do what you're doing in gawk, but if you use mawk, you can give it the -Winteractive option, which does not buffer.
Incidentally, you are running two awks, but you only need one:
nmap -n -Pn -sS --vv <*IPs*> | mawk -Winteractive '/Not shown/ {system("pkill nmap")} /^Discovered/{print $6}'
Every predicate in awk runs the associated block. (Although I love awk, this use case might match expect better.)

Given this command line:
left-side | right-side
The problems you have are:
The shell will buffer the output of any command going to a pipe so you may not see "Not shown" in the right-side command until after the left-side command has finished running, and
You want the left-side command to stop running as soon as the right-side command sees "Not shown"
For the first problem, use stdbuf or similar.
For the second - exiting the right-side command will send a terminate signal back to the left-side command so you don't need to do anything else, what you want to happen will simply happen when you exit the right-side command.
So your command line would be something like:
nmap -n -Pn -sS --vv <*IPs*> |
stdbuf awk '/Not shown/{exit} /^Discovered/{print $6}'
idk if you meant to use "Not shown" or "Nmap scan report for" in your sample code. Use whichever is the string you want awk to exit at.

Related

How to highlight certain words over terminal always from all commands

I need to highlight certain keywords like "fail, failed, error, fatal, missing" over my terminal.
I need this with the output of ALL the commands, not any specific command. I assume I need to tweak my bashrc file for this.
To color I can use:
<input coming to terminal>|grep -P --color=auto 'fail|failed|error|fatal|missing|$'
I tried the following command but not helped:
tail -f $(tty) |grep -P --color=auto 'fail|failed|error|fatal|missing|$' &
[1]+ Stopped(SIGTTIN) tail -f $(tty) | grep -P --color=auto 'fail|failed|error|fatal|missing|$'
Searched SO for answers but could not find any question which provides desired an answer.
I don't think there's really an elegant way to do this using the shell. Ideally, you'd get a terminal emulator with this kind a keyword highlighting built in. You can get some of the way by piping the output of bash through a filter that adds ANSI colour escapes. Here is a sed script, that replaces "fail" with (red)fail(normal):
s/fail/\x1B[31m&\x1B[0m/
t done
:done
Run bash with its output piped through sed like this:
$bash | sed -f color.sed
This mechanism is not without problems, but it works in some cases. Usually it's better just to collect up the output you want, and then pipe it through sed, rather than working directly with the bash output.

referencing stdout in a command that has been piped into

I want to make a simple dmenu command that reads a file of commands and names. Then takes the names and displays them using dmenu then takes dmenu's output and runs the associated command using the file again.
I got to the point where dmenu displays the names, but I don't really know where to go from there. Learning bash is a really daunting task to me and I don't really know where to start with this seemingly simple script/command.
here is the file:
Pushbullet
google-chrome-stable --app=https://www.pushbullet.com
Steam
steam
Chrome
google-chrome-stable
Libre Office
libreoffice
Transmission
transmission-qt
Audio Control Panel
sudo pavucontrol & bluberry
and here is what I have so far for my command:
awk 'NR % 2 != 0' /home/rocco/programlist | dmenu | ??(grep -l "stdout" /home/rocco/programlist....)
It was my thinking that I could somehow pipe into grep or awk with the name of the application then get the line number then add one and pipe that into sh.
Thanks
I have no experience with dmenu but if I understand how it works correctly, this should do what you want. Wrapping a command in $(…) returns the output as a variable, which we can pass on to another command.
#!/bin/bash
plist="/home/rocco/programlist"
# pipe every second line to dmenu
selected=$(awk 'NR % 2 != 0' "$plist" | dmenu)
# search for the selected item, get the command after it
cmd=$(grep -A1 "$selected" "$plist" | tail -n 1)
# run the command
$cmd
Worth mentioning a mistake in your question. dmenu sends to stdout, or standard output, but the next program in line would be reading stdin, or standard input. In any case, grep can't take patterns on standard input, which is why I've saved to a variable instead of trying to pipe it somewhere.
Assuming you have programlist.txt in the working directory you can use:
awk 'NR%2 !=0' programlist.txt |dmenu |awk '{system("grep --no-group-separator -A 1 '"'"'"$0"'"'"' programlist.txt");}' |awk '{if(NR==2){system($0);}}'
Note the quoting of the $0 in the first awk envocation. This is necessary to get names with spaces in them like "Libre Office"

Continuous grep, output at same spot on console

I use
tail -f file | grep pattern
all the time for continuous grep.
However, is there a way I can make grep output its pattern at the same spot, say at the top of the screen? so that the screen doesn't scroll all the time?
My case is something like this: tail -f log_file | grep Status -A 2 will show the current status and what changed it to that status. The problem is the screen scrolls and it becomes annoying. I'd rather have the output stuck on the first 3 lines in the screen.
Thank you!
you could use the watch command; which will always execute the same command, but the position on the screen will stay the same. The process might eat some more CPU or memory though:
watch "tail file | grep pattern"
by default watch executes that command every 2 seconds. You can adjust up to 0.1 seconds using:
watch -n 0.1
NOTE
As noted by #etanReisner: this is not exactly the same as tail -f: tail -f will change immediately if something is added to your logfile, the watch command will only notice that when it executes, ie every 2 (or 0.1 seconds).
Assuming you are using a vt100 compatible emulator...
This command will tail a file, pipe it into grep, read it a line at a time and then display it in reverse on the top line of the screen:
TOSL=$(tput sc;tput cup 0 0;tput rev;tput el)
FROMSL=$(tput sgr0; tput rc)
tail -f file | grep --line-buffered pattern | while read line
do
echo -n "$TOSL${line}$FROMSL"
done
It assumes your output appears a line at a time. If you want more than one line, you can read more than a line, but you have to decide how you want to buffer the output. You could also use the csr terminfo command to set up an entire separate scrolling region instead of just having one line.
Here is the scrolling region version with a ten line status area at the top:
TOSL=$(tput sc; tput csr 0 10; tput cup 10 0;tput rev;tput el)
FROMSL=$(tput sgr0; tput rc;tput csr 10 50;tput rc)
tail -f file | grep --line-buffered pattern | while read line
do
echo -n "$TOSL${line}
$FROMSL"
done
Note that it is not impossible that your display will be corrupted from time-to-time as it could be that the output from your main shell and your background task get mixed up.
Simply replace the newlines with carriage returns.
tail -f file | grep --line-buffered whatever | tr '\012' '\015'
The line buffering is to avoid jumpy output; see http://mywiki.wooledge.org/BashFAQ/009
This is quick and dirty. As noted in comments, this will leave the previous contents of the line underneath, so a shorter line will not completely overlay a longer line. You could add some control codes to address that, but then you might as well use Curses for the formatting too, like in rghome's answer.

search with in Data displayed as a result of tail Operation?

I am working on a Java EE application where its logs will be generated inside a Linux server .
I have used the command tail -f -n -10000 MyLog
It displayed last 1000 lines from that log file .
Now I pressed Ctrl + c in Putty to disconnect the logs updation ( as i am feared it may be updated with new requests and I will loose my data )
In the displayed result, how can I search for a particular keyword ?? (Used / String name to search but it's not working)
Pipe your output to PAGER.
tail -f -n LINE_CNT LOG_FILE | less
then you can use
/SEARCH_STRING
Two ways:
tail -n 10000 MyLog| grep -i "search phrase"
tail -f -n 10000 MyLog | less
The 2nd method will allow you to search with /. It will only search down but you can press g to go back to the top.
Edit: On testing it seems method 2 doesn't work all that well... if you hit the end of the file it will freeze till you ctrl+c the tail command.
You need to redirect the output from tail into a search utility (e.g. grep). You could do this in two steps: save the output to a file, then search in the file; or in one go: pipe the ouput to the search utility
To see what goes into the file (so you can hit Ctlr+c) you can use the tee command, which duplicates the output to the screen and to a file:
tail -f -n -10000 MyLog | tee <filename>
Then search within the file.
If you want to pipe the result into the search utility, you can use the same trick as above, but use your search program instead of tee
Controlling terminal output on the fly
While running any command in a terminal such as Putty you can use CTRL-S and CTRL-Q to stop and start output to the Putty terminal.
Excluding lines using grep
If you want to exclude lines that contain a specific pattern use grep -v the following would remove all line that contain the string INFO
tail -f logfile | grep -v INFO
Show lines that do not contain the words INFO or DEBUG
tail -f logfile | grep -v -E 'INFO|DEBUG'
Finally, the MOTHER AND FATHER of all tailing tools is xtail.pl
If you have perl on your host xtail.pl is a very nice tool to learn and in a nutshell you can use it to tail multiple files. Very handy.
You can just open it with less command
less logfile_name
when you open the file you can use this guide here
Tip: I suggest, first to use G to go to the end of the file and then to you use Backward Search

Colour highlighting output based on regex in shell

I'd like to know if I can colour highlight the output of a shell command that matches certain strings.
For example, if I run myCommand, with the output below:
> myCommand
DEBUG foo bar
INFO bla bla
ERROR yak yak
I'd like all lines matching ^ERROR\s.* to be highlighted red.
Similarly, I'd like the same highlighting to be applied to the output of grep, less etc...
EDIT: I probably should mention that ideally I'd like to enable this feature globally via a 'profile' option in my .bashrc.
There is an answer in superuser.com:
your-command | grep -E --color 'pattern|$'
or
your-command | grep --color 'pattern\|$'
This will "match your pattern or the end-of-line on each line. Only the pattern is highlighted..."
You can use programs such as:
spc (Supercat)
grc (Generic Colouriser)
highlight
histring
pygmentize
grep --color
You can do something like this, but the commands won't see a tty (some will refuse to run or behave differently or do weird things):
exec > >(histring -fEi error) # Bash
If you want to enable this globally, you'll want a terminal feature, not a process that you pipe output into, because a pipe would be disruptive to some command (two problems are that stdout and stderr would appear out-of-order and buffered, and that some commands just behave differently when outputting to a terminal).
I don't know of any “conventional” terminal with this feature. It's easily done in Emacs, in a term buffer: configure font-lock-keywords for term-mode.
However, you should think carefully whether you really want that feature all the time. What if the command has its own colors (e.g. grep --color, ls --color)? Maybe it would be better to define a short alias to a colorizer command and run myCommand 2>&1|c when you want to colorize myCommand's output. You could also alias some specific always-colorize commands.
Note that the return status of a pipeline is its last command, so if you run myCommand | c, you'll get the status of c, not myCommand. Here's a bash wrapper that avoids this problem, which you can use as w myCommand:
w () {
"$#" | c
return $PIPESTATUS[0]
}
You could try (maybe needs a bit more escaping):
BLUE="$(tput setaf 4)"
BLACK="$(tput sgr0)"
command | sed "s/^ERROR /${BLUE}ERROR ${BLACK}/g"
Try
tail -f yourfile.log | egrep --color 'DEBUG|'
where DEBUG is the text you want to highlight.
You can use the hl command avalaible on github :
git clone http://github.com/mbornet-hl/hl
Then :
myCommand | hl -r '^ERROR.*'
You can use the $HOME/.hl.cfg configuration file to simplify the command line.
hl is written in C (source is available).
You can use up to 42 differents colors of text.
Use awk.
COLORIZE_AWK_COMMAND='{ print $0 }'
if [ -n "$COLORIZE" ]; then
COLORIZE_AWK_COMMAND='
/pattern1/ { printf "\033[1;30m" }
/pattern2/ { printf "\033[1;31m" }
// { print $0 "\033[0m"; }'
fi
then later you can pipe your output
... | awk "$COLORIZE_AWK_COMMAND"
printf is used in the patterns so we don't print a newline, just set the color.
You could probably enable it for specific commands using aliases and user defined shell functions wihtout too much trouble. If your coloring errors I assume you want to process stderr. Since stderr in unbuffered you would probably want to line buffer it by sending through a fifo.

Resources