Greping folder names but exluding "#domain.com" - linux

Is it possible to exclude a domain from a grep? What I have tried below doesn't seem to work.
ls -l /var/www/folder | grep -E -o --exclude-dir="#somedomain.com" --color "\b[a-zA-Z0-9.-]+#[a-zA-Z0-9.-]+\.[a-zA-Z0-9.-]+\b">>test.txt

how about this
ls -l /var/www/folder | grep -v "#somedomain.com"
test case:
$ mkdir -p /tmp/test && cd $_
$ touch {a,b,c,d}#domain.com
$ touch {e,f}#somedomain.com
$ ls
domain.com b#domain.com c#domain.com d#domain.com e#somedomain.com f#somedomain.com
$ ls -1 | grep -v "#somedomain.com"
a#domain.com
b#domain.com
c#domain.com
d#domain.com
Here is what the man page says for -v
-v, --invert-match
Invert the sense of matching, to select non-matching lines. (-v is specified by POSIX.)

-l /var/www/folder | grep --invert-match "#somedomain.com" | grep -E -o --color "\b[a-zA-Z0-9.-]+#[a-zA-Z0-9.-]+.[a-zA-Z0-9.-]+\b">>test.txt

Related

remove file in Cron task

I have the following command line that I run on a Debian:
ls -d mypath/filename* -tp | grep -v '/$' | tail -n +1 | xargs rm -f
Result : OKAY (the file is removed)
But I want to run this command periodically, and if I put this command in a Cron job, then it doesnt work (file not removed)
I checked the logs, and there are no errors nor warnings
What I tried :
* * * * * bash -c "ls -d mypath/filename* -tp | grep -v '/$' | tail -n +1 | xargs rm -f"
Any idea ?

Linux use grep command

I know use ps -ef | grep test| grep -v grep |wc -l can list the num of process test,and now i plan to list the test processes belong to user :forme.is this right as below :
ps -ef | grep test|grep -x forme| grep -v grep |wc -l
For a start, grep test| grep -v grep can be replaced with grep '[t]est'. See here for an explanation.
Secondly, if you want to limit the processes to a single user, that's what the -u option to ps is for:
ps -fu forme | grep '[t]est' | wc -l
And, finally, grep already has a -c option to count lines, so you can ditch the wc part of the pipeline:
ps -fu forme | grep -c '[t]est'

How to change dqpk-query -L <package-name> view?

I use dpkg-query -L <package-name> to list all files belong to the specific package.
The result is a lot of files with their directory:
/.
/usr
/usr/bin
/usr/bin/tree
/usr/share
/usr/share/doc
/usr/share/doc/tree
/usr/share/doc/tree/TODO
/usr/share/doc/tree/copyright
/usr/share/doc/tree/README.gz
/usr/share/doc/tree/changelog.Debian.gz
/usr/share/man
/usr/share/man/man1
/usr/share/man/man1/tree.1.gz
Because the outpu is too many,i always use this command like this:
dpkg-query -L tree > tree.txt
My question is how to change the view of the list in tree.txt to become more human readable,let the output be sorted by their directory prefix.
The rpm -ql <package-name> comand has the same use on Centos.
If you are only interested in the folders:
dpkg -L <pkg-name> | xargs -I{} dirname {} | sort -u
A version which groups files by directory could be done using a shell function:
lspkg() {
pkg="${1}"
if [ -t 1 ] ; then
color_dir=$'\x1b\x5b34;1m'
color_link=$'\x1b\x5b36m'
color_end=$'\x1b\x5b0m'
fi
LANG=C dpkg -L "${pkg}" \
| awk -F: '{printf "%s%s\n",$NF,(NF>1?" (pkg set link)":"")}' \
| sort \
| while read -r file ; do
if [ -d "${file}" ] ; then
echo "${color_dir}${file}${color_end}"
elif /bin/grep -q 'set link)' <<< "$file" ; then
echo " - ${color_link}${file}${color_end}"
else
echo " - ${file}"
fi
done
}

Perl Script to Grep Directory For String and Print

I would like to create a perl or bash script that will read keyboard input and assign a variable, perform a fixed string grep recursively within the current directory filled with Snort logs, and then automatically tcpdump the matched files, grep its output, and print the specified lines to the terminal. Does anyone have a good idea of how this should work?
Here is an example of the methodology I want from the script:
step 1: Read keyboard input and assign it to variable named string.
step 2 command: grep -Fr "$string"
step 2 output: snort.log.1470609906 matches
step 3 command: tcpdump -r snort.log.1470609906 | grep -F "$string" C-10
step 3 output:
Snort log
Here's some bash code that does that:
s="google.com"
grep -Frl "$s" | \
while IFS= read -r x; do
tcpdump -r "$x" | grep -F "$s" -C10
done
idk about perl but you can do it easily enough just in shell:
str="google.com"
find . -type f -name 'snort.log.*' -exec grep -FlZ "$str" {} + |
xargs -0 -I {} sh -c 'tcpdump -r "{}" | grep -F '"$str"' -C10'

Bash - pipe multiple grep and print output

I am writing a shell script which will grep a document for certain words and then displaying the found words in colour output.
echo $(egrep -wi --color=always 'error|exception' $logFile)
now I want to combine this grep with another one to exclude a few results
For this I want to pipe above command to a grep command to exclude certain patterns
grep -vi '<status>error</status>'
For some reason this fails when I try to execute the command
echo $(egrep -wi --color=always 'error|exception' $logFile | $(grep -v '<STATUS>ERROR</STATUS>') )
or even if I try
echo $(egrep -wi --color=always 'error|exception' $logFile | grep -v '<STATUS>ERROR</STATUS>')
What am I doing wrong? Why is this failing?
The problem seems append only with egrep, --color=always, and -i.
egrep -wi --color=always 'error|exception' /tmp/log.log | grep -v '<STATUS>ERROR</STATUS>'
doesn't work but
egrep -w --color=always 'error|exception' /tmp/log.log | grep -v '<STATUS>ERROR</STATUS>'
and
egrep -wi --color 'error|exception' /tmp/log.log | grep -v '<STATUS>ERROR</STATUS>'
and
grep -wi --color=always 'error|exception' /tmp/log.log | grep -v '<STATUS>ERROR</STATUS>'
does...
But I don't know why your solution does'nt work...
In shell script:
result=`grep -wi --color=always 'error|exception' /tmp/log.log | grep -v '<STATUS>ERROR</STATUS>'`
echo $result

Resources