My server not responding to grep - linux

I am trying to use this command on my server
grep -lr --include=*.php "eval(base64_decode" /path/to/webroot
Absolutely nothing happens, no response from the server.
Can anyone help me out?
I am not an experienced Linux user.

The GNU folks messed up when they gave grep arguments to recursively search for files. Forget you ever heard of -r or --include and rewrite your script to use find to find the files and grep to Globally search for a Regular Expression and Print (g/re/p) the result from each file (see the huge clues in the tool names?). For example:
find /path/to/webroot -name '*.php' -print0 |
xargs -0 grep -l 'eval(base64_decode'
If that still gives you an issue then step 1 in debugging it is to run the find on it's own and see if it produces a list of files. If so, then step 2 is to run the grep alone on one of the files output by find. If you can't figure it out from that, let us know.

Related

Why 'grep -l' is not matching all files containing a certain string?

From time to time I face a very weird behavior with find + grep commands. I am asking this because I haven't found anything related to this.
At my work I often have to perform a considerable search against a high amount of logs, looking for a certain string.
Due to its excellent performance I trust heavily on the command grep -l to execute this.
I use commands like this:
find . -type f -name "*log*" -exec grep -l STRING {} \; 2>/dev/null
I also have a multi-thread program that use find + grep -l in a parallel way.
The problem is that sometimes some files are not found during the search, even though they contain the string I am interested in. Then, when I execute the same command for a second time, the searching works and show me all the files I am interested in.
This seems to be a very intermittent issue and I have no idea of what I should check.
Any idea of what could cause that? Could it be a problem with find that set parameters for grep command? Is it a grep problem? May it be related to the high amount of files we search on a certain time.
Thanks.

Unix/Bash: Redirect results of find command so files are used as input for other command

I've got a directory structure that contains many different files named foo.sql. I want to be able to cd into this directory & issue a command like the following:
find . -name "foo.sql" -exec mysql -uUserName -pUserPasswd < {} \;
where {} is the relative path to each foo.sql file. Basically, I want:
mysql -uUserName -pUserPasswd < path/to/foo.sql
to be run once for each foo.sql file under my subdirectory. I've tried Google & it's been not much help. Ideally this would be part of a UNIX shell script.
Thanks in advance, & sorry if it's been asked before.
The -exec option doesn't run a shell, so it can't process shell operators like redirection. Try this:
find . -name "foo.sql" -exec cat {} + | mysql -uUserName -pUserPasswd
cat {} will write the contents of all the files to the pipe, which will then be read by mysql.
Or, just to point out another approach:
find . | xargs cat | mysql etcetera
xargs is a generic pipe operation roughly equivalent to find's '-exec'. It has some advantages, some disadvantages, depending on what you're doing. Intend to use it because i'm often filtering the list of found files in an earlier pipeline stage before operating on them.
There are also other ways of assembling such command lines. One nice thing about Unix's generic toolkits is that there are usually multiple solutions, each with its own tradeoffs.

grep recursion - inconsistencies

I'm looking for a particular string (mainly in .c files) recursively from the root directory.
When I use this, I get a list back almost immediately.
grep -rl "F_capture" .
However, if I try to speed things up by just searching .c files:
grep -r --include=*.c "F_capture" .
I end up with a slew of recursive directory warnings like this:
grep: warning: ./sys/block/fd0/device/bus/drivers/i8042/i8042/serio1/input:event1/subsystem/event3/device/bus/drivers/pcnet32/0000:00:03.0/subsystem: recursive directory loop
When I tried suppressing the warnings using the -s parameter, I don't get the warnings but I don't get anything back either - seems like it's going off into never never land.
grep -rsl --include="*.c" "F_capture" .
So I guess my question is, why does the first grep I used return something immediately and the other types where I'm targeting a specific type of file seem to hang up. I would think the targeted search would be faster.
You can try this:
find . -type f -name "*.c" -print|xargs grep -l "F_capture"
Try the following command:
grep "F_capture" `find -type f -name "*.c"`

Find in Linux combined with a search to return a particular line

I'm trying to return a particular line from files found from this search:
find . -name "database.php"
Each of these files contains a database name, next to a php variable like $dname=
I've been trying to use -exec to execute a grep search on this file with no success
-exec "grep {\}\ dbname"
Can anyone provide me with some understanding of how to accomplish this task?
I'm running CentOS 5, and there are about 100 database.php files stored in subdirectories on my server.
Thanks
Jason
You have the arguments to grep inverted, and you need them as separate arguments:
find . -name "database.php" -exec grep '$dbname' /dev/null {} +
The presence of /dev/null ensures that the file name(s) that match are listed as well as the lines that match.
I think this will do it. Not sure if you need to make any adjustments for CentOS.
find . -name "database.php" -exec grep dbname {} \;
I worked it out using xargs
find . -name "database.php" -print | xargs grep \'database\'\=\> > list_of_databases
Feel free to post a better way if you find one (or what some rep for a good answer)
I tend to habitually avoid find because I've never learned how to use it properly, so the way I'd accomplish your task would be:
grep dbname **/database.php
Edit: This command won't be viable in all cases because it can potentially generate a very long argument list, whereas find executes its command on found files one by one like xargs. And, as I noted in my comment, it's possibly not very portable. But it's damn short ;)

Exclude .svn directories from grep [duplicate]

This question already has answers here:
How can I exclude directories from grep -R?
(14 answers)
Closed 6 years ago.
When I grep my Subversion working copy directory, the results include a lot of files from the .svn directories. Is it possible to recursively grep a directory, but exclude all results from .svn directories?
If you have GNU Grep, it should work like this:
grep --exclude-dir=".svn"
If happen to be on a Unix System without GNU Grep, try the following:
grep -R "whatever you like" *|grep -v "\.svn/*"
For grep >=2.5.1a
You can put this into your environment (e.g. .bashrc)
export GREP_OPTIONS='--exclude-dir=".svn"'
PS: thanks to Adrinan, there are extra quotes in my version:
export GREP_OPTIONS='--exclude-dir=.svn'
PPS: This env option is marked for deprecation: https://www.gnu.org/software/grep/manual/html_node/Environment-Variables.html "As this causes problems when writing portable scripts, this feature will be removed in a future release of grep, and grep warns if it is used. Please use an alias or script instead."
If you use ack (a 'better grep') it will handle this automatically (and do a lot of other clever things too!). It's well worth checking out.
psychoschlumpf is correct, but it only works if you have the latest version of grep. Earlier versions do not have the --exclude-dir option. However, if you have a very large codebase, double-grep-ing can take forever. Drop this in your .bashrc for a portable .svn-less grep:
alias sgrep='find . -path "*/.svn" -prune -o -print0 | xargs -0 grep'
Now you can do this:
sgrep some_var
... and get expected results.
Of course, if you're an insane person like me who just has to use the same .bashrc everywhere, you could spend 4 hours writing an overcomplicated bash function to put there instead. Or, you could just wait for an insane person like me to post it online:
http://gist.github.com/573928
grep --exclude-dir=".svn"
works because the name ".svn" is rather unique. But this might fail on a more generalized name.
grep --exclude-dir="work"
is not bulletproof, if you have "/home/user/work" and "/home/user/stuff/work" it will skip both. It is not possible to define "/*/work/*"
to restrict the exclusion to only the former folder name.
As far as I could experiment, in GNU grep the simple --exclude won't exclude directories.
On my GNU grep 2.5, --exclude-dirs is not a valid option. As an alternative, this worked well for me:
grep --exclude="*.svn-base"
This should be a better solution than excluding all lines which contain .svn/ since it wouldn't accidentally filter out such lines in a real file.
Two greps will do the trick:
The first grep will get everything.
The second grep will use output of first grep as input (via piping). By using the -v flag, grep will select the lines which DON'T match the search terms. Voila. You are left with all the ouputs from the first grep which do not contain .svn in the filepath.
-v, --invert-match
Invert the sense of matching, to select non-matching lines.
grep the_text_you_want_to_search_for * | grep -v .svn
I tried double grep'in on my huge code base and it took forever so I got this solution with the help of my co-worker
Pruning is much faster as it stops find from processing those directories compared to 'grep -v' which processes everything and only excludes displaying results
find . -name .svn -prune -o -type f -print0 | xargs -0 egrep 'YOUR STRING'
You can also alias this command in your .bashrc as
alias sgrep='find . -name .svn build -prune -o -type f -print0 | xargs -0 egrep '
Now simply use
sgrep 'whatever'
Another option, albeit one that may not be perceived as an acceptable answer is to clone the repo into git and use git grep.
Rarely, I run into svn repositories that are so massive, it's just impractical to clone via git-svn. In these rare cases, I use a double grep solution, svngrep, but as many answers here indicate, this could be slow on large repositories, and exclude '.svn' occurrences that aren't directories. I would argue that these would be extremely seldom though...
Also regarding slow performance of multiple greps, once you've used something like git, pretty much everything seems slow in svn!
One last thing.., my variation of svngrep passes through colorization, beware, the implementation is ugly! Roughly grep -rn "$what" $where | egrep -v "$ignore" | grep --color "$what"
For grep version 2.5.1 you can add multiple --exclude items to filter out the .svn files.
$ grep -V | grep grep
grep (GNU grep) 2.5.1
GREP_OPTIONS="--exclude=*.svn-base --exclude=entries --exclude=all-wcprops" grep -l -R whatever ./
I think the --exclude option of recursion is what you are searching for.

Resources