Trying to write my first compund linux query and running into some gaps in knowledge.
The idea is to find all the file that may be either .doc or .txt as well as search the contents for the text clown.
So I started off with searching from the root as such.
$find /
Then I added the wildcard for filename.'
$find / -name '*.doc'...uhh oh
First question. How do I specify or? Is it with pipe | or double pipe || or...? and do I need to repeat the -name parameter like this?
$find / -name '*.doc' || -name '*.txt'
Second ? do I add the grep for the string after / before...?
$find / -name '*.doc' || -name '*.txt' grep -H 'cat' {} \
Finally is there a place where I can validate syntax / run like SQLFiddle?
TIA
'Or' in find is -o
You have to specify the find type again though. So something like:
find / -name *.doc -o -name *.txt
You can simply put your grep command in front, so long as you encase your find command in backticks:
grep 'whatever' `find / -name *.doc -o -name *.txt`
There's a reasonably nice guide to find here
You want something like this:
find / \( -name \*.doc -o -name \*.txt \) -exec grep clown {} \; -print
you specify or with -o within \( \), you run grep in a -exec and you can validate the syntax in a bash shell.
Try:
(find ./ -name "*.txt" -print0 2>/dev/null ; find ./ -name "*.doc" -print0 2>/dev/null) | xargs -0 grep clown
Related
I'm looking for a proper way to list
all filenames (without extension)
matching a specific extension list
recursively in a specific folder
with some exclusions patterns
and then export that to a file.
Currently i'm doing the following which is working properly:
ls -R --ignore={"Sample","Sample.*","sample.*","*_sample.*","*.sample.*","*-sample.*","*.sample-*","*-sample-*","*trailer]*"} "$filesSource" | grep -E '\.mkv$|\.mp4$|\.avi$' | sed -e 's/\(.*\)/\L\1/' | sort >> "$listFile"
Thanks to ShellChecker, I have a feedback on this line and I don't know how to do that properly!
Thanks for your help!
Why don't you try find command?
something like
find YOUR_PATH -type f -name "*.FIRST_EXTENSION" -o -name "*.SECOND_EXTENSION"| grep -v SOME_EXCLUSION | awk -F. '{print $(NF-1)}' | sort > SOME_FILE
note: this will work only if the filenames contain only 1 "." character for the extension, otherwise you need to modify a little bit the awk part.
If you are searching just on filenames, then you can use:
I split the command line in multiple lines:
$ find /path/to/folder -type f \( \( -name '*.ext1' -or -name '*.ext2' -or -name '*.ext3' \) -and -not \( -name '*excl1*' -or -name 'excl2*' \) \) -print
This will do:
/path/to/folder: the folder you are searching
-type f : you are searching for files in the above folder which satisfy
\(: open the conditional test
\( -name '*.ext1' -or -name '*.ext2' -or -name '*.ext3' \): who have one of the three listed extensions (with a conditional or)
-and -not \( -name '*excl1*' -or -name 'excl2*' \): if the above condition mathches it will check (-and) if one of the patterns *excl1* or excl2* do -not match.
\) close the main conditional test
-print perform the action to print the found paths.
How should this be fixed? I am following a tutorial but I receive this error:
$ find ~/Desktop -name “*.jpg” -o -name “*.gif” -o -name “*.png” -print0 | xargs -0 mv –target-directory ~/Pictures
mv: cannot stat `–target-directory': No such file or directory
*I am interested on how to perform this command using xargs!
Using find and exec
$ find ~/Desktop -name "*.jpg" -exec mv '{}' /tmp/target/ \; -or -name "*.gif" -exec mv '{}' /tmp/target/ \; -or -name "*.png" -exec mv '{}' /tmp/target/ \;
Using xargs
$ find ~/Desktop -name "*.jpg" -or -name "*.gif" -or -name "*.png" | xargs -I SRCFILE mv SRCFILE /tmp/target/
You don't need to use xargs, find can execute commands on the matches:
find ~/Desktop -name “*.jpg” -o -name “*.gif” -o -name “*.png” -exec mv \{\} ~/Pictures \;
You can give a command after -exec and before the escaped semicolon \;. The \{\} is replaced with the matching file name.
From man find:
-exec command ;
Execute command; true if 0 status is returned. All following arguments to find are taken to be arguments to the command until an argument consisting of ';' is encountered. The string '{}' is replaced by the current file name being processed everywhere it occurs in the arguments to the command, not just in arguments where it is alone, as in some versions of find. Both of these constructions might need to be escaped (with a '\') or quoted to protect them from expansion by the shell. See the EXAMPLES section for examples of the use of the -exec option. The specified command is run once for each matched file. The command is executed in the starting directory. There are unavoidable security problems surrounding use of the -exec action; you should use the -execdir option instead.
Notice that the semicolon and {} must be escaped.
I believe -target-directory should be --target-directory, or just -t.
I need to find all the iplanets on one server and I was thinking to use this command:
find / type d -name https-* | uniq
But at the same time I need to ignore some directories/file. I've been trying to use !, but it not always work. I have a command like this:
find / type d -name https-* ! -name https-admserv* ! -name conf_bk* ! -name alias* ! -name *db* ! -name ClassCache* | uniq
I need to ignore all that. The directories admserv, conf_bk, alias and tmp and the files *.db*
Basically I need find this:
/opt/mw/iplanet/https-daniel.com
/opt/https-daniel1.com
/apps/https-daniel2.com
I only need to find the directory name. How can I ignore all the other stuff?
Use -prune to keep from recursing into directories:
find / \( -type d \( -name 'https-admserv*' -o -name 'conf_bk*' -o -name 'alias*' -o -name 'tmp' \) -prune -o -type d -name 'https-*' -print
There's no need to ignore any files. You're only selecting https-* directories, so everything else is ignored.
And there's no need to pipe to uniq, since find never produces duplicates.
Someone created directories with names like source.c. I am doing a find over all the directories in a tree. I do want find to search in the source.c directory, but I do not want source.c to be passed to the grep I am doing on what is found.
How can I make find not pass directory names to grep? Here is what my command line looks like:
find sources* \( -name "*.h" -o -name "*.cpp" -o -name "*.c" \) -exec grep -Hi -e "ThingToFind" {} \;
Add -a -type f to your find command. This will force find to only output files, not directories. (It will still search directories):
find sources* \( -name "*.h" -o -name "*.cpp" -o -name "*.c" \) -a -type f -exec grep -Hi -e "ThingToFind" {} \;
If I have a list of filenames in a text file that I want to exclude when I run find, how can I do that? For example, I want to do something like:
find /dir -name "*.gz" -exclude_from skip_files
and get all the .gz files in /dir except for the files listed in skip_files. But find has no -exclude_from flag. How can I skip all the files in skip_files?
I don't think find has an option like this, you could build a command using printf and your exclude list:
find /dir -name "*.gz" $(printf "! -name %s " $(cat skip_files))
Which is the same as doing:
find /dir -name "*.gz" ! -name first_skip ! -name second_skip .... etc
Alternatively you can pipe from find into grep:
find /dir -name "*.gz" | grep -vFf skip_files
This is what i usually do to remove some files from the result (In this case i looked for all text files but wasn't interested in a bunch of valgrind memcheck reports we have here and there):
find . -type f -name '*.txt' ! -name '*mem*.txt'
It seems to be working.
I think you can try like
find /dir \( -name "*.gz" ! -name skip_file1 ! -name skip_file2 ...so on \)
find /var/www/test/ -type f \( -iname "*.*" ! -iname "*.php" ! -iname "*.jpg" ! -iname "*.png" \)
The above command gives list of all files excluding files with .php, .jpg ang .png extension. This command works for me in putty.
Josh Jolly's grep solution works, but has O(N**2) complexity, making it too slow for long lists. If the lists are sorted first (O(N*log(N)) complexity), you can use comm, which has O(N) complexity:
find /dir -name '*.gz' |sort >everything_sorted
sort skip_files >skip_files_sorted
comm -23 everything_sorted skip_files_sorted | xargs . . . etc
man your computer's comm for details.
This solution will go through all files (not exactly excluding from the find command), but will produce an output skipping files from a list of exclusions.
I found that useful while running a time-consuming command (file /dir -exec md5sum {} \;).
You can create a shell script to handle the skipping logic and run commands on the files found (make it executable with chmod, replace echo with other commands):
$ cat skip_file.sh
#!/bin/bash
found=$(grep "^$1$" files_to_skip.txt)
if [ -z "$found" ]; then
# run your command
echo $1
fi
Create a file with the list of files to skip named files_to_skip.txt (on the dir you are running from).
Then use find using it:
find /dir -name "*.gz" -exec ./skip_file.sh {} \;
This should work:
find * -name "*.gz" $(printf "! -path %s " $(<skip_files.txt))
Working out
Assuming skip_files has a filename on each line, you can get the list of filenames via $(<skip_files.txt). E.g. echo $(<skip_files.txt) should print them all out.
For each filename you want to have a ! -path filename expression. To build this, use $(printf "! -path %s " $(<skip_files.txt))
Then, put it together with a filter on -name "*.gz"