Grep a string recursively in all .htaccess files - linux

How can I grep for a certain string recursively in all .htaccess files?
grep -r -n -H -I 'string' .htaccess
doesn't seem to work.
I'm on a GNU Linux system.

cd to a folder before the folders that store the htaccess
$find . -name ".htaccess" -exec grep -r -n -H -I 'string' {} \;

Use the --include option:
grep -rnHI 'pattern' --include=.htaccess .

You can also specify to 'find' that it needs to look for regular files only:
$ find /usr/local -type f -name ".htaccess" -exec grep -rnHI 'pattern' {} \;
You can specify from where your search should begin. For this example, 'find' will look into all directories and sub directories under /usr/local.

Related

How to ignore directories and certain patterns when supplying command-line arguments

I'm a newbie at Linux, and I am wondering if there's a 'one-liner' command that allows me to link everything in a directory to another directory, but ignoring subdirectories and certain wildcards from the source directory.
Let's be more specific...let's say I want to link everything in /foo to /bar/tmp as in...
ln -s /foo/* /bar/tmp/.
...but I want to:
ignore any subdirectories in /foo
ignore any files with the wildcard
runscript*
Any suggestions on how to do this?
You could use find like this
find /foo -maxdepth 1 -type f ! -name 'runscript*' -exec ln -s {} /bar/tmp/ \;
Something like
cd /bar/tmp
find /foo -maxdepth 1 -a -type f -a \! -name 'runscript*' |
while read file; do
ln -s "$file"
done
could do the trick.

Linux script Loop

I want to create a loop in Linux script that will go thru the folders in one directory and will copy photos to one folder and will overwrite photos that have the same name. Can anyone point me in the write direction?
find /path/to/source -type f -exec cp -f {} /path/to/destination \;
Would that work? Keep in mind, that will overwrite files without asking.
If you want it to confirm with you before overwriting, use the -i flag (for interactive mode) in the cp command.
find /path/to/source -type f | xargs -I {} file {} | grep <JPEG or image type> | cut -d ":" -f1 | xargs -I {} cp -rf {} /path/to/destination
With this you can find tune your copy with selecting only the image type.
Actually, you need not to loop through folders for finding photos using script, find command will do that job for you.
Try using find with xargs and cp
find source_dir -type f -iname '*.jpg' -print0 | xargs -0 -I {} cp -f {} dest_dir
Replace *.jpg with format of your photo files. (e.g. *.png etc.)
Note the use of -f option of cp, since you want to overwrite photos with the same name

rename folders with case-sensitive names

I want to rename folder with case-sensitive option. for example:
mv "foldername1" "foldername2"
an error occuring in this command, because name of my folder isn't "foldername1", it is "FolderName1".
How can I use rename (mv) command to rename case-sensitive named folders?
ls | grep -i foldername1 | xargs -I {} mv {} foldername2
Warning: You would not like to use this when multiple candidate files (eg. if foldername1 and Foldername1 are both present).
Use the find command:
find . -iname foldername1 -exec mv '{}' foldername2 ;

How to list specific type of files in recursive directories in shell?

How can we find specific type of files i.e. doc pdf files present in nested directories.
command I tried:
$ ls -R | grep .doc
but if there is a file name like alok.doc.txt the command will display that too which is obviously not what I want. What command should I use instead?
If you are more confortable with "ls" and "grep", you can do what you want using a regular expression in the grep command (the ending '$' character indicates that .doc must be at the end of the line. That will exclude "file.doc.txt"):
ls -R |grep "\.doc$"
More information about using grep with regular expressions in the man.
ls command output is mainly intended for reading by humans. For advanced querying for automated processing, you should use more powerful find command:
find /path -type f \( -iname "*.doc" -o -iname "*.pdf" \)
As if you have bash 4.0++
#!/bin/bash
shopt -s globstar
shopt -s nullglob
for file in **/*.{pdf,doc}
do
echo "$file"
done
find . | grep "\.doc$"
This will show the path as well.
Some of the other methods that can be used:
echo *.{pdf,docx,jpeg}
stat -c %n * | grep 'pdf\|docx\|jpeg'
We had a similar question. We wanted a list - with paths - of all the config files in the etc directory. This worked:
find /etc -type f \( -iname "*.conf" \)
It gives a nice list of all the .conf file with their path. Output looks like:
/etc/conf/server.conf
But, we wanted to DO something with ALL those files, like grep those files to find a word, or setting, in all the files. So we use
find /etc -type f \( -iname "*.conf" \) -print0 | xargs -0 grep -Hi "ServerName"
to find via grep ALL the config files in /etc that contain a setting like "ServerName" Output looks like:
/etc/conf/server.conf: ServerName "default-118_11_170_172"
Hope you find it useful.
Sid
Similarly if you prefer using the wildcard character * (not quite like the regex suggestions) you can just use ls with both the -l flag to list one file per line (like grep) and the -R flag like you had. Then you can specify the files you want to search for with *.doc
I.E. Either
ls -l -R *.doc
or if you want it to list the files on fewer lines.
ls -R *.doc
If you have files with extensions that don't match the file type, you could use the file utility.
find $PWD -type f -exec file -N \{\} \; | grep "PDF document" | awk -F: '{print $1}'
Instead of $PWD you can use the directory you want to start the search in. file prints even out he PDF version.

Linux command for removing all ~ files

What command can I use in Linux to check if there is a file in a given directory (or its subdirectories) that contains a ~at the end of the file's name?
For example, if I'm at a directory called t which contains many subdirectories, etc, I would like to remove all files that end with a ~.
Watch out for filenames with spaces in them!
find ./ -name "*~" -type f -print0 | xargs -0 rm
with GNU find
find /path -type f -name "*~" -exec rm {} +
or
find /path -type f -name "*~" -delete
find ./ -name '*~' -print0 | xargs -0 rm -f
Here find will search the directory ./ and all sub directories, filtering for filenames that match the glob '*~' and printing them (with proper quoting courtesy of alberge). The results are passed to xargs to be appended to rm -f and the resulting string run in a shell. You can use multiple paths, and there are many other filters available (just read man find).
you can use a find, grep, rm combination, something like
find | grep "~" | xargs rm -f
Probably others have better ideas :)

Resources