list all xml files within a directory and subdirectory - linux

I want to list all .xml files in a dir and its subdir. I tried ls -LR but not able to filter out other files apart from .xml..
I want something like ls -LR | grep *.xml .
Thanks!

You can use find command:
find . -type f -name '*.xml'

Since you tagged the question with bash:
shopt -s extglob globstar
ls !(exclude.this.dir)/**/*.xml

Related

List through all files and find a specific file location

I want to list through .xml files in a folder. For that I did:
find . -name *.xml
The result will be file location, which I should do a cat and do a grep for the text test, if test is present then print the file location, else skip.
To begin with I tried:
find . -name *.xml | xargs cat * | grep test
but this prints the matching line, but not the file location. I tried -b, -l commands with grep to get the file location, but it doesn't work.
And cat only prints the file in the given location but not recursively accessing.
Try this:
find . -name *.xml -exec grep -l test {} +
This will execute grep -l test on all files found by find.
You can use the globstar shell option to enable subdirectory globbing:
shopt -s globstar
grep -l 'test' **/*.xml
When globstar is enabled, ** matches "all files and zero or more subdirectories" (see the manual).

How do I selectively create symbolic links to specific files in another directory in LINUX?

I'm not exactly sure how to go about doing this, but I need to create symbolic links for certain files in one directory and place the symbolic links in another directory.
For instance, I want to link all files with the word "foo" in its name in the current directory bar1 that does not have the extension ".cc" and place the symbolic links in a directory bar2.
I was wondering if there was single line command that could accomplish this in LINUX bash.
Assuming you are in a directory that contains directories bar1 and bar2:
find bar1 -name '*foo*' -not -type d -not -name '*.cc' -exec ln -s $PWD/'{}' bar2/ \;
Try this:
cd bar1
find . -maxdepth 1 -name '*foo*' -not -name '*.cc' -exec echo ln -s $PWD/{} ../bar2 \;
Once you are satisfied with the dry run, remove echo from the command and run it for real.
This is easily handled with extended globbing:
shopt -s extglob
cd bar2
ln -s ../bar1/foo!(*.cc) .
If you really want it all on one line, just use the command separator:
shopt -s extglob; cd bar2; ln -s ../bar1/foo!(*.cc) .
The two examples are identical, but the first is much easier to read.
This technically doesn't count as a one line answer...but it can be pasted in a single instance and should do what you are looking for.
list=`ls | grep foo | grep -v .cc`;for file in $list;do ln $file /bar2/;done

bash shell: how to rename files

I just want to rename all the *.a file to *.a.b in current directory and subdirs, how to do it in shell script?
find . -type f -name '*.a' -print0 | xargs -0 -IZZ mv ZZ ZZ.b
This should handle filenames with spaces and / or newlines. It also doesn't rename directories (the other solution doing find would). If you want it to be case-insensitive, use "-iname" instead of "-name"
Ruby(1.9+)
$ ruby -e 'Dir["**/*.a"].each{|x|File.file?x && File.rename(x,"#{x}.b")}'
In a shell script (at least Bash 4)
shopt -s globstar
shopt -s nullglob
for file in **/*.a
do
echo mv "${file}" "${file}.b"
done
To rename <files> with that, rename 's/\.a$/.a.b/' <files>. Doing so recursively will just take a bit of looping.
(or use *, */*, */*/*, */*/*/*, etc. for the files)
Try the script below:
for file in `find . -name '*.a'`; do mv $file $file.b; done

How to list specific type of files in recursive directories in shell?

How can we find specific type of files i.e. doc pdf files present in nested directories.
command I tried:
$ ls -R | grep .doc
but if there is a file name like alok.doc.txt the command will display that too which is obviously not what I want. What command should I use instead?
If you are more confortable with "ls" and "grep", you can do what you want using a regular expression in the grep command (the ending '$' character indicates that .doc must be at the end of the line. That will exclude "file.doc.txt"):
ls -R |grep "\.doc$"
More information about using grep with regular expressions in the man.
ls command output is mainly intended for reading by humans. For advanced querying for automated processing, you should use more powerful find command:
find /path -type f \( -iname "*.doc" -o -iname "*.pdf" \)
As if you have bash 4.0++
#!/bin/bash
shopt -s globstar
shopt -s nullglob
for file in **/*.{pdf,doc}
do
echo "$file"
done
find . | grep "\.doc$"
This will show the path as well.
Some of the other methods that can be used:
echo *.{pdf,docx,jpeg}
stat -c %n * | grep 'pdf\|docx\|jpeg'
We had a similar question. We wanted a list - with paths - of all the config files in the etc directory. This worked:
find /etc -type f \( -iname "*.conf" \)
It gives a nice list of all the .conf file with their path. Output looks like:
/etc/conf/server.conf
But, we wanted to DO something with ALL those files, like grep those files to find a word, or setting, in all the files. So we use
find /etc -type f \( -iname "*.conf" \) -print0 | xargs -0 grep -Hi "ServerName"
to find via grep ALL the config files in /etc that contain a setting like "ServerName" Output looks like:
/etc/conf/server.conf: ServerName "default-118_11_170_172"
Hope you find it useful.
Sid
Similarly if you prefer using the wildcard character * (not quite like the regex suggestions) you can just use ls with both the -l flag to list one file per line (like grep) and the -R flag like you had. Then you can specify the files you want to search for with *.doc
I.E. Either
ls -l -R *.doc
or if you want it to list the files on fewer lines.
ls -R *.doc
If you have files with extensions that don't match the file type, you could use the file utility.
find $PWD -type f -exec file -N \{\} \; | grep "PDF document" | awk -F: '{print $1}'
Instead of $PWD you can use the directory you want to start the search in. file prints even out he PDF version.

Grep a string recursively in all .htaccess files

How can I grep for a certain string recursively in all .htaccess files?
grep -r -n -H -I 'string' .htaccess
doesn't seem to work.
I'm on a GNU Linux system.
cd to a folder before the folders that store the htaccess
$find . -name ".htaccess" -exec grep -r -n -H -I 'string' {} \;
Use the --include option:
grep -rnHI 'pattern' --include=.htaccess .
You can also specify to 'find' that it needs to look for regular files only:
$ find /usr/local -type f -name ".htaccess" -exec grep -rnHI 'pattern' {} \;
You can specify from where your search should begin. For this example, 'find' will look into all directories and sub directories under /usr/local.

Resources