How to get the number of files with a specfic extension in a directory and it's sub directories on Linux terminal? - linux

The question is itself self-explanatory.
I tried the following command I found somewhere on the internet but it shows the number just in the immediate directory and not its subdirectories.
ls -lR ./*.jpg | wc -l
I am searching for all the files with the extension ".jpg" in the current folder and its subdirectories.

find . -type f -name '*.jpg' | wc -l
Find all the files (type f) that have a name that matches '*.jpg' then count them with wc

It's a job for find:
find -name "*.jpg" | wc -l

Related

How do I retrieve all files in linux for given directory level. I want to absolute path aswell as the filename

I'm trying to retrieve all files for a given directory level. The files retrieved should also show absolute path.
git ls-files | cut -f1 | uniq | grep '.xml'
I expect the results to be a list of files in the directory level/depth 1.
Using find: using -maxdepth flag set to 1, -type f to list only files, -iname to list file of matching type. -exec to perform some action on the matched files. readlink -f to print the full path of the files.
find . -maxdepth 1 -type f -iname "*.xml" -exec readlink -f {} \;

How do I find the number of all .txt files in a directory and all sub directories using specifically the find command and the wc command?

So far I have this:
find -name ".txt"
I'm not quite sure how to use wc to find out the exact number of files. When using the command above, all the .txt files show up, but I need the exact number of files with the .txt extension. Please don't suggest using other commands as I'd like to specifically use find and wc. Thanks
Try:
find . -name '*.txt' | wc -l
The -l option to wc tells it to return just the number of lines.
Improvement (requires GNU find)
The above will give the wrong number if any .txt file name contains a newline character. This will work correctly with any file names:
find . -iname '*.txt' -printf '1\n' | wc -l
-printf '1\n tells find to print just the line 1 for each file name found. This avoids problems with file names having difficult characters.
Example
Let's create two .txt files, one with a newline in its name:
$ touch dir1/dir2/a.txt $'dir1/dir2/b\nc.txt'
Now, let's find the find command:
$ find . -name '*.txt'
./dir1/dir2/b?c.txt
./dir1/dir2/a.txt
To count the files:
$ find . -name '*.txt' | wc -l
3
As you can see, the answer is off by one. The improved version, however, works correctly:
$ find . -iname '*.txt' -printf '1\n' | wc -l
2
find -type f -name "*.h" -mtime +10 -print | wc -l
This worked out.

Count only visible files in directory

I'm having problem with hidden file in my directory. If I use $(find . -type f | wc -l) it shows 8 files, which counts hidden file too, there should be only 7 files.
Is there anything that could count only visible files?
Ignore the names that start with . by saying:
find . ! -name '.*' -type f | wc -l
From the man page:
! expression
-not expression
This is the unary NOT operator. It evaluates to true if the
expression is false.
If you have filenames with newlines, then you can do using gnu find (as suggested by gniourf gniourf in comments):
find . ! -name '.*' -type f -maxdepth 1 -printf 'x' | wc -c
find . -type f -not -path '*/\.*' | wc -l
-not -path allows you to ignore files with name starting with . (hidden files)
Exclude all files starting with ( . )
find ./ ! -name '\.*' -type f | wc -l
! simply negates the search
If that doesnt work then try this dirty looking solution:
ls -lR | egrep '^(-|l|c|b|p|P|D|C|M|n|s)' | wc -l
Listed all types of files there excluding directories.
You can find the type of files in linux here
without -R of you want to look only in same dir.

counting files in directory linux

Q2. Write a script that takes a directory name as command line argument and display the attributes of various files in it e.g.
Regular Files
Total No of files
No of directories
Files allowing write permissions
Files allowing read permissions
Files allowing execute permissions
File having size 0
Hidden files in directory
working in linux in shell script
what i have done is
find DIR_NAME -type f -print | wc -l
To count all files (including subdirs):
find /home/vivek -type f -print| wc -l
To count all dirs including subdirs:
find . -type d -print | wc -l
To only count files in given dir only (no subdir):
find /dest -maxdepth 1 -type f -print| wc -l
To only count dirs in given dir only (no subdir):
find /path/to/foo -maxdepth 1 -type d -print| wc -l
All your questions can be solved by looking into man find
-type f
no option necessary
-type d
-perm /u+w,g+w or some variation
-perm /u+r,g+r
-perm /u+x,g+x
-size 0
-name '.*'

How can I "clip" the output of the find command?

I executed the following command:
find / -type f -name fs-type -exec svnlook tree {} \; |egrep "/$"
The result was
svnlook: Can't open file '/var/lib/svn/repos/b1me/products/payone/generic/code/core/db/fs-type/format': Not a directory
svnlook: Can't open file '/var/lib/svn/repos/b1me/products/payone/generic/code/fees/db/fs-type/format': Not a directory
Maybe I should make find command give me the path without db/fs-type/format in other words I should clip the output of find. How can I do this?
First you can give
find ... -not -path "*/db/*"
to find.
This is what you're looking for
find Subversion -type d -name db -exec svnlook tree {}/.. \; | egrep "/$"
Your command was failing because svnlook expects a directory argument not a file one.

Resources