Bash Command for Finding the size of all files with particular filetype in a directory in ubuntu - linux

I have a folder which contains several file types say .html,.php,.txt etc.. and it has sub folders also .Sub folders may contain all the file types mentioned above.
Question1:- I want to find size of all the files having the file type as '.html' which are there in both root directory and in sub- directories
Question2:- I want to find size of all the files having the file type as '.html' which are there only in root directory but not in sub folders.
I surfed through the internet but all i am able to get is commands like df -h, du -sh etc..
Are there any bash commands for the above questions? Any bash scripts?

You can use the find command for that.
#### Find the files recursively
find . -type f -iname "*.html"
#### Find the files on the r
find . -type f -maxdepth 1 -iname "*.iml"
Then, in order to get their size, you can use the -exec option like this:
find . -type f -iname "*.html" -exec ls -lha {} \;
And if you really only need the file size (I mean, without all the other stuff that ls prints):
find . -type f -iname "*.html" -exec stat -c "%s" {} \;
Explanation:
iname search of files without being case sensitive
maxdepth travels subdirectories recursively up to the specify level (1 means only the immediate folder)
exec executes an arbitrary command using the found paths, where "{}" represents the path of the file
type indicates the type of file (a directory is a file in Linux)

Related

Copy recursive files of all the subdirectories

I want to copy all the log files from a directory which does not contain log files, but it contains other subdirectories with log files. These subdirectories also contain other subdirectories, so I need something recursive.
I tried
cp -R *.log /destination
But it doesn't work because the first directory does not contains log files. The response can be also a loop in bash.
find /path/to/logdir -type f -name "*.log" |xargs -I {} cp {} /path/to/destinationdir
Explanation:
find searches recursively
-type f tells you to search for files
-name specifies the name pattern
xargs executes commands
-I {} indicates an argument substitution symbol
Another version without xargs:
find /path/to/logdir -type f -name '* .log' -exec cp '{}' /path/to/destinationdir \;

linux: find common files from two directories with single command

Dir1: [anyName]-test/target/surefire-reports/*.xml
Dir2: target/surefire-reports/*.xml
jenkins shell cmd i came up:
sh "jar -cMvf Test.zip target/surefire-reports/*.xml *-test/target/surefire-reports/*.xml "
only one directory exists ( dir1 or dir2), so the shell step always fails for no file or directory.
Any better idea to look for xml files, in single command, without failing ? (may be some regular expression) Thanks !
With GNU find:
find . -type f -regex '\./\([^/]*-test/\)?target/surefire-reports/[^/]*\.xml'\
-exec jar -cMvf Test.zip {} +
The -regex action matches the path of your regular (type -f) files. This will only add *.xml files from the surefire-reports directories, not from its subdirectories. If you want to include subdirectories, replace [^/]*\.xml with .*\.xml.
Alternative using glob patterns:
find target/surefire-reports *-test/target/surefire-reports -maxdepth 1 -type f\
-name '*.xml' -exec jar -cMvf Test.zip {} +
If you want to include subdirectories, remove -maxdepth 1.
Run both commands from the parent directory of target (your project dir).
Try:
cd <TOP DIRECTORY>
find . -type f -name "*unitTest.xml" -print | xargs jar -cMvf Test.zip
I put "*unitTest.xml since in Dir2 the J is in caps.
This way it will capture the JUnitTtest.xml files only if they exist.
So the results of the find command are used as arguments to the jar command. This is done by xargs. find does not care if the file is there or not, so no error.
Tested on bash.

Find a file and export the full path to a list

I'm looking for a way to search certain files named "XYHello.pdf" or "BDHello.pdf" so basically "*Hello.pdf" in a directory with subfolder and export the found files including path to the file in a text file.
So that at the end I have a list with all found files including the paths in a list. I spontaneously thought about Linux Command find.
find . -type f -iname "*Hello.pdf"
But the problem is i need the full path to the file in a list.
find $PWD -type f -iname "*Hello.pdf"
or
find . -type f -iname "*Hello.pdf" -exec realpath {} \;

Get list of files that contain given text within directory given by pattern

I want to get a list of files that contain a given text within my file-system. Furthermore only files should be considdered that are located in a directoy given by a pattern.
So let´s say I have a number of directories called myDir within my filelsystem as shown here:
/usr
/myDir
/tmp
/myDir
/anotherDir
Now I want to get all the files within those directories that contain the text.
So basically I need to perform these steps:
loop all directories names myDir on the whole file-system
for every directory within that list get the files that contain the search-string
What I tried so far is find /etc /opt /tmp /usr /var -iname myDir -type d -exec ls -exec grep -l "SearchString" {} \;
However this doesn´t work as the results of find are directories which I may not use as input for grep. I assume I have to do one step in between the find and the grep but can´t find out how to do this.
I think I got it and will show you a little script that achieves what I need:
for i in $(find / -type d -iname myDir) do
for j in $(find "$i" -type f) do
grep "SearchString" "$j"
done
done
This will give me all the files that contain the SearchString and are located in any of the folders named myDir.

Linux: how to look for files with a certain extension in hierarchy and execute command whenever one is found?

I have a directory hierarchy, whose names do not follow a pattern. E.g.
parent
bcgegec
hfiwehfiuwe
huiwwuifegeufg
whegwgefyfeg
hfeohfeiofe
chidchuehugfe
dedewdewf
tegtgetg
gtgetgtg
and so on.
Inside some of such directories there is a file with "gr" extension. I need to find each of such files, cd to its dir and execute "gnuplot" command having the .gr file as argument. I tried the following to nest two find commands, but the {} of the inner one does not work as I need. The outer find should iterate for every directory, and the inner find should look for the presence of the .gr file.
find $parentDir -type d -exec sh -c '(cd {} && find . -maxdepth 1 -name *.gr -exec /usr/bin/gnuplot {} \;)' \;
Perhaps this is what you are looking for:
find . -type f -name "*.gr" -execdir /usr/bin/gnuplot {} \;
Read through man find for other useful information.

Resources