How to dump list of file's from subfolders with its full path in Linux - linux

I am not able to get the command (or group of commands)to do the following operation in Linux:
I am having a main project_folder with some 10 sub_folder containing all different extensions of files (for example *.cpp /*.txt/*.y / *.py etc ). But I want to just make a list of all REQUIRE_*.txt files from all SUB-FOLDERS with its complete path and dump into the text file .
For example :
result_dump.txt should include :
user/project_folder/sub_folder0/a0.txt
user/project_folder/sub_folder0/a6.txt
user/project_folder/sub_folder0/a11.txt
user/project_folder/sub_folder0/a12.txt
user/project_folder/sub_folder1/a1.txt
user/project_folder/sub_folder1/a13.txt
user/project_folder/sub_folder2/a14.txt
user/project_folder/sub_folder2/a15.txt
user/project_folder/sub_folder2/a2.txt
user/project_folder/sub_folder3/a3.txt
user/project_folder/sub_folder4/a4.txt
user/project_folder/sub_folder5/a5.txt
--
--
--
If I use below command ,then I am getting all files information which is not my intention:
find $(pwd) -maxdepth 1 -type f -not -path '*/\.*' | sort
Note: Please let me know how I can dump that result in text file !

Run the below command from the user's home directory
find project_folder/ -maxdepth 3 -type f -name REQUIRE_*.txt
Edit
You can then redirect the o/p using to a file adding > result_dump.txt after the command

Related

How to search for files ending/starting/containing a certain letter in terminal?

I have been looking all over the internet to help me with this. I want to list all files that start/end/contain a certain letter but the results I found on the internet do not seem to work for me. I need to use the ls command for this (assignment).
I tried this code from another question:
ls abc* # list all files starting with abc---
ls *abc* # list all files containing --abc--
ls *abc # list all files ending with --abc
but when ever I try any of those it comes back with "ls: cannot access '*abc': No such file or directory"
Use find for finding files:
find /path/to/folder -maxdepth 1 -type f -name 'abc*'
This will give you all regular filenames within /path/to/folder which start with abc.

How to script to go to the deepest folder directory from parent directory to execute a command?

I am trying to automate converting .vmx to .ova by ovftool, and these .vmx files are generated from ghettoVCB. So I am writing a script to get converting automation working.
My question is how do I write a shell script that goes through each directory from a parent_directory and executes a command in each directory? Or could move everything from the deepest folder to parent_directory? (This solution may take time consuming to move those files from the deepest folder to parent_directory).
The directory structure is as follows:
parent_directory/automation/automation-2016-04-18_19-08-32/automation.vmx
parent_directory/bulls/bulls-2016-04-18_18-28-57/bulls.vmx
Here is another structure layout
parent_directory
automation
automation-2016-04-18_19-08-32
automation.vmx
bulls
bulls-2016-04-18_18-28-57
bulls.vmx
The name of subfolders from parent_directory does not follow patterns. Could be any name.
The folder "automation-2016-04-18_19-08-32" is the name of subfolder + date + time.
Approach 1
move everything from the deepest folder to parent_directory
This command will search subdirectories of the current directory, find all .vmx files, and move them to the current directory:
find . -type f -name '*.vmx' -exec mv {} . \;
Approach 2
write a shell script that goes through each directory from a parent_directory and executes a command in each directory
The following command will search for every subdirectory of the current directory and execute command in it:
find . -type d -execdir command \;
If you want to run command in every directory that contains .vmx files and supply the names of those .vmx files as arguments to command, then run:
find . -type f -name '*.vmx' -execdir command {} +
Alternatively, suppose we want to run the command once for each .vmx file that we find:
find . -type f -name '*.vmx' -execdir command {} \;

Bash Command for Finding the size of all files with particular filetype in a directory in ubuntu

I have a folder which contains several file types say .html,.php,.txt etc.. and it has sub folders also .Sub folders may contain all the file types mentioned above.
Question1:- I want to find size of all the files having the file type as '.html' which are there in both root directory and in sub- directories
Question2:- I want to find size of all the files having the file type as '.html' which are there only in root directory but not in sub folders.
I surfed through the internet but all i am able to get is commands like df -h, du -sh etc..
Are there any bash commands for the above questions? Any bash scripts?
You can use the find command for that.
#### Find the files recursively
find . -type f -iname "*.html"
#### Find the files on the r
find . -type f -maxdepth 1 -iname "*.iml"
Then, in order to get their size, you can use the -exec option like this:
find . -type f -iname "*.html" -exec ls -lha {} \;
And if you really only need the file size (I mean, without all the other stuff that ls prints):
find . -type f -iname "*.html" -exec stat -c "%s" {} \;
Explanation:
iname search of files without being case sensitive
maxdepth travels subdirectories recursively up to the specify level (1 means only the immediate folder)
exec executes an arbitrary command using the found paths, where "{}" represents the path of the file
type indicates the type of file (a directory is a file in Linux)

check if a file is in a folder or its subfolder using linux terminal

I want to check if the particular file is in a folder or its sub folder or not using Linux terminal.
Which should I use for this? I use find and grep command but it travels only one folder.
In order to search from your current directory, use
find . -name filename
In order to search from root directory use
find / -name filename
If you don't know the file extension try
find . -name filename.*
Also note that find command only displays the files in the path which you have permission to view. If you don't have permission for a/b/c path then it will just display a message mentioning that path can't be searched
If you want to search for by filename, use find:
find /path -name "filename"
example:
find . -name myfile.txt
If need to find all files containing a specific string, use grep:
grep -r "string" /path
example:
grep -r foobar .
By default, find will traverse all subdirectories, for example:
mkdir level1
mkdir level1/level2
touch level1/level2/file
find . -name "file"
Output:
./level1/level2/file
locate file name
This is the simple command
I also prefer using a combination of tree and grep. Something like
tree | grep filename
Try
find . -name "filename" -type f
-type f restricts to only files in the current directory (replace . with your path).

Search recursively for files in a parent directory in Linux

I am trying to list all the files in a parent directory and its subdirectories. However, I am running this command from another location. So, at first, I need to traverse to the directory (from where I want to run this command).
Please note that I am using the find command instead of ls because I also want to list down the absolute path for each file in front of it. This is not possible with the ls command.
here is what I am doing:
cd ../../../;cd level1_dir1;find $(pwd) . -name *.* -printf "%TY-%Tm-%Td\t%p\n"
This command does not show any output.
Here is the directory structure:
level1_dir1
this has multiple subdirectories:
level2_dir1
level2_dir2
....
level2_dir10
each of these subdirectories again have subdirectories and files inside them.
however, now if I do:
cd ../../../;cd level1_dir1/level2_dir1;find $(pwd) . -name *.* -printf "%TY-%Tm-%Td\t%p\n"
it will do the recursion properly for all the subdirectories in level2_dir1 and show output like:
date level1_dir1/level2_dir1/path/to/file/filename
so, I wanted to print out for all the level2 directories, this way (by using the wild character):
cd ../../../;cd level1_dir1/*;find $(pwd) . -name *.* -printf "%TY-%Tm-%Td\t%p\n"
but it prints out the results only for the first directory in level2 (that is level2_dir1)
how can I make it list down the files for all the subdirectories?
thanks.
How about this?
find ../../../level1_dir1 -printf "%TY-%Tm-%Td\t%p\n"
If you want all the files, you don't even need -name in the find command. If you don't want to see the directories and only the files, just add "-type f" before -printf.
Hope this helps...

Resources