There are some .app files in the folder, such as
folder_A/1.app
/2.app
/subF/3.app
/3.txt
Then i want to use ls command to check if there are any .app files under folder_A, i can use ls -R folder_A to list all the files under "folder_A" and sub folder "subF", but on Macos, the app file is also considered as an directory that ls will list all the files contained in 1.app,2.app and so on.
For example, 1.app contains some .png,.txt; then ls -R folder_A will return all the png and txt files, not the 1.app itself. But i want to list all the app files under folder_A and its sub folder without list all the files included in .app.
The trick is to use the right tool for the job.
find folder_A -name '*.app'
The find command is better suited for traversing a directory hierarchy.
find folder_A -name '*.app'
Find can be easily used to search on specific files
find folder_name -name "*.app" -print
folder_name can be an absolute path or a relative path.
Related
I have multiple files in different folders with the same name and extension. For example: There are 460 folders and each folder has one file with the name of snps.vcf. I want to copy/move these files to one folder and later on, I will do some analysis that I need to do.
I have tried:
find -type f -name "*.vcf" -exec cp {} /home/AWAN/try';'
but this code overwrites the files and only one file remains there in the end.
I have tried rename but I don't know how to select multiple files by find command then rename. Even with the mmv I couldn't find the possible solution.
You need to write an external script and pass it to -exec.
Your script may use mktemp to generate a random file name. Example:
mktemp /your/directory/try-XXX
The XXX part will be replaced by mktemp with a different value for each call.
I have a bunch of files in separate folders, and all of the folders are in one directory.
/var/www/folder1/file1.txt
/var/www/folder1/file2.txt
/var/www/folder1/file3.txt
/var/www/folder2/file4.jpg
/var/www/folder2/file5.jpg
/var/www/folder2/file6.jpg
/var/www/folder3/file7.pdf
/var/www/folder3/file8.doc
/var/www/folder3/file9.gif
I need everything inside of the folders that are inside of /var/www/ to be copied to another directory (say, /var/my-directory/), but not the actual folders. Based on the example above, I need /var/my-directory/` to look as follows:
/var/my-directory/file1.txt
/var/my-directory/file2.txt
/var/my-directory/file3.txt
/var/my-directory/file4.jpg
/var/my-directory/file5.jpg
/var/my-directory/file6.jpg
/var/my-directory/file7.pdf
/var/my-directory/file8.doc
/var/my-directory/file9.gif
I can't seem to figure out the command to do this. I've tried the following:
sudo cp -R /var/www/./. /var/my-directory/
But, that still copies all of the folders.
Is there any way to do what I'm trying to do?
Use find.
find /var/www/ -type f -exec cp '{}' /var/my-directory/ \;
The trick is -type f that only selects file.
I know commands:
To find path of file:
find . -name filename
To compare two files:
meld path_of_file_in_one_dir path_of_file_in_second_dir
I have to compare so many files in different sub directories of a directory, every time I have to first search the path of that file and then use this path with meld, and this I have to for each file.
It would be very easy if i only give names and root directory of that files to meld.
How can I combine meld and find commands so that I can apply it on each file?
If your directory structure is like this say
test_folder
test1
file_to_cmp
test2
file_to_cmp
test3
file_to_cmp
then you can run the following
meld find . -name "file*" or
meld find . -name "file_to_cmp"
This will compare the 3 files for sure.
About more than three file comparing, I doubt meld can do it.
I currently have a number of drives mounted under "/media/" I want to recursively scan all the drives mounted looking for files with a specific extension "*.foo". Once found I want to symlink these files into a directory elsewhere. One requirement is that I keep the basename of the file the same when creating the symlink. I wasn’t able to come up with a an easy solution using "find -exec" on my own. Is there an easy way to do this?
find /media/ -name *.foo | xargs ln -s -t DIRECTORYYOUWANTLINKSIN
I'm trying to write a shell script under linux, which lists all folders (recursively) with a certain name and no symlink pointing to it.
For example, I have:
/home/htdocs/cust1/typo3_src-4.2.11
/home/htdocs/cust2/typo3_src-4.2.12
/home/htdocs/cust3/typo3_src-4.2.12
Now I want to go through all subdirectories of /home/htdocs and find those folders typo3_*, that are not pointed to from somewhere.
Should be possible with a shellscript or a command, but I have no idea how.
Thanks for you help
Stefan
I think none of the common file systems store if there are symlinks pointing to this file in the file node, so you would have to scan all other files to see if it is a symlink to this one. If you don't limit your depth of search to a certain level, this might take a very long time. If you want to perform that search in /home/htdocs, for example, it would work something like this:
# find specified folders:
find /home/htdocs -name 'typo3_*' -type d | while read folder; do
# list all symlinks pointing to $folder
find -L /home/htdocs -samefile "$folder"|grep -v "$folder\$"
done