First of all... I'm newbie in linux! haha
I'm trying to show all the files and directories from a main directory but I need to exclude the main directory record.
Example (all files in /var/www/html):
index.php
Images
Images/1.jpg
Images/2.jpg
Images/3.jp3
Includes
Includes/db.php
Includes/security.php
The records that I want to exclude I've shown in bold / strong
Now I'm using this command:
find /var/www/html/ -mindepth 1 -printf '%P\n'
I appreciate any help. Regards!
Related
I am confused to how to copy a specific type of files from a folder structure to simply a folder on a LINUX machine.
This is how the source folder structure is:
Folder_X
file1.type
file2.nottype
- Folder_Y
file3.type
file4.nottype
- Folder_P
file5.type
file6.nottype
- Folder_A
file7.type
file8.nottype
- Folder_Z
file9.type
file10.nottype
So When I do find . -iname "*.type" in Folder_X, I get following output
./file1.type
./Folder_Y/file3.type
./Folder_Y/Folder_P/file5.type
./Folder_Y/Folder_P/Folder_A/file7.type
./Folder_Z/file9.type
I want to copy these .type extension files to another location in a single folder as this
/some/another/location/Folder_I
file1.type
file3.type
file5.type
file7.type
file9.type
Any help is appreciated... Thank you for time
try below code,
The find command has a -exec option.
ref : https://man7.org/linux/man-pages/man1/find.1.html
find . -iname "*.type" -exec cp {} /some/another/location/Folder_I \;
Imagine the below database that has thousands of files. I want to search for a file name that has the letter D. and 0329. I tried this but it searches all directory not current one and search does not work together.
find . -name "*D.*" | find . -name "*0329*"
File Names
A.20180329
A.20180327
B.20180329
B.20180321
C.20180321
D.20180329
D.20180329
D.20180327
D.20180321
E.20180321
E.20180321
Below command should work if its linux.
ls -lRt *|grep D|grep 0329
I feel like I'm so close to the answer looking through the forums here... I'm successfully using the following command:
sudo find . -mindepth 1 -maxdepth 4 -type d -print0 >structure.txt
to get a text file of our file structure. The issue is that I need to prune out any file directories that have integers. We have THOUSANDS of sub-directories that our app creates that are numbered. Example:
Blue\clientfiles
Blue\clientfiles\saturn
Blue\clientfiles\saturn\moon
Blue\clientfiles\saturn\moon\33
Blue\clientfiles\saturn\moon\34
Blue\clientfiles\saturn\moon\35
Blue\documents
Blue\documents\1001
Blue\documents\1002
Blue\documents\1003
Blue\ftp
Blue\ftp\consumed
Blue\ftp\consumed\202
Blue\ftp\consumed\203
Blue\ftp\consumed\204
Blue\system
Blue\system\007
Blue\system\008
As you can see, part of the problem is that the depth varies... not just in the "Blue" directory, but the Red might have different depths as well. The only constant is that I do not need ANY of the numbered directories. I can't figure out the proper prune syntax to exclude the numbered dirs.
Any insight would be appreciated!! ~R
This is rather unix.stackexchange question. Try to exclude all directories with digit on it's end:
egrep -v "[0-9]$"
(not compatible with print0)
I am trying to write a shell script that loops through all the directories under a Parent Directory and skip the directories that have empty folder "I_AM_Already_Processed" at leaf level.
Parent directory is provided as input to shell script as:
. selectiveIteration.sh /Employee
Structure under parent directory is shown below
( Employee directory contains data bifurcated by yearly -> monthly -> daily -> hourly basis )
/Employee/alerts/output/2014/10/08/HOURS/Actual_Files
Shell script is trying to find out which directory is not already processed. For Example:
Let us consider three hours of data for Date : 10/08/2014
1. /USD/alerts/output/2014/10/08/2(hourly_directory)/Actual_file +
directory_with_name(I_AM_Already_Processed)
2. /USD/alerts/output/2014/10/08/3(hourly_directory)/Actual_file +
directory_with_name(I_AM_Already_Processed)
3. /USD/alerts/output/2014/10/08/(hourly_directory)/Actual_file
in above example leaf directories 2 and 3 are already processed as they contain the folder named
"I_AM_Already_Processed" and whereas directory 4 is not already processed.
So shell script should skip folders 2, 3 but should process directory 4 ( print this directory in output).
Research/work I did:
Till now i am being able to iterate through the directory structure and go through all folders/files from root to leaf level, but i am not sure how to check for specific file and skip the directory if that file is present. ( i was able to do this much after referring few tutorials and older posts on StackOverflow)
I am newbie to shell scripting, this is my first time writing shell script, so if this too basic question to ask please excuse me. Trying to learn.
Any suggestion is welcome. Thanks in advance.
To check if a some_directory has already been processed, just do something like
find some_directory -type d -links 2 -name 'I_AM_Already_Processed'
Which will return the directory name if it has, or nothing if it hasn't. Note -links 2 tests if the directory is a leaf (meaning it only has links to its parent and itself, but not to any subdirectories). See this answer for more information.
So in a script, you could do
#!/bin/bash
directory_list=(/dir1 /dir2)
for dir in "${directory_list[#]}"; do
if [[ -n $(find "$dir" -type d -links 2 -name 'I_AM_Already_Processed' -print -quit) ]]; then
echo 'Has been processed'
else
echo 'Has not been processed'
fi
As i have a serous sever performance warning with installing drupal-commons (this is a installation-profile) i now want to reduce the server load.
Why - i get a message when trying to install drupal commons: Too-many-files-open it says!
Well Drupal & modules (ab)uses too many files! 50,000 maximum files and maybe 5000 directories is their goal and that si what they only backup so its in
So my question: How can i get rid of all those silly translation files or whatever for tiny miny parts of info and
unnecesary subdivisions; How i can get rid of them!
Background: I would expect that file_exists() during the installation(or bootstrap-cycle) is the most expensive built-in PHP function measured as total time spent calling the function for all invocations in a single request.
Well now i try to get rid of all the overhead (especially of the translation-files that are so called - po-files) - and unnecessary files that are contained in the drupal-commons 6.x-2.3 in order to get it runnning on my server.
i want to get rid all those silly translation files or whatever for tiny miny parts of info and unnecesary subdivisions;
How to search for all those .po-files recursivly - with GREP i guess ..
Note: i do not now where they are!
linux-vi17:/home/martin/web_technik/drupal/commons_3_jan_12/commons-6.x-2.3/commons-6.x-2.3 # lsCHANGELOG.txt
._.htaccess install.php modules themes
._CHANGELOG.txt ._includes INSTALL.txt ._profiles ._update.php
COMMONS_RELEASE_NOTES.txt includes ._INSTALL.txt profiles update.php
._COMMONS_RELEASE_NOTES.txt ._index.php LICENSE.txt ._robots.txt UPGRADE.txt
COPYRIGHT.txt index.php ._LICENSE.txt robots.txt ._UPGRADE.txt
._COPYRIGHT.txt INSTALL.mysql.txt MAINTAINERS.txt ._scripts ._xmlrpc.php
._cron.php ._INSTALL.mysql.txt ._MAINTAINERS.txt scripts xmlrpc.php
cron.php INSTALL.pgsql.txt ._misc ._sites
.directory ._INSTALL.pgsql.txt misc sites
.htaccess ._install.php ._modules ._themes
linux-vi17:/home/martin/web_technik/drupal/commons_3_jan_12/commons-6.x-2.3/commons-6.x-2.3 # grep .po
Any way i want to remove all .po files with one bash command - is this possible
but wait: first of all - i want to find out all the files - and the ni want to list it:
- since i then know what i rease (/or remove)
Well - all language translations in Drupal are named with .po -
how to find them with GREP?
How to list them - and subsequently - how to erase them!?
update:
i did the search with
find -type f -name "*.po"
. well i found approx 930 files.
afterwards i did remove all them with
6.x-2.3 # find -type f -name "*.po" -exec rm -f {} \;
a final serach with that code
find -type f -name "*.po"
gave no results back so every po-file was erased!
manym many thanks for the hints.
greetings
zero
If you want to find all files named *.po in a directory named /some/directory, you can use find:
find /some/directory -type f -name "*.po"
If you want to delete them all in a row (you do have backups, don't you?), then append an action to this command:
find /some/directory -type f -name "*.po" -exec rm -f {} \;
Replace /some/directory with the appropriate value and you should be set.
The issue with "too many open files" isn't normally because there are too many files in the filesystem, but because there is a limitation to the amount of files an application or user can have open at one time. This issue has been covered on drupal forums, for example, see this thread to solve it more permanently/nicely:
http://drupal.org/node/474152
A few more links about open files:
http://www.cyberciti.biz/tips/linux-procfs-file-descriptors.html
http://blog.thecodingmachine.com/content/solving-too-many-open-files-exception-red5-or-any-other-application