How to get all files and directories by user in Linux - linux

I am trying to find all the directories and files owned by user with the following command.
find / -type d -user greg | grep -v proc
it is working fine sometimes and hanging up sometimes. Is there any performance issues associated with it or is there any better way to executing this.

To keep it from descending into /proc, use -prune. It's better than filtering out entries with grep -v as it'll avoid descending into /proc at all.
find / -path /proc -prune -o -type d -user greg -print
Read -o as "or". If the path is /proc, prune it, i.e. don't go in there. Otherwise, match directories owned by greg. (If you want files too then get rid of the -type d test.)
When you use -prune, you also have to use -print to print matches. -print's normally implied, but using -prune changes that.

Related

List files but exclude certain directories in Unix

I am attempting to list files in a folder with sub folders recursively, but trying to avoid going into one folder as they are duplicates.
This is the command I run but it doesn't do anything.
ls -lR /opt/elk/data/syslogs | grep -v .log. | grep --exclude-dir="cam" * > /tmp/logs.log
If there any changes I can make to this?
Thanks.
Options to different versions of find vary greatly, but you may try:
find /opt/elk/data/syslogs -name cam -prune -o -print
On RHEL, you probably have gnu find, and if you want file size and modification time, you might try:
find /opt/elk/data/syslogs -name cam -prune -o -printf "%p %s %t\n"

find files which have been modified in the last 30 minutes in Linux

how to find files based upon time information, such as creation, modified and accessed. It is useful to find files before a certain time, after a certain time and between two times. what command in Linux would i have to use ?
I understand to find setuid files on linux computers i would have to use :
find / -xdev ( -perm -4000 ) -type f -print0 | xargs -0 ls -l
How do i check for files which have been modified in the last 30 minutes. (I created a new file called FILE2)
Just add -mtime -30m. I might be wrong about the actual syntax, but you get the idea. See man find.
Answer on your question is
find . -cmin -30 -exec ls -l {} \;

how to exclude all subdirectories of a given directory in the search path of the find command in unix

I need to backup all the directory hierarchy of our servers, thus I need to list all the sub directories of some of the directories in the server.
The problem is that one of those sub directories contains tens of thousands of sub directories (file with only the names of the sub directories could take couple of hundreds megabytes and the respective find command takes very long).
For example, if I have a directory A and one sub directory A/a that contains tens of thousands of sub directories, I want to use the find command to list all the sub directories of A excluding all the sub directories of A/a but not excluding A/a itself.
I tried many variations of -prune using the answers in this question to no avail.
Is there a way to use the find command in UNIX to do this?
UPDATE:
the answer by #devnull worked very well, but now i have another problem, so i will refine my question a little:
i used the following command:
find /var/www -type d \( ! -wholename "/var/www/web-release-data/*" ! -wholename "/var/www/web-development-data/*" \)
the new problem that arises is that find for some reason is still traversing the whole directory tree of "/var/www/web-release-data/" and "/var/www/web-development-data/", thus it's very slow, and I fear it could take hours.
Is there any way make find completely exclude those directories and not traverse their respective directory hierarchies?
The following should work for you:
find A -type d \( ! -wholename "A/a/*" \)
This would list all subdirectories of A including A/a but excluding subdirectories of A/a.
Example:
$ mkdir -p A/{a..c}/{1..4}
$ find A -type d \( ! -wholename "A/a/*" \)
A
A/c
A/c/4
A/c/2
A/c/3
A/c/1
A/a
A/b
A/b/4
A/b/2
A/b/3
A/b/1
Another solution:
find A \! -path "A/a/*"
If you don't want a as well, use
find A \! -path "A/a/*" -a \! -path "A/a"
Have you tried rsync(1)? It has an option --exclude=PATTERN which might work well here:
rsync -avz --exclude=A/a <source> <target>
Using rsync you wouldn't need to use find(1)
To exclude 2 subdirs:
find . -type d ! -wholename "dir/name/*" -a ! -wholename "dir/name*"
To answer your updated question, you can do
find /var/www -wholename "/var/www/web-release-data/*" -o -wholename "/var/www/web-development-data/*" -prune -o -type d -print

Remove files for a lot of directories - Linux

How can I remove all .txt files present in several directories
Dir1 >
Dir11/123.txt
Dir12/456.txt
Dir13/test.txt
Dir14/manifest.txt
In my example I want to run the remove command from Dir1.
I know the linux command rm, but i don't know how can I make this works to my case.
PS.: I'm using ubuntu.
To do what you want recursively, find is the most used tool in this case. Combined with the -delete switch, you can do it with a single command (no need to use -exec (and forks) in find like other answers in this thread) :
find Dir1 -type f -name "*.txt" -delete
if you use bash4, you can do too :
( shopt -s globstar; rm Dir1/**/*.txt )
We're not going to enter sub directories so no need to use find; everything is at the same level. I think this is what you're looking for: rm */*.txt
Before you run this you can try echo */*.txt to see if the correct files are going to be removed.
Using find would be useful if you want to search subfolders of subfolders, etc.
There is no Dir1 in the current folder so don't do find Dir1 .... If you run the find from the prompt above this will work:
find . -type f -name "*.txt" -delete

Find in Linux combined with a search to return a particular line

I'm trying to return a particular line from files found from this search:
find . -name "database.php"
Each of these files contains a database name, next to a php variable like $dname=
I've been trying to use -exec to execute a grep search on this file with no success
-exec "grep {\}\ dbname"
Can anyone provide me with some understanding of how to accomplish this task?
I'm running CentOS 5, and there are about 100 database.php files stored in subdirectories on my server.
Thanks
Jason
You have the arguments to grep inverted, and you need them as separate arguments:
find . -name "database.php" -exec grep '$dbname' /dev/null {} +
The presence of /dev/null ensures that the file name(s) that match are listed as well as the lines that match.
I think this will do it. Not sure if you need to make any adjustments for CentOS.
find . -name "database.php" -exec grep dbname {} \;
I worked it out using xargs
find . -name "database.php" -print | xargs grep \'database\'\=\> > list_of_databases
Feel free to post a better way if you find one (or what some rep for a good answer)
I tend to habitually avoid find because I've never learned how to use it properly, so the way I'd accomplish your task would be:
grep dbname **/database.php
Edit: This command won't be viable in all cases because it can potentially generate a very long argument list, whereas find executes its command on found files one by one like xargs. And, as I noted in my comment, it's possibly not very portable. But it's damn short ;)

Resources