Give out parent folder name if not containing a certain file - linux

I am looking for a terminal linux command to give out the folder parent name that does not contain a certain file:
By now I use the following command:
find . -type d -exec test -e '{}'/recon-all.done \; -print| wc -l
Which gives me the amount of folders which contain then file.
The file recon-all.done would be in /subject/../../recon-all.done and I would need every single "subject" name which does not contain the recon-all.done file.

Loop through the directories, test for the existence of the file, and print the directory if the test fails.
for subject in */; do
if ! [ -e "${subject}scripts/recon-all.done" ]
then echo "$subject"
fi
done

Your command;
find . -type d -exec test -e '{}'/recon-all.done \; -print| wc -l
Almost does the job, we'll just need to
Remove | wc -l to show the directory path witch does not contain the recon-all file
Now, we can negate the -exec test by adding a ! like so:
find . -type d \! -exec test -e '{}'/recon-all.done \; -print
This way find will show each folder name if it does not contain the recon-all file
Note; Based on your comment on Barmar's answer, I've added a -maxdepth 1 to prevent deeper directorys from being checked.
Small example from my local machine:
$ /tmp/test$ tree
.
├── a
│   └── test.xt
├── b
├── c
│   └── test.xt
└── x
├── a
│   └── test.xt
└── b
6 directories, 3 files
$ /tmp/test$ find . -maxdepth 1 -type d \! -exec test -e '{}/test.xt' \; -print
.
./b
./x
$ /tmp/test$

Related

Change permission of a directory and subdirectories with xargs and chmod commands in linux

i have a list of directories in the current directory named with there permission codes (exemple : 552, 700, 777). I want to get the code permission from the name of directory and apply it to the directory and all the files it contains.
I tried with the xargs command :
find . -name "[0-9][0-9][0-9]" -type d | xargs chmod -R [0-9][0-9][0-9]
the problem with this command it takes the first directory and its change the permission code of all directories.
├── 555
│   └── logs
│   ├── 01.log
│   ├── 02.log
│   ├── 03.log
│   ├── 04.log
│   ├── 05.log
│   ├── 06.log
│   └── 07.log
├── 700
│   └── data
│   └── data1.data
what I want : I have the 555 directory so I want to change all sub files and directory with permission code 555 and for the second directory I want to change all the subfiles and directory with the permission code 700
what my command do: it change all other files and subdirectories with the permission code of the first file 500
Try
find . -name "[0-9][0-9][0-9]" -type d | sed 's#\./\(.*\)#\1#' | xargs -I{} chmod -R {} {}
the find is the same as yours.
the sed is added to remove the ./ from the directory name. Find returns ./700, ./555, ...
xargs with -I uses {} to reuse what it received into the command. So it says "chmod -R DIRNAME DIRNAME". So chmod -R 700 700 and so on.
In your attempt, xargs chmod -R [0-9][0-9][0-9], there is nothing to link the [0-9] in the find to the [0-9] in xargs.
Without xargs
find . -type d -regextype sed -regex ".*/[0-9]\{3\}$"| awk -F"/" '{print "chmod -R " $NF,$0}'|sh
find:
find . -type d -name '[0-7][0-7][0-7]' \
-exec sh -c 'for i do chmod -R "${i##*/}" "$i"; done' _ {} +
or a bash loop:
shopt -s globstar
for i in **/[0-7][0-7][0-7]/; do
i=${i%/}
chmod -R "${i##*/}" "$i"; done
done

Find directories where a text is found in a specific file

How can I find the directories where a text is found in a specific file? E.g. I want to get all the directories in "/var/www/" that contain the text "foo-bundle" in the composer.json file. I have a command that already does it:
find ./ -maxdepth 2 -type f -print | grep -i 'composer.json' | xargs grep -i '"foo-bundle"'
However I want to make an sh script that gets all those directories and do things with them. Any idea?
find
Your current command is almost there, instead off using xargs with grep, lets:
Move the grep to an -exec
Use xargs to pass the result to dirname to show only the parent folder
find ./ -maxdepth 2 -type f -exec grep -l "foo-bundle" {} /dev/null \; | xargs dirname
If you only want to search for composer.json files, we can include the -iname option like so:
find ./ -maxdepth 2 -type f -iname '*composer.json' -exec grep -l "foo-bundle" {} /dev/null \; | xargs dirname
If the | xargs dirname doesn't give enough data, we can extend it so we can loop over the results of find using a while read like so:
find ./ -maxdepth 2 -type f -iname '*composer.json' -exec grep -l "foo-bundle" {} /dev/null \; | while read -r line ; do
parent="$(dirname ${line%%:*})"
echo "$parent"
done
grep
We can use grep to search for all files containing a specific text.
After looping over each line, we can
Remove behind the : to get the filepath
Use dirname to get the parent folder path
Consider this file setup, were /test/b/composer.json contains foo-bundle
➜ /tmp tree
.
├── test
│   ├── a
│   │   └── composer.json
│   └── b
│   └── composer.json
└── test.sh
When running the following test.sh:
#!/bin/bash
grep -rw '/tmp/test' --include '*composer.json' -e 'foo-bundle' | while read -r line ; do
parent="$(dirname ${line%:*})"
echo "$parent"
done
The result is as expected, the path to folder b:
/tmp/test/b
In order to find all files, containing a particular piece of text, you can use:
find ./ -maxdepth 2 -type f -exec grep -l "composer.json" {} /dev/null \;
The result is a list of filenames. Now all you need to do is to get a way to launch the command dirname on all of them. (I tried using a simple pipe, but that would have been too easy :-) )
Thanks to #0stone0 for leading the way. I finally got it with:
#!/bin/sh
find /var/www -maxdepth 2 -type f -print | grep -i 'composer.json' | xargs grep -i 'foo-bundle' | while read -r line ; do
parent="$(dirname ${line%%:*})"
echo "$parent"
done

How to get occurrences of word in all files? But with count of the words per directory instead of single number

I would like to get given word count in all the files but per directory instead of a single count. I do get the word count with simple grep foo error*.log | wc -l by going to a specific directory. I would like to get the word count per directory when the directory structure is like below.
Directory tree
.
├── dir1
│   └── error2.log
└── error1.log
└── dir2
└── error_123.log
└── error_234.log
── dir3
└── error_12345.log
└── error_23554.log
Update: The following command can be used on AIX:
#!/bin/bash
for name in /path/to/folder/* ; do
if [ ! -d "${name}" ] ; then
continue
fi
# See: https://unix.stackexchange.com/a/398414/45365
count="$(cat "${name}"/error*.log | tr '[:space:]' '[\n*]' | grep -c 'SEARCH')"
printf "%s %s\n" "${name}" "${count}"
done
On GNU/Linux, with GNU findutils and GNU grep:
find /path/to/folder -maxdepth 1 -type d \
-printf "%p " -exec bash -c 'grep -ro 'SEARCH' {} | wc -l' \;
Replace SEARCH by the actual search term.

How to pass directory name from find to grep through xargs?

.
├── AAA
│   └── 01.txt
├── AAA_X
│   └── 03.txt
├── BBB
│   └── 02.txt
└── BBB_X
└── 04.txt
$ find . -not -name \*_X -type d -print0 | xargs -0 -n1 -I {} grep 'Hello' {}/\*.txt
grep: ./*.txt: No such file or directory
grep: ./AAA/*.txt: No such file or directory << Why failed here?
grep: ./BBB/*.txt: No such file or directory
$ grep 'Hello' AAA/*.txt
Hello
Question> How can I pass the directory names to grep from find with xargs?
The problem is that xargs doesn't execute the command through the shell.
You should use -name '*.txt' to get the files directly in the find command. To exclude the *_X directories, you can use -prune:
find . -type d -name '*_X' -prune -o -name '*.txt' -exec grep 'Hello' {} +
Why not use --exclude with grep?
grep --exclude=*_X/*.txt Hello */*.txt

Copy files and preserving directory structure

Here's what I have to do: Find all files which are in the directory src (or in its subdirectories) and have str in their name and copy them to dest preserving the subdirectory structure. For example I have the directory dir1 which contains foo.txt and the subdirectory subdir which also contains foo.txt. After running my script (with str=txt and dest=dir2) dir2 should countain foo.txt and subdir/foo.txt. So far I have come up with this code:
while read -r line; do
cp --parents $line $dest
done <<< "$(find $src -name "*$str*")"
which almost does the job except that it creates dir1 inside of dir2 and the desired files are inside dir2/dir1. I also tried doing it with the -exec option of find but didn't get better results.
IIUC, this can be done with find ... -exec. Let's say we have the following directory:
$ tree
.
└── src
├── dir1
│   └── yet_another_file_src
└── file_src
2 directories, 2 files
We can copy all files that contain *src* to /tmp/copy-here like this:
$ find . -type f -name "*src*" -exec sh -c 'echo mkdir -p /tmp/copy-here/$(dirname {})' \; -exec sh -c 'echo cp {} /tmp/copy-here/$(dirname {})' \;
mkdir -p /tmp/copy-here/./src
cp ./src/file_src /tmp/copy-here/./src
mkdir -p /tmp/copy-here/./src/dir1
cp ./src/dir1/yet_another_file_src /tmp/copy-here/./src/dir1
Notice that I used echo instead of really running this command -
read the output and make sure that this is what you want to
achieve. If you're sure that this would be what you want just remove
echo like this:
$ find . -type f -name "*src*" -exec sh -c 'mkdir -p /tmp/copy-here/$(dirname {})' \; -exec sh -c 'cp {} /tmp/copy-here/$(dirname {})' \;
$ tree /tmp/copy-here
/tmp/copy-here
└── src
├── dir1
│   └── yet_another_file_src
└── file_src
2 directories, 2 files
EDIT:
And of course, you can always use rsync:
$ rsync -avz --include "*/" --include="*src*" --exclude="*" "$PWD" /tmp/copy-here

Resources