How to pass directory name from find to grep through xargs? - linux

.
├── AAA
│   └── 01.txt
├── AAA_X
│   └── 03.txt
├── BBB
│   └── 02.txt
└── BBB_X
└── 04.txt
$ find . -not -name \*_X -type d -print0 | xargs -0 -n1 -I {} grep 'Hello' {}/\*.txt
grep: ./*.txt: No such file or directory
grep: ./AAA/*.txt: No such file or directory << Why failed here?
grep: ./BBB/*.txt: No such file or directory
$ grep 'Hello' AAA/*.txt
Hello
Question> How can I pass the directory names to grep from find with xargs?

The problem is that xargs doesn't execute the command through the shell.
You should use -name '*.txt' to get the files directly in the find command. To exclude the *_X directories, you can use -prune:
find . -type d -name '*_X' -prune -o -name '*.txt' -exec grep 'Hello' {} +

Why not use --exclude with grep?
grep --exclude=*_X/*.txt Hello */*.txt

Related

Change permission of a directory and subdirectories with xargs and chmod commands in linux

i have a list of directories in the current directory named with there permission codes (exemple : 552, 700, 777). I want to get the code permission from the name of directory and apply it to the directory and all the files it contains.
I tried with the xargs command :
find . -name "[0-9][0-9][0-9]" -type d | xargs chmod -R [0-9][0-9][0-9]
the problem with this command it takes the first directory and its change the permission code of all directories.
├── 555
│   └── logs
│   ├── 01.log
│   ├── 02.log
│   ├── 03.log
│   ├── 04.log
│   ├── 05.log
│   ├── 06.log
│   └── 07.log
├── 700
│   └── data
│   └── data1.data
what I want : I have the 555 directory so I want to change all sub files and directory with permission code 555 and for the second directory I want to change all the subfiles and directory with the permission code 700
what my command do: it change all other files and subdirectories with the permission code of the first file 500
Try
find . -name "[0-9][0-9][0-9]" -type d | sed 's#\./\(.*\)#\1#' | xargs -I{} chmod -R {} {}
the find is the same as yours.
the sed is added to remove the ./ from the directory name. Find returns ./700, ./555, ...
xargs with -I uses {} to reuse what it received into the command. So it says "chmod -R DIRNAME DIRNAME". So chmod -R 700 700 and so on.
In your attempt, xargs chmod -R [0-9][0-9][0-9], there is nothing to link the [0-9] in the find to the [0-9] in xargs.
Without xargs
find . -type d -regextype sed -regex ".*/[0-9]\{3\}$"| awk -F"/" '{print "chmod -R " $NF,$0}'|sh
find:
find . -type d -name '[0-7][0-7][0-7]' \
-exec sh -c 'for i do chmod -R "${i##*/}" "$i"; done' _ {} +
or a bash loop:
shopt -s globstar
for i in **/[0-7][0-7][0-7]/; do
i=${i%/}
chmod -R "${i##*/}" "$i"; done
done

Find directories where a text is found in a specific file

How can I find the directories where a text is found in a specific file? E.g. I want to get all the directories in "/var/www/" that contain the text "foo-bundle" in the composer.json file. I have a command that already does it:
find ./ -maxdepth 2 -type f -print | grep -i 'composer.json' | xargs grep -i '"foo-bundle"'
However I want to make an sh script that gets all those directories and do things with them. Any idea?
find
Your current command is almost there, instead off using xargs with grep, lets:
Move the grep to an -exec
Use xargs to pass the result to dirname to show only the parent folder
find ./ -maxdepth 2 -type f -exec grep -l "foo-bundle" {} /dev/null \; | xargs dirname
If you only want to search for composer.json files, we can include the -iname option like so:
find ./ -maxdepth 2 -type f -iname '*composer.json' -exec grep -l "foo-bundle" {} /dev/null \; | xargs dirname
If the | xargs dirname doesn't give enough data, we can extend it so we can loop over the results of find using a while read like so:
find ./ -maxdepth 2 -type f -iname '*composer.json' -exec grep -l "foo-bundle" {} /dev/null \; | while read -r line ; do
parent="$(dirname ${line%%:*})"
echo "$parent"
done
grep
We can use grep to search for all files containing a specific text.
After looping over each line, we can
Remove behind the : to get the filepath
Use dirname to get the parent folder path
Consider this file setup, were /test/b/composer.json contains foo-bundle
➜ /tmp tree
.
├── test
│   ├── a
│   │   └── composer.json
│   └── b
│   └── composer.json
└── test.sh
When running the following test.sh:
#!/bin/bash
grep -rw '/tmp/test' --include '*composer.json' -e 'foo-bundle' | while read -r line ; do
parent="$(dirname ${line%:*})"
echo "$parent"
done
The result is as expected, the path to folder b:
/tmp/test/b
In order to find all files, containing a particular piece of text, you can use:
find ./ -maxdepth 2 -type f -exec grep -l "composer.json" {} /dev/null \;
The result is a list of filenames. Now all you need to do is to get a way to launch the command dirname on all of them. (I tried using a simple pipe, but that would have been too easy :-) )
Thanks to #0stone0 for leading the way. I finally got it with:
#!/bin/sh
find /var/www -maxdepth 2 -type f -print | grep -i 'composer.json' | xargs grep -i 'foo-bundle' | while read -r line ; do
parent="$(dirname ${line%%:*})"
echo "$parent"
done

Give out parent folder name if not containing a certain file

I am looking for a terminal linux command to give out the folder parent name that does not contain a certain file:
By now I use the following command:
find . -type d -exec test -e '{}'/recon-all.done \; -print| wc -l
Which gives me the amount of folders which contain then file.
The file recon-all.done would be in /subject/../../recon-all.done and I would need every single "subject" name which does not contain the recon-all.done file.
Loop through the directories, test for the existence of the file, and print the directory if the test fails.
for subject in */; do
if ! [ -e "${subject}scripts/recon-all.done" ]
then echo "$subject"
fi
done
Your command;
find . -type d -exec test -e '{}'/recon-all.done \; -print| wc -l
Almost does the job, we'll just need to
Remove | wc -l to show the directory path witch does not contain the recon-all file
Now, we can negate the -exec test by adding a ! like so:
find . -type d \! -exec test -e '{}'/recon-all.done \; -print
This way find will show each folder name if it does not contain the recon-all file
Note; Based on your comment on Barmar's answer, I've added a -maxdepth 1 to prevent deeper directorys from being checked.
Small example from my local machine:
$ /tmp/test$ tree
.
├── a
│   └── test.xt
├── b
├── c
│   └── test.xt
└── x
├── a
│   └── test.xt
└── b
6 directories, 3 files
$ /tmp/test$ find . -maxdepth 1 -type d \! -exec test -e '{}/test.xt' \; -print
.
./b
./x
$ /tmp/test$

How to get occurrences of word in all files? But with count of the words per directory instead of single number

I would like to get given word count in all the files but per directory instead of a single count. I do get the word count with simple grep foo error*.log | wc -l by going to a specific directory. I would like to get the word count per directory when the directory structure is like below.
Directory tree
.
├── dir1
│   └── error2.log
└── error1.log
└── dir2
└── error_123.log
└── error_234.log
── dir3
└── error_12345.log
└── error_23554.log
Update: The following command can be used on AIX:
#!/bin/bash
for name in /path/to/folder/* ; do
if [ ! -d "${name}" ] ; then
continue
fi
# See: https://unix.stackexchange.com/a/398414/45365
count="$(cat "${name}"/error*.log | tr '[:space:]' '[\n*]' | grep -c 'SEARCH')"
printf "%s %s\n" "${name}" "${count}"
done
On GNU/Linux, with GNU findutils and GNU grep:
find /path/to/folder -maxdepth 1 -type d \
-printf "%p " -exec bash -c 'grep -ro 'SEARCH' {} | wc -l' \;
Replace SEARCH by the actual search term.

Pinpoint archive file from a list of archive files where the target file is zipped

I have a directory structure like this -
./Archive1
./Archive1/Archive2
./Archive1/Archive3
In each directory there are many tar files. Say for example (don't go with the name, they are just for example) -
Archive1
├── Archive2
│ ├── tenth.tar.gz
│ └── third.tar.gz
├── Archive3
│ ├── fourth.tar.gz
│ └── sixth.tar.gz
├── fifth.tar.gz
├── first.tar.gz
└── second.tar.gz
Now I have a file file.txt, that could reside in any tar file. I need a command that would give me the output as which tar file have my input file (file.txt) and also the absolute path of the tar file.
So for example if test.txt is in sixth.tar.gz. The output will be sixth.tar.gz and ./Archive1/Archive3/
Currently I have this command, but the drawback of the command is, it is listing all the tar files -
find . -maxdepth "3" -type f -name "*.tar.gz" -printf [%f]\\n -exec tar -tf {} \; | grep -iE "[\[]|file.txt"
For each tar file, you can run tar | grep, and if there is a match, print the tar file's name. One way to do this is by running a shell command for each tar file. For a small number of files, and if performance is not too important, this might be good enough and it's fairly straightforward.
find . -maxdepth "3" -type f -name "*.tar.gz" -exec sh -c 'tar tf {} | grep -iEq "[\[]|file.txt" && echo {}' \;
So for example if test.txt is in sixth.tar.gz, the output will be ./Archive1/Archive3/sixth.tar.gz.

Resources