LINUX - shell script finding and listing all files with rights to write in directory tree - linux

Here is the code that i have soo far :
echo $(pwd > adress)
var=$(head -1 adress)
rm adress
found=0 #Flag
fileshow()
{
cd $1
for i in *
do
if [ -d $i ]
then
continue
elif [ -w $i ]
then
echo $i
found=1
fi
done
cd ..
}
fileshow $1
if [ $found -eq 0 ]
then
clear
echo "$(tput setaf 1)There arent any executable files !!!$(tput sgr0)"
fi
Its working but it find files only in current directory.
I was told that i need to use some kind of recursive method to loop through all sub-directories but i dont know how to do it.
So if any one can help me i will be very grateful.
Thanks!

The effect of your script is to find the files below the current working directory that are not directories and are writeable to the current user. This can be achieved with the command:
find ./ -type f -writable
The advantage of using -type f is that it also excludes symbolic links and other special kinds of file, if that's what you want. If you want all files that are not directories (as suggested by your script), then you can use:
find ./ ! -type d -writable
If you want to sort these files (added question, assuming lexicographic ascending order), you can use sort:
find ./ -type f -writable | sort
If you want to use these sorted filenames for something else, the canonical pattern would be (to handle filenames with embedded newlines and other seldom-used characters):
while read -r -d $'\0'; do
echo "File '$REPLY' is an ordinary file and is writable"
done < <(find ./ -type f -writable -print0 | sort -z)
If you're using a very old version of find that does not support the handy -writable predicate (added to v.4.3 in 2005), then you only have file permissions to go on. You then have to be clear about what you mean by “writable” in the specific context (writable to whom?), and you can replace the -writable predicate with the -perm predicates described in #gregb's answer. If you decide that you mean “writable by anyone” you could use -perm /u=w,g=w,o=w or -perm /222, but there's actually no way of getting all the benefits of -writable just using permissions. Also note that the + form of permission tests to -perm is deprecated and should no longer be used; the / form should be used instead.

You could use find:
find /path/to/directory/ -type f -perm -o=w
Where the -o=w implies that each file has the "other write-permission" set.
or,
find /path/to/directory/ -type f -perm /u+w,g+w,o+w
Where /u+w,g+w,o+w implies that each file either has user, group, or other write-permissions set.

Related

linux command line recursively check directories for at least 1 file with the same name as the directory

I have a directory containing a large number of directories. Each directory contains some files and in some cases another directory.
parent_directory
sub_dir_1
sub_dir_1.txt
sub_dir_1_1.txt
sub_dir_2
sub_dir_2.txt
sub_dir_2_1.txt
sub_dir_3
sub_dir_3.txt
sub_dir_3_1.txt
sub_dir_4
sub_dir_4.txt
sub_dir_4_1.txt
sub_dir_5
sub_dir_5.txt
sub_dir_5_1.txt
I need to check that each sub_dir contains at least one file with the exact same name. I don' need to check any further down if there are sub directories within the sub_dirs.
I was thinking of using for d in ./*/ ; do (command here); done but I dont know how to get access to the sub_dir name inside the for loop
for d in ./*/ ;
do
(if directory does not contain 1 file that is the same name as the directory then echo directory name );
done
What is the best way to do this or is there a simpler way?
from the parent directory
find -maxdepth 1 -type d -printf "%f\n" |
xargs -I {} find {} -maxdepth 1 -type f -name {}.txt
will give you the name/name.txt pair. Compare with the all dir names to find the missing ones.
UPDATE
this might be simpler, instead of scanning you can check whether file exists or not
for f in $(find -maxdepth 1 -type d -printf "%f\n");
do if [ ! -e "$f/$f.txt" ];
then echo "$f not found";
fi; done
Maybe not understand fully, but
find . -print | grep -P '/(.*?)/\1\.txt'
this will print any file which is inside of the same-named directory, e.g:
./a/b/b.txt
./a/c/d/d.txt
etc...
Similarly
find . -print | sed -n '/\(.*\)\/\1\.txt/p'
this
find . -print | grep -P '/(.*?)/\1\.'
will list all files regardless of the extension in same-named dirs.
You can craft other regexes following the backreference logic.

How to rename directory and subdirectories recursively in linux?

Let say I have 200 directories and it have variable hierarchy sub-directories, How can I rename the directory and its sub directories using mv command with find or any sort of combination?
for dir in ./*/; do (i=1; cd "$dir" && for dir in ./*; do printf -v dest %s_%02d "$dir" "$((i++))"; echo mv "$dir" "$dest"; done); done
This is for 2 level sub directory, is there more cleaner way to do it for multiple hierarchy? Any other one line command suggestions/ solutions are welcome.
I had a specific task - to replace non-ASCII symbols and square brackets, in directories and in files as well. It works fine.
First, exactly my case, as a working example:
find . -depth -execdir rename -v 's/([^\x00-\x7F]+)|([\[\]]+)/\_/g' {} \;
or separately non-ascii and brackets:
find . -depth -execdir rename -v 's/[^\x00-\x7F]+/\_/g' {} \;
find . -depth -execdir rename -v 's/[\[\]]+/\_/g' {} \;
If we'd like to work only with directories, add -type d (after the -depth option)
Now, in more generalized view:
find . -depth [-type d] [-type f] -execdir rename [-v] 's/.../.../g' '{}' \;
Here we can control dirs/files and verbosity. Quotes around {} may be needed or not on your machine (backslash before ; serves the same, may be replaced with quotes)
You have two options when you want to do recursive operations in files/directories:
Option 1 : Find
while IFS= read -r -d '' subd;do
#do your stuff here with var $subd
done < <(find . -type d -print0)
In this case we use find to return only dirs using -type d
We can ask find to return only files using -type f or not to specify any type and both directories and files will be returned.
We also use find option -print0 to force null separation of the find results and thus to ensure correct names handling in case names include special chars like spaces, etc.
Testing:
$ while IFS= read -r -d '' s;do echo "$s";done < <(find . -type d -print0)
.
./dir1
./dir1/sub1
./dir1/sub1/subsub1
./dir1/sub1/subsub1/subsubsub1
./dir2
./dir2/sub2
Option 2 : Using Bash globstar option
shopt -s globstar
for subd in **/ ; do
#Do you stuff here with $subd directories
done
In this case , the for loop will match all subdirs under current working directory (operation **/).
You can also ask bash to return both files and folders using
for sub in ** ;do #your commands;done
if [[ -d "$sub" ]];then
#actions for folders
elif [[ -e "$sub" ]];then
#actions for files
else
#do something else
fi
done
Folders Test:
$ shopt -s globstar
$ for i in **/ ;do echo "$i";done
dir1/
dir1/sub1/
dir1/sub1/subsub1/
dir1/sub1/subsub1/subsubsub1/
dir2/
dir2/sub2/
In your small script, just by enabling shopt -s globstar and by changing your for to for dir in **/;do it seems that work as you expect.

Linux rename files as dirname

i got lots of files like this:
./1/wwuhw.mp3
./2/nweiewe.mp3
./3/iwqjoiw.mp3
./4/ncionw.MP3
./5/joiwqfm.wmv
./6/jqoifiew.WMV
how can i rename them like this in Linux Bash:
./1/1.mp3
./2/2.mp3
./3/3.mp3
./4/4.MP3
./5/5.wmv
./6/6.WMV
Try this,
for i in */*; do mv $i $(dirname $i)/$(dirname $i).${i##*.}; done
For loop iterates over each file in directory one by one. and mv statement renames the each file in directory one by one.
Something like this should do the job:
for i in */*; do
echo mv "${i}" "${i%/*}/${i%/*}.${i##*.}"
done
See e.g. here, what this cryptic parameter expansions (like ${i%/*}) mean in bash.
The script above will only print the commands in the console, without invoking them. Once you are sure you want to proceed, you can remove the echo statement and let it run.
If you don't mind using external tool, then rnm can do this pretty easily:
rnm -ns '/pd0/./e/' */*
/pd0/ is the immediate parent directory, /pd1/ is the directory before that and so forth.
-ns means name string and /pd/ and /e/ are name string rules which expands to parent directory and file extension respectively.
The general format of the /pd/ rule is /pd<digit>-<digit>-<delim>/, for example, a rule like /pd0-2-_/ will construct dir0_dir1_dir2 from a directory structure of dir2/dir1/dir0
More examples can be found here.
The for loop method, as outlined in some of the other answers, would suffice and work great for most cases where you need to rename every file in a directory to the first parent's directory name. My particular case called for a bit more granularity, where I only wanted to rename a subset of the files in a directory and assert that the operand was, in fact, an actual file, not an empty directory, symbolic link, etc. Using find can achieve exactly what you want in addition to the added ability to apply filtration and processing to the file inputs and outputs.
#####################################
# Same effect as using a `for` loop #
#####################################
#
# -mindepth 2 : ensures that the file has a parent directory.
# -type f : ensures that we are working with a `regular file` (not directory, symlink, etc.).
find . -mindepth 2 -type f -exec bash -c 'file="{}"; dir="$(dirname $file)"; mv "$file" "$dir/${dir##*/}.${file##*.}"' \;
#########################
# Additional filtration #
#########################
# mp3 ONLY (case insensitive)
find . -mindepth 2 -type f -iname "*.mp3" -exec bash -c 'file="{}"; dir="$(dirname $file)"; mv "$file" "$dir/${dir##*/}.${file##*.}"' \;
# mp3 OR mp4 ONLY (case insensitive)
find . -mindepth 2 -type f \( -iname "*.mp3" -or -iname "*.mp4" \) -exec bash -c 'file="{}"; "dir=$(dirname $file)"; mv "$file" "$dir/${dir##*/}.${file##*.}"' \;

Bash - how to exclude directory with find command and how to get full path with find?

so I have the code right now down below, and I'm running into a few problems with it
I'm having trouble excluding the directories being outputted by
find ${1-.}
It is giving me the directories too instead of only names; I've tried different methods such as -prune etc.
I'm having trouble with deleting the empty files
The data given to me by
EMPTY_FILE=$(find ${1-.} -size 0)
Does not give me the correct path
Here is the output for that
TestFolder/TestFile
in this case I can't just do:
rm TestFolder/TestFile
As it is invalid path; since it needs ./TestFolder/TestFile
How would I add on the ./ or is there away to get the full path.
#!/bin/bash
echo "Here are all the files in the directory specified\n"
find ${1-.}
EMPTY_FILE=$(find ${1-.} -size 0)
echo "Here are the list of empty files\n"
echo "$EMPTY_FILE \n"
echo "Do you want to delete those empty files?(yes/no)"
read text
if [ "$text" == "yes" ]; then $(rm -- $EMPTY_FILE); fi
Any help is appreciated!
You want this:
#!/bin/bash
echo -e "Here are all the files in the directory specified\n"
# Use -printf "%f\n" to print the filename without leading directories
# Use -type f to restrict find to files
find "${1-.}" -type f -printf " %f\n"
echo -e "Here are the list of empty files\n"
# Again, use -printf "%f\n"
find "${1-.}" -type f -size 0 -printf " %f\n"
echo -e "Do you want to delete those empty files?(yes/no)"
read answer
# Delete files using the `-delete` option
[ "$answer" = "yes" ] && find "${1-.}" -type f -size 0 -delete
Also note that I've quotes "${1-.}" at all occurrences. Since it is user input, you can't rely on the input. Even if it is a path, it might still contain problematic characters, like spaces.
I'm having trouble excluding the directories being outputted by
find ${1-.}
It is giving me the directories too instead of only names
You are looking for the -type test. To instruct find to report only regular files, you could say
find ${1-.} -type f
That's probably what you really want, but what you actually asked (to exclude only directories) would be
find ${1-.} -not -type d
Excluding only directories will list symbolic links and special files, too.
in this case I can't just do:
rm TestFolder/TestFile
As it is invalid path; since it needs ./TestFolder/TestFile
Nonsense. ./TestFolder/TestFile means exactly the same thing as TestFolder/TestFile.
In any event, find does print paths starting at the specified starting path(s).
I have a feeling that I'm missing something from your question, but if all you need to do is exclude directories, just tell find to only look for files:
find . -type f -size 0 -delete
And then adjust that to suit your script. Hope this helps.
-size 0 -type f
rm with no option will not delete directories . Your claim that rm needs ./ is wrong anyway.

A bash script to run a program for directories that do not have a certain file

I need a Bash Script to Execute a program for all directories that do not have a specific file and create the output file on the same directory.This program needs an input file which exist in every directory with the name *.DNA.fasta.Suppose I have the following directories that may contain sub directories also
dir1/a.protein.fasta
dir2/b.protein.fasta
dir3/anyfile
dir4/x.orf.fasta
I have started by finding the directories that don't have that specific file whic name is *.protein.fasta
in this case I want the dir3 and dir4 to be listed (since they do not contain *.protein.fasta)
I have tried this code:
find . -maxdepth 1 -type d \! -exec test -e '{}/*protein.fasta' \; -print
but it seems I missed some thing it does not work.
also I do not know how to proceed for the whole story.
This is a tricky one.
I can't think of a good solution. But here's a solution, nevertheless. Note that this is guaranteed not to work if your directory or file names contain newlines, and it's not guaranteed to work if they contain other special characters. (I've only tested with the samples in your question.)
Also, I haven't included a -maxdepth because you said you need to search subdirectories too.
#!/bin/bash
# Create an associative array
declare -A excludes
# Build an associative array of directories containing the file
while read line; do
excludes[$(dirname "$line")]=1
echo "excluded: $(dirname "$line")" >&2
done <<EOT
$(find . -name "*protein.fasta" -print)
EOT
# Walk through all directories, print only those not in array
find . -type d \
| while read line ; do
if [[ ! ${excludes[$line]} ]]; then
echo "$line"
fi
done
For me, this returns:
.
./dir3
./dir4
All of which are directories that do not contain a file matching *.protein.fasta. Of course, you can replace the last echo "$line" with whatever you need to do with these directories.
Alternately:
If what you're really looking for is just the list of top-level directories that do not contain the matching file in any subdirectory, the following bash one-liner may be sufficient:
for i in *; do test -d "$i" && ( find "$i" -name '*protein.fasta' | grep -q . || echo "$i" ); done
#!/bin/bash
for dir in *; do
test -d "$dir" && ( find "$dir" -name '*protein.fasta' | grep -q . || Programfoo"$dir/$dir.DNA.fasta");
done

Resources