Command script recognizes files as directories - linux

The following code should count the number of elements that a directory contains, but as well as it does it correctly, it also recognizes every element inside the current directory as a directory .
I don't know how not to show the elements that are not directories. How could I do it?
Code is here: http://pastebin.com/9R4eB4Xn
termlog.txt:
https://justpaste.it/tgsl
As you may see, some files like .jpg or .zip are recognized as directories.

Your echo "Element is a directory" is between the if and the then. Move it after then :
for i in *
do
if [ ! -f "$i" ] && [ -d "$i" ]
then
echo "Element is a directory"
FILES=`ls -l "$i" | wc -l` # List the content of "$i" directory
# and count the number of lines
FILES2=`expr $FILES - 1` # Substract one because one line is
# occupied with the number of blocks
echo "$i: $FILES2" # Shows the name of the directory and
# the number of inputs that it has
fi
done

for i in `find DIRECTORY -maxdepth 2 -type d`; do echo "$i: `ls -1 $i | wc -l`"; done
If only interested in current directory, replace DIRECTORY with .

Related

Displaying file content with bash scripting

I am trying to write a bash script (display) that will allow me to access a directory, list the files, and then display the content of all of the files. So far I am able to access the directory and list the files.
#!/bin/bash
#Check for folder name
if [ "$#" -ne 1 ]; then
echo " Usage: count [folder name]"
exit 1
fi
#Check if it is a directory
if [ ! -d "$1" ]; then
echo "Not a valid directory"
exit 2
fi
#Look at the directory
target=$1
echo "In Folder: $target"
for entry in `ls $target`; do
echo $entry
done
So if I use the command ./display [directory] it will list the files. I want to display the contents of all of the files as well but I am stuck. Any help would be appreciated thanks!
Use find to find files. Use less to display files interactively or cat otherwise.
find "$target" -type f -exec less {} \;
I thin a loop similar to your "look at the directory" loop would suffice, but using the cat command instead of ls

How to log the compressed zip files when using a function in shell script

I am working on a shell script which should validate .gz files in multiple folders in linux and then gzip them if a particular file not zipped and if already the file is zipped, purge them with following condition.
a) All these files in folders have *.log..gz as extension
So i was using functions and find cmd to achieve the same.
Script seems to be working fine but its not logging the zipped files information to log file, however its spooling about already zipped files in the folder to log. is this the correct way using functions?
#!/bin/bash
DIR_PATH="/var/log"
LOG="/tmp/test.log"
VARLOG_PATH=("$DIR_PATH"{"Kevin","John","Robin","Pavan"})
fun_zip_log(){
for i in `find "$i" -type f \( -name "*.log.20*" 2>/dev/null \) `; do
gzip "$i" ; done >> $LOG
}
fun_purge_log(){
for i in `find "$i" -type f \( -name "log.20*" 2>/dev/null \) `; do rm -f
"$i" ; done >> $LOG
}
validate_zip(){
for file in $i/*.gz
do
if ! [ -f "$file" ];
then
echo "$file is getting zipped" >> $LOG
fun_zip_log "$i"
else
echo "$file is already zipped" >> $LOG
fun_purge_log "$i"
fi
done
}
#MainBlock
for i in "${VARLOG_PATH[#]}"
do
if [ -d "$i" ] && [ "$(ls -A "$i" |wc -l )" -gt 0 ]; then
echo "Searching for files in directory : "$i" " >> $LOG
validate_zip "$i"
else
echo "No files exist in directory : "$i" " >> $LOG
fi
done
exit
####LOG FILE###
Searching for files in directory : /var/log/Kevin
[*.gz] is getting zipped.
Searching for files in directory : /var/log/John
/var/log/John/instrumentation.log.2018-06-20.gz is already zipped
/var/log/John/instrumentation.log.2018-06-21.gz is already zipped
No files exist in directory : /var/log/Robin
Searching for files in directory : /var/log/Pavan
[*.gz] is getting zipped.
Your code is very muddled and confusing. For example in this:
fun_purge_log(){
for i in `find "$i" -type f \( -name "log.20*" 2>/dev/null \) `; do rm -f
"$i" ; done >> $LOG
}
for file in $i/*.gz
do
...
fun_purge_log "$i"
In the calling code you're looping setting a variable file but then passing the directory "$i" to your function to then try to find the files again.
Within fun_purge_log() you're ignoring the argument being passed in and then using the global variable i as both the directory argument to find and also to loop through the list of files output by find - why not pick a new variable name and use some local variables?
I can't imagine what you think 2>/dev/null is going to do in \( -name "*log.20*" 2>/dev/null \).
You're trying to append something to $LOG but you aren't printing anything to append to it.
Run your code through shellcheck (e.g. at shellcheck.net), read http://mywiki.wooledge.org/BashFAQ/001, https://mywiki.wooledge.org/Quotes and https://mywiki.wooledge.org/ParsingLs, and really just THINK about what each line of your code is doing. Correct the issues yourself and then let us know if you still have a problem. Oh, and by convention and to avoid clashing with other variables don't use all capitals for non-exported variable names and lastly use $(command) instead of `command`.

How to prefix folders and files within?

I'm stuck looking for a one-liner to add a prefix to all subfolder names and file names in a directory
eg "AAA" in the examples below
/folder/AAAfile.txt
/folder/AAAread/AAAdoc.txt
/folder/AAAread/AAAfinished/AAAread.txt
I've tried using xargs and find, but can't get them to go recursively through the subdirectories and their contents. Any suggestions?
James
You could use something like that
find . -mindepth 1 | sort -r | xargs -l -I {} bash -c 'mv $1 ${1%/*}/AAA${1##*/}' _ {}
Tested with your folder structure, executed from the root (same as AAAfile.txt).
The following script should meet your need (ran it from inside your folder directory):
for i in `ls -R`;do
dname=`dirname $i`
fname=AAA`basename $i`
if [ -f $i ]
then
mv $i $dname/$fname
fi
#this could be merged with previous condition but have been kept just to avoid invalid directory warning
if [ -d $i ]
then
mv $i $dname/$fname
fi
done

Verify if a file was created shell

I have to write a shell script that will monitor some folders given in the command and give a message if a certain file will be created inside them (the name of the file will be read from keyboard).
Can anyone tell me why is this not working?
#!/bin/sh
f=`read filename`
isIn=0
for dir in $*
do
if [ ! -d $dir ]
then
echo $dir is not a directory.
fi
for i in `find $dir -type f`
do
if [ $f=$i ]
then
echo The file $f already exists.
isIn=1
break
fi
done
if [ $isIn -eq 0 ]
then
sleep 20
isIn=0
for i in `find $dir -type f`
do
if [ $f=$i ]
then
echo The file was created\!
isIn=1
break
fi
done
fi
if [ $isIn -eq 0 ]
then
echo The file was not created\!
fi
done
The idea i used was that i take all the files from the directory and verify is the file isn't already there.
If it is - show message and move to the next directory.
If not then I 'wait'. if in the time i waited that certain file was created, it would have appeared in the list of all files, and i check for it.
My problem is that no matter what file I read from the keyboard, i would get the message "The file already exists." without telling me the name of the file.
Replace f=`read filename` with the correct usage read f or read -p filename: f.
find $dir -type f prints the full file name including the directory path. Since you want just the basename, replace both lines
for i in `find $dir -type f`
with
for i in `find $dir -type f -printf '%f\n'`
Each operator and operand in [ ] must be a separate argument. Thus replace both lines
if [ $f=$i ]
with
if [ "$f" = $i ]

A bash script to run a program for directories that do not have a certain file

I need a Bash Script to Execute a program for all directories that do not have a specific file and create the output file on the same directory.This program needs an input file which exist in every directory with the name *.DNA.fasta.Suppose I have the following directories that may contain sub directories also
dir1/a.protein.fasta
dir2/b.protein.fasta
dir3/anyfile
dir4/x.orf.fasta
I have started by finding the directories that don't have that specific file whic name is *.protein.fasta
in this case I want the dir3 and dir4 to be listed (since they do not contain *.protein.fasta)
I have tried this code:
find . -maxdepth 1 -type d \! -exec test -e '{}/*protein.fasta' \; -print
but it seems I missed some thing it does not work.
also I do not know how to proceed for the whole story.
This is a tricky one.
I can't think of a good solution. But here's a solution, nevertheless. Note that this is guaranteed not to work if your directory or file names contain newlines, and it's not guaranteed to work if they contain other special characters. (I've only tested with the samples in your question.)
Also, I haven't included a -maxdepth because you said you need to search subdirectories too.
#!/bin/bash
# Create an associative array
declare -A excludes
# Build an associative array of directories containing the file
while read line; do
excludes[$(dirname "$line")]=1
echo "excluded: $(dirname "$line")" >&2
done <<EOT
$(find . -name "*protein.fasta" -print)
EOT
# Walk through all directories, print only those not in array
find . -type d \
| while read line ; do
if [[ ! ${excludes[$line]} ]]; then
echo "$line"
fi
done
For me, this returns:
.
./dir3
./dir4
All of which are directories that do not contain a file matching *.protein.fasta. Of course, you can replace the last echo "$line" with whatever you need to do with these directories.
Alternately:
If what you're really looking for is just the list of top-level directories that do not contain the matching file in any subdirectory, the following bash one-liner may be sufficient:
for i in *; do test -d "$i" && ( find "$i" -name '*protein.fasta' | grep -q . || echo "$i" ); done
#!/bin/bash
for dir in *; do
test -d "$dir" && ( find "$dir" -name '*protein.fasta' | grep -q . || Programfoo"$dir/$dir.DNA.fasta");
done

Resources