Why check if file is exists in shell always false? - linux

I created a cron using bash to delete files older than 3 days, but when checking the age of the files with mtime +3 &> /dev/null it is always false. here's the script:
now=$(date)
create log file
file_names=('*_takrib_golive.gz' '*_takrib_golive_filestore.tar.gz')
touch /media/nfs/backup/backup_delete.log
echo "Date: $now" >> /media/nfs/backup/backup_delete.log
for filename in "${file_names[#]}";
do
echo $filename
if ls /media/nfs/backup/${filename} &> /dev/null
then
echo "backup files exist"
if find /media/nfs/backup -maxdepth 1 -mtime +3 -name ${filename} -ls &> /dev/null
then
echo "The following backup file was deleted" >> /media/nfs/backup/backup_delete.log
find /media/nfs/backup -maxdepth 1 -mtime +3 -name ${filename} -delete
else
echo "There are no ${filename} files older than 3 days in /media/nfs/backup" &>> /media/nfs/backup/backup_delete.log
fi
else
echo "No ${filename} files found in /media/nfs/backup" >> /media/backup/backup_delete.log
fi
done
exit 0
in if find /media/nfs/backup -maxdepth 1 -mtime +3 -name ${filename} -ls &> /dev/null always goes to else, even though files older than 3 days are in the directory

You are not quoting the -name attribute so it expands to the name of the file which already exists.
I would refactor this rather extensively anyway. Don't parse ls output and perhaps simplify this by making it more obvious when to quote and when not to.
Untested, but hopefully vaguely useful still:
#!/bin/bash
backup=/media/nfs/backup
backuplog=$backup/backup_delete.log
# no need to touch if you write to the file anyway
date +"Date: %C" >> "$backuplog"
# Avoid using a variable, just loop over the stems
for stem in takrib_golive takrib_golive_filestore.tar
do
# echo $filename
# Avoid parsing ls; instead, loop over matches
for filename in "$backup"/*_"$stem".gz; do
pattern="*_$stem.gz"
if [ -e "$filename" ]; then
echo "backup files exist"
if files=$(find "$backup" -maxdepth 1 -mtime +3 -false -o -name "$pattern" -print -delete)
then
echo "The following backup file was deleted" >> "$backuplog"
echo "$files" >> "$backuplog"
else
echo "There are no $pattern files older than 3 days in $backup" >> "$backuplog"
fi
else
echo "No $pattern files found in $backup" >> "$backuplog"
fi
# Either way, we can break the loop after one iteration
break
done
done
# no need to explicitly exit 0
The for + if [ -e ... ] arrangement is slightly clumsy, but that's how you check if a wildcard matched any files. If the wildcard did not match, if [ -e will check for a file whose name is literally the wildcard expression itself, and fail.

Related

How to log the compressed zip files when using a function in shell script

I am working on a shell script which should validate .gz files in multiple folders in linux and then gzip them if a particular file not zipped and if already the file is zipped, purge them with following condition.
a) All these files in folders have *.log..gz as extension
So i was using functions and find cmd to achieve the same.
Script seems to be working fine but its not logging the zipped files information to log file, however its spooling about already zipped files in the folder to log. is this the correct way using functions?
#!/bin/bash
DIR_PATH="/var/log"
LOG="/tmp/test.log"
VARLOG_PATH=("$DIR_PATH"{"Kevin","John","Robin","Pavan"})
fun_zip_log(){
for i in `find "$i" -type f \( -name "*.log.20*" 2>/dev/null \) `; do
gzip "$i" ; done >> $LOG
}
fun_purge_log(){
for i in `find "$i" -type f \( -name "log.20*" 2>/dev/null \) `; do rm -f
"$i" ; done >> $LOG
}
validate_zip(){
for file in $i/*.gz
do
if ! [ -f "$file" ];
then
echo "$file is getting zipped" >> $LOG
fun_zip_log "$i"
else
echo "$file is already zipped" >> $LOG
fun_purge_log "$i"
fi
done
}
#MainBlock
for i in "${VARLOG_PATH[#]}"
do
if [ -d "$i" ] && [ "$(ls -A "$i" |wc -l )" -gt 0 ]; then
echo "Searching for files in directory : "$i" " >> $LOG
validate_zip "$i"
else
echo "No files exist in directory : "$i" " >> $LOG
fi
done
exit
####LOG FILE###
Searching for files in directory : /var/log/Kevin
[*.gz] is getting zipped.
Searching for files in directory : /var/log/John
/var/log/John/instrumentation.log.2018-06-20.gz is already zipped
/var/log/John/instrumentation.log.2018-06-21.gz is already zipped
No files exist in directory : /var/log/Robin
Searching for files in directory : /var/log/Pavan
[*.gz] is getting zipped.
Your code is very muddled and confusing. For example in this:
fun_purge_log(){
for i in `find "$i" -type f \( -name "log.20*" 2>/dev/null \) `; do rm -f
"$i" ; done >> $LOG
}
for file in $i/*.gz
do
...
fun_purge_log "$i"
In the calling code you're looping setting a variable file but then passing the directory "$i" to your function to then try to find the files again.
Within fun_purge_log() you're ignoring the argument being passed in and then using the global variable i as both the directory argument to find and also to loop through the list of files output by find - why not pick a new variable name and use some local variables?
I can't imagine what you think 2>/dev/null is going to do in \( -name "*log.20*" 2>/dev/null \).
You're trying to append something to $LOG but you aren't printing anything to append to it.
Run your code through shellcheck (e.g. at shellcheck.net), read http://mywiki.wooledge.org/BashFAQ/001, https://mywiki.wooledge.org/Quotes and https://mywiki.wooledge.org/ParsingLs, and really just THINK about what each line of your code is doing. Correct the issues yourself and then let us know if you still have a problem. Oh, and by convention and to avoid clashing with other variables don't use all capitals for non-exported variable names and lastly use $(command) instead of `command`.

how to delete files from an array after a check?

I have this code:
#!/bin/bash
path="/home/asdf"
dateminusoneday=$(date +%m --date='-1 month')
date=$(date +"%Y-$dateminusoneday-%d")
list=$(find /home/asdf | grep -P '\d{4}\-\d{2}\-\d{2}' -o)
listArray=($list)
for i in "${listArray[#]}"
do
echo $i
if [[ $i < $date ]]; then
echo "delete file"
else
echo "no need delete this file" fi done
I need to delete the smallest files that date. but I do not get it
What would be the most optimal way?
thanks all.
From your code I see that you are trying to delete files older than one month. If I am not mistaken and you can accept that (1 month)==(30 days) you can use such one-liner:
find "$path" -mtime +30 -delete
If you want exactly 1 mont (not 30 days) you can use:
#!/bin/bash
path="/home/asdf"
number_of_days=$((($(date '+%s')-$(date -d '1 month ago' '+%s'))/86400))
find "$path" -mtime +$number_of_days -delete

How to get echo to print only deleted file paths?

I'm trying to write a script to create mysqldumps daily in a directory as well as check all the backups in that directory and remove any older than 7 days that is going to run on cron.
So my functions work correctly, it's just my last echo command that is not doing what I want it to. This is what I have so far:
DBNAME=database
DATE=`date +\%Y-\%m-\%d_\%H\%M`
SQLFILE=$DBNAME-${DATE}.sql
curr_dir=$1
#MAIN
mysqldump -u root -ppassword --databases $DBNAME > $SQLFILE
echo "$SQLFILE has been successfully created."
#Remove files older than 7 days
for filepath in "$curr_dir"*
do
find "$filepath" -mtime +7 -type f -delete
echo "$filepath has been deleted."
done
exit
So the backup creations and removal of old files both work. But, my problem is that echo "$filepath has been deleted." is printing all files in the directory instead of just the files older than 7 days that were deleted. Where am I going wrong here?
EDIT (Full solution):
This is the full solution that wound up working for me using everyone's advice from the answers and comments. This works for cron jobs. I had to specify the main function's output filepath because the files were being created in the root directory instead of the path specified in Argument $1.
Thank you everyone for the help! The if statement also checks whether or not $1 is the specified directory I want files to be deleted in.
#Variables
DBNAME=database
DATE=`date +\%Y-\%m-\%d_\%H\%M`
SQLFILE=$DBNAME-${DATE}.sql
curr_dir=$1
#MAIN
mysqldump -u root -ppassword --databases $DBNAME > /path/to/db-backups/directory/$SQLFILE
echo "$SQLFILE has been successfully created."
#Remove files older than 7 days
for filepath in "$curr_dir"*
do
if [[ $1 = "/path/to/db-backups/directory" ]]; then
find "$filepath" -mtime +7 -type f -delete -exec sh -c 'printf "%s has been deleted.\n" "$#"' _ {} +
fi
done
exit
You can merge the echo into the find:
find "$filepath" -mtime +7 -type f -delete -exec echo '{}' "has been deleted." \;
The -delete option is just a shortcut for -exec rm '{}' \; and all the -exec commands are run in the sequence you specify them in.

Command script recognizes files as directories

The following code should count the number of elements that a directory contains, but as well as it does it correctly, it also recognizes every element inside the current directory as a directory .
I don't know how not to show the elements that are not directories. How could I do it?
Code is here: http://pastebin.com/9R4eB4Xn
termlog.txt:
https://justpaste.it/tgsl
As you may see, some files like .jpg or .zip are recognized as directories.
Your echo "Element is a directory" is between the if and the then. Move it after then :
for i in *
do
if [ ! -f "$i" ] && [ -d "$i" ]
then
echo "Element is a directory"
FILES=`ls -l "$i" | wc -l` # List the content of "$i" directory
# and count the number of lines
FILES2=`expr $FILES - 1` # Substract one because one line is
# occupied with the number of blocks
echo "$i: $FILES2" # Shows the name of the directory and
# the number of inputs that it has
fi
done
for i in `find DIRECTORY -maxdepth 2 -type d`; do echo "$i: `ls -1 $i | wc -l`"; done
If only interested in current directory, replace DIRECTORY with .

How to display true if find is not empty

I am very new to bash. I have just started to learn last week. I am trying to search for a file name.
How can I display a message if the file is found?
this is what i have but it keeps saying 'no'
echo ' [Enter] a file name '
read findFile
if [[ -n $(find /$HOME -type f -name "findFile") ]]
then
echo 'yes'
else
echo 'no'
fi
A few issues:
Use var= or read var when defining a variable, but $var when using it.
There is no reason to keep searching after finding a file, so do something like below, where find will -quit after finding a single file and return it as a result of the -print
#!/bin/bash
echo ' [Enter] a file name '
read findFile
if [[ -f $(find "$HOME" -type f -name "$findFile" -print -quit) ]]; then
echo 'yes'
else
echo 'no'
fi
Note that the option -quit will work on GNU and FreeBSD operating systems (which means this will work in most cases), but for example, you will need to change it to -exit on NetBSD.
You can see this answer from Unix/Linux StackExchange for details on this option.
Also note, per Adaephon's comment, that although the / is not needed in front of $HOME, it's not wrong and the files will still be found .
Use wc to count the number of lines in the find output:
if [ $(find $HOME -type f -name "thisFile" 2> /dev/null | wc -l) -gt 0 ]; then
echo 'yes'
else
echo 'no'
fi
the 2> /dev/null part hides possible error messages.

Resources