Shell script to loop and delete - linux

Could someone help me on this.I have below folder structure as shown below .I want to loop through every folder inside the backuptest and delete all the folders except today date folder.i want it run as a cron job

Use find for this:
today="$(date +%Y-%m-%d)"
find /path/to/backuptest/Server* -mindepth 1 -maxdepth 1 -type d -not -name "$today" -exec rm -R {} \;
Edit
To not delete directories other than those containing a date structure, use something like
find /path/to/backuptest/Server* -mindepth 1 -maxdepth 1 -type d -regex ".*2016-[0-1]*[0-9]-[0-3][0-9]$" -not -name "$today"

You can get today's date in whatever format you require via the date command. For example,
TODAY=$(date +%Y-%m-%d)
You can loop over the subfolders you want with a simple wildcard match:
for d in /path/to/backuptest/*/*; do
# ...
done
You can strip the directory portion from a file name with the basename command:
name=$(basename path/to/file)
You can glue that together something like this:
#!/bin/bash
TODAY=$(date +%Y-%m-%d)
for d in /path/to/backuptest/*/*; do
test "$(basename "$d")" = "$TODAY" || rm -rf "$d"
done
Update:
If you don't actually want to purge all subfolders except today's, but rather only those matching some particular name pattern, then one way to accomplish that would be to insert that pattern into the glob in the for command. For example, here
for d in /path/to/backuptest/*/+([0-9])-+([0-9])-+([0-9]); do
test "$(basename "$d")" = "$TODAY" || rm -rf "$d"
done
the only files / directories considered for deletion are those whose names consist of three nonempty, hyphen-separated strings of decimal digits. One could write patterns that more precisely match date string format if one preferred, but it does get messier the more discriminating you want the pattern to be.

You can do it with find:
set date=`date +%Y-%m-%d`
find backuptest -type d -not -name $date -not -name "backuptest" -not -name "Server*" -exec rm -rf {} \;
This:
find backuptest -type d -not -name $date -not -name "backuptest" -not -name "Server*"
will look for directories name different than:
backuptest
Server*
$date -> current date
and remove them with:
rm -rf

Related

Ubuntu - Remove dir and ignore filetypes

I'm trying to create a cronjob for Ubuntu where:
all empty dir's should be removed
if the dir is not empty then it should be removed if the only filetypes are txt or csv files
Currently I have:
find /path -depth rmdir {} \; 2>dev/null
What do I need to delete the folders which only have txt or csv files?
I don't want to delete all txt or csv files, just those folders which do not contain other filetypes.
Additional example:
Dir1
SubDir1
SubSubDir1
File.txt
File.csv
SubDir2
SubSubDir2
File.xml
SubSubDir1 should be deleted. Since SubDir1 and Dir is now empty they should be deleted as well.
SubSubDir2 contains another filetype and should no be deleted.
You could list the number of files in a folder with something like:
find "$d" -maxdepth 1 -not -iname '*.csv' -a -not -iname '*.txt' | wc -l
If the folder is empty or the folder contains exclusively txt and csv files, it shall print 1.
And to list folders so that they don’t mess up each other if you erase the parents first:
find /path -depth -type d
All in all, you may be able to achieve what you want with:
while read d
do
if [ $(find "$d" -maxdepth 1 -not -iname '*.csv' -a -not -iname '*.txt' | wc -l) -eq 1 ]
then
rm -rf "$d"
fi
done < <(find /path -depth -type d)
But I also advocate a check somewhere so your cron doesn’t wipe your storage without your consent.

How to clean up folders efficiently using shell script

I am using a directory structure with various folders. There are new files created daily in some of them.
I have created some programs to clean up the directories, but I would like to use a shell script to make it more efficient.
Therefore I would like to store an "archiving.properties" file in every folder that needs to be cleaned up. The properties file should contain the following variables
file_pattern="*.xml"
days_to_keep=2
Now my clean up routine should:
find all properties files
delete all files that match the file name pattern (file_pattern) and that are older then the defined number of days (days_to_keep) in the directory where the properties file was found.
So my question is how can I do this in the most efficient way?
find . -type f -name "archiving.properties" -print
find . -type f -name "<file_pattern>" -mtime +<days_to_keep> -delete
currently I was trying the following in a single folder. It prints out the command correctly, but it is not executed.
#!/bin/bash
. archiving.properties
find . -type f -name "*.xml" -mtime +1 -exec rm -rf {} \;
echo " find . -type f -name \"${file_pattern}\" -mtime +${days_to_keep} -exec rm -rf {} \;"
Result is: find . -type f -name "*.xml" -mtime +1 -exec rm -rf {} \;
Thanks for your help in advance.
I got a final result
echo "start deleting files in " $(pwd) " ... "
#filename of the properties
properties="clean_up.properties"
#find all properties files
for prop in $(find . -type f -name $properties);do
#init variables
file_pattern="*._html"
days_to_keep=14
#load the variables from the properties file
. "$prop"
#define the folder of the properties file
folder=${prop%?$properties}
#remove all files matching the parameters in the folder where the properties were found
echo ">>> find $folder -type f -name \"${file_pattern}\" -mtime +${days_to_keep} -exec rm -f {} \;"
find $folder -type f -name "${file_pattern}" -mtime +${days_to_keep} -exec rm -f {} \;
done
echo "... done"

Running a script based on hostname within a group variable

I'm new to shell/bash and I'm trying to perform a function to clear logs on my Oracle files. The environment I'm working in has to have all logs fully open, but the issue we have is the volumes filling up and not allowing services to restart. I'm trying to create a script to run as a cron job to search directories depending on which group they're a part of (each group has slightly different paths and names).
I've got the script going through the "VMORDER" which cycles through the groups listed. I want it to pull the host name. Is there a way for me to say "If VM belongs to a group (i.e. GP1, GP2, etc) then run "GP1s" script"?
Thanks for any help you can provide :).
#!/bin/bash
SCRIPTDIR=[SCRIPT DIR]
GP1="vm01 vm02"
GP2="vm03 vm04 vm05"
GP3="vm06 vm07 vm08"
VMORDER="GP1 GP2 GP3"
##DIRECTORY PATHS
VCIE_DIRECTORY=[DIRECTORY]
##FILE EXCLUSION LISTING
access_log='access.log'
admin_server='AdminServer.log'
admin_service='adminservice.log'
app_ms_1='app_ms*.log'
app_ms_2='app_ms*.out'
app_wm_1='app_ms*.log'
app_wm_2='app_ms*.out'
audit_recorder_log='DefaultAuditRecorder.log'
jms_log='jms*.log'
osb_log='osb_domain.log'
diagnostic_log='diagnostic.log'
HNAME=$( hostname | cut -d'.' -f1 | tr [:lower:] [:upper:] )
find_log_rotation(){
for i in $(VMORDER)
do
clear_logs ${i}
done
}
clear_logs(){
##GP1
if [ $HNAME = GP1];
find -P $VCIE_DIRECTORY/app_ms{1..4}/logs/ -type f -not -name "$app_ms_1" -not -name "$app_ms_2" -not -name "$access_log" -not -name "$audit_recorder_log" -not -name "$jms_log" -mtime 1
fi
##GP2
if [ $HNAME = GP2];
find -P $VCIE_DIRECTORY/app_wm{1..4}/logs/ -type f -not -name "$app_wm_1" -not -name "$app_wm_2" -not -name "$access_log" -not -name "$audit_recorder_log" -not -name "$jms_log" -mtime 1
fi
##GP3
if [ $HNAME = GP3];
find -P $VCIE_DIRECTORY/AdminServer/logs/ -type f -not -name "$admin_server" -not -name "$access_log" -not -name "$access_log" -not -name "$admin_service" -not -name "$osb_domain" -mtime 1
fi
Short answer:
When the hostname is something like "vm04" und you do not tr it to uppercase, you could use:
if [[ "${GP1}" = *${HNAME}* ]]; then
The double [] make it a special syntax, right from the =-sign is an expression.
Do not put quotes around it, that would make it a normal string.
Long answer
Do you really have to look for different files on different servers? When you do not have app_ms_1 files under AdminServer, making the selection of the files to skip easier.
SKIP="${app_ms_1}|${app_ms_2}"
SKIP="${SKIP}|${access_log}|${audit_recorder_log}|${jms_log}"
SKIP="${SKIP}|${app_wm_1}|${app_wm_2}"
SKIP="${SKIP}|${admin_server}|${osb_domain}"
find ${VCIE_DIRECTORY}/*/logs -type f -mtime 1 | egrep -v "${SKIP}" | while read file; do
echo Something with ${file}
done
First make sure the code above returns the correct files (should SKIP have ${app_ms_3}, are the wildcards handled correctly).
Do you need to use the HNAME?
Than you might want to rewrite your code:
if [[ "${GP1}" = *${HNAME}* ]]; then
SKIP="${app_ms_1}|${app_ms_2}"
SKIP="${SKIP}|${access_log}|${audit_recorder_log}|${jms_log}"
STARTDIR=${VCIE_DIRECTORY}/app_ms{1..4}/logs/"
fi
# Something like this also for GP2 and GP3
find ${STARTDIR} -type f -mtime 1 | egrep -v "${SKIP}" | xargs rm

Selective Sub Directory Deleting

So far i have this script.
my folder structure for now is /root/test/
inside test a folder gets created each month named May Jun July based on (date +%B)
i want the script to delete all sub directory's minus the directory that matches this months (date +%B) and keeping its contents.
currently it deletes everything apart from the sub directory matching. May is completely empty. any ideas?
#!/bin/bash
LinkDest=/root/test
m_date=$(date +%B)
find $LinkDest/ -not -name May -xdev -depth -mindepth 1 -exec rm -Rf {} \;
You can use:
find $LinkDest/ -not -path "*$m_date*" -xdev -depth -mindepth 1 -exec rm -Rf '{}' \;
Try running the find without the -exec to see what's going to be removed. The problem is that -name tries to match the whole name, not a part of it. You need -path:
find -not -path "*/$m_date" -not -name $m_date

Run command from variables in shell script

I wrote this piece of code to scan a directory for files newer than a reference file while excluding specific subdirectories.
#!/bin/bash
dateMarker="date.marker"
fileDate=$(date +%Y%m%d)
excludedDirs=('./foo/bar' './foo/baz' './bar/baz')
excludedDirsNum=${#excludedDirs[#]}
for (( i=0; i < $excludedDirsNum; i++)); do
myExcludes=${myExcludes}" ! -wholename '"${excludedDirs[${i}]}"*'"
done
find ./*/ -type f -newer $dateMarker $myExcludes > ${fileDate}.changed.files
However the excludes are just being ignored. When I "echo $myExcludes" it looks just fine and furthermore the script behaves just as intended if I replace "$myExcludes" in the last line with the output of the echo command. I guess it's some kind of quoting/escaping error, but I haven't been able to eliminate it.
Seems to be a quoting problem, try using arrays:
#!/bin/bash
dateMarker=date.marker
fileDate=$(date +%Y%m%d)
excludedDirs=('./foo/bar' './foo/baz' './bar/baz')
args=(find ./*/ -type f -newer "$dateMarker")
for dir in "${excludedDirs[#]}"
do
args+=('!' -wholename "$dir")
done
"${args[#]}" > "$fileDate.changed.files"
Maybe you also need -prune:
args=(find ./*/)
for dir in "${excludedDirs[#]}"
do
args+=('(' -wholename "$dir" -prune ')' -o)
done
args+=('(' -type f -newer "$dateMarker" -print ')')
you need the myExcludes to evaluate to something like this:
\( -name foo/bar -o -name foo/baz -o -name bar/baz \)

Resources