How to create shell script for creating logs on linux server based on hostname - linux

All,
I have a requirement that if the hostname of the server starts with tm1 or dm1 then it should create gz1 format of log files , if the hostname starts with pc1 then it should create bz1 format of logs.
I have created a generic shell script to create tar files of log files :
#!/bin/bash
#START
TIME=$(date +%Y%-m%-d)
FILENAME=logsbackup-$TIME.tar.gz
SRCDIR=/var/log/
DESDIR=/var/
find $SRCDIR -mtime +1 | xargs tar -cpzf $DESDIR/$FILENAME
#END
How can I implement the above mentioned changes in my script.

You can use a condition like this:
#!/bin/bash
#START
TIME=$(date +%Y%-m%-d)
FILENAME=logsbackup-$TIME.tar
SRCDIR=/var/log/
DESDIR=/var/
host=$(hostname)
if [[ $host == #(tm1|dm1)* ]]; then
echo "creating gz format"
find $SRCDIR -mtime +1 -print0 | xargs -0 tar -cpzf $DESDIR/$FILENAME.gz
elif [[ $host == pc1* ]]; then
echo "creating bz2 format"
find $SRCDIR -mtime +1 | xargs -0 tar -cjf $DESDIR/$FILENAME.bz2
fi
# END

Related

Why check if file is exists in shell always false?

I created a cron using bash to delete files older than 3 days, but when checking the age of the files with mtime +3 &> /dev/null it is always false. here's the script:
now=$(date)
create log file
file_names=('*_takrib_golive.gz' '*_takrib_golive_filestore.tar.gz')
touch /media/nfs/backup/backup_delete.log
echo "Date: $now" >> /media/nfs/backup/backup_delete.log
for filename in "${file_names[#]}";
do
echo $filename
if ls /media/nfs/backup/${filename} &> /dev/null
then
echo "backup files exist"
if find /media/nfs/backup -maxdepth 1 -mtime +3 -name ${filename} -ls &> /dev/null
then
echo "The following backup file was deleted" >> /media/nfs/backup/backup_delete.log
find /media/nfs/backup -maxdepth 1 -mtime +3 -name ${filename} -delete
else
echo "There are no ${filename} files older than 3 days in /media/nfs/backup" &>> /media/nfs/backup/backup_delete.log
fi
else
echo "No ${filename} files found in /media/nfs/backup" >> /media/backup/backup_delete.log
fi
done
exit 0
in if find /media/nfs/backup -maxdepth 1 -mtime +3 -name ${filename} -ls &> /dev/null always goes to else, even though files older than 3 days are in the directory
You are not quoting the -name attribute so it expands to the name of the file which already exists.
I would refactor this rather extensively anyway. Don't parse ls output and perhaps simplify this by making it more obvious when to quote and when not to.
Untested, but hopefully vaguely useful still:
#!/bin/bash
backup=/media/nfs/backup
backuplog=$backup/backup_delete.log
# no need to touch if you write to the file anyway
date +"Date: %C" >> "$backuplog"
# Avoid using a variable, just loop over the stems
for stem in takrib_golive takrib_golive_filestore.tar
do
# echo $filename
# Avoid parsing ls; instead, loop over matches
for filename in "$backup"/*_"$stem".gz; do
pattern="*_$stem.gz"
if [ -e "$filename" ]; then
echo "backup files exist"
if files=$(find "$backup" -maxdepth 1 -mtime +3 -false -o -name "$pattern" -print -delete)
then
echo "The following backup file was deleted" >> "$backuplog"
echo "$files" >> "$backuplog"
else
echo "There are no $pattern files older than 3 days in $backup" >> "$backuplog"
fi
else
echo "No $pattern files found in $backup" >> "$backuplog"
fi
# Either way, we can break the loop after one iteration
break
done
done
# no need to explicitly exit 0
The for + if [ -e ... ] arrangement is slightly clumsy, but that's how you check if a wildcard matched any files. If the wildcard did not match, if [ -e will check for a file whose name is literally the wildcard expression itself, and fail.

Need guidance with a bash script to check log files in a certain directory for a certain string

I would like to preface this with I am a complete noob with scripting. So I have a situation where I need to manually look for a phone number that could live in one of hundreds of files.
so the logs live in the following directory.
/actlogs/sbclogger_archive
The logs file names are in directories numbered 01-31 inside of that directory and all the files are zipped.
Inside of those numbered directories are tons of files but the only ones I want to search are "sipd.logthenthedate.gz" and "sipmsg.logthenthedate.gz".
So I need to look in all the files in the following directory.
"/actlogs/sbclogger_archive"
Which has 31 directories labeled "01-31"
Then in each 01-31 there is hundreds of files the only ones I want to look are are "sipd.logthenthedate.gz" and "sipmsg.logthenthedate.gz".
The script I am using is below, please let me know what I could do to make this work.
#!/bin/bash
read -p "Enter a phone number: " text
read -p "Enter directory of log file's, Hint it should be /actlogs/sbclogger_archive: " directory
#arr=( $(find $directory -type f -exec grep -l "$text" {} \; | sort -r) )
#find $directory -type f -exec grep -qe "$text" {} \; -exec bash -c '
file=$(find $directory -type f -name 'sipd.log*' -exec grep -qe "$text" {} \; -exec bash -c 'select f; do echo $f; break; done' find-sh {} +;)
if [ -z "$file" ]; then
echo "No matches found."
else
echo "select tool:"
tools=("nano" "less" "vim" "quit")
select tool in "${tools[#]}"
do
case $tool in
"quit")
break
;;
*)
$tool $file
break
;;
esac
done
fi
This would give you the list of files matching:
find \( -name 'sipd.log[0-9]*.gz' -o -name 'sipmsg.log[0-9]*.gz' \) \
-exec sh -c 'gunzip -c {}| grep -m1 -q 888333' \; -print
./18/sipd.log20200118.gz
./7/sipd.log20200107.gz
Note: -m1 tells grep to stop after first match, since you need only the file name in this case, it's enough.
If you have zgrep, you can shorten it to:
find \( -name 'sipd.log[0-9]*.gz' -o -name 'sipmsg.log[0-9]*.gz' \) \
-exec zgrep -l '888333' {} \;
./18/sipd.log20200118.gz
./7/sipd.log20200107.gz
Also, some of the tools you are suggesting do not support gzip files (nano and some variants of less for example). In which case you might need to decompress the file and compress it again when done.
And, you might want to consider a loop if you want to "quit". Feeding the file list to the tool doesn't make sense.
Note: AFAIK zgrep doesn't do recursive:
DESCRIPTION
Zgrep invokes grep on compressed or gzipped files. These grep options will cause zgrep to terminate with an
error code:
(-[drRzZ]|--di*|--exc*|--inc*|--rec*|--nu*). All other options specified are passed directly to grep. If no file is specified, then
the
standard input is decompressed if necessary and fed to grep. Otherwise the given files are uncompressed if necessary and fed to
grep.
so zgrep -rl "$text" "$directory" or zgrep -rl --include 'simpd.log*.gz' "$test" {01..31} won't work except if you have a special zgrep
As you must unzip before using your tool, i would divide the problem in two blocks.
Firstly, i would expand the paths you need (looking under <directory> for the phone <text>), and then iterate to apply the tool (because some tools like vim or nano cannot be piped).
Try something like this:
#!/bin/bash
#...
# text/directory input stuff
#...
tmpdir=$(mktemp -d)
trap 'rm -rf ${tmpdir}' EXIT
while IFS= read -r file; do
unzipped=${tmpdir}/$(basename "${file}" .gz)
gunzip -c "${file}" > "${unzipped}"
${tool} "${unzipped}"
done < <(zgrep -lw "${text}" "${directory}"/{01..31}/{sipd.logthenthedate.gz,sipmsg.logthenthedate.gz} 2>/dev/null)
Above is the proposed invert-form by Charles Duffy following this Bash FAQ.
If you prefer to iterate an array, you could build in this way:
# shellcheck disable=SC2207
files=( $(zgrep -lw "${text}" "${directory}"/{01..31}/{sipd.logthenthedate.gz,sipmsg.logthenthedate.gz} 2>/dev/null) )
for file in "${files[#]}"; do
# etc.
as in our particular case, the files to match have no spaces in their names and shellcheck warning is not so important (hidden above).
BRs

How to get echo to print only deleted file paths?

I'm trying to write a script to create mysqldumps daily in a directory as well as check all the backups in that directory and remove any older than 7 days that is going to run on cron.
So my functions work correctly, it's just my last echo command that is not doing what I want it to. This is what I have so far:
DBNAME=database
DATE=`date +\%Y-\%m-\%d_\%H\%M`
SQLFILE=$DBNAME-${DATE}.sql
curr_dir=$1
#MAIN
mysqldump -u root -ppassword --databases $DBNAME > $SQLFILE
echo "$SQLFILE has been successfully created."
#Remove files older than 7 days
for filepath in "$curr_dir"*
do
find "$filepath" -mtime +7 -type f -delete
echo "$filepath has been deleted."
done
exit
So the backup creations and removal of old files both work. But, my problem is that echo "$filepath has been deleted." is printing all files in the directory instead of just the files older than 7 days that were deleted. Where am I going wrong here?
EDIT (Full solution):
This is the full solution that wound up working for me using everyone's advice from the answers and comments. This works for cron jobs. I had to specify the main function's output filepath because the files were being created in the root directory instead of the path specified in Argument $1.
Thank you everyone for the help! The if statement also checks whether or not $1 is the specified directory I want files to be deleted in.
#Variables
DBNAME=database
DATE=`date +\%Y-\%m-\%d_\%H\%M`
SQLFILE=$DBNAME-${DATE}.sql
curr_dir=$1
#MAIN
mysqldump -u root -ppassword --databases $DBNAME > /path/to/db-backups/directory/$SQLFILE
echo "$SQLFILE has been successfully created."
#Remove files older than 7 days
for filepath in "$curr_dir"*
do
if [[ $1 = "/path/to/db-backups/directory" ]]; then
find "$filepath" -mtime +7 -type f -delete -exec sh -c 'printf "%s has been deleted.\n" "$#"' _ {} +
fi
done
exit
You can merge the echo into the find:
find "$filepath" -mtime +7 -type f -delete -exec echo '{}' "has been deleted." \;
The -delete option is just a shortcut for -exec rm '{}' \; and all the -exec commands are run in the sequence you specify them in.

Copy two most recent files to another directory using bash script

I'm trying to create a bash script to create daily backups of a MySQL db & a web directory. It should tar then then copy the two most recent .tar.gz files to a weekly directory on day 0 of each week, a monthly directory on day 1 of each month and to a year directory on day 1 of each year.
I'm having issues trying to get the 'copy the two most recent files' part to work.
What I've got so far (used the script from https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script as a base.):
#!/bin/sh
# https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Local Source
SOURCE=/path/to/source
# Create directories etc here
DIR=/path/to/backups
# Local Destination
DESTINATION=/path/to/network/share
# Direct all output to logfile found here
#LOG=$$.log
#exec > $LOG 2>&1
# Database Backup User
DATABASE='wordpress'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'
# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
DOW=$(date '+%u')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')
#LATEST=$(ls -t | head -1)
#LATEST_DAILY=$(find $DIR/tmp/daily/ -name '*.tar.gz' | sort -n | tail -3)
#DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)
#DAILY=$(ls -1tr $DIR/tmp/daily/ | tail -2 )
DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)
# Direct all output to logfile found here
# LOG=$DIR/logs/$$.log
# exec > $LOG 2>&1
# Make Temporary Folder
if [ ! -d "$DIR/tmp" ]; then
mkdir "$DIR/tmp"
echo 'Created tmp directory...'
fi
# Make Daily Folder
if [ ! -d "$DIR/tmp/daily" ]; then
mkdir "$DIR/tmp/weekly"
echo 'Created daily directory...'
fi
# Make Weekly Folder
if [ ! -d "$DIR/tmp/weekly" ]; then
mkdir "$DIR/tmp/weekly"
echo 'Created weekly directory...'
fi
# Make Folder For Current Year
if [ ! -d "$DIR/tmp/${YEAR}" ]; then
mkdir "$DIR/tmp/${YEAR}"
echo 'Directory for current year created...'
fi
# Make Folder For Current Month
if [ ! -d "$DIR/tmp/${YEAR}/$MONTH" ]; then
mkdir "$DIR/tmp/${YEAR}/$MONTH"
echo '...'Directory for current month created
fi
# Make The Daily Backup
tar -zcvf $DIR/tmp/daily/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/tmp/database.sql
tar -zcvf $DIR/tmp/daily/${NOW}_database.tar.gz $DIR/tmp/database.sql
rm -rf $DIR/tmp/database.sql
echo 'Made daily backup...'
# Check whether it's Sunday (0), if so, then copy most recent daily backup to weekly dir.
if [ $DOW -eq 2 ] ; then
cp $DAILY $DIR/tmp/weekly/
fi
echo 'Made weekly backup...'
# Check whether it's the first day of the year then copy two most recent daily backups to $YEAR folder
if [ $DAY_OF_YEAR -eq 146 ] ; then
cp $DAILY $DIR/tmp/${YEAR}/
fi
echo 'Made annual backup...'
# Check if it's the first day of the month, if so, copy the latest daily backups to the monthly folder
if [ $DAY_OF_MONTH -eq 26 ] ; then
cp $DAILY $DIR/tmp/${YEAR}/${MONTH}/
fi
echo 'Made monthly backup...'
# Merge The Backup To The Local Destination's Backup Folder
# cp -rf $DIR/tmp/* $DESTINATION
# Delete The Temporary Folder
# rm -rf $DIR/tmp
# Delete daily backups older than 7 days
# find $DESTINATION -mtime +7 -exec rm {} \;
echo 'Backup complete. Log can be found under $DIR/logs/.'
I've commented out some parts for now whilst I'm trying to get this working and I've set the day/month/year to todays so I can see files being copied. I've also left in my commented-out previous attempts at $DAILY variables.
The issue I'm getting is that upon executing the script, it returns the following:
./backup-rotation-script.sh
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made weekly backup...
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made annual backup...
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made monthly backup...
Backup complete. Log can be found under /path/to/backups/logs/.
But when I check /path/to/backups/tmp/daily/ the files are there and it's clearly seeing them because it's returning the file names in the error.
From what I can gather, it's because $DAILY (find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2) is returning two results on one line? I'm assuming the easiest way to get this to work would probably be to create a for loop that copies the two results over to the weekly/monthly/yearly directories?
I tried adding variations on:
for file in `ls -1t /path/to/backups/tmp/daily/ | head -n2`
do
cp $file /path/to/backups/tmp/weekly/
done
But it didn't go so well. :S
Ideally, I'd also like it to report if it fails but I'm not that far yet. :)
Any help would be much appreciated!
Nevermind! Figured it out.
I removed the 'daily' variable entirely and used the following for the copy instead:
find $DIR/tmp/daily/ -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/tmp/weekly/
So script now looks like:
#!/bin/sh
# Original script: https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Edited/hacked/chopped/stuff by Khaito
# Redirect all script output to log file located in log directory with date in name.
exec 3>&1 4>&2
trap 'exec 2>&4 1>&3' 0 1 2 3 RETURN
exec 1>/path/to/logs/$(date +"%Y-%m-%d-%H%M")_intranet.log 2>&1
# Local Source
SOURCE=/path/to/source
# Create directories etc here
LOCAL=/path/to/backups
DIR=/path/to/backups/intranet
DIRD=/path/to/backups/intranet/daily
DIRW=/path/to/backups/intranet/weekly
DIRM=/path/to/backups/intranet/monthly
# Local Destination
DESTINATION=/path/to/network/share
# Database Backup User
DATABASE='dbname'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'
# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')
DOW=$(date '+%u')
YEARMONTH=$(date +"%Y-%m-%B")
# Make Daily Folder
if [ ! -d "$LOCAL/intranet" ]; then
mkdir "$DIR/intranet"
echo 'Intranet directory created...'
fi
# Make Daily Folder
if [ ! -d "$DIR/daily" ]; then
mkdir "$DIR/daily"
echo 'Daily directory created...'
fi
# Make Weekly Folder
if [ ! -d "$DIR/weekly" ]; then
mkdir "$DIR/weekly"
echo 'Weekly directory created...'
fi
# Make Folder For Current Month
if [ ! -d "$DIR/monthly" ]; then
mkdir "$DIR/monthly"
echo 'Monthly directory created...'
fi
# Make Folder For Current Year
if [ ! -d "$DIR/${YEAR}" ]; then
mkdir "$DIR/${YEAR}"
echo 'Directory for current year created...'
fi
# Tar the intranet files then dump the db, tar it then remove the original dump file.
tar -cvzf $DIRD/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/database.sql
tar -cvzf $DIRD/${NOW}_database.tar.gz $DIR/database.sql
rm -rf $DIR/database.sql
echo 'Made daily backup...'
# Check if it's Sunday (0), if so, copy the two most recent daily files to the weekly folder.
if [ $DOW -eq 0 ] ; then
find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRW
fi
echo 'Made weekly backup...'
# Check if it's the first day of the month, if so, copy the two most recent daily files to the monthly folder
if [ $DAY_OF_MONTH -eq 1 ] ; then
find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRM
fi
echo 'Made monthly backup...'
# Check if it's the first day of the year, if so, copy the two most recent daily files to the current year folder
if [ $DAY_OF_YEAR -eq 1 ] ; then
find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/${YEAR}/
fi
echo 'Made annual backup...'
# Rsync the new files to the network share for backup to tape
rsync -hvrPt $DIR/* $DESTINATION
# Delete local backups
# find $DIRD -mtime +8 -exec rm {} \;
# find $DIRW -mtime +15 -exec rm {} \;
# find $DIRM -mtime +2 -exec rm {} \;
# find $DIR/${YEAR} -mtime +2 -exec rm {} \;
# Delete daily backups older than 7 days on network share
# find $INTRANETDESTINATION/daily -mtime +8 -exec rm {} \;
# Delete weekly backups older than 31 days on network share
# find $INTRANETDESTINATION/weekly -mtime +32 -exec rm {} \;
# Delete monthly backups older than 365 days on network share
# find $INTRANETDESTINATION/monthly -mtime +366 -exec rm {} \;
echo 'Backup complete. Log can be found under /path/to/logs/.'

Bash script file size

I have got script like that:
#!/bin/sh
cd /home/gamesimport/
ls -t games*.xml | tail -n+2 | xargs rm
mv games*.xml games_ok.xml
It's just deleting old games*.xml files, renaming the lastest games.xml file but I would like also to change name if games.xml file is larger then 1 MB. How would I do that?
FILESIZE=$(stat -c%s games_ok.xml)
MAX=1048576
if [ $FILESIZE -ge $MAX ]; then
#do something else
fi
should work
Simply, use find:
find some/where -name games\*.xml -size +1M -exec mv {} {}.big \;

Resources