Copy two most recent files to another directory using bash script - linux

I'm trying to create a bash script to create daily backups of a MySQL db & a web directory. It should tar then then copy the two most recent .tar.gz files to a weekly directory on day 0 of each week, a monthly directory on day 1 of each month and to a year directory on day 1 of each year.
I'm having issues trying to get the 'copy the two most recent files' part to work.
What I've got so far (used the script from https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script as a base.):
#!/bin/sh
# https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Local Source
SOURCE=/path/to/source
# Create directories etc here
DIR=/path/to/backups
# Local Destination
DESTINATION=/path/to/network/share
# Direct all output to logfile found here
#LOG=$$.log
#exec > $LOG 2>&1
# Database Backup User
DATABASE='wordpress'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'
# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
DOW=$(date '+%u')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')
#LATEST=$(ls -t | head -1)
#LATEST_DAILY=$(find $DIR/tmp/daily/ -name '*.tar.gz' | sort -n | tail -3)
#DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)
#DAILY=$(ls -1tr $DIR/tmp/daily/ | tail -2 )
DAILY=$(find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2)
# Direct all output to logfile found here
# LOG=$DIR/logs/$$.log
# exec > $LOG 2>&1
# Make Temporary Folder
if [ ! -d "$DIR/tmp" ]; then
mkdir "$DIR/tmp"
echo 'Created tmp directory...'
fi
# Make Daily Folder
if [ ! -d "$DIR/tmp/daily" ]; then
mkdir "$DIR/tmp/weekly"
echo 'Created daily directory...'
fi
# Make Weekly Folder
if [ ! -d "$DIR/tmp/weekly" ]; then
mkdir "$DIR/tmp/weekly"
echo 'Created weekly directory...'
fi
# Make Folder For Current Year
if [ ! -d "$DIR/tmp/${YEAR}" ]; then
mkdir "$DIR/tmp/${YEAR}"
echo 'Directory for current year created...'
fi
# Make Folder For Current Month
if [ ! -d "$DIR/tmp/${YEAR}/$MONTH" ]; then
mkdir "$DIR/tmp/${YEAR}/$MONTH"
echo '...'Directory for current month created
fi
# Make The Daily Backup
tar -zcvf $DIR/tmp/daily/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/tmp/database.sql
tar -zcvf $DIR/tmp/daily/${NOW}_database.tar.gz $DIR/tmp/database.sql
rm -rf $DIR/tmp/database.sql
echo 'Made daily backup...'
# Check whether it's Sunday (0), if so, then copy most recent daily backup to weekly dir.
if [ $DOW -eq 2 ] ; then
cp $DAILY $DIR/tmp/weekly/
fi
echo 'Made weekly backup...'
# Check whether it's the first day of the year then copy two most recent daily backups to $YEAR folder
if [ $DAY_OF_YEAR -eq 146 ] ; then
cp $DAILY $DIR/tmp/${YEAR}/
fi
echo 'Made annual backup...'
# Check if it's the first day of the month, if so, copy the latest daily backups to the monthly folder
if [ $DAY_OF_MONTH -eq 26 ] ; then
cp $DAILY $DIR/tmp/${YEAR}/${MONTH}/
fi
echo 'Made monthly backup...'
# Merge The Backup To The Local Destination's Backup Folder
# cp -rf $DIR/tmp/* $DESTINATION
# Delete The Temporary Folder
# rm -rf $DIR/tmp
# Delete daily backups older than 7 days
# find $DESTINATION -mtime +7 -exec rm {} \;
echo 'Backup complete. Log can be found under $DIR/logs/.'
I've commented out some parts for now whilst I'm trying to get this working and I've set the day/month/year to todays so I can see files being copied. I've also left in my commented-out previous attempts at $DAILY variables.
The issue I'm getting is that upon executing the script, it returns the following:
./backup-rotation-script.sh
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made weekly backup...
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made annual backup...
cp: cannot stat `2015-05-26-1152_files.tar.gz': No such file or directory
cp: cannot stat `2015-05-26-1152_database.tar.gz': No such file or directory
Made monthly backup...
Backup complete. Log can be found under /path/to/backups/logs/.
But when I check /path/to/backups/tmp/daily/ the files are there and it's clearly seeing them because it's returning the file names in the error.
From what I can gather, it's because $DAILY (find $DIR/tmp/daily/ -name *.tar.gz | sort -n | head -2) is returning two results on one line? I'm assuming the easiest way to get this to work would probably be to create a for loop that copies the two results over to the weekly/monthly/yearly directories?
I tried adding variations on:
for file in `ls -1t /path/to/backups/tmp/daily/ | head -n2`
do
cp $file /path/to/backups/tmp/weekly/
done
But it didn't go so well. :S
Ideally, I'd also like it to report if it fails but I'm not that far yet. :)
Any help would be much appreciated!

Nevermind! Figured it out.
I removed the 'daily' variable entirely and used the following for the copy instead:
find $DIR/tmp/daily/ -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/tmp/weekly/
So script now looks like:
#!/bin/sh
# Original script: https://github.com/dlabey/Simple-Linux-Bash-Rotating-Backup-Script
# Edited/hacked/chopped/stuff by Khaito
# Redirect all script output to log file located in log directory with date in name.
exec 3>&1 4>&2
trap 'exec 2>&4 1>&3' 0 1 2 3 RETURN
exec 1>/path/to/logs/$(date +"%Y-%m-%d-%H%M")_intranet.log 2>&1
# Local Source
SOURCE=/path/to/source
# Create directories etc here
LOCAL=/path/to/backups
DIR=/path/to/backups/intranet
DIRD=/path/to/backups/intranet/daily
DIRW=/path/to/backups/intranet/weekly
DIRM=/path/to/backups/intranet/monthly
# Local Destination
DESTINATION=/path/to/network/share
# Database Backup User
DATABASE='dbname'
DATABASE_USER='dbuser'
DATABASE_PASSWORD='password'
DATABASE_HOST='localhost'
# DO NOT EDIT ANYTHING BELOW THIS
# Date Variables
DAY_OF_YEAR=$(date '+%j')
DAY_OF_MONTH=$(date '+%d')
DAY_OF_WEEK_RAW=$(date '+%w')
WEEK_OF_YEAR=$(date '+%W')
DAY_OF_WEEK=$((DAY_OF_WEEK_RAW + 1))
DAY=$(date '+%a')
NOW=$(date +"%Y-%m-%d-%H%M")
MONTH=$(date '+%m')
YEAR=$(date '+%Y')
DOW=$(date '+%u')
YEARMONTH=$(date +"%Y-%m-%B")
# Make Daily Folder
if [ ! -d "$LOCAL/intranet" ]; then
mkdir "$DIR/intranet"
echo 'Intranet directory created...'
fi
# Make Daily Folder
if [ ! -d "$DIR/daily" ]; then
mkdir "$DIR/daily"
echo 'Daily directory created...'
fi
# Make Weekly Folder
if [ ! -d "$DIR/weekly" ]; then
mkdir "$DIR/weekly"
echo 'Weekly directory created...'
fi
# Make Folder For Current Month
if [ ! -d "$DIR/monthly" ]; then
mkdir "$DIR/monthly"
echo 'Monthly directory created...'
fi
# Make Folder For Current Year
if [ ! -d "$DIR/${YEAR}" ]; then
mkdir "$DIR/${YEAR}"
echo 'Directory for current year created...'
fi
# Tar the intranet files then dump the db, tar it then remove the original dump file.
tar -cvzf $DIRD/${NOW}_files.tar.gz $SOURCE
mysqldump -h $DATABASE_HOST -u $DATABASE_USER -p$DATABASE_PASSWORD $DATABASE > $DIR/database.sql
tar -cvzf $DIRD/${NOW}_database.tar.gz $DIR/database.sql
rm -rf $DIR/database.sql
echo 'Made daily backup...'
# Check if it's Sunday (0), if so, copy the two most recent daily files to the weekly folder.
if [ $DOW -eq 0 ] ; then
find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRW
fi
echo 'Made weekly backup...'
# Check if it's the first day of the month, if so, copy the two most recent daily files to the monthly folder
if [ $DAY_OF_MONTH -eq 1 ] ; then
find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIRM
fi
echo 'Made monthly backup...'
# Check if it's the first day of the year, if so, copy the two most recent daily files to the current year folder
if [ $DAY_OF_YEAR -eq 1 ] ; then
find $DIRD -type f -printf "%p\n" | sort -rn | head -n 2 | xargs -I{} cp {} $DIR/${YEAR}/
fi
echo 'Made annual backup...'
# Rsync the new files to the network share for backup to tape
rsync -hvrPt $DIR/* $DESTINATION
# Delete local backups
# find $DIRD -mtime +8 -exec rm {} \;
# find $DIRW -mtime +15 -exec rm {} \;
# find $DIRM -mtime +2 -exec rm {} \;
# find $DIR/${YEAR} -mtime +2 -exec rm {} \;
# Delete daily backups older than 7 days on network share
# find $INTRANETDESTINATION/daily -mtime +8 -exec rm {} \;
# Delete weekly backups older than 31 days on network share
# find $INTRANETDESTINATION/weekly -mtime +32 -exec rm {} \;
# Delete monthly backups older than 365 days on network share
# find $INTRANETDESTINATION/monthly -mtime +366 -exec rm {} \;
echo 'Backup complete. Log can be found under /path/to/logs/.'

Related

How to compress multiple folders to separetly another folders and tar.gz

How I can compress files in this scenario:
I have folder structure like this:
User1:
/home/user1/websites/website1
/home/user1/websites/website2
User2:
/home/user2/websites/website1
/home/user2/websites/website2
/home/user2/websites/website3
And I try now (need) to do backup like this:
Folders for backups per user:
/backup/date/websites/user1/
/backup/date/websites/user1/
And I need backup tar in user directory separately per website like this:
/backup/date/websites/user1/website1.tar.gz
/backup/date/websites/user1/website2.tar.gz
/backup/date/websites/user2/website1.tar.gz
/backup/date/websites/user2/website2.tar.gz
/backup/date/websites/user2/website3.tar.gz
I have script like this one to do half of this work:
#VARIABLES
BKP_DATE=`date +"%F"`
BKP_DATETIME=`date +"%H-%M"`
#BACKUPS FOLDERS
BKP_DEST=/backup/users
BKP_DEST_DATE=$BKP_DEST/$BKP_DATE
BKP_DEST_TIME=$BKP_DEST_DATE/$BKP_DATETIME
BACKUP_DIR=$BKP_DEST_TIME
#NUMBER OF DAYS TO KEEP ARCHIVES IN BACKUP DIRECTORY
KEEP_DAYS=7
#Create folders
mkdir -p $BKP_DEST_DATE
mkdir -p $BKP_DEST_TIME
mkdir -p $BACKUP_DIR
#DELETE FILES OLDER THAN {*} DAYS IN BACKUP SERVER DIRECTORY
#echo 'Deleting backup folder older than '${KEEP_DAYS}' days'
find $BKP_DEST/* -type d -ctime +${KEEP_DAYS} -exec rm -rf {} \;
#Do backups
#List only names available users data directories
usersdirectories=`cd /home && find * -maxdepth 0 -type d | grep -Ev "(tmp|root)"`
#Creating directories per user name
for i in $usersdirectories; do
mkdir -p $BACKUP_DIR/$i/websites
done
But if u see, i haven't how to do tar this for separately archives. In my half script I have done:
Create folder structure for backup by datetime (/backup/users/day/hour-minutes)
Create folder structure for backup by users names (/backup/users/day/hour-minutes/user1)
Thanks for all users who try to help me!
I will try to complete your script, but I can't debug it because your environment is hard to reproduce. It is better in the future that you provide a minimal reproducible example.
#VARIABLES
BKP_DATE=$(date +"%F")
BKP_DATETIME=$(date +"%H-%M")
#BACKUPS FOLDERS
BKP_DEST=/backup/users
BKP_DEST_DATE=$BKP_DEST/$BKP_DATE
BKP_DEST_TIME=$BKP_DEST_DATE/$BKP_DATETIME
BACKUP_DIR=$BKP_DEST_TIME
#NUMBER OF DAYS TO KEEP ARCHIVES IN BACKUP DIRECTORY
KEEP_DAYS=7
#Create folders
mkdir -p $BKP_DEST_DATE
mkdir -p $BKP_DEST_TIME
mkdir -p $BACKUP_DIR
#DELETE FILES OLDER THAN {*} DAYS IN BACKUP SERVER DIRECTORY
#echo 'Deleting backup folder older than '${KEEP_DAYS}' days'
find $BKP_DEST/* -type d -ctime +${KEEP_DAYS} -exec rm -rf {} \;
#Do backups
#List only names available users data directories
usersdirectories=$(cd /home && find * -maxdepth 0 -type d | grep -Ev "(tmp|root)")
#Creating directories per user name
for i in $usersdirectories; do
for w in $(/home/$i/websites/*); do
ws=$(basename $w)
mkdir -p $BACKUP_DIR/$i/websites/$ws
tar -czvf $BACKUP_DIR/$i/websites/$ws.tar.gz /home/$i/websites/$ws
done
done
I suppose there are no blanks inside the directory names website1, etc...
I also replaced the deprecated backticks operators of your code by $(...).
I wish I could comment to ask for more clarification but I will attempt to help you.
#!/bin/bash
# example for user1
archive_path=/backup/data/websites/user1
file_path=/home/user1/websites/*
for i in $file_path
do
tar czf $archive_path/$(basename $i).tar.gz $file_path
done
Okay, after small fix. Version from Pierre working now. If u want, u can adjust it for yours need.
#VARIABLES
BKP_DATE=$(date +"%F")
BKP_DATETIME=$(date +"%H-%M")
#BACKUPS FOLDERS
BKP_DEST=/backup
BKP_DEST_DATE=$BKP_DEST/$BKP_DATE
BKP_DEST_TIME=$BKP_DEST_DATE/$BKP_DATETIME
BACKUP_DIR=$BKP_DEST_TIME
#NUMBER OF DAYS TO KEEP ARCHIVES IN BACKUP DIRECTORY
KEEP_DAYS=7
#Create folders
mkdir -p $BKP_DEST_DATE
mkdir -p $BKP_DEST_TIME
mkdir -p $BACKUP_DIR
#DELETE FILES OLDER THAN {*} DAYS IN BACKUP SERVER DIRECTORY
#echo 'Deleting backup folder older than '${KEEP_DAYS}' days'
find $BKP_DEST/* -type d -ctime +${KEEP_DAYS} -exec rm -rf {} \;
#Do backups
#List only names available users data directories
usersdirectories=$(cd /home/user && find * -maxdepth 0 -type d | grep -Ev "(tmp|root)")
#Creating directories per user name
for i in $usersdirectories; do
for w in /home/user/$i/*; do
#echo ok
ws=$(basename $w)
echo mkdir -p $BACKUP_DIR/$i
echo tar -czvf $BACKUP_DIR/$i/$ws.tar.gz /home/user/$i/$ws
done
done

Linux: Search for old files.... copy the oldest ones to a location... (+- Verify copy)... then delete them

I need help with file handling in on Raspbian Stretch Lite running on Raspberry Pi Zero --- fresh install, updated.
The following script is run periodically as a cron job:
partition=/dev/root
imagedir=/etc/opt/kerberosio/capture/
if [[ $(df -h | grep $partition | head -1 | awk -F' ' '{ print $5/1 }' | tr ['%'] ["0"]) -gt 90 ]];
then
echo "Cleaning disk"
find $imagedir -type f | sort | head -n 100 | xargs -r rm -rf;
fi;
Essentially when the SD card is >90% full the oldest 100 files in a directory are deleted.
I want to add some functionality:
1) Copy the 100 oldest files to a NAS drive mounted on the file system and
2) Verify successful copy and
3) Delete the files that were copied.
I have found the following string which may be helpful in the modification of the script above:
find /data/machinery/capture/ -type f -name '*.*' -mtime +1 -exec mv {} /data/machinery/nas/ \;

How to get echo to print only deleted file paths?

I'm trying to write a script to create mysqldumps daily in a directory as well as check all the backups in that directory and remove any older than 7 days that is going to run on cron.
So my functions work correctly, it's just my last echo command that is not doing what I want it to. This is what I have so far:
DBNAME=database
DATE=`date +\%Y-\%m-\%d_\%H\%M`
SQLFILE=$DBNAME-${DATE}.sql
curr_dir=$1
#MAIN
mysqldump -u root -ppassword --databases $DBNAME > $SQLFILE
echo "$SQLFILE has been successfully created."
#Remove files older than 7 days
for filepath in "$curr_dir"*
do
find "$filepath" -mtime +7 -type f -delete
echo "$filepath has been deleted."
done
exit
So the backup creations and removal of old files both work. But, my problem is that echo "$filepath has been deleted." is printing all files in the directory instead of just the files older than 7 days that were deleted. Where am I going wrong here?
EDIT (Full solution):
This is the full solution that wound up working for me using everyone's advice from the answers and comments. This works for cron jobs. I had to specify the main function's output filepath because the files were being created in the root directory instead of the path specified in Argument $1.
Thank you everyone for the help! The if statement also checks whether or not $1 is the specified directory I want files to be deleted in.
#Variables
DBNAME=database
DATE=`date +\%Y-\%m-\%d_\%H\%M`
SQLFILE=$DBNAME-${DATE}.sql
curr_dir=$1
#MAIN
mysqldump -u root -ppassword --databases $DBNAME > /path/to/db-backups/directory/$SQLFILE
echo "$SQLFILE has been successfully created."
#Remove files older than 7 days
for filepath in "$curr_dir"*
do
if [[ $1 = "/path/to/db-backups/directory" ]]; then
find "$filepath" -mtime +7 -type f -delete -exec sh -c 'printf "%s has been deleted.\n" "$#"' _ {} +
fi
done
exit
You can merge the echo into the find:
find "$filepath" -mtime +7 -type f -delete -exec echo '{}' "has been deleted." \;
The -delete option is just a shortcut for -exec rm '{}' \; and all the -exec commands are run in the sequence you specify them in.

How to create shell script for creating logs on linux server based on hostname

All,
I have a requirement that if the hostname of the server starts with tm1 or dm1 then it should create gz1 format of log files , if the hostname starts with pc1 then it should create bz1 format of logs.
I have created a generic shell script to create tar files of log files :
#!/bin/bash
#START
TIME=$(date +%Y%-m%-d)
FILENAME=logsbackup-$TIME.tar.gz
SRCDIR=/var/log/
DESDIR=/var/
find $SRCDIR -mtime +1 | xargs tar -cpzf $DESDIR/$FILENAME
#END
How can I implement the above mentioned changes in my script.
You can use a condition like this:
#!/bin/bash
#START
TIME=$(date +%Y%-m%-d)
FILENAME=logsbackup-$TIME.tar
SRCDIR=/var/log/
DESDIR=/var/
host=$(hostname)
if [[ $host == #(tm1|dm1)* ]]; then
echo "creating gz format"
find $SRCDIR -mtime +1 -print0 | xargs -0 tar -cpzf $DESDIR/$FILENAME.gz
elif [[ $host == pc1* ]]; then
echo "creating bz2 format"
find $SRCDIR -mtime +1 | xargs -0 tar -cjf $DESDIR/$FILENAME.bz2
fi
# END

Delete all files in a directory, except those listed matching specific criteria

I need to automate a clean-up of a Linux based FTP server that only holds backup files.
In our "\var\DATA" directory is a collection of directories. Any directory here used for backup begins with "DEV". In each "DEVxxx*" directory are the actual backup files, plus any user files that may have been needed in the course of maintenance on these devices.
We only want to retain the following files - anything else found in these "DEVxxx*" directories is to be deleted:
The newest two backups: ls -t1 | grep -m2 ^[[:digit:]{6}_Config]
The newest backup done on the first of the month: ls -t1 | grep -m1 ^[[:digit:]{4}01_Config]
Any file that was modified less than 30 days ago: find -mtime -30
Our good configuration file: ls verification_cfg
Anything that doesn't match the above should be deleted.
How can we script this?
I'm guessing a BASH script can do this, and that we can create a cron job to run daily to perform the task.
Something like this perhaps?
{ ls -t1 | grep -m2 ^[[:digit:]{6}_Config] ;
ls -t1 | grep -m1 ^[[:digit:]{4}01_Config] ;
find -mtime -30 ;
ls -1 verification_cfg ;
} | rsync -a --exclude=* --include-from=- /var/DATA/ /var/DATA.bak/
rm -rf /var/DATA
mv /var/DATA.bak /var/DATA
For what it's worth, here is the bash script I created to accomplish my task. Comments are welcome.
#!/bin/bash
# This script follows these rules:
#
# - Only process directories beginning with "DEV"
# - Do not process directories within the device directory
# - Keep files that match the following criteria:
# - Keep the two newest automated backups
# - Keep the six newest automated backups generated on the first of the month
# - Keep any file that is less than 30 days old
# - Keep the file "verification_cfg"
#
# - An automated backup file is identified as six digits, followed by "_Config"
# e.g. 20120329_Config
# Remember the current directory
CurDir=`pwd`
# FTP home directory
DatDir='/var/DATA/'
cd $DatDir
# Only process directories beginning with "DEV"
for i in `find . -type d -maxdepth 1 | egrep '\.\/DEV' | sort` ; do
cd $DatDir
echo Doing "$i"
cd $i
# Set the GROUP EXECUTE bit on all files
find . -type f -exec chmod g+x {} \;
# Find the two newest automated config backups
for j in `ls -t1 | egrep -m2 ^[0-9]{8}_Config$` ; do
chmod g-x $j
done
# Find the six newest automated config backups generated on the first of the month
for j in `ls -t1 | egrep -m6 ^[0-9]{6}01_Config$` ; do
chmod g-x $j
done
# Find all files that are less than 30 days old
for j in `find -mtime -30 -type f` ; do
chmod g-x $j
done
# Find the "verification_cfg" file
for j in `find -name verification_cfg` ; do
chmod g-x $j
done
# Remove any files that still have the GROUP EXECUTE bit set
find . -type f -perm -g=x -exec rm -f {} \;
done
# Back to the users current directory
cd $CurDir

Resources