how to check my file is created within 10 days in shell script - linux

I have a bunch of log files which are named according to their creation dates. For example; if my log file is created on 12 March 2018, the name of the logfile is log-2018-03-12.log
Here is what I want to do: From today's date, I want to check the name of my log files and zip the log files which are created in last 10 days.
Here is my code that zip all log files in a specific directory:
#!/bin/bash
# current time
now=$(date +"%F")
backupfile="backup-$now"
scripthome=/opt/MyStore/action_scripts/deneme/
tendaysbefore= date -d "$now - 10 days" '+%F'
for file in $scripthome;
do
find "$(basename "$file")" | zip -R $backupfile.zip "log-2*.log"
done
But I want to zip last 10 days log file, not all log files, and also I want to continue doing it for every 10 days after this. Also, after having zip file, I want to delete old log files.
In other words, I am trying to write a log-backup script. Can you help me please?
Thank you very much!

#!/bin/bash
END=10
for ((i=1;i<=END;i++)); do
file=log-`date -d "$i days ago" +"%F"`.log
echo $file
done
With the above script you have file names for last 10 days. Later(inside loop) you can do whatever you want like adding it to existing zip or searching for its existence.
Edit:
Following code may be useful according to your requirement
#!/bin/bash
# current time
now=$(date +"%F")
backupfile="backup-$now"
scripthome=/home/bhanu/opt/MyStore/action_scripts/deneme/
tendaysbefore=`date -d "$now - 10 days" '+%F'`
for file in $scripthome;
do
zip -r -tt $now -t $tendaysbefore "$backupfile.zip" $scripthome/log-*.log > add.log 2>&1
zip "$backupfile.zip" -d "*" -tt $tendaysbefore > delete.log 2>&1
done

Related

list files from directory between given hours using linux command

There are few errors which we need to mitigate which can be done only when I am able to find the list of files which got created between 2 hours.
My file naming pattern is
App_ErrorFile1401_01_11_YYYYMMDDHHMMSS_1234_123456.csv.gz
I need to find files which are between 9AM-11AM of yesterday. Having found the list of files I will then FTP those files to a given server IP.
FTP part is something which we can do easily but I am not able to find any pattern using which I can search only 2 hours files and select them for FTP. I dont have much idea of grep with regex pattern and after going through net for more than an hour I am yet to figure out how to build my statement.
I can use find/grep. I am on RHEL.
Help would be greatly appreciated.
you can try the following code snippet if you can use bash:
#!/bin/bash
# define timearea
yesterday=$(date --date="yesterday" +"%Y%m%d")
start="${yesterday}090000"
stop="${yesterday}110000"
find . -name "App_ErrorFile*.csv.gz" | \
while read -r file; do
IFS=_ read -r -a arr <<< "$file"
timestamp="${arr[4]}"
if [[ $timestamp -ge $start && $timestamp -le $stop ]]; then
echo "$file"
fi
done
You have to start it from the directory in which the files are located.

Script to check the change of crontab using diff

I need a script which needs to look in a way that take copy of the current crontab in a file then every day a cron tab copy needs to be taken and it needs to compare using "diff" command if it is not matching it needs to send alert mail.Can any one please help me on this?
Currently I'm using the below script But issue with this script is it sends alerts even if the Changes made in the crontab are correct.But I want to compare the contents using the diff command.So this script not suits for my requirement
#!/bin/sh
export smtp=smtprelay.intra.coriant.com:25
CROND=/home/ssx00001
ALERT=redmine#coriant.com
checkf=last.crontab.check
if [ -f $checkf ]
then
find $CROND -type f -newer $checkf | while read tabfile
do
echo "Crontab file for Redmine has changed" | mail -s "Crontab changed" $ALERT
done
fi
touch $CHECKF
#!/bin/sh
export smtp=smtprelay.intra.coriant.com:25
ALERT=redmine#coriant.com
crontab -l > /home/ssx00001/y.txt
cat y.txt
diff /home/ssx00001/x.txt /home/ssx00001/y.txt > /home/ssx00001/z.txt
ab=`cat z.txt | wc -l`
echo $ab
if [[ $ab != 0 ]]; then
echo "Crontab for Redmine has changed" | mail -s "Crontab modified" $ALERT
fi
(/home/ssx00001 is the path in which files stored ?)
Also create a file in /home/ssx00001 as x.txt which contains data of current cronjobs
The problem you have is that the diff command requires two files to compare. You cannot check for changes in a file without saving an old version of the file to check against. The crontab command does not do this.
Your best bet is to write a wrapper around the crontab command which saves a copy of the original crontab file, runs crontab to edit and install the new file, and then runs diff with the file you saved.

Script to look at files in a directory

I am writing a script that shows all the files in a directory named "Trash". The script will then prompt the user for which file he wants to "undelete" and send it back to it's original directory. Currently I am having a problem with the for statement, but I also am not sure how to have the user input which file and how to move it back to it's original directory. Here is what I have thus far:
PATH=/home/user/Trash
for files in $PATH
do
echo "$files deleted on $(date -r $files)"
done
echo "Enter the filename to undelete from the above list:"
Actual Output:
./undelete.sh: line 6: date: command not found
/home/user/Trash deleted on
Enter the filename to undelete from the above list:
Expected Output:
file1 deleted on Thu Jan 23 18:47:50 CST 2014
file2 deleted on Thu Jan 23 18:49:00 CST 2014
Enter the filename to undelete from the above list:
So I am having two problems currently. One instead of reading out the files in the directory it is giving $files the value of PATH, the second is my echo command in the do statement is not processing correctly. I have changed it around all kinds of different ways but can't get it to work properly.
You're making many mistakes in your script but biggest of all is setting the value of reserved path variable PATH. Which is basically messing up standard paths and causing errors like date command not found.
In general avoid using all caps variables in your script.
To give you a start you can use script like this:
trash=/home/user/Trash
restore=$HOME/restored/
mkdir -p "$restore" 2>/dev/null
for files in "$trash"/*
do
read -p "Do you want to keep $file (y/n): " yn
[[ "$yn" == [yY] ]] && mv "$file" "restore"
done

How to read date from user in linux and use that date in svn log

I am new to shell scripting.
I want to write a shell script to get the date from user from the terminal in format yyyy-mm-dd and use that date as start date to get the revision changes made in svn repository for a particular date range.
My script is:
echo "Date"
echo "Enter year"
read Y
echo "Enter month"
read M
echo "Enter Date"
read D
D1=`expr $D + 3`
svn log -r {$Y-$M-$D}:{$Y-$M-$D1} http://svn.abc.com/svn/trunk.
This is the script I have written.
I know I should use date function not sperately Y M D.
And also I want to list only revision numbers not full log messages as svn log command shows.
I would use the date utility for verification:
while true; do
read -p "Enter a date (YYYY-mm-dd): " user_date
if date=$(date -d "$user_date" +%F); then
# user date was ok
break
fi
done
date3=$(date -d "$date + 3 days" +%F)
svn log -r "$date:$date3" ...
You really need to use date, especially to add 3 days: you don't want to end up with "2014-02-30"

Shell script: Count files, delete 'X' oldest file

I am new to scripting. Currently I have a script that backs up a directory every day to a file server. It deletes the oldest file outside of 14 days. My issue is I need it to count the actual files and delete the 14th oldest one. When going by days, if the file server or host is down for a few days or longer, when back up it will delete a couple days worth of backups or even all of them. Pending down time. I want it to always have 14 days worth of backups.
I tried searching around and could only find solutions related to deleting by dates. Like what I have now.
Thank you for the help/advice!
My code I have, sorry its my first attempt at scripting:
#! /bin/sh
#Check for file. If not found, the connection to the file server is down!
if
[ -f /backup/connection ];
then
echo "File Server is connected!"
#Directory to be backed up.
backup_source="/var/www/html/moin-1.9.7"
#Backup directory.
backup_destination="/backup"
#Current date to name files.
date=`date '+%m%d%y'`
#naming the file.
filename="$date.tgz"
echo "Backing up directory"
#Creating the back up of the backup_source directory and placing it into the backup_destination directory.
tar -cvpzf $backup_destination/$filename $backup_source
echo "Backup Finished!"
#Search for folders older than '+X' days and delete them.
find /backup -type f -ctime +13 -exec rm -rf {} \;
else
echo "File Server is NOT connected! Date:`date '+%m-%d-%y'` Time:`date '+%H:%M:%S'`" > /user/Desktop/error/`date '+%m-%d-%y'`
fi
Something along the lines like this might work:
ls -1t /path/to/directory/ | head -n 14 | tail -n 1
in the ls command, -1 is to list just the filenames (nothing else), -t is to list them in chronological order (newest first). Piping through the head command takes just the first 14 from the output of the ls command, then tail -n 1 takes just the last from that list. This should give the the file that is 14th newest.
Here is another suggestion. The following script simply enumerates the backups. This eases the task of keeping track of the last n backups. If you need to know the actual creation date you can simply check the file metadata, e.g. using stat.
#!/bin/sh
set -e
backup_source='somedir'
backup_destination='backup'
retain=14
filename="backup-$retain.tgz"
check_fileserver() {
nc -z -w 5 file.server.net 80 2>/dev/null || exit 1
}
backup_advance() {
if [ -f "$backup_destination/$filename" ]; then
echo "removing $filename"
rm "$backup_destination/$filename"
fi
for i in $(seq $(($retain)) -1 2); do
file_to="backup-$i.tgz"
file_from="backup-$(($i - 1)).tgz"
if [ -f "$backup_destination/$file_from" ]; then
echo "moving $backup_destination/$file_from to $backup_destination/$file_to"
mv "$backup_destination/$file_from" "$backup_destination/$file_to"
fi
done
}
do_backup() {
tar czf "$backup_destination/backup-1.tgz" "$backup_source"
}
check_fileserver
backup_advance
do_backup
exit 0

Resources