Capture the log hourly from daily log file - linux

I have a log file which is daily rotate( let say main log file) I need capture the log hourly up-to now then store the hourly log file
Ex;
When I execute the script command at 02/06/2017 03:41:35 I need logs between 02:41:35 to 03:41:35
Main log file
[02/06/2017][00:12:41][58162][3690952448][000000000000000000000000
[02/06/2017][00:12:41][58162][3690952448][000000000000000000000000
----------
[02/06/2017][03:41:35][57732][3674167040][000000000000000000000000
[02/06/2017][03:41:35][57732][3674167040][000000000000000000000000
hourly log file
[02/06/2017][02:41:35][58162][3690952448][000000000000000000000000
[02/06/2017][02:41:35][58162][3690952448][000000000000000000000000
----------
[02/06/2017][03:41:35][57732][3674167040][000000000000000000000000
[02/06/2017][03:41:35][57732][3674167040][000000000000000000000000
I have execute the below commands, but it’s not capturing hourly log instead of capturing all the logs
echo $(date -d'now-1 hours' +"[%d/%m/%Y][%H:%M:%S]") | cat mainlog.log
echo $(date --date='1 hours ago' +"[%d/%m/%Y][%H:%M:%S]") | cat mainlog.log

You are piping the output of echo into cat - that won't filter anything. cat will ignore the input coming through the pipe (stdin) since it is reading from the file mainlog.log.
You could do this instead (note the change in date format - I have removed the %M and %S parts):
grep -F "$(date -d'now-1 hours' +'[%d/%m/%Y][%H:]')" mainlog.log
grep -F "$(date -d'1 hours ago' +'[%d/%m/%Y][%H:]')" mainlog.log
Here, grep would look for the string returned by the date command in your log file.

Related

log console output where the same line is being updated

I'm running an application that on start updates the same line:
1 of 1,000,000
200 of 1,000,000
300 of 1,000,000
But the above is on a single line.
Every time the cursor is updated I'd like to write it to a log file so that I can observe the duration between updates.
This command seems to work for lines that were output to console before the number starts increment on the same line, but does not log the number updates.
command |& tee >(ts "%d-%m-%y %H_%M_%S" > play.log)
Is there a trick to log the state screen on update along with a timestamp?
You could keep track of changes on a file by using tail and if you would like to prefix the contents with a custom input you could use xargs, for example:
$ tail -F file | xargs -I# date +"%d-%m-%y %H_%M_%S --> #"
Or pass the output of your command:
$ echo "foo" | xargs -I# date +"%d-%m-%y %H_%M_%S --> #"
11-08-18 15_53_53 --> foo

bash loop file echo to each file in the directory

I searched a while and tried it by myself but unable to get this sorted so far. My folder looks below, 4 files
1.txt, 2.txt, 3.txt, 4.txt, 5.txt, 6.txt
I want to print file modified time and echo the time stamp in it
#!/bin/bash
thedate= `ls | xargs stat -s | grep -o "st_mtime=[0-9]*" | sed "s/st_mtime=//g"` #get file modified time
files= $(ls | grep -Ev '(5.txt|6.txt)$') #exclud 5 and 6 text file
for i in $thedate; do
echo $i >> $files
done
I want to insert each timestamp to each file. but having "ambiguous redirect" error. am I doing it incorrectly? Thanks
In this case, files is a "list" of files, so you probably want to add another loop to handle them one by one.
Your description is slightly confusing but, if your intent is to append the last modification date of each file to that file, you can do something like:
for fspec in [1-4].txt ; do
stat -c %y ${fspec} >>${fspec}
done
Note I've used stat -c %y to get the modification time such as 2017-02-09 12:21:22.848349503 +0800 - I'm not sure what variant of stat you're using but mine doesn't have a -s option. You can still use your option, you just have to ensure it's done on each file in turn, probably something like (in the for loop above):
stat -s ${fspec} | grep -o "st_mtime=[0-9]*" | sed "s/st_mtime=//g" >>${fspec}
You can not redirect the output to several files as in > $files.
To process several files you need something like:
#!/bin/bash
for f in ./[0-4].txt ; do
# get file modified time (in seconds)
thedate="$(stat --printf='%Y\n' "$f")"
echo "$thedate" >> "$f"
done
If you want a human readable time format change %Y by %y:
thedate="$(stat --printf='%y\n' "$f")"

Get last 30 minutes from log file

I have a log file that contain logs as follows
1486307866.155 240207 68.146.231.80 TCP_MISS/200 790 CONNECT clients1.google.com:443 - DIRECT/172.217.6.238 -
1486307866.155 is the time in unix format with corresponds to 2017-02-05 07:17:46 (Format : Y-m-d H:i:s)
I need a unix command that give me the logs within last 30 minutes in the following format and discarding any details that i don't need.
2017-02-05 07:17:46|68.146.231.80|clients1.google.com:443
Using GNU date and GNU awk you can achieve what you want:
awk -v bt=$(date "+%s" -d "30 minutes ago") '$1 > bt {printf("%s|%s|%s\n", strftime("%F %T",$1), $3, $7)} ' yourfile
Explanation:
the date command date "+%s" -d "30 minutes ago" gets the timestamp from 30 minutes ago
the date command is replaced with its output via the command substitution feature $( ... )
the awk option -v passes that timestamp as variable named bt into the awk script
the script prints only those lines from the file having a value in column one ($1) larger than bt in your desired format

Append the output as new line in shell script

I have a log file log .txt which is having the data in below format
Name=abc Date=20140710
Name=xyz Date=20140715
Name=pqr Date=20140810 And so on
I am fetching the data based on today's date and appending it to a log file in new line
today=$(date --date "+1 week" +%Y%m%d)
grep $today log.txt $'\r' >> append_file.txt
But when I am running the script, it is giving me exception like
: No such file or directory
Also, in the append_file.txt, it is keeping the data as
log.txt:Name=abc Date=20140710
Ideally it should keep only the data i.e.
Name=abc Date=20140710
Actually, my end point objective is mail the content of append_file.txt and I want the data line wise...... like this
Name=abc Date=20140710
Name=mno Date=20140710
At present, it is mailing the data in single line Name=abc Date=20140710 Name=mno Date=20140710
Any suggestion ?
Your output looks like:
log.txt:Name=abc Date=20140710
Because grep thinks that you're giving more than one file to work with.
The problem is $'\r' in this line:
grep $today log.txt $'\r' >> append_file.txt
Replace it by:
grep $today log.txt >> append_file.txt
or if you need to insert \r at the end of each line:
grep $today log.txt | sed -e 's/$/\r/g' >> append_file.txt

Shell Script to check the folder created date against the current date/time

I am preparing shell script if folder created date is equal to the current date/time then it need to call the another script .
My requirement is script need to check only the folder created date against the current date/time not to the files date/time inside the folder .
Thanks in advance
You can have a look at the stat command and the change time of the folder. The change time gives the last date when the metadata of the folder was changed. (stackexchange link). Timestamps of permission changes are also included in this timestamp. This is maybe not exactly what you need.
For the current time, you can use the date command. You can compare timestamps between the stat and the date command if you print the output in seconds since the epoch.
stat --format="%Z" /path/to/folder
date +%s
Suppose TestToday is your Directory Name
*Please replace TestToday with your directory name in below script
Test.sh
ls -lrt | grep ^d | grep TestToday | grep "`date +%b`" | awk -v var="`date +%d | bc`" '$7==var {print $NF}' > DIR_NAME
DIR_EXISTS=`cat DIR_NAME`
#echo $DIR_EXISTS
if [ "$DIR_EXISTS" == "TestToday" ];then
echo "Calling Shell Script Here."
else
echo "Directed not created or updated today."
fi

Resources