I'm reading a timestamp from a file and I'd like to copy the value to the START_DATE variable, and add 1 second to that value.
export START_DATE=`cat ${WINDOW_END_FILE}`
Timestamp format
2019-04-03-23.59.59
In the end, I'd like the date to be
2019-04-04-00.00.00
Convert the date to epoch time, then add 1:
fmt='%Y-%m-%d-%H.%M.%S'
date -j -f %s $(( $(date -j -f "$fmt" 2019-04-03-23.59.59 +%s) + 1 )) +"$fmt"
Ok here goes, this isn't pretty, but fun:
If I understood, let's just say START_DATE=2019-04-03-23.59.59
# Pull Date section
date=$(echo $START_DATE | cut -d - -f 1-3)
# Pull Time section
time=$(echo $START_DATE | cut -d - -f 4 | sed 's/\./:/g')
# Use bash builtin date function for conversion
date -d "$cal $time + 1 second" +"%Y-%m-%d-%H-%M-%S"
Output:
2019-04-04-00-00-00
Using GNU awk:
$ gawk '{
gsub(/[^0-9]/," ") # change the format mktime friendly
print strftime("%F-%H.%M.%S",mktime($0)+1) # to epoch, add one, reformat back
}' file # a timestamp from a file
2019-04-04-00.00.00 # output this time
mktime turns datespec (YYYY MM DD HH MM SS [DST]) to number of seconds since the system epoch. strftime formats the timestamp to given format.
Related
I have log file with this strings
2022-08-13 19:15:17.170 INFO 550034 --- [ scheduling-3] org.hibernate.SQL_SLOW : SlowQuery: 11387 milliseconds. SQL:
I need grep command to count slowQuery for last hour with time more than 10000 ms
I've try
grep "SQL_SLOW" app.log | wc -l
but i can't add two conditions:
time (first 19 symbols) must be between current time minus one hour
time for query must be more than 10000 ms (in example it 11387 ms)
grep is the wrong tool for the job; trying to get the date and time conditions to match via RE is just wrong.
How about awk?
echo '2022-08-13 19:15:17.170 INFO 550034 --- [ scheduling-3] org.hibernate.SQL_SLOW : SlowQuery: 11387 milliseconds. SQL:' \
| awk -v now=$(date "+%s") '
/SlowQuery:/ {
tmp = "\" " $1 " " $2 "\""
cmd = "date -d " tmp " +%s"
cmd | getline ts
t = gensub(/.*SlowQuery: ([0-9]+) milliseconds.*/, "\\1", "g", $0)
if (now - ts < 3600 && t > 10000) {
print $0
}
}
Brief explanation: first we capture the current time in seconds since epoch.
Then we convert the timestamp for each line of the log containing /SlowQuery:/ to seconds since epoch tmp="\x22 "$1" "$2"\x22";cmd="date -d " tmp " +%s";cmd|getline ts and store it in the variable ts.
We extract the time taken t=gensub(/.*SlowQuery: ([0-9]+) milliseconds.*/, "\\1", "g") and store it in t.
The last step is to check if less than an hour has passed and the query was slower than 10000 ms if(ts-now<3600 && t>10000){print $0}. If both is true, print the line.
Someone helped me find this solution. But it have some admissions. Logs will not be exactly for the last hour, but for the current incomplete hour and for the previous hour (that is, depending on the current time, it will be from 1 to 2 hours)
grep "SQL_SLOW" "app.log" \
| grep "^$(date -d '1 hour ago' '+%Y-%m-%d %H')" \
| grep -P "SlowQuery: \d{5,} milliseconds" \
| wc -l
But solution with awk looks like better
So I have the date format like this : 2019-10-19 23:55:42.797 and I want the millisecond part to be round of into the second so the output should look something like this: 2019-10-19 23:55:43
I have tried
date -d "2019-10-19 23:55:42.797" "+%Y-%m-%d %H:%M:%S"
but it's giving me output like 2019-10-19 23:55:42
How should I do this in Linux bash shell?
This can be done in a single awk like this:
s='2020-12-31 23:59:59.501'
awk -F. 'gsub(/[-:]/, " ", $1) {
dt = mktime($1)
if ($2 >= 500) dt++
print strftime("%F %X", dt)
}' <<< "$s"
2021-01-01 00:00:00
The behaviour you observe is as expected. The format specifiers represent the actual quantity without rounding. Imagine you would include rounding and you have the time "2019-10-19 23:55:42.797" but you are not interested in seconds and set the format to "%F %H:%M", do you want to see "2019-10-19 23:55" or "2019-10-19 23:56", and even further. Imagine you have the time "2020-12-31 23:59:59.501" with format "%F %T", do you want it to show "2021-01-01 00:00:00" or "2020-12-31 23:59:59". While we all want 2020 to finish as soon as possible, the latter still remains the correct time representation.
Rounding in times is only relevant when you look at time differences and not at absolute times. Hence, I strongly recommend not to implement any rounding and just use the output that date provides you.
However, if, for whatever reason you actually need to round the time to the nearest second, then you can do this:
epoch_ms=$(date -d "2019-10-19 23:55:42.797" "+%s%3N")
epoch=$(( (epoch_ms + 500)/1000 ))
date -d "#$epoch" "%F %T"
Or in a single line:
date -d "#$(( ( $(date -d "2019-10-19 23:55:42.797" "+%s%3N") + 500 )/1000 ))" "+%F %T"
I am trying to parse a log and get the lines between timestamp.Tried sed approach like below but facing issue with regex
Log pattern:
IP - - [20/Apr/2018:14:25:37 +0000] "GET / HTTP/1.1" 301 3936 "-" "
IP - - [20/Apr/2018:14:44:08 +0000]
----------------------------------
IP- - [20/Apr/2018:20:43:46 +0000]
I need to get the lines between 14:25 and 20:43 for 20th april as the log contains other dates also.
Tried this:
sed -n '/\[14:25/,/\[20:43/p' *-https_access.log.1
but not working.
Since you mentioned you want logs for 20th April, I'd suggest something like :
$ sed -n '/20\/Apr\/2018:14:25/,/20\/Apr\/2018:20:43/p' *-https_access.log.1
This is very less likely to conflict with false matches in case "20:43" occurs elsewhere.
sed is not appropriate because it's hard to compare element (like day and hour).
with awk (self commented):
awk -F '[ []' '
{
# separt date and hour then rebuild the fields
sub(/:/, " ", $5);$0=$0""
}
# print if it s the day and between the 2 hour (string compare works in this case)
$5 ~ /20.Apr.2018/ && $6 >= "04:25" && $7 < "20:44"
' YourFile
more generaly, we can use variable to give date and hour as paramter to the awk (not the purpose here)
To print lines between match1 and match2 with sed or awk you can do:
sed -n '/match1/,/match2/p' inputfile
awk '/match1/,/match2/' inputfile
in your example match1 is 20/Apr/2018:14:25 and match2 is 20/Apr/2018:20:43. So any of these commands should work for you:
sed -n '/20\/Apr\/2018:14:25/,/20\/Apr\/2018:20:43/p' inputfile
awk '/20\/Apr\/2018:14:25/,/20\/Apr\/2018:20:43/' inputfile
or use | as a sed's delimiter to prevent escaping slash:
sed -n '\|20/Apr/2018:14:25|,\|20/Apr/2018:20:43|p' inputfile
The best solution is to use awk for this. What you need to do is convert your time-stamps to a unix-time and then do the comparisons. In awk you can do this using mktime()
mktime(datespec [, utc-flag ]): Turn datespec into a timestamp in the same form as is returned by systime(). It is similar to the
function of the same name in ISO C. The argument, datespec, is a
string of the form YYYY MM DD HH MM SS [DST]. The string consists of
six or seven numbers representing, respectively, the full year
including century, the month from 1 to 12, the day of the month from 1
to 31, the hour of the day from 0 to 23, the minute from 0 to 59, the
second from 0 to 60,55 and an optional daylight-savings flag.
In order to convert your time-format of the form 20/Apr/2018:14:25:37 +0000 into 2018 04 20 14 25 37 +0000
awk -v tstart="20/Apr/2018:14:25:00" -v tend = "20/Apr/2018:20:43:00" \
'function tounix(str) {
split(str,a,"/|:| ")
return mktime(a[3]" "month[a[2]]" "a[1]" "a[4]" "a[5]" "a[6])
}
BEGIN{
month["Jan"]="01";month["Feb"]="02";month["Mar"]="03"
month["Apr"]="04";month["May"]="05";month["Jun"]="06"
month["Jul"]="07";month["Aug"]="08";month["Sep"]="09"
month["Oct"]="10";month["Nov"]="11";month["Dec"]="12"
FS="\\[|\\]"
t1=tounix(tstart)
t2=tounix(tend)
}
{ t=tounix($2) }
(t1<=t && t<=t)' <file>
This method is robust as it will do true time comparisons which are independent of leap years, day/month/year-cross-overs, ... In contrast to other solutions provided, this method also does not require the existence of the date tstart and tend in the file
I am working on shellscript with excel sheet. Till now I have done as shown in screenshot by using below command:
bash execution.sh BehatIPOP.xls| awk '/Script|scenario/' | awk 'BEGIN{print "Title\tResult"}1' | awk '0 == NR%2{printf "%s",$0;next;}1' >> BehatIPOP.xls
My requirement is along with the heading Result I want to add(concat) current date also. So I am getting date by using below command:
$(date +"%d-%m-%y %H:%M:%S")
So date will display like this : 25-08-2016 17:00:00
But I am not getting how can use date command in the above mentioned command to achieve heading like below:
| Title | Result # 25-08-2016 17:00:00|
Thanks for any suggestions..
You can pick up the date inside awk and store it in a variable d like this, if that is what you mean:
awk 'BEGIN{cmd="date +\"%d-%m-%y %H:%M:%S\""; cmd |getline d; close(cmd);print "Result # " d}'
Result # 25-08-16 13:44:05
Don't use awk at all for the header, just use date directly:
{ printf "Title\tResult # "; date +"%d-%m-%y %H:%M:%S"; bash execution.sh BehatIPOP.xls |
awk '/Script|scenario/' |
awk '1 == NR%2{printf "%s",$0;next;}1'; } >> BehatIPOP.xls
Note that there's no need for 2 awks, but I'm keeping that here to minimize the diff. Since I've pulled the header out of the awk, the comparison changes from 0==NR%2 to 1==NR%2.
shell>/bin/date -d "091029 20:18:02" +%s
1256827682
Similarly I created shell script:
#My.sh
myDate=`date +'%y%m%d_%H%M%S'`
myDate1=`echo $myDate | sed 's/_/ /g'`
myDate2=`echo $myDate1 | sed 's/\([0-9][0-9][0-9][0-9][0-9][0-9]\) \([0-9][0-9]\)\([0-9][0-9]\)\([0-9][0-9]\)/\/bin\/date -d "\1 \2:\3:\4" +%s/'`
print $myDate2
`$myDate2`
But it doesn't execute above command. WHy?
The backticks in the last line executes the command and then executes the output of that command, which is probably not what you are trying to do -- when I run this I get the output -bash: 1256829614: command not found. Remove the backticks on the last line so that it just says
$myDate2
In addition to the solutions above, I would like to offer assistant to get to myDate2:
$ myDate=$(date +'/bin/date -d "%y%m%d %H:%M:%S" +%%s')
$ echo $myDate
/bin/date -d "091029 09:06:29" +%s
The notation:
`$myDate2`
expands $myDate2 and executes the command (and I'll come back to why there are problems with that), and then captures the output - and tries to run the output.
What you are looking for is eval:
eval $myDate2
Handling quotes is tricky - and eval is often a part of the answer. When you build up a string with internal quotes, you need to use eval to get the shell to put the quotes back together.
One very useful tool that I have is a program called al - for argument list.
#include <stdio.h>
int main(int argc, char **argv)
{
while (*++argv != 0)
puts(*argv);
return(0);
}
It prints each separate argument on a separate line. It was almost the first thing I did when looking at what you are up to.
myDate=`date +'%y%m%d_%H%M%S'`
myDate1=`echo $myDate | sed 's/_/ /g'`
myDate2=`echo $myDate1 | sed 's/\([0-9][0-9][0-9][0-9][0-9][0-9]\) \([0-9][0-9]\)\([0-9][0-9]\)\([0-9][0-9]\)/\/bin\/date -d "\1 \2:\3:\4" +%s/'`
print $myDate2
#`$myDate2`
al $myDate2
eval al $myDate2
eval $myDate2
The trace output from this was:
+ date +%y%m%d_%H%M%S
+ myDate=091029_082546
+ sed 's/_/ /g'
+ echo 091029_082546
+ myDate1='091029 082546'
+ sed 's/\([0-9][0-9][0-9][0-9][0-9][0-9]\) \([0-9][0-9]\)\([0-9][0-9]\)\([0-9][0-9]\)/\/bin\/date -d "\1 \2:\3:\4" +%s/'
+ echo 091029 082546
+ myDate2='/bin/date -d "091029 08:25:46" +%s'
+ print /bin/date -d '"091029' '08:25:46"' +%s
/bin/date -d "091029 08:25:46" +%s
+ al /bin/date -d '"091029' '08:25:46"' +%s
/bin/date
-d
"091029
08:25:46"
+%s
+ eval al /bin/date -d '"091029' '08:25:46"' +%s
+ al /bin/date -d '091029 08:25:46' +%s
/bin/date
-d
091029 08:25:46
+%s
+ eval /bin/date -d '"091029' '08:25:46"' +%s
+ /bin/date -d '091029 08:25:46' +%s
usage: date [-jnu] [-d dst] [-r seconds] [-t west] [-v[+|-]val[ymwdHMS]] ...
[-f fmt date | [[[mm]dd]HH]MM[[cc]yy][.ss]] [+format]
Note how when I ran 'al $myDate2' the date string was split into two arguments, but when I ran 'eval al $myDate2', the string was one argument - as required. I was testing on MacOS X, where the data command does not accept the date string format you supplied - that is a whole separate problem. But getting the string healed requires 'eval'.
I didn't even address the issue of what the shell script was trying to do.
I gather from Hai Vu's answer that we're really after the current time in seconds since the epoch; I can sort of see how that might be.
On MacOS X, that is obtained trivially:
date +'%s'
(where the single quotes really aren't needed). The MacOS X manual page also includes the example:
date -j -f "%a %b %d %T %Z %Y" "`date`" "+%s"
This seems a bit convoluted - but would allow you to find the seconds since the epoch for any date previously given by the date command - or a date that will be given at some time in the future (by replacing the back-quoted date with the previous string).
An æon or so ago, I wrote programs 'systime' to print the current time as the number of seconds past the epoch, and also a program 'timestamp' to convert such values back into formatted dates - because none of the standard versions of the 'date' command supported such mechanisms back then (before the C standard was standard, and therefore before strftime() was widely available). I also have a program 'strptime' for converting a formatted date into a timestamp. Ah well - nice to know that the standard programs can now do it.
However, I note that the MacOS 'date' command is a superset of the POSIX standard version; I suspect that the Linux (GNU) 'date' command is a different superset of the POSIX standard, and so on for each platform.