Grok pattern, date and time formats - logstash

I have log output with the following messages;
[event] 02/05 09:20:01.8 PM message description
[event] 10/26 09:42:27.0 AM message description
How can I use grok to get the date and time in the above format
The date is 02/05 i.e mm/dd. The year is not defined but is not important as I know its 2020 so there no need to define it
The time is as above example and can be PM and AM
How can i grab the date and time in log stash using grok
I have tried
%{TIME:timestamp} %{GREEDYDATA:Description}
But this captures the Time stamp only as 09:20:01.8 and does not include the PM. It would be good if it converted it to 24 hour.

You can define a custom pattern to match the entire date/time
grok {
pattern_definitions => { "MYTIME" => "%{MONTHNUM}/%{MONTHDAY} %{TIME} [AP]M"
match => { "message" => "%{MYTIME:timestamp}" }
}

Related

grok pattern for Automation Anywhere timestamp

(1/15/2018 3:00:32 AM)
Hi I have the above format for which I was trying to write grok pattern to seperate date, time, and AM/PM , Please help. I was using below pattern but still don't see the proper out put when create the index.
grok {
match => {
"message" => "%{MONTHDAY}/%{MONTHNUM}/%{YEAR}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?"
}
}
The first number is a month and the second is the day, since it's above 12. So you'll have to switch %{MONTHDAY} & %{MONTHNUM} like this:
"%{MONTHNUM}/%{MONTHDAY}/%{YEAR}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?"

Grok parsing negative numbers into Kibana custom fields

Been banging my head against the wall over this - started out with logstash and Grok 2 days ago and made a bit of progress but i've been stuck looking at this particular problem all evening.
I have the following lines of input from a log file being ingested into logstash.
'John Pence ':'decrease':-0.01:-1.03093: 0.96: 0.97
'Dave Pound':'increase':0.04:1.04000: 0.97: 0.93
With the following grok filter matches:
match => { "message" => "%{QS:name}:%{QS:activity}:%{BASE16FLOAT:Change}:%{BASE16FLOAT:Percentage}: %{BASE16FLOAT:CurrentPrice}: %{BASE16FLOAT:PreviousPrice}" }
match => { "message" => "%{QS:Name}:%{QS:Activity}:-%{BASE16FLOAT:Change}:-%{BASE16FLOAT:Percentage}: %{BASE16FLOAT:CurrentPrice}: %{BASE16FLOAT:PreviousPrice}" }
This produces the following output in Kibana:
As you can see - I can't get the negative numbers to display correctly, how would one correctly show the minus sign in a grok filter?
Would greatly appreciate some help!
You can simply use the NUMBER grok pattern instead of BASE16FLOAT
The following grok pattern works perfectly on your input:
grok {
"match" => {"message" => "%{QS:name}:%{QS:activity}:%{NUMBER:Change}:%{NUMBER:Percentage}: %{NUMBER:CurrentPrice}: %{NUMBER:PreviousPrice}"}
}

Converting date to UNIX time in Logstash

Is it possible to convert date from "2016-08-22T09:09:55.487Z" format to UNIX time in Logstash? I have seen the opposite operation, but nothing about it.
First, you'll have to convert "2016-08-22T09:09:55.487Z" to a date object, with the date filter:
(supposing that the field date contains a string representing a valid ISO8601 timestamp)
date {
match => ["date", "ISO8601"]
target => "date_object"
}
At this point you'll have a field date_object containing a logstash timestamp.
This timestamp can be converted to its epoch equivalent with the to_i method.
To do this we'll have to use the ruby filter, which allow to execute ruby code as a filter.
ruby {
code => set.event('date_epoch', event.get('date_object').to_i)"
}
Then you'll have a field date_epoch, which will be a number representing the UNIX time.
I came across a similar issue today. Unfortunately the peace of config above has a limitation that it loses milliseconds from the timestamp during the integer conversion:
ruby {
code => "event['date_epoch'] = event['date_object'].to_i"
}
I’ve tried several options including converting the date object to float, multiplying it by 1000 and then back to string. The bottom line is that the precision was not exactly the same.
Finally I came up with this a bit hacky sample below. It worked with logstash version 2.4.1.
So first I create a field tmpTimestamp in order to convert the parsed timestamp into a plain String:
mutate{
add_field => ["tmpTimestamp","%{#timestamp}"]
}
A peace of ruby code to cast the string into a standard ruby DateTime format, convert it to the epoch format (including ms) and then back to String:
ruby { code => "require 'date';event['epoch'] = DateTime.parse(event['tmpTimestamp']).strftime('%Q').to_s" }
Remove unused tmp variable:
mutate{
remove_field => ["tmpTimestamp"]
}

Logstash: How to save an entry from earlier in a log for use across multiple lines later in the log?

So the format of my logs looks somethings like this
02:00:30> First line of log for date of 2014-08-13
...
04:03:30> Every other line of log
My question is: how can I save the date from the first line to create the timestamp for the other lines in the files?
Is there a way to set some kind of "global" field that I can reuse for other lines?
I'm looking at historical logs so the current time isn't much use.
I posted a memorize filter that you could use to do that. It was posted here.
You'd use it like this:
filter {
if [message] =~ /date of/ {
grok {
match => [ "message", "date of (?<date>\d\d\d\d-\d\d-\d\d)" ]
}
} else {
// parse your log with grok or some other method that doesn't capture date
}
memorize {
field => date
}
}
So on the first line, because you extract a date, it'll memorize it... since it's not on the remaining lines, it'll add the memorized date to the events.

Converting date MMM dd HH:mm:ss for logstash

I have a logfile with a custom format, the date field looks like this:
Dec 4 23:59:21
Nov 21 23:59:21
in my logstash config I have this for the filter:
date {
type => "custom"
# tell it the format
custom_timestamp => ["MMM d HH:mm:ss", "MMM dd HH:mm:ss"]
# locale didn't help
locale => "en"
}
mutate{
type => "custom"
# replace the timestamp
replace => ["#timestamp", "%{custom_timestamp}"]
}
which supposedly replaces the logstash timestamp with the custom one from the logs (I am backfilling it from old logs at the moment to test).
If I turn on the debug flag and output to stdout, it shows me that #timestamp has been replaced with custom_timestamp but I get an error message telling me that it cannot be imported:
:exception=>java.lang.IllegalArgumentException: Invalid format: "Dec 4 23:59:21"
what do I have to do to convert the date format?
Turns out that the sample I was working from is wrong. You do not need the mutate replacement, the config is this now:
date {
type => "custom"
# tell it the format
custom_timestamp => ["MMM d HH:mm:ss", "MMM dd HH:mm:ss"]
# date format is english, computer might not be
locale => "en"
}
mutate{
type => "custom"
#other mutations go here
}
Two misconceptions in this post:
The java exception is generated because there is no YEAR in your format, therefore it cannot parse the date safely.
You need to run a mutate if you want other applications to see your old imported logs as a coherent timeline. Otherwise, when you import all your old logs, you'll only see a few minutes of events concentrated (during the import).
Other than that, good question/answer, it helped me get back on track on my particular problem ;)

Resources