Count length message field on Logstash - logstash

I want to know the length of the message log and put it in as a new field on logstash event. for example:
From this event
{
...
"message" => "[2021-12-22T04:41:20.151992+00:00] testing.INFO: Message error"
...
}
Into this event
{
...
"message" => "[2021-12-22T04:41:20.151992+00:00] testing.INFO: Message error"
"mes_leng" => 76
...
}
I've tried using filter with Ruby plugin and using code to extract the length, but nothing happen on the output logs.
Is this possible to manipulate this event on logstash? Many appreciate

hope this gonna help (tested on 7.16.X)
ruby {
code => "event.set('message_length', event.get('message').length)"
}
in your case with "mes_leng":
ruby {
code => "event.set('msg_leng', event.get('message').length)"
}

Related

_grokparsefailure while applying grok filters

My Log
2021-08-19 15:43:55,122 INFO c.t.i.c.ClassA - log message Service Name=Service-OCR Content ID=ABC.xml Category=SUCCESS Timestamp=2021-08-19T15:43:55.122292244 Message=The response has been received. Unit Name=N/A
2021-08-19 15:43:55,122 ERROR c.t.i.c.ClassB - log message Service Name=Service-OCR Engine Content ID=ABC.xml Category=ERROR Timestamp=2021-08-19T15:43:55.122292244 Message=The response has been received. Unit Name=TM
My logstash.conf is
input {
tcp {
port => 12201
codec => json_lines
}
}
filter {
grok {
patterns_dir => ["./patterns"]
match => {
'message' => '%{TIMESTAMP_ISO8601} %{LOGLEVEL:level} %{STRING} - \"log message \"Service Name=\"%{STRING} \"Content ID=\"%{STRING} \"Category=\"%{STRING} \"Timestamp=\"%{TIMESTAMP_ISO8601} \"Message=\"%{STRING} \"Unit Name=\"%{STRING}'
}
}
}
output {
elasticsearch {
hosts => ["http://elasticsearch:9200"]
index => "logstash"
}
}
I know that STRING is not in grok-filters that's why I have defined a customer filter.
STRING ^[a-zA-Z0-9 !##$%^&*()_+\-=\[\]{};':"\\|,.<>\/?]{1,}$
I am assuming that wherever I have used STRING that could include special characters ,spaces, numbers . Just like string in Java.
But still I am unable to filter my logs through this. Any help ?
You have anchored STRING to the start and end of line using ^ and $. It is never going to match in the way you are using it. Remove the ^ and $
Instead of custom patter STRING, you can simply use %{GREEDYDATA}. This will solve your problem.

Logstash - Multiple grok pattern not working together

I am very new in using Logstash. I have two kinds of log,
Pattern 1 : --2019-05-09 08:53:45.057 -INFO 11736 --- [ntainer#1-0-C-1] c.s.s.service.MessageLogServiceImpl : [adc7fd862db5307a688817198046b284dbb12b9347bed9067320caa49d8efa381557392024151] Event => Message Status Change [Start Time : 09052019 08:53:44] : CUSTOM_PROCESSING_COMPLETED
Pattern 2 : --2019-05-09 06:49:05.590 -TRACE 6293 --- [ntainer#0-0-C-1] c.s.s.service.MessageLogServiceImpl : [41a6811cbc1c66eda0e942712a12a003d6bf4654b3edb6d24bf159b592afc64f1557384545548] Event => Message Failure Identified : INVALID_STRUCTURE
Though there are many more other lines, but I want to consider only these two types. Hence I used below filter,
grok {
#Event : message status change
match => {
"message" => "--(?<logtime>[^\]]*) -%{LOGLEVEL:level} (?<pid>\d+) --- \[(?<thread>[^\]]+)] (?<classname>[\w.]+)\s+: \[(?<token>[^\]]+)] Event \=> Message Status Change \[Start Time : (?<start>[^\]]*)\] : (?<status>[\w]+)"
}
add_field => {
"event" => "message_status_change"
}
}
grok {
#Event : message failure
match => {
"message" => "--(?<logtime>[^\]]*) -%{LOGLEVEL:level} (?<pid>\d+) --- \[(?<thread>[^\]]+)] (?<classname>[\w.]+)\s+: \[(?<token>[^\]]+)] Event \=> Message Failure Identified : (?<code>[\w]+)"
}
add_field => {
"event" => "message_failure"
}
}
I have also noticed that both of these grok patterns work individually (if I comment one, then other one works perfectly). Logstash server also ok when both patterns are active. But it raises a grokparse error when both of them is open and a new line is added in the log file.
Also I want to know, though I am configured the input to read from a file from beginning, it is not reading even after server restart unless I add a new line in the log. Why this behaviour?
Thanks in advance.

Logstash Parse Date Issue

I am using logstash to read some logs. I have a log file which the Timestamp only consist of time field, i.e. 08:28:20,500, but no date field. I would like to map it with the datetime of today. How should I do that with date filter.
A line of my log file is like this.
08:28:20,500 INFO [org.jboss.as.connector.subsystems.datasources] (ServerService Thread Pool -- 27) JBAS010403: Deploying JDBC-compliant driver class org.h2.Driver (version 1.3)>>"C:\CIGNA\jboss\jboss.log"
Is there anyone who can help with this issue?Great thanks in advance.
EDIT
After using ruby as filter, I have managed to solve the issue. However, there is occasionally a ruby exception. As seen from below, the first message has come across with ruby exception while the 2nd one runs fine. I would wonder how this happen and if anyone can provide me some advice. Thanks.
{
"message" => "10:30:39 FATAL [org.jboss.as.server] (default task-1) JBAS015957: Server boot has failed in an unre
coverable manner; exiting. See previous messages for details.\r",
"#version" => "1",
"#timestamp" => "2016-07-26T02:43:17.379Z",
"path" => "C:/CIGNA/jboss/jboss.log",
"host" => "SIMSPad",
"type" => "txt",
"Time" => "10:30:39",
"Level" => "FATAL",
"JavaClass" => "org.jboss.as.server",
"Message" => "(default task-1) JBAS015957: Server boot has failed in an unrecoverable manner; exiting. See previo
us messages for details.\r",
"tags" => [
[0] "_rubyexception"
]
}
{
"message" => "10:30:39 DEBUG [org.jboss.as.quickstarts.logging.LoggingExample] (default task-1) Settings reconfig
ured: JBOSS EAP Resettlement\r",
"#version" => "1",
"#timestamp" => "2016-07-26T02:30:39.000Z",
"path" => "C:/CIGNA/jboss/jboss.log",
"host" => "SIMSPad",
"type" => "txt",
"Time" => "10:30:39",
"Level" => "DEBUG",
"JavaClass" => "org.jboss.as.quickstarts.logging.LoggingExample",
"Message" => "(default task-1) Settings reconfigured: JBOSS EAP Resettlement\r"
}
And my updated filter part in my logstash .conf file is as shown.
filter {
grok {
match => { "message" => '\A%{TIME:Time}%{SPACE}%{WORD:Level}%{SPACE}\[%{PROG:JavaClass}]%{SPACE}%{JAVALOGMESSAGE:Message}'}
}
ruby {
code => "
p = Time.parse(event['message']);
event['#timestamp'] = LogStash::Timestamp.new(p);
"
}
}
you can do that via ruby filter. Ruby can parse this out of the box. Sorry, I have not tried it with the date filter (might work as well). Here is my example:
my configuration:
input {
stdin {
}
}
filter {
ruby {
code => "
p = Time.parse(event['message']);
event['myTime'] = p;
"
}
}
output {
stdout { codec => rubydebug }
}
Input and output:
[
artur#pandaadb:~/dev/logstash$ ./logstash-2.3.2/bin/logstash -f conf2/
Settings: Default pipeline workers: 8
Pipeline main started
08:28:20
{
"message" => "08:28:20",
"#version" => "1",
"#timestamp" => "2016-07-25T09:43:28.814Z",
"host" => "pandaadb",
"myTime" => 2016-07-25 08:28:20 +0100
}
I am simply passing your string, you can use the variable that you parsed, e.g. "Time" in the ruby code.
Ruby is quite smart when parsing dates and recognises that it is a time, rather than an entire date. So it uses today's timestamp and modifies the time only.
Hope that helps!
EDIT:
I tried the date filter just now and that one works differently. It sets the date to the 1st of this year. So it appears that the ruby filter will be your solution as the date filter does not offer any date modifications that I know of to modify the date after it has been matched.
EDIT 2:
In the comment you asked how to write it into the #timestmap field. The #timestamp field is a predefined field that expects a Logstash Timestamp Objects (not a string or datetime object). So you can write it directly into that field, however you must create an object. (Alternatively this would also work by using the date filter, but why double the filters)
Here is the necessary code:
ruby {
code => "
p = Time.parse(event['message']);
event['#timestamp'] = LogStash::Timestamp.new(p);
"
}
EDIT:
With regards to the updated question, your issue is that you are using the wrong variable in your event.
From your log update, you can see that your grok is parsing the things correctly, e.g.:
"message" => "10:30:39 FATAL [org.jboss.as.server] (default task-1) JBAS015957: Server boot has failed in an unre ...",
"Time" => "10:30:39"
In your filter however you reference the "message" variable of the event, not the "Time" variable.
So ruby will attempt to parse the entire message string into a date. Why this works in the second log is a mystery to me :D
You need to change your filter to:
filter {
grok {
match => { "message" => '\A%{TIME:Time}%{SPACE}%{WORD:Level}%{SPACE}\[%{PROG:JavaClass}]%{SPACE}%{JAVALOGMESSAGE:Message}'}
}
ruby {
code => "
p = Time.parse(event['Time']);
event['#timestamp'] = LogStash::Timestamp.new(p);
"
}
}
This will tell the parsing to take the time that is in the event field "Time".
Regards,
Artur

Logstash-filter-rest sent field references incorrectly it always reference first field value it had referened

I recently use logstash-filter-rest, and configure it like below:
rest {
url => "http://example.com/api"
sprintf => true
method => "post"
params => {
"post_key" => "%{a_field_in_log}"
}
response_key => "my_key"
}
after this, logstash make a post request to my api, but something is wrong, the value of a_field_in_log
is identical in every request ( I check api access log, all of the value is the first field value sent to api ) it seems like there have caches for referenced field.
Does someone had encountered same problem, would thank you for your help!
As it happens, I'm the author of logstash-filter-rest and I'm glad to hear that someone is actually using it.
I was able to reproduce your issue. It was a bug and (good news) I fixed it. Thank you for reporting!
You can update now to the new version 0.1.5.
../logstash/bin/plugin update logstash-filter-rest
Test config:
input { stdin {} }
filter {
grok { match => [ "message", "Hello %{WORD:who}" ] }
rest {
url => "http://requestb.in/1f7s1sg1?test=%{who}"
method => "post"
sprintf => true
params => {
"hello" => "%{who}"
}
}
}
output { stdout{ codec => "rubydebug" } }
Test data:
Hello John
Hello Doe
Hello foo
Hello bar
Result:
http://requestb.in/1f7s1sg1?inspect (looks good)
Many thanks for contributing! I hope everything works as expected now.

How to modify #timestamp with an entry in logs using logstash

I have some logs which has only time as its entries
1. 17:20:45.331|ERR|....
2. 17:20:54.715|SYS|.....Logging started for [....] (Date=[07/28/2014], ...
3. 17:20:54.716|SYS....
and so on
I have the date in only one line of the logs. based on that i want to create a timestamp such as that logging date in logs + the time in each entry
Iam able to get the time in each entry. i can get the log_message => " Logging started for [....] (Date=[07/28/2014], ..." as one entry.
Is it possible to get the date from this entry and modify all other entry's timestamp?
how can I add time and the date and modify the timestamp?
Any help will be appreciated as iam new to logstash
My filter in logstash conf
filter {
grok { match => [ "message", "%{TIME:time}\|%{WORD:Message_type}\|%{GREEDYDATA:Component}\|%{NUMBER:line_number}\| %{GREEDYDATA:log_message}"]
}
date {
match => ["timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ] => need to modify this as date+%{time}
}
}
time field has milliseconds also.
Your options are:
Change how things are logged to get the date included
Write something to fix the logs before they are picked up by logstash (ie something that looks for the entry any modifies the log)
use the memorize plugin that I wrote (and I submitted a pull request for to try and get it in a future version).
The plugin is detailed in this answer. The caveat with this solution is that if the plugin misses the line that has the date, you'll have issues with the remainder of the file. This could happen if you restart logstash, so you'll need to add in some logic to handle this -- in this case below, I assume that if it hasn't seen the date, it's today.
An implementation using the memorize plugin would look like this:
filter {
if ([message] =~ /Date=/) {
grok { match => [ "message", "Date=%{DATE:date}" ] }
}
# either add the field date to the saved date or pull the date from the saved data
memorize { fields => ["date"] }
# if we still don't have a date, lets just assume it's today
if ([date] == '') {
ruby {
code => 'event["date"]=ime.now.strftime("%m/%d/%Y")'
}
}
if ([message] !~ /Date=/) {
# grok to parse message
grok { match => [ "message", "%{TIME:time}\|%{WORD:Message_type}\|%{GREEDYDATA:Component}\|%{NUMBER:line_number}\| %{GREEDYDATA:log_message}"]
# now add in date
mutate {
add_field => {
datetime => "%{date} %{time}"
}
}
}
}
(This example has not been tested, so there may be syntax/logic errors, but it should get you down the right path).

Resources