How do I replace a string in a field in Logstash - logstash

I have an IP address field from the Windows event log that contains characters like "::fffff:" in front of the IP address. I cannot change the source here, so I have to fix this in Logstash.
I must suck at googling, but I really can't find a simple way to just strip these characters from the ip-address fields in logstash.
I have tried for example
if ("" in [event_data][IpAddress]) {
mutate {
add_field => { "client-host" => "%{[event_data][IpAddress]}"}
gsub => ["client-host", ":", ""]
}
dns {
action => "replace"
reverse => [ "client-host" ]
}
}
but no luck, the colon is still there. How can I replace "::ffff:" in the string "::ffff:10.0.36.39" in Logstash?

The add_field isn't executed until after the gsub, so you need to break it up into two mutate blocks.
mutate {
add_field => { "client-host" => "%{[event_data][IpAddress]}"}
}
mutate {
gsub => ["client-host", "::ffff:", ""]
}
The specifc order that mutate works in:
rename(event) if #rename
update(event) if #update
replace(event) if #replace
convert(event) if #convert
gsub(event) if #gsub
uppercase(event) if #uppercase
lowercase(event) if #lowercase
strip(event) if #strip
remove(event) if #remove
split(event) if #split
join(event) if #join
merge(event) if #merge
filter_matched(event)
Where filter_matched has all of the standard actions like add_field

Related

Creating dynamic Key-Value pairs in logstash

I have the following data in logstash output:
"Details" => "SAID,:EGT1_M2P7_01,::LIP,:10-168-98-203::RIP,:10-81-122-84:",
I want to make dynamic Key-value pairs according to delimiters
",:" means that "SAID" is the key and "EGT1_M2P7_01" is the value
"::" means that it is a new line and again ",:" means that "LIP" is the key and "10-168-98-203" is the value.
Need to know how to do it. Looking forward for answers
for the input you have given
"SAID,:EGT1_M2P7_01,::LIP,:10-168-98-203::RIP,:10-81-122-84:"
this filter plugin and stdout
filter {
kv {
source => "Details"
field_split => "::"
value_split => ":"
}
mutate {
remove_field => ["host", "#timestamp","#version", "message", "sequence" ]
}
}
output {
stdout {
codec => rubydebug
}
}
gives you
{
"LIP," => "10-168-98-203",
"SAID," => "EGT1_M2P7_01,",
"RIP," => "10-81-122-84"
}
remove the additional fields that are specific to your host system by adding in above remove_field list.

How to capture repeated pattern in logstash(5.4.0) grok?

I would appreciate if someone can help me out with logstash grok.
Given a log like below ,
IN 192.168.11.2 IN 192.168.11.3
My goal is to put the ip address into array using grok. List of ip is dynamic and possible to extend more than 2.
e.g
tmp = [
"192.168.11.2", "192.168.11.3"
]
However, if I use a filter like below it ends up in single field.
filter {
grok {
match => { "message" => "(?<tmp>(IN %{IPV4}(\s)?)*)" }
}
}
Result,
"path" => "/tmp/sample.csv",
"#timestamp" => 2017-08-24T05:00:08.093Z,
"tmp" => "IN 192.168.11.2 IN 192.168.11.3",
"#version" => "1",
"host" => "host.ywlocal.net",
"message" => "IN 192.168.11.2 IN 192.168.11.3"
Would this be possible?
You can use the ruby filter for more advanced parsing:
filter {
ruby {
code => "event.set('ips') = event.get('message').scan(/\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\/)"
}
}
Regexp is not 100% correct to match ip address but should work for your needs

configuring the logstash-output-csv

I am pretty new to logstash and I have been trying to convert an existing log into a csv format using the logstash-output-csv plugin.
My input log string looks as follows which is a custom log written in our application.
'128.111.111.11/cpu0/log:5988:W/"00601654e51a15472-76":687358:<9>2015/08/18 21:06:56.05: comp/45 55% of memory in use: 2787115008 bytes (change of 0)'
I wrote a quick regex and added it to the patterns_dir using the grok plugin.
My pattern is as follows :
IP_ADDRESS [0-9,.]+
CPU [0-9]
NSFW \S+
NUMBER [0-9]
DATE [0-9,/]+\s+[0-9]+[:]+[0-9]+[:]+[0-9,.]+
TIME \S+
COMPONENT_ID \S+
LOG_MESSAGE .+
without adding any csv filters I was able to get this output.
{
"message" => "128.111.111.11/cpu0/log:5988:W/"00601654e51a15472-76":687358:<9>2015/08/18 21:06:56.05: comp/45 55% of memory in use: 2787115008 bytes (change of 0)",
"#version" => "1",
"#timestamp" => "2015-08-18T21:06:56.05Z",
"host" => "hostname",
"path" => "/usr/phd/raveesh/sample.log_20150819000609",
"tags" => [
[0] "_grokparsefailure"
]
}
This is my configuration in order to get the csv as an output
input {
file {
path => "/usr/phd/raveesh/temporary.log_20150819000609"
start_position => beginning
}
}
filter {
grok {
patterns_dir => "./patterns"
match =>["message", "%{IP_ADDRESS:ipaddress}/%{CPU:cpu}/%{NSFW:nsfw}<%{NUMBER:number}>%{DATE}:%{SPACE:space}%{COMPONENT_ID:componentId}%{SPACE:space}%{LOG_MESSAGE:logmessage}" ]
break_on_match => false
}
csv {
add_field =>{"ipaddress" => "%{ipaddress}" }
}
}
output {
# Print each event to stdout.
csv {
fields => ["ipaddress"]
path => "./logs/firmwareEvents.log"
}
stdout {
# Enabling 'rubydebug' codec on the stdout output will make logstash
# pretty-print the entire event as something similar to a JSON representation.
codec => rubydebug
}
}
The above configuration does not seem to give the output. I am trying only to print the ipaddress in a csv file but finally I need to print all the captured patterns in a csv file. so I need the output as follows :
128.111.111.111,cpu0,nsfw, ....
Could you please let me know the changes i need to make. ?
Thanks in advance
EDIT:
I fixed the regex as suggested using the tool http://grokconstructor.appspot.com/do/match#result
Now my regex filter looks as follows :
%{IP:client}\/%{WORD:cpu}\/%{NOTSPACE:nsfw}<%{NUMBER:number}>%{YEAR:year}\/%{MONTHNUM:month}\/%{MONTHDAY:day}%{SPACE:space}%{TIME:time}:%{SPACE:space2}%{NOTSPACE:comp}%{SPACE:space3}%{GREEDYDATA:messagetext}
How do I capture the individual splits and save it as a csv ?
Thanks
EDIT:
I finally resolved this using the File plugin .
output {
file{
path => "./logs/sample.log"
message_pattern =>"%{client},%{number}"
}
}
The csv tag in the filter section is for parsing the input and exploding the message to key/value pairs.
In your case you are already parsing the input with the grok, so I bet you don't need the csv filter.
But in the output we can see there is a gorkfailure
{
"message" => "128.111.111.11/cpu0/log:5988:W/"00601654e51a15472-76":687358:<9>2015/08/18 21:06:56.05: comp/45 55% of memory in use: 2787115008 bytes (change of 0)",
"#version" => "1",
"#timestamp" => "2015-08-18T21:06:56.05Z",
"host" => "hostname",
"path" => "/usr/phd/raveesh/sample.log_20150819000609",
"tags" => [
[0] "****_grokparsefailure****"
]
}
That means your grok expression cannot parse the input.
You should fix the expression according to your input and then the csv will output properly.
Checkout http://grokconstructor.appspot.com/do/match for some help
BTW, are you sure the patterns NSFW, CPU, COMPONENT_ID, ... are defined somewhere ?
HIH

need custom fields of log through grok filter in logstash

I have logstash, kibana and elasticsearch installed on my system, with this filter configuration:
filter{
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{#timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
mutate {
add_field => {
"timestamp" => "%{TIME} %{MONTH} %{monthday}"
}
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
and receiving output on kibana as:
but I need some fields which are as follows:
#timestamp
#version
_id
_index
_type
_file
Log Level
Host Name
Host IP
Process Name
Response Time
I tried adding Timestamp but its printing same string instead of dynamic result
You're confusing patterns with fields.
A pattern is a short-hand notation that represents a regular expression, such as %{WORD} as a shortcut for "\b\w+\b".
A field is where data - including information matched by patterns - is stored. It's possible to put a pattern into a field like this: %{WORD:my_field}
In your grok{}, you match with: %{SYSLOGTIMESTAMP:syslog_timestamp}, which puts everything that was matched into a single field called syslog_timestamp. This is the month, monthday, and time seen at the front of syslog messages.
Even though SYSLOGTIMESTAMP is itself defined as "%{MONTH} +%{MONTHDAY} %{TIME}", they don't have that ":name" syntax, so no fields are created for MONTH, MONTHDAY, and TIME.
Assuming that you really do want to make a new field in the format you describe, you'd need to either:
make a new pattern to replace all of SYSLOGTIMESTAMP that would make fields out of the pieces of information.
use the existing pattern to create the syslog_timestamp field as you're doing, and then grok{} that with a simple pattern to split it apart.
I'd recommend #2, so you'd end up with something like this:
grok {
match => { "syslog_timestamp" => "%{MONTH:month} +%{MONTHDAY:monthday} %{TIME:time}" }
}
That should do it.
Please note that your field will be a string, so it won't be of any use in range queries, etc. You should use the date{} filter to replace #timestamp with your syslog_timestamp information.
Good luck.

Parse error for float values

I'm trying to get a float value from a log line but logstash mutate filter rounds the value and converts it into integer.
The log line is
f413e89e-8c2f-e411-97a5-005056820dbe|0,0033
and the configuration file is
input {
file {
path => "log.txt"
}
}
filter {
grok {
match => ["message", "%{UUID:request_object_id}[/|]%{LOCALNUM:total_time}"]
}
mutate {
gsub => ["total_time", "[,]", "."]
convert => [ "total_time", "float" ]
}
}
output {
elasticsearch { host => localhost }
}
LOCALNUM is a custom pattern and it is
(?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:[,][0-9]+)?)|(?:[,][0-9]+)))
(uses "," instead of "." in floating numbers).
With this configuration, total_time is 0 instead of 0.0033.
Looking at the logstash source code it does this:
convert(event) if #convert
gsub(event) if #gsub
So it does the convert before the gsub. Try splitting your mutate into two different mutates and it will fix your problem.
mutate {
gsub => ["total_time", "[,]", "."]
}
mutate {
convert => [ "total_time", "float" ]
}
Oh I found my mistake. I used 2 seperate mutate blocks, 1 for gsub and the other for convert and it solved the problem.

Resources