Logstash: Failed parsing date - logstash

Summary:
logstash -> elasticsearch --> Failed parsing date shown in debug output
Events in logfile contain field #timestamp (format: 2014-06-18T11:52:45.370636+02:00)
Events are actually processed to elasticsearch but 'failed parsing' errors are shown.
Versions:
Logstash 1.4.1
Elasticsearch 1.20
Is there something I do wrong?
I have log files that contain events like this:
{"#timestamp":"2014-06-18T11:52:45.370636+02:00","Level":"Info","Machine":"X100","Session":{"MainId":0,"SubId":"5otec"},"Request":{"Url":"http://www.localhost:5000/Default.aspx","Method":"GET","Referrer":"http://www.localhost:5000/Default.aspx"},"EndRequest":{"Duration":{"Main":0,"Page":6720}}}
I use this logstash config:
input {
file {
path => "D:/testlogs/*.log"
codec => "json"
start_position => [ "beginning" ]
}
}
filter {
date {
match => [ "#timestamp", "ISO8601" ]
}
}
output {
stdout {
codec => json
}
elasticsearch {
protocol => http
host => "10.125.26.128"
}
}
When I run logstash with this config on the events in the log files I get the following error:
[33mFailed parsing date from field {:field=>"#timestamp", :value=>"2014-06-18T12:18:34.717+02:00", :exception=>#<TypeError: cannot convert instance of class org.jruby.RubyTime to class java.lang.String>
Now the thing is that actually the events are imported in elasticsearch, but I see these errors.
Can this be a problem or can these failed parsing errors being ignored?

You logs are already in json format, so you no need to parse the date.
You can use the json filter to parse all the field and value.
For example, with this configuration:
input {
file {
path => "D:/testlogs/*.log"
codec => "json"
}
}
filter {
json {
source => "message"
}
}
output {
stdout {
codec => rubydebug
}
}
I can parse your log successfully. And the output is :
{
"#timestamp" => "2014-06-18T17:52:45.370+08:00",
"Level" => "Info",
"Machine" => "X100",
"Session" => {
"MainId" => 0,
"SubId" => "5otec"
},
"Request" => {
"Url" => "http://www.localhost:5000/Default.aspx",
"Method" => "GET",
"Referrer" => "http://www.localhost:5000/Default.aspx"
},
"EndRequest" => {
"Duration" => {
"Main" => 0,
"Page" => 6720
}
},
"#version" => "1",
"host" => "ABC",
"path" => "D:/testlogs/*.log"
}
I also have try that the codec => json in file input is not work but is work in stdin input. Maybe this is a bug in logstash.
Hope this can help you.

Related

Logstash influxdb output with geoip filter

I’m trying to use logstash geoip filter to send data about location of IP address to InfluxDB via InfluxDB output plugin.
My logstash conf file is:
input {
file {
path => "/root/geoip_test.txt"
start_position => "beginning"
}
}
filter {
geoip {
source => "message"
fields => ["latitude", "longitude"]
}
}
output {
stdout {codec => rubydebug}
influxdb {
host => "localhost"
port => 8086
db => "metrics"
measurement => "geoip_test"
codec => "json"
use_event_fields_for_data_points => true
}
}
geoip_test.txt file contains only one IP address:
14.143.35.10
Output with error I receive is:
[2020-09-07T12:26:26,696][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
{
"geoip" => {
"latitude" => 12.9771,
"longitude" => 77.5871
},
"message" => "14.143.35.10",
"path" => "/root/geoip_test.txt",
"#timestamp" => 2020-09-07T10:26:33.963Z,
"host" => "test",
"#version" => "1"
}
[2020-09-07T12:26:34,942][WARN ][logstash.outputs.influxdb][main][941178b6897abb80f9a5f7654e5e62ba752d5e20b68781bc62b466e489c2ce56] Non recoverable exception while writing to InfluxDB {:exception=>#<InfluxDB::Error: {"error":"unable to parse 'geoip_test,host=test geoip={\"latitude\"=\u003e12.9771, \"longitude\"=\u003e77.5871},path=\"/root/geoip_test.txt\" 1599474393963': invalid boolean"}
>}
I think geoip filter generates some boolean field which InfluxDB is not able to work with.
Does anyone have any idea what to do with that? Is it possible to set up geoip filter in some way so it wouldn’t generate anything but lon and lat fields?
Any help is really appreciated!

String conversion of special character

JSON parsing exception is thrown from logstash whenever ® is encountered.
I need to convert ® into its equivalent HTML encoded value and then push it into ES through logstash.
I got few articles where its mentioned how to convert the HTML codes into equivalent symbols but am looking for the reverse case.
If I pass "®" then it should return me ® but if ® is passed then it should not format it and still it should return ®
Update:
Below is the script i am using to push data into ES
input{ file { path => ["C:/input.json"]
start_position => "beginning"
sincedb_path => "/dev/null"
}}filter{
mutate
{
replace => [ "message", "%{message}" ]
gsub => [ 'message','\n','']
}
json { source => message }
mutate{
remove_field => ["message"]
}}output {
elasticsearch {
hosts => ["localhost:9200"]
index => "index"
document_type => "type"
document_id => "%{id}"
}
stdout { codec => rubydebug }
}
How can i solve this issue

Logstash filter correct in debugger but doesn't work when searching in kibana

My logstash filter correct in debugger but doesn't show the fields when searching the exact message I tested with in kibana. Here is my filter:
filter {
if [type] == "syslog" {
grok {
match => { 'message' => '%{SYSLOG5424LINE}' }
}
syslog_pri {
syslog_pri_field_name => 'syslog5424_pri'
}
date {
match => [ 'syslog5424_ts', 'ISO8601' ]
}
}
and here is an example of my log message:
<134>1 2017-01-23T10:54:44.587136-08:00 mcmp mapp - - close ('xxx', 32415)
It seems like the filter isn't applying, I restarted my logstash service and tested in the grok debugger. Any idea whats wrong?
It looks like it works correctly to me.
I created test.conf with:
input {
stdin {}
}
filter {
grok {
match => { 'message' => '%{SYSLOG5424LINE}' }
}
syslog_pri {
syslog_pri_field_name => 'syslog5424_pri'
}
date {
match => [ 'syslog5424_ts', 'ISO8601' ]
}
}
output {
stdout { codec => "rubydebug" }
}
and then tested like this:
echo "<134>1 2017-01-23T10:54:44.587136-08:00 mcmp mapp - - close ('xxx', 32415)" | bin/logstash -f test.conf
And the event it gives as output:
{
"syslog_severity_code" => 6,
"syslog_facility" => "local0",
"syslog_facility_code" => 16,
"syslog5424_ver" => "1",
"message" => "<134>1 2017-01-23T10:54:44.587136-08:00 mcmp mapp - - close ('xxx', 32415)",
"syslog5424_app" => "mapp",
"syslog5424_msg" => "close ('xxx', 32415)",
"syslog_severity" => "informational",
"tags" => [],
"#timestamp" => 2017-01-23T18:54:44.587Z,
"syslog5424_ts" => "2017-01-23T10:54:44.587136-08:00",
"syslog5424_pri" => "134",
"#version" => "1",
"host" => "xxxx",
"syslog5424_host" => "mcmp"
}
which has all of the fields that the SYSLOG5424LINE pattern contains.

Issue in renaming Json parsed field in Logstash

I am parsing json log file in Logstash. There is a field named #person.name. I tried to rename this field name before sending it to elasticsearch. I also tried to remove the field but I couldn't remove or delete that field because of that my data not getting indexed in Elasticsearch.
Error recorded in elasticsearch
MapperParsingException[Field name [#person.name] cannot contain '.']
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:276)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:221)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parse(ObjectMapper.java:196)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:308)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:221)
at org.elasticsearch.index.mapper.object.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:138)
at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:119)
at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:100)
at org.elasticsearch.index.mapper.MapperService.parse(MapperService.java:435)
at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.applyRequest(MetaDataMappingService.java:257)
at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.execute(MetaDataMappingService.java:230) at org.elasticsearch.cluster.service.InternalClusterService.runTasksForExecutor(InternalClusterService.java:458)
at org.elasticsearch.cluster.service.InternalClusterService$UpdateTask.run(InternalClusterService.java:762)
My Logstash config
input {
beats {
port => 11153
}
}
filter
{
if [type] == "person_get" {
##Parsing JSON input to JSON Filter..
json {
source => "message"
}
mutate{
rename => { "#person.name" => "#person-name" }
remove_field => [ "#person.name"]
}
fingerprint {
source => ["ResponseTimestamp"]
target => "fingerprint"
key => "78787878"
method => "SHA1"
concatenate_sources => true
}
}
}
output{
if [type] == "person_get" {
elasticsearch {
index => "logstash-person_v1"
hosts => ["xxx.xxx.xx:9200"]
document_id => "%{fingerprint}" # !!! prevent duplication
}
stdout {
codec => rubydebug
}
} }

logstash output to kafka - topic data in message

I want to create a conf file for logstash that loads data from a file and send it to kafka.
the file is in json format and has the topicId in it.
This is what I have so far..
input {
file {
path => "~/file1.json"
start_position => "beginning"
codec => "json"
}
}
filter {
json {
source => message
}
}
output {
kafka {
bootstrap_servers => "localhost"
codec => plain {
format => "%{message}"
}
topic_id => "???"
}
}
can this be done?
Regards,
Ido
Yes it can be done.
For example if the message json contains a topic_id key like:
"topicId": "topic1"
Then in logstash kafka output plugin:
output {
kafka {
bootstrap_servers => "localhost"
codec => plain {
format => "%{message}"
}
topic_id => "%{topicId}"
}
}

Resources