Logstash keep syslog host - logstash

I have a syslog server and the ELK stack on the same server. I have a directory for each syslog source.
I'm trying to parse syslog files with Logstash, and I'd like to keep the ip adress or the hostname of the syslog source in the "host" field. At the moment I have the 0.0.0.0 source after Logstash parsing.
My logstash.conf :
input {
file {
path => ["path/to/file.log"]
start_position => "beginning"
type => "linux-syslog"
ignore_older => 0
}
}
filter {
if [type] == "linux-syslog" {
grok {
match => {"message" => "<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
}
}
}
output {
elasticsearch {
hosts => ["#IP_Elastic:Port_Elastic"]
}
stdout { codec => rubydebug }
}

you can overwrite your host with your ip variable once you have parsed it. Consider this example:
Pipeline main started
{"ip":"1.2.3.4"}
{
"message" => "{\"ip\":\"1.2.3.4\"}",
"#version" => "1",
"#timestamp" => "2016-08-10T13:36:18.875Z",
"host" => "pandaadb",
"ip" => "1.2.3.4",
"#host" => "1.2.3.4"
}
I am parsing json to get the IP. Then I write the IP field into the host.
The filter:
filter {
# this parses the ip json
json {
source => "message"
}
mutate {
add_field => { "#host" => "%{ip}" }
}
}
replace %{ip} with whatever field contains your ip address.
Cheers,
Artur

Related

Logstash influxdb output with geoip filter

I’m trying to use logstash geoip filter to send data about location of IP address to InfluxDB via InfluxDB output plugin.
My logstash conf file is:
input {
file {
path => "/root/geoip_test.txt"
start_position => "beginning"
}
}
filter {
geoip {
source => "message"
fields => ["latitude", "longitude"]
}
}
output {
stdout {codec => rubydebug}
influxdb {
host => "localhost"
port => 8086
db => "metrics"
measurement => "geoip_test"
codec => "json"
use_event_fields_for_data_points => true
}
}
geoip_test.txt file contains only one IP address:
14.143.35.10
Output with error I receive is:
[2020-09-07T12:26:26,696][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
{
"geoip" => {
"latitude" => 12.9771,
"longitude" => 77.5871
},
"message" => "14.143.35.10",
"path" => "/root/geoip_test.txt",
"#timestamp" => 2020-09-07T10:26:33.963Z,
"host" => "test",
"#version" => "1"
}
[2020-09-07T12:26:34,942][WARN ][logstash.outputs.influxdb][main][941178b6897abb80f9a5f7654e5e62ba752d5e20b68781bc62b466e489c2ce56] Non recoverable exception while writing to InfluxDB {:exception=>#<InfluxDB::Error: {"error":"unable to parse 'geoip_test,host=test geoip={\"latitude\"=\u003e12.9771, \"longitude\"=\u003e77.5871},path=\"/root/geoip_test.txt\" 1599474393963': invalid boolean"}
>}
I think geoip filter generates some boolean field which InfluxDB is not able to work with.
Does anyone have any idea what to do with that? Is it possible to set up geoip filter in some way so it wouldn’t generate anything but lon and lat fields?
Any help is really appreciated!

Change the seconds to milliseconds in logstash

I m trying to compare two logs files haproxy and nginx using ELK stack especially the response time in logtash i have created two separate conf files in logstash for haproxy and nginx, In haproxy I'm getting response time as milliseconds eg: 2334 and in nginx I'm getting it in seconds eg: 1.23.
I want to convert the response time of nginx in milliseconds, I tried to convert it using the ruby filter but not I'm getting proper results also I think it's conflicting with my current elasticsearch index created for haproxy.
Below are my two config files:
Haproxy logstash Conf:
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "%{MONTH:month} %{MONTHDAY:date} %{TIME:time} %{WORD:[source]} %{WORD:[app]}\[%{DATA:[class]}\]: %{IPORHOST:[UE_IP]}:%{NUMBER:[UE_Port]} %{IPORHOST:[NATTED_IP]}:%{NUMBER:[NATTED_Source_Port]} %{IPORHOST:[NATTED_IP]}:%{NUMBER:[NATTED_Destination_Port]} %{IPORHOST:[WAN_IP]}:%{NUMBER:[WAN_Port]} \[%{HAPROXYDATE:[accept_date]}\] %{NOTSPACE:[frontend_name]}~ %{NOTSPACE:[backend_name]} %{NOTSPACE:[ty_name]}/%{NUMBER:[response_time]:int} %{NUMBER:[http_status_code]} %{NUMBER:[response_bytes]:int} - - ---- %{NOTSPACE:[df]} %{NOTSPACE:[df]} %{DATA:[domain_name]} %{DATA:[cache_status]} %{DATA:[domain_name]} %{URIPATHPARAM:[content]} HTTP/%{NUMBER:[http_version]}" }
add_tag => [ "response_time", "response_time" ]
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
# stdout {
# codec => rubydebug
# }
}
Nginx logstash conf file
input {
beats {
port => 5045
}
}
filter {
grok {
match => { "message" => "%{IPORHOST:clientip} - - \[%{HTTPDATE:timestamp}\] \"%{WORD:verb} %{URIPATHPARAM:content} HTTP/%{NUMBER:httpversion}\" %{NUMBER:response} %{NUMBER:response_bytes:int} \"-\" \"%{GREEDYDATA:junk}\" %{NUMBER:response_time}"}
}
ruby {
code => "event.set('response_time', event.get('response_time').to_i * 1000)"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout { codec => rubydebug }
}
grok {
match => { "message" => "%{IPV4:clientip} - - [%{HTTPDATE:requesttimestamp}] "%{WORD:httpmethod} /" %{NUMBER:responsecode:int} %{NUMBER:responsesize:int} "-" "-" "-" "%{NUMBER:responsetimems:float}""}
}
ruby {
code => "event.set('responsetimems', event.get('responsetimems').to_i * 1000)"
}

losgtash grok pattern error with custom log

I'm new with ELK stack. I need to parse my customlogs using Grok and then analyze them.
[11/Oct/2018 09:51:47] INFO [serviceManager.management.commands.serviceManagerMonitor:serviceManagerMonitor.py:114] [2018-10-11 07:51:47.527997+00:00] SmMonitoring Module : Launching action info over the service sysstat is delayed
with this grok pattern:
\[(?<timestamp>%{MONTHDAY}\/%{MONTH}\/%{YEAR} %{TIME})\] %{LOGLEVEL:loglevel} \[%{GREEDYDATA:messagel}\] \[%{GREEDYDATA:message2}\] %{GREEDYDATA:message3}
i tried a grok debugger and it's matched
here is the configuration of logstash for the input:
input {
beats {
port => 5044
}
}
and here is the output configuration:
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[#metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[#metadata][type]}"
}
}
and here is the whole filter:
filter {
grok {
match => { "message" => "\[(?<timestamp>%{MONTHDAY}\/%{MONTH}\/%{YEAR} %{TIME})\] %{LOGLEVEL:loglevel} \[%{GREEDYDATA:messagel}\] \[%{GREEDYDATA:message2}\] %{GREEDYDATA:message3}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
but i don't get any result with elasticsearch.
Thank you for helping me.

Unable to parse logs from log file in logstash

I am trying to get logs from log file in logstash This is my config file for logstash Please help me out with this.
input {
tcp {
port => 5022
type => "syslog"
}
udp {
port => 5022
type => "syslog"
}
file {
path => ["/var/log/haproxy.log"]
type => "syslog"
start_position => "beginning"
}
}
Try something like this
input
{
file {
path => "file-path"
}

Logstash input filename as output elasticsearch index

Is there a way of having the filename of the file being read by logstash as the index name for the output into ElasticSearch?
I am using the following config for logstash.
input{
file{
path => "/logstashInput/*"
}
}
output{
elasticsearch{
index => "FromfileX"
}
}
I would like to be able to put a file e.g. log-from-20.10.2016.log and have it indexed into the index log-from-20.10.2016. Does the logstash input plugin "file" produce any variables for use in the filter or output?
Yes, you can use the path field for that and grok it to extract the filename into the index field
input {
file {
path => "/logstashInput/*"
}
}
filter {
grok {
match => ["path", "(?<index>log-from-\d{2}\.\d{2}\.\d{4})\.log$" ]
}
}
output{
elasticsearch {
index => "%{index}"
}
}
input {
file {
path => "/home/ubuntu/data/gunicorn.log"
start_position => "beginning"
}
}
filter {
grok {
match => {
"message" => "%{USERNAME:u1} %{USERNAME:u2} \[%{HTTPDATE:http_date}\] \"%{DATA:http_verb} %{URIPATHPARAM:api} %{DATA:http_version}\" %{NUMBER:status_code} %{NUMBER:byte} \"%{DATA:external_api}\" \"%{GREEDYDATA:android_client}\""
remove_field => ["message"]
}
}
date {
match => ["http_date", "dd/MMM/yyyy:HH:mm:ss +ssss"]
}
ruby {
code => "event.set('index_name',event.get('path').split('/')[-1].gsub('.log',''))"
}
}
output {
elasticsearch {
hosts => ["0.0.0.0:9200"]
index => "%{index_name}-%{+yyyy-MM-dd}"
user => "*********************"
password => "*****************"
}
stdout { codec => rubydebug }
}

Resources