Connecting ELK(Logstash) to Zabbix via Plugin - logstash

I am trying to configure Logstash to send any error logs to a remote Zabbix server.
I have got my inputs and filters working fine and the if statement is correctly outputting to the file log however I am not sure how to get the Output to correctly send to Zabbix.
I am not sure if it's my configuration here or on the ELK server or the Zabbix Server or else communication in between.
I have zero experience with Zabbix however I've set up a trapper which I belive should collect the log message coming from the ELK server
input {
file{
path=>["/opt/logstash/alectronic_test_input.log"]
}
}
filter {
# Filter out empty lines in the audit logs
if ([message] =~ /^[\s]*$/) {
drop {}
}
//some other filter stuff which breaks up fields which will have a field called "LogLvl"
if [LogLvl]=="ERROR" {
mutate{
add_field => {
"sendToZabbix"=>"true"
}
}
}else{
mutate{
add_field => {
"sendToZabbix"=>"false"
}
}
}
if [sendToZabbix]=="true"{
mutate{
add_field =>{
"whereAmIGoing"=>"ToZabbix"
"zhost"=>"jenkins"
"zkey"=>"trap"
}
}
}
}
output {
if[sendToZabbix]=="true"{
file{
path => "/opt/logstash/alectronic_test_output.log"
}
zabbix{
zabbix_host=>"zhost"
zabbix_server_host=>"zabbix.alectronic.co"
#zabbix_server_port=>10051
#zabbix_key=>"zkey"
}
}
}

The problem I seem to have had was that I was using the wrong Zabbix_Server_Host Address (apparently the system I was using had a public facing IP and private facing IP).

Related

Make logstash filter by name

I have a log file called "/var/log/commands.log" that I'm trying to separate into fields with logstash & grok. I've got it working. Now, I'm trying to make logstash only do this to the file "/var/log/commands.log" and not any input by doing "if name = commands.log" but something with the "if" statement seems wrong as it skips over it.
input{
file{
path => "/var/log/commands.log"
}
beats{
port => 5044
}
}
filter {
if [log][file][path] == "/var/log/commands.log" {
grok{
match => { "message" => "*very long statement*"
}
}
}
}
output{
elasticsearch { hosts => ["localhost:9200"]}
}
If I remove the if statement it works and the fields are visible in kibana. I'm testing things locally. Does anyone know what's going on?
EDIT: SOLVED: In logstash, it has to be only [path] instead of all the rest.

Laravel Parsing log with elk (elasticsearch, logstash, kibana)

I have configured ELK successfully for Laravel app, But we are facing issue with Laravel log. I have configured logstash template with below code. but I am receiving Break line in Kibana. I have tried two different configuration code as per below details.
20-laravel.conf
input {
stdin{
codec => multiline {
pattern => "^\["
what => "previous"
negate => true
}
}
}
filter {
grok {
match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:env}\.%{DATA:severity}: %{DATA:message}" }
}
}
output {
elasticsearch {
document_type => "logs"
hosts => ["127.0.0.1"]
index => "laravel_logs"
}
}
filter {
# Laravel log files
if [type] == "laravel" {
grok {
match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] %{DATA:env}\.%{DATA:severity}: %{DATA:message} \[" }
}
}
}
laravel sample log is :
[2017-09-13 16:19:28] production.ERROR: Symfony\Component\Debug\Exception\FatalThrowableError: Parse error: syntax error, unexpected identifier (T_STRING), expecting ',' or ')' in /var/www/app/Http/Controllers/BrandsController.php:57
Stack trace:
#0 /var/www/vendor/composer/ClassLoader.php(322):
Composer\Autoload\includeFile('/var/www/vendor...')
#1 [internal function]: Composer\Autoload\ClassLoader-
>loadClass('App\\Http\\Contro...')
#2 [internal function]: spl_autoload_call('App\\Http\\Contro...')
So my main issue is we are reciveing this log in kibana in single line. for example above log code is a divided in different line message and we can't figure out that which line message is from which error?
Kibana log output for single laravel log is displayed in below image.kibana log-output
An easy alternative is to use Laralog.
With Laralog it is possible to Laravel logs directly to Elastic Search without install all the full Logstash stack, so it is suitable for small and container environments.
Example of usage:
laralog https://elasticsearch:9200 --input=laravel.log
Laralog will parse and send the logs automatically.
You should create a new provider to setup monolog properly, try the following setup:
class LogstashProvider extends ServiceProvider
{
public function boot(): void
{
$stream = storage_path('logs/laravel.log');
$name = env('APP_NAME');
$formatter = new LogstashFormatter($name, null, null, 'ctxt_', LogstashFormatter::V1);
$streamHandler = new StreamHandler($stream, Logger::DEBUG, false);
$streamHandler->setFormatter($formatter);
Log::getMonolog()->pushHandler(
$streamHandler
);
}
}
You also should configure your logstash to parse json instead

logstash index not assigned with json pattern

I am new to ELK stack and trying to configure fields in kibana dashboard. my logstash.conf
input {
tcp {
port => 5000
}
}
filter{
json{
source => "message"
add_field => {
"newfiled" => "static"
}
}
}}
output {
elasticsearch {
hosts => "elasticsearch:9200"
index => "test"
}
}
But index test is not present when i use curl to elastic server. Iam using python-logstash.I have installed json plugin. someone can help me how to send the json to elastic search so that I could view it on kibana dashboard?
Found the issue. There are two libraries json and jsonencode. If you are sending dictionary in text format(or using python logstash) make sure you use json encode

Multiple identical messages with logstash/kibana

I'm running an ELK stack on my local filesystem. I have the following configuration file set up:
input {
file {
path => "/var/log/rfc5424"
type => "RFC"
}
}
filter {
grok {
match => { "message" => "%{SYSLOG5424LINE}" }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}
I have a kibana instance running as well. I write a line to /var/log/rfc5424:
$ echo '<11>1' "$(date +'%Y-%m-%dT%H:%M:%SZ')" 'test-machine test-tag f81d4fae-7dec-11d0-a765-00a0c91e6bf6 log [nsId orgID="12 \"hey\" 345" projectID="2345[hehe]6"] this is a test message' >> /var/log/rfc5424
And it shows up in Kibana. Great! However, weirdly, it shows up six times:
As far as I can tell everything about these message is identical, and I only have one instance of logstash/kibana running, so I have no idea what could be causing this duplication.
Check out if there is .swp or .tmp file for your configuration under conf directory.
Add document id to documents:
output {
elasticsearch {
hosts => ["localhost:9200"]
document_id => "%{uuid_field}"
}
}

Logstash Grok Filter key/value pairs

Working on getting our ESET log files (json format) into elasticsearch. I'm shipping logs to our syslog server (syslog-ng), then to logstash, and elasticsearch. Everything is going as it should. My problem is in trying to process the logs in logstash...I cannot seem to separate the key/value pairs into separate fields.
Here's a sample log entry:
Jul 8 11:54:29 192.168.1.144 1 2016-07-08T15:55:09.629Z era.somecompany.local ERAServer 1755 Syslog {"event_type":"Threat_Event","ipv4":"192.168.1.118","source_uuid":"7ecab29a-7db3-4c79-96f5-3946de54cbbf","occured":"08-Jul-2016 15:54:54","severity":"Warning","threat_type":"trojan","threat_name":"HTML/Agent.V","scanner_id":"HTTP filter","scan_id":"virlog.dat","engine_version":"13773 (20160708)","object_type":"file","object_uri":"http://malware.wicar.org/data/java_jre17_exec.html","action_taken":"connection terminated","threat_handled":true,"need_restart":false,"username":"BATHSAVER\\sickes","processname":"C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe"}
Here is my logstash conf:
input {
udp {
type => "esetlog"
port => 5515
}
tcp {
type => "esetlog"
port => 5515
}
filter {
if [type] == "esetlog" {
grok {
match => { "message" => "%{DATA:timestamp}\ %{IPV4:clientip}\ <%{POSINT:num1}>%{POSINT:num2}\ %{DATA:syslogtimestamp}\ %{HOSTNAME}\ %{IPORHOST}\ %{POSINT:syslog_pid\ %{DATA:type}\ %{GREEDYDATA:msg}" }
}
kv {
source => "msg"
value_split => ":"
target => "kv"
}
}
}
output {
elasticsearch {
hosts => ['192.168.1.116:9200']
index => "eset-%{+YYY.MM.dd}"
}
}
When the data is displayed in kibana other than the data and time everything is lumped together in the "message" field only, with no separate key/value pairs.
I've been reading and searching for a week now. I've done similar things with other log files with no problems at all so not sure what I'm missing. Any help/suggestions is greatly appreciated.
Can you try belows configuration of logstash
grok {
match => {
"message" =>["%{CISCOTIMESTAMP:timestamp} %{IPV4:clientip} %{POSINT:num1} %{TIMESTAMP_ISO8601:syslogtimestamp} %{USERNAME:hostname} %{USERNAME:iporhost} %{NUMBER:syslog_pid} Syslog %{GREEDYDATA:msg}"]
}
}
json {
source => "msg"
}
It's working and tested in http://grokconstructor.appspot.com/do/match#result
Regards.

Resources