Logstash - Split escape character " \ " is not working - logstash

I have logstash to check log from window file ; there is many app running on window show I think using the folder to determinate this log come from what app ; but it is not working and get the exception :
Failed to execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Expected one of
\', ', any character at line 21, column 1 (byte 237)
my config
input {
beats {
port => 5044
}
}
filter {
mutate {
split => { "source" => '\\' }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => mt4log
}
}
someone can help me find out what is problem here thanks

Related

Filebeat cant send data to logstash

I have problem to send data from *.log file to logstash. This is filebeat configuration:
filebeat.prospectors:
- type: log
enabled: true
paths:
- /home/centos/logs/*.log
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
setup.template.settings:
index.number_of_shards: 3
setup.kibana:
output.logstash:
hosts: "10.206.81.234:5044"
This is logstash configuration:
path.data: /var/lib/logstash
path.config: /etc/logstash/conf.d/*.conf
path.logs: /var/log/logstash
xpack.monitoring.elasticsearch.url: ["10.206.81.236:9200", "10.206.81.242:9200", "10.206.81.243:9200"]
xpack.monitoring.elasticsearch.username: logstash_system
xpack.monitoring.elasticsearch.password: logstash
queue.type: persisted
queue.checkpoint.writes: 10
And this is my pipeline in /etc/logstash/conf.d/test.conf
input {
beats {
port => "5044"
}
file{
path => "/home/centos/logs/mylogs.log"
tags => "mylog"
}
file{
path => "/home/centos/logs/syslog.log"
tags => "syslog"
}
}
filter {
}
output {
if [tag] == "mylog" {
elasticsearch {
hosts => [ "10.206.81.246:9200", "10.206.81.236:9200", "10.206.81.243:9200" ]
user => "Test"
password => "123456"
index => "mylog-%{+YYYY.MM.dd}"
}
}
if [tag] == "syslog" {
elasticsearch {
hosts => [ "10.206.81.246:9200", "10.206.81.236:9200", "10.206.81.243:9200" ]
user => "Test"
password => "123456"
index => "syslog-%{+YYYY.MM.dd}"
}
}
}
I tried to have two separate outputs for mylog and syslog. At first, it works like this: everything was passed to mylog-%{+YYYY.MM.dd} index even files from syslog. So I tried change second if statement to else if. It did not work so I changed it back. Now, my filebeat are not able to send data to logstash and I am receiving this errors:
2018/01/20 15:02:10.959887 async.go:235: ERR Failed to publish events caused by: EOF
2018/01/20 15:02:10.964361 async.go:235: ERR Failed to publish events caused by: client is not connected
2018/01/20 15:02:11.964028 output.go:92: ERR Failed to publish events: client is not connected
My second test was change my pipeline like this:
input {
beats {
port => "5044"
}
file{
path => "/home/centos/logs/mylogs.log"
}
}
filter {
grok{
match => { "message" => "%{IP:client} %{WORD:method} %{URIPATHPARAM:request} %{NUMBER:bytes} %{NUMBER:duration}" }
}
}
output {
elasticsearch {
hosts => [ "10.206.81.246:9200", "10.206.81.236:9200", "10.206.81.243:9200" ]
user => "Test"
password => "123456"
index => "mylog-%{+YYYY.MM.dd}"
}
}
If I add some lines to mylog.log file, filebeat will print the same ERR files but it is passed to logstash and I can see it in Kibana. Could anybody explain me why does it not work? What does those errors means?
I am using filebeat an logstash version 6.1.
Sorry if I make any mistake in english.
In the output section, you are using "tag" (note: is in singular) which doesn't exists. But changing this to "tags" will not work either because the field tags is an array and you will be comparing it to a string, so you should get the first item instead of getting the whole array and then compare. Try this:
if [tags[0]] == "mylog" { ......

Data missed in Logstash?

Data missed a lot in logstash version 5.0,
is it a serous bug ,when a config the config file so many times ,it useless,data lost happen again and agin, how to use logstash to collect log event property ?
any reply will thankness
Logstash is all about reading logs from specific location and based on you interested information you can create index in elastic search or other output also possible.
Example of logstash conf
input {
file {
# PLEASE SET APPROPRIATE PATH WHERE LOG FILE AVAILABLE
#type => "java"
type => "json-log"
path => "d:/vox/logs/logs/vox.json"
start_position => "beginning"
codec => json
}
}
filter {
if [type] == "json-log" {
grok {
match => { "message" => "UserName:%{JAVALOGMESSAGE:UserName} -DL_JobID:%{JAVALOGMESSAGE:DL_JobID} -DL_EntityID:%{JAVALOGMESSAGE:DL_EntityID} -BatchesPerJob:%{JAVALOGMESSAGE:BatchesPerJob} -RecordsInInputFile:%{JAVALOGMESSAGE:RecordsInInputFile} -TimeTakenToProcess:%{JAVALOGMESSAGE:TimeTakenToProcess} -DocsUpdatedInSOLR:%{JAVALOGMESSAGE:DocsUpdatedInSOLR} -Failed:%{JAVALOGMESSAGE:Failed} -RecordsSavedInDSE:%{JAVALOGMESSAGE:RecordsSavedInDSE} -FileLoadStartTime:%{JAVALOGMESSAGE:FileLoadStartTime} -FileLoadEndTime:%{JAVALOGMESSAGE:FileLoadEndTime}" }
add_field => ["STATS_TYPE", "FILE_LOADED"]
}
}
}
filter {
mutate {
# here converting data type
convert => { "FileLoadStartTime" => "integer" }
convert => { "RecordsInInputFile" => "integer" }
}
}
output {
elasticsearch {
# PLEASE CONFIGURE ES IP AND PORT WHERE LOG DOCs HAS TO PUSH
document_type => "json-log"
hosts => ["localhost:9200"]
# action => "index"
# host => "localhost"
index => "locallogstashdx_new"
# workers => 1
}
stdout { codec => rubydebug }
#stdout { debug => true }
}
To know more you can go throw many available websites like
https://www.elastic.co/guide/en/logstash/current/first-event.html

Logstash can't read file

I'm using Logstash for the first time. I'm on Windows 10
I try to access a multiline json file from a input/ folder like this :
input
{
file
{
codec => multiline
{
pattern => '^/{'
negate => true
what => previous
}
path => "/input/*.json"
exclude => "*.gz"
}
}
filter
{
mutate
{
replace => [ "message", "%{message}" ]
gsub => [ 'message','\n','']
}
if [message] =~ /^{.*}$/
{
json { source => message }
}
}
output
{
file
{
path => "/output/output.json"
}
}
The problem here is when i'm launching logstash instance I get this error :
WARN logstash.inputs.file - failed to open
/input/sample.json: \input\sample.json
I already replace the LS_GROUP by adm in startup.options and tried to replace "/" with "\", didn't work.
I also "chmod 777" my json file but it change nothing.
Any idea ?
Going with / should work fine. What if you're trying to have the path something like this, escaping with \\:
path => "\\input\\*.json"
Make sure you give the complete path.

Logstash not printing anything

I am using logstash for the first time and trying to setup a simple pipeline for just printing the nginx logs. Below is my config file
input {
file {
path => "/var/log/nginx/*access*"
}
}
output {
stdout { codec => rubydebug }
}
I have saved the file as /opt/logstash/nginx_simple.conf
And trying to execute the following command
sudo /opt/logstash/bin/logstash -f /opt/logstash/nginx_simple.conf
However the only output I can see is:
Logstash startup completed
Logstash shutdown completed
The file is not empty for sure. As per my understanding I should be seeing the output on my console. What am I doing wrong ?
Make sure that the character encoding of your logfile is UTF-8. If it is not, try to change it and restart the Logstash.
Please try this code as your Logstash configuration, in order to setup a simple pipeline for just printing the nginx logs.
input {
file {
path => "/var/log/nginx/*.log"
type => "nginx"
start_position => "beginning"
sincedb_path=> "/dev/null"
}
}
filter {
if [type] == "nginx" {
grok {
patterns_dir => "/home/krishna/Downloads/logstash-2.1.0/pattern"
match => {
"message" => "%{NGINX_LOGPATTERN:data}"
}
}
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
}
stdout { codec => rubydebug }
}

Collect Data from log files with logstash

I'm trying with logstash to collect data from a log file for a version of NETASQ Firewall which contains a lot of lines , but i can not collect correctly my data , I don't know if there is a standard to follow, but I started like this:
input {
stdin { }
file {
type => "FireWall"
path => "/var/log/file.log"
start_position => 'beginning'
}
}
filter {
grok {
match => [ "message", "%{SYSLOGTIMESTAMP:date} %{WORD:id}"]
}
}
output {
stdout { }
elasticsearch {
cluster => "logstash"
}
}
The first line of my file.log looks like this :
Feb 27 04:02:23 id=firewall time="2015-02-27 04:02:23" fw="GVGM-NEWYORK"
tz=+0200 startime="2015-02-27 04:02:22" pri=5 confid=01 slotlevel=2 ruleid=57
srcif="Vlan2" srcifname="SSSSS" ipproto=udp dstif="Ethernet0"
dstifname="out" proto=teredo src=192.168.21.12 srcport=52469
srcportname=ephemeral_fw_udp dst=94.245.121.253 dstport=3544
dstportname=teredo dstname=teredo.ipv6.microsoft.com.nsatc.net
action=block logtype="filter"#015
And finally How can I collect data from the others lines. Please give me a topic just to start. Thanks All.

Resources