Logstash - Error "Expected one of [ \t\r\n], "#" - logstash

I use Docker to build my Logstash 7.10.1 image, I overrided logstash.conf with :
input {
uri => "mongodb://admin:pass#localhost:27017/programs?ssl=true"
placeholder_db_dir => "/opt/logstash-mongodb/"
placeholder_db_name => "logstash_sqlite.db"
collection => "programs"
batch_size => 5000
}
output {
elasticsearch {
hosts => ["http://elasticsearch-crawlers:9200"]
index => "programs"
document_id => "%{id}"
}
stdout {
codec => rubydebug
}
}
I have this error :
[logstash.agent ] Failed to execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Expected one
of [ \t\r\n], "#", "input", "filter", "output" at line 1,
column 1 (byte 1)",
:backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in
compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:184:in initialize'",
"org/logstash/execution/JavaBasePipelineExt.java:69:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in initialize'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in
execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:365:in block
in converge_state'"]}
I am on Window, I used Notepad++ to create this logstash.conf file. I used Convert with BOM and I am on Unix (LF) format.
I don't see what is the problem here ? I checked this thread : Logstash exception Expected one of #, input, filter, output at line 1, column 1, but it seems my file is correct

Your config is missing the input plugin type, it seems that you are using the mongodb input plugin.
Your config should be something like this.
input {
mongodb {
uri => "mongodb://admin:pass#localhost:27017/programs?ssl=true"
placeholder_db_dir => "/opt/logstash-mongodb/"
placeholder_db_name => "logstash_sqlite.db"
collection => "programs"
batch_size => 5000
}
}

Related

Logstash: Custom delimiter for multi-line XML logs

I have XML logs where logs are closed with "=======", e.g.
<log>
<level>DEBUG</level>
<message>This is debug level</message>
</log>
=======
<log>
<level>ERROR</level>
<message>This is error level</message>
</log>
=======
Every log can span across multiple lines.
How to parse those logs using logstash?
This can be done using multiline codec. The delimiter "=======" can be used in pattern like this
input {
file {
type => "xml"
path => "/path/to/logs/*.log"
codec => multiline {
pattern => "^======="
negate => "true"
what => "previous"
}
}
}
filter {
mutate {
gsub => [ "message", "=======", ""]
}
xml {
force_array => false
source => "message"
target => "log"
}
mutate {
remove_field => [ "message" ]
}
}
output {
elasticsearch {
codec => json
hosts => ["http://localhost:9200"]
index => "logs-%{+YYYY.MM.dd}"
}
}
Here the combination of pattern and negate => true means: if a line does not start with "=======" it belongs to the previous event (thus what => "previous"). When a line with the delimiter is hit, we start a new event. In the filter the delimiter is simply removed with gsub and XML is parsed with xml plugin.

Logstash 6.2.4 crashes when adding an ID to plugin (Expected one of #)

I am trying to add ID field to my Logstash 6.2.4 config. What i want to do is to debug "http://localhost:9600/_node/stats/pipelines" so i need some names (id is random UUIDs without id field in configs). I found documentation about plugin id. It works for me like this:
input {
http {
port => "${HTTP_PORT_FOR_EVENTS:8089}"
additional_codecs => {"application/json"=>"json"}
id => "http_events"
tags => [ "test" ]
}
}
filter {
if "test" in [tags] {
mutate {
remove_field => [ "headers", "host" ]
}
}
}
But it crashes like this:
input {
http {
port => "${HTTP_PORT_FOR_EVENTS:808}"
additional_codecs => {"application/json"=>"json"}
id => "http_events"
tags => [ "test" ]
}
}
filter {
id => "test2"
if "test" in [tags] {
mutate {
remove_field => [ "headers", "host" ]
}
}
}
With this error (i guess, there is two erros with shutdown between because of Docker container restart or something):
Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, { at line 32, column 8 (byte 676) after filter {\r\n id ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in block in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:inconverge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:105:in block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:18:ininterval'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:inblock in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
SIGTERM received. Shutting down.
Sending Logstash's logs to /usr/share/logstash/logs which is now configured via log4j2.properties
Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
Ignoring the 'pipelines.yml' file because modules or command line options are specified
Starting Logstash {"logstash.version"=>"6.2.4"}
Successfully started Logstash API endpoint {:port=>9600}
SIGTERM received. Shutting down.
Sending Logstash's logs to /usr/share/logstash/logs which is now configured via log4j2.properties
Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
Ignoring the 'pipelines.yml' file because modules or command line options are specified
Starting Logstash {"logstash.version"=>"6.2.4"}
Successfully started Logstash API endpoint {:port=>9600}
Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, { at line 24, column 8 (byte 534) after filter {\n id ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in block in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:inconverge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:inblock in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
Those configs are also break my logstash:
input {
rabbitmq {
id => "test1"
type => "event"
exchange => "event"
exclusive => true
}
}
input {
id => "test1"
rabbitmq {
type => "event"
exchange => "event"
exclusive => true
}
}
I found a solution.
ID must be placed in filter plugin field (such as grok, mutate or json), not just in filter field:
filter {
if "test" in [tags] {
mutate {
id => "test2"
remove_field => [ "headers", "host" ]
}
}
}
Also not every version of plugins support IP field. Update plugins if necessary:
logstash-plugin update logstash-filter-mutate

Logstash - Split escape character " \ " is not working

I have logstash to check log from window file ; there is many app running on window show I think using the folder to determinate this log come from what app ; but it is not working and get the exception :
Failed to execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Expected one of
\', ', any character at line 21, column 1 (byte 237)
my config
input {
beats {
port => 5044
}
}
filter {
mutate {
split => { "source" => '\\' }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => mt4log
}
}
someone can help me find out what is problem here thanks

Logstash not importing Apache log files

I am just new to logstash trying to import apache log file into Elastic - I see the below error:
RROR] 2017-10-28 00:38:51.085 [LogStash::Runner] agent - Cannot create pipeline {:reason=>"Expected one of #, {, } at line 4, column 19 (byte 81) after input {\nfile {\npath =>\"/home/monus/logstash-tutorial-dataset“\nstart_position =>\""}
here is my logstash.conf file
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
That's logstash's way of telling you it found a syntax error on line 4, character 19 of the config-file. As line 4 in your snippet is a close-bracket, and you have no input {} bracket at all in the snippet, I'd look in your input section for the syntax error.

Export ntopng log to logstash

I know ntopng can put direct to elasticsearch but my boss want use logtash as layer to transfer log to elasticsearch.
I'm try many time but failed.
ntopng log like:
{"index": {"_type": "ntopng", "_index": "ntopng-2016.08.23"}}
{ "#timestamp": "2016-08-23T01:49:41.0Z", "type": "ntopng", "IN_SRC_MAC": "04:2A:E2:0D:62:FB", "OUT_DST_MAC": "00:16:3E:8D:B7:E4", "IPV4_SRC_ADDR": "14.152.84.14", "IPV4_DST_ADDR": "xxx.xxx.xxx", "L4_SRC_PORT": 34599, "L4_DST_PORT": 53, "PROTOCOL": 17, "L7_PROTO": 5, "L7_PROTO_NAME": "DNS", "IN_PKTS": 15, "IN_BYTES": 1185, "OUT_PKTS": 15, "OUT_BYTES": 22710, "FIRST_SWITCHED": 1471916981, "LAST_SWITCHED": 1471916981, "SRC_IP_COUNTRY": "CN", "SRC_IP_LOCATION": [ 113.250000, 23.116699 ], "DST_IP_COUNTRY": "VN", "DST_IP_LOCATION": [ 105.849998, 21.033300 ], "NTOPNG_INSTANCE_NAME": "ubuntu", "INTERFACE": "ens192", "DNS_QUERY": "cpsc.gov", "PASS_VERDICT": true }
Logstash config:
input {
tcp {
port => 5000
codec => json
}
}
filter{
json{
source => "message"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout{ codec => rubydebug }
}
Thanks
Since the ntopng logs are already in the bulk format expected by Elasticsearch you don't need to use the elasticsearch output but you can use the http output directly like this. No need to have Logstash parse JSON, simply forward the raw bulk commands to ES.
There's one catch, though: we need to add a newline character after the second line otherwise ES will reject the bulk call. We can achieve this with a mutate/update filter that adds a verbatim newline character after the message. Try it out, this will work.
input {
tcp {
port => 5000
codec => multiline {
pattern => "_index"
what => "next"
}
}
}
filter{
mutate {
update => {"message" => "%{message}
"}
}
}
output {
http {
http_method => "post"
url => "http://localhost:9200/_bulk"
format => "message"
message => "%{message}"
}
}

Resources