logstash : basic http request to a web api - logstash

I am a newbie in logstash and i just want to make a basic http get to a simple api and display result in console
my conf file named "api.conf" contain :
input {
http {
url => 'https://jsonplaceholder.typicode.com/albums'
}
}
output {
stdout { codec => rubydebug }
}
and i launch it from logstash folder i have just dowloaded and have not changed thanks a windos cmd command:
C:\Users\username\Desktop\logstash-6.2.2>.\bin\logstash.bat -f .\api.conf
it returning n error in console:
Failed to execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Something is
wrong with your configuration.",
:backtrace=>["C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/config/mixin.rb:89:in
config_init'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/inputs/base.rb:62:in
initialize'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/plugins/plugin_factory.rb:89:in
plugin'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:112:in
plugin'", "(eval):8:in <eval>'", "org/jruby/RubyKernel.java:994:in
eval'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:84:in
initialize'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/pipeline.rb:169:in
initialize'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/pipeline_action/create.rb:40:in
execute'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:315:in
block in converge_state'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:141:in
with_pipelines'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:312:in
block in converge_state'", "org/jruby/RubyArray.java:1734:in
each'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:299:in
converge_state'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:166:in
block in converge_state_and_update'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:141:in
with_pipelines'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:164:in
converge_state_and_update'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/agent.rb:90:in
execute'",
"C:/Users/username/Desktop/logstash-6.2.2/logstash-core/lib/logstash/runner.rb:348:in
block in execute'",
"C:/Users/username/Desktop/logstash-6.2.2/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in
block in initialize'"]}
Do you know what I am dig wrong and how make it working?

For pulling web rest api, the correct input plug-in is http_poller
input {
http_poller {
urls => {
test1 => "https://jsonplaceholder.typicode.com/albums"
}
request_timeout => 60
# Supports "cron", "every", "at" and "in" schedules by rufus scheduler
schedule => { cron => "* * * * * UTC"}
codec => "json"
# A hash of request metadata info (timing, response headers, etc.) will be sent here
metadata_target => "http_poller_metadata"
}
}
output {
stdout {
codec => rubydebug
}
}

Related

Error in logstash while passing if statement

I am new to logstash.When I am trying to put a if statement in logstash config file it gives me error
if statement used is:
if {await} > 10
{ mutate {add_field => {"RULE_DATA" => "Value is above threshold"}
add_field => {"ACTUAL_DATA" => "%{await}"}
}
}
the error faced is given below:
[ERROR] 2018-07-20 16:52:21.327 [Ruby-0-Thread-1: /usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:22] agent - Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 18, column 10 (byte 729) after filter{\n grok {\n patterns_dir => [\"./patterns\"]\n match => { \"message\" => [\"%{TIME:time}%{SPACE}%{USERNAME:device}%{SPACE}%{USERNAME:tps}%{SPACE}%{SYSLOGPROG:rd_sec/s}%{SPACE}%{SYSLOGPROG:wr_sec/s}%{SPACE}%{SYSLOGPROG:avgrq-sz}%{SPACE}%{SYSLOGPROG:avgqu-sz}%{SPACE}%{NUMBER:await}%{SPACE}%{SYSLOGPROG:svctm}%{SPACE}%{SYSLOGPROG:%util}\"]\n }\n overwrite => [\"message\"]\n } \n if \"_grokparsefailure\" in [tags] {\n drop { }\n }\nif {await", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315:in block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in block in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:inconverge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:inblock in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
Please suggest what has caused this error.
You have a syntax error. If you have a field as name it await. Like output of grok parse etc.
use the below
if [await] > 10
{
mutate {
add_field => {"RULE_DATA" => "Value is above threshold"}
add_field => {"ACTUAL_DATA" => "%{await}"}
}
}
Logstash conditional's expression enclosed in [] not {}, have a look at the following example from conditional documentation,
filter {
if [action] == "login" {
mutate { remove_field => "secret" }
}
}

ELK-logstash.conf is always wrong

I want use Filebeat with logstash.But the logstash.conf is wrong.
logstash.conf:
```
input {
beats {
port => "5044"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}
```
It reponse this:
Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 3, column 1 (byte 76) after ", :backtrace=>["/opt/logstash/logstash-core/lib/logstash/compiler.rb:42:in compile_imperative'", "/opt/logstash/logstash-core/lib/logstash/compiler.rb:50:incompile_graph'", "/opt/logstash/logstash-core/lib/logstash/compiler.rb:12:in block in compile_sources'", "org/jruby/RubyArray.java:2486:inmap'", "/opt/logstash/logstash-core/lib/logstash/compiler.rb:11:in compile_sources'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:51:ininitialize'", "/opt/logstash/logstash-core/lib/logstash/pipeline.rb:171:in initialize'", "/opt/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:inexecute'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:335:in block in converge_state'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:332:in block in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:319:in converge_state'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:164:inconverge_state_and_update'", "/opt/logstash/logstash-core/lib/logstash/agent.rb:90:in execute'", "/opt/logstash/logstash-core/lib/logstash/runner.rb:343:inblock in execute'", "/opt/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
The beats plugin is wrong. The port should be a number.
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-beats.html
Also you have no filter plugin, maybe that will be necessary too:
input {
beats {
port => 5044
}
}
filter{}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
}
}

Logstash got exception in running

I got this exception in logstash log when I run it.
[2018-01-14T15:42:00,912]
[ERROR][logstash.outputs.elasticsearch]
Unknown setting 'host' for elasticsearch
[2018-01-14T15:42:00,921][ERROR][logstash.agent] Failed to execute action {:action=>LogStash::PipelineAction::Create/
pipeline_id:main, :exception=>"LogStash::ConfigurationError",
:message=>"Something is wrong with your configuration.",
:backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/config
/mixin.rb:89:in config_init
"/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:63:in
initialize'",
"/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:3:in
initialize'",
"/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:25:in
initialize'",
"/usr/share/logstash/logstash-core/lib/logstash/plugins/plugin_factory.rb:86:in
plugin'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:114:in
plugin'", "(eval):87:in <eval>'","org/jruby/RubyKernel.java:994:in
eval'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:86:in
initialize'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:171:in
initialize'",
"/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40:in
execute'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:335:inblock
in converge_state'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in
with_pipelines'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:332:inblock
in converge_state'", "org/jruby/RubyArray.java:1734:in each'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:319:in
converge_state'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in block
in converge_state_and_update'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in
with_pipelines'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in
converge_state_and_update'",
"/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in
execute'",
"/usr/share/logstash/logstash-core/lib/logstash/runner.rb:343:in
block in execute'",
"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in
block in initialize'"]}
It is my configure:
input{
lumberjack {
port => 5044
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
filter{
if[type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:sysylog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
add_field => ["received_at", "%{#timestamp}" ]
add_field => ["received_from", "%{host}" ]
}
syslog_pri {}
date {
match => ["syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output{
elasticsearch { host =>localhost }
stdout { codec => rubydebug }
}
How can I solve it . thank you.
I use latest version of ELK
If you check your output elasticsearch plugin, it has host parameter.
It needs a hosts parameter and a string array.
https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-hosts
My logstash->elastic plugin looks like this:
elasticsearch{
hosts=>["localhost:9200"]
index=>"logstash-%{+YYYY.MM.dd}"
}
You might need the index parameter set too.

Get JSON from file

Logstash 5.2.1
I can't read JSON documents from a local file using Logstash. There are no documents in the stdout.
I run Logstash like this:
./logstash-5.2.1/bin/logstash -f logstash-5.2.1/config/shakespeare.conf --config.reload.automatic
Logstash config:
input {
file {
path => "/home/trex/Development/Shipping_Data_To_ES/shakespeare.json"
codec => json {}
start_position => "beginning"
}
}
output {
stdout {
codec => rubydebug
}
}
Also, I tried with charset:
...
codec => json {
charset => "UTF-8"
}
...
Also, I tried with/without json codec in the input and with filter:
...
filter {
json {
source => "message"
}
}
...
Logstash console after start:
[2017-02-28T11:37:29,947][WARN ][logstash.agent ] fetched new config for pipeline. upgrading.. {:pipeline=>"main", :config=>"input {\n file {\n path => \"/home/trex/Development/Shipping_Data_To_ES/shakespeare.json\"\n codec => json {\n charset => \"UTF-8\"\n }\n start_position => \"beginning\"\n }\n}\n#filter {\n# json {\n# source => \"message\"\n# }\n#}\noutput {\n stdout {\n codec => rubydebug\n }\n}\n\n"}
[2017-02-28T11:37:29,951][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
[2017-02-28T11:37:30,434][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-02-28T11:37:30,446][INFO ][logstash.pipeline ] Pipeline main started
^C[2017-02-28T11:40:55,039][WARN ][logstash.runner ] SIGINT received. Shutting down the agent.
[2017-02-28T11:40:55,049][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
^C[2017-02-28T11:40:55,475][FATAL][logstash.runner ] SIGINT received. Terminating immediately..
The signal INT is in use by the JVM and will not work correctly on this platform
[trex#Latitude-E5510 Shipping_Data_To_ES]$ ./logstash-5.2.1/bin/logstash -f logstash-5.2.1/config/shakespeare.conf --config.test_and_exit
^C[trex#Latitude-E5510 Shipping_Data_To_ES]$ ./logstash-5.2.1/bin/logstash -f logstash-5.2.1/config/shakespeare.conf --confireload.automatic
^C[trex#Latitude-E5510 Shipping_Data_To_ES]$ ./logstash-5.2.1/bin/logstash -f logstash-5.2.1/config/shakespeare.conf --config.reload.aumatic
Sending Logstash's logs to /home/trex/Development/Shipping_Data_To_ES/logstash-5.2.1/logs which is now configured via log4j2.properties
[2017-02-28T11:45:48,752][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-02-28T11:45:48,785][INFO ][logstash.pipeline ] Pipeline main started
[2017-02-28T11:45:48,875][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
Why Logstash doesn't put my JSON documents in stdout?
Did you try including the file type within your file input:
input {
file {
path => "/home/trex/Development/Shipping_Data_To_ES/shakespeare.json"
type => "json" <-- add this
//codec => json {} <-- for the moment i'll comment this
start_position => "beginning"
}
}
And then have your filter as such:
filter{
json{
source => "message"
}
}
OR if you're going with the codec plugin make sure to have the synopsis as such within your input:
codec => "json"
OR you might want to try out json_lines plugin as well. Hope this thread comes in handy.
It appears that sincedb_path is important to read JSON files. I was able to import the JSON only after adding this option. It is needed to maintain the current position in the file to be able to resume from that position in case the import is interrupted. I don't need any position tracking, so I just set this to /dev/null and it works.
The basic working Logstash configuration:
input {
file {
path => ["/home/trex/Development/Shipping_Data_To_ES/shakespeare.json"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
output {
stdout {
codec => json_lines
}
elasticsearch {
hosts => ["localhost:9200"]
index => "shakespeare"
}
}

logstash ArgumentError

I have trouble to get the logstash (2.4.0) tutorial to work on Windows 7.
This is working: bin\logstash.bat -f pipe.conf
# pipe.conf
input {
stdin { }
}
output {
stdout { }
}
When I enter then code in the msdos-window, I get expected log messages.
C:\Users\foo\Workspace\Reporting\Stack5.0 pipe.conf
Settings: Default pipeline workers: 4
Pipeline main started
configuration in a file
2016-10-10T14:32:13.506Z foopc configuration in a file
yehaaaa
2016-10-10T14:32:18.320Z foopc yehaaaa
Tweaking the configuration file to get close to the tutorial, does not work. Then I get the following error message:
{
:timestamp=>"2016-10-10T16:45:25.605000+0200",
:message=>"Pipeline aborted due to error",
:exception=>"ArgumentError",
:backtrace=>["C:/Users/foo/Workspace/Reporting/Stack5.0/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-input-file-2.2.5/lib/logstash/inputs/file.rb:187:in `register'",
"org/jruby/RubyArray.java:1613:in `each'",
"C:/Users/foo/Workspace/Reporting/Stack5.0/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-input-file-2.2.5/lib/logstash/inputs/file.rb:185:in `register'",
"C:/Users/foo/Workspace/Reporting/Stack5.0/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:330:in `start_inputs'",
"org/jruby/RubyArray.java:1613:in `each'",
"C:/Users/foo/Workspace/Reporting/Stack5.0/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:329:in `start_inputs'",
"C:/Users/foo/Workspace/Reporting/Stack5.0/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:180:in `start_workers'",
"C:/Users/foo/Workspace/Reporting/Stack5.0/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'",
"C:/Users/foo/Workspace/Reporting/Stack5.0/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"],
:level=>:error} {:timestamp=>"2016-10-10T16:45:28.608000+0200",
:message=>"stopping pipeline",
:id=>"main"
}
I call the script like before with: bin\logstash.bat -f pipe.conf
# pipe.conf
input {
# stdin { }
# https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html#configuring-file-input
# logstash 2.4.0
file {
path => "logstash-tutorial-dataset"
start_position => beginning
ignore_older => 0
}
}
# The filter part of this file is commented out to indicate that it is
# optional.
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
}
}
output {
stdout { }
}
The logfile: logstash-tutorial-dataset is available and accessable. I downloaded the file from the tutorial.
What did I miss and how do I get logstash to work with this configuration?
According to the doc:
Paths must be absolute and cannot be relative.

Resources