Where can I find documentation on coding the 'target' value for an http plugin to Logstash? - logstash

I'm getting the following error in my logstash log:
ECS compatibility is enabled buttargetoption was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set thetarget option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
My logstash config file currently looks like this:
`
input {
http {
port => 3333
codec => json
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
`
Can someone point me to a description of how to code the 'target' option please?
See error log above for details

target is an option on the json codec, not the http input. It is documented here. Try
http { port => 3333 codec => json { target => "[document]" } }

Related

How to send all ECS Services Logs to LogStash

I am trying to forward all AWS ECS logs to Logstash. Checking some documentation for that I only found ways to have specific Cloudwatch logs groups defined:
input {
cloudwatch_logs {
log_group => ["/aws/ecs/a","/aws/ecs/b","/aws/ecs/c","/aws/ecs/d","/aws/ecs/e","/aws/ecs/f"]
start_position => "end"
access_key_id => "<access_key>"
secret_access_key => "<secret_access_key>"
region => "eu-west-2"
tags => ["cloudwatch_syslog"]
}
}
Is any way to forward all logs inside "/aws/ecs/*"?
Other nice solution possible?
Using AWS Firelens for log forwarding works fine.
https://docs.newrelic.com/docs/logs/forward-logs/aws-firelens-plugin-log-forwarding/

Server name in logstash pipeline

Im running a logstash pipeline as below:
input {
redis {
key => "ose_system.log"
data_type => ['list']
db => 10
host => "redis"
port => "6379"
tags => ["ose-dev"]
codec => "plain"
}
}
The "problem" im now having is that i dont have any relation to my server where the log is coming from. Im also a bit worried what will happen when i have 25 servers pushing to the same key. Won't there be any locking or something on that key?
Is there a way to include the server name here without adding the server name to the log itself? I cant add it to the log since the fact it uses the monolog format which is a uniformal way of logging.
Thanks a lot for the feedback
Just add filter section to your config and add server field like:
filter {
mutate {
add_field => { "servers" => "Server_1" }
}
}

How to use logstash plugin - logstash-input-http

I am exploring Logstash to receive inputs on HTTP. I have installed http plugin using:
plugin install logstash-input-http
The installation was successfull. Then I tried to run logstash using following command:
logstash -e 'input {http {port => 8900}} output {stdout{codec => rubydebug}}'
But logstash terminates without giving any error as such.
Don't know how to verify whether plugin is installed correctly or not. And how to utilize the http plugin to test a sample request.
Thanks in Advance!
I was able to solve the problem by using the .conf file instead of command line arguments.
I created a http-pipeline.conf file similar to below:
input {
http {
host => "0.0.0.0"
port => "8080"
}
}
output {
stdout {}
}
And then executed Logstash like:
logstash -f http-pipeline.conf
Using a POSTMAN tool, I sent a POST request(http://localhost:8080) to Logstash with a sample string and voila it appeared on the Logstash console.
If you are executing from the same domain following will be sufficient.
input {
http {
port => 5043
}
}
output {
file {
path => "/log_streaming/my_app/app.log"
}
}
If you want to executing a request on a different domain of the website then you need to set few response headers
input {
http {
port => 5043
response_headers => {
"Access-Control-Allow-Origin" => "*"
"Content-Type" => "text/plain"
"Access-Control-Allow-Headers" => "Origin, X-Requested-With, Content-Type,
Accept"
}
}
}
output {
file {
path => "/log_streaming/my_app/app.log"
}
}

logstash http_poller first URL request's response should be input to second URL's request param

I have two URLs (due to security concern i will explain by using dummy)
a> https://xyz.company.com/ui/api/token
b> https://xyz.company.com/request/transaction?date=2016-01-21&token=<tokeninfo>
When you hit url mentioned in point 'a' it will generate a token let it be a string of 16 characters
Then that token should be used in making second request of point 'b' in token param
Updated
The second url response is important to me i.e is a JSON response, I need
to filter the json data and extract required data and output it to standard
output and elastic search.
is there any way of doing so in logstash using plugin "http_poller" or any other plugins.
Note : these request urls should be executed one after another, i.e point 'a' url should be executed first and point 'b' url should be executed next after receiving new token.
Please suggest.
Yes, it's possible with a mix of an http_poller input and an http output.
Here is the config I came up with:
input {
# 1. trigger new token requests every hour
http_poller {
urls => {
token => "https://xyz.company.com/ui/api/token"
}
interval => 3600
add_field => {"token" => "%{message}"}
}
}
filter {
}
output {
# 2. call the API
http {
http_method => "get"
url => "https://xyz.company.com/request/transaction?date=2016-01-21&token=%{token}"
}
}
UPDATE
If you want to be able to get the content of the API call and store it in ES, you need a hybrid solution. You need to set up a cron that will call a script that runs the two HTTP calls and stores the results in a file and then you can let logstash tail that file and forward the results to ES.
Shell script to put on cron:
#!/bin/sh
# 1. Get the token
TOKEN=$(curl -s -XGET https://xyz.company.com/ui/api/token)
# 2. Call the API with the token and append JSON to file
curl -s -XGET "https://xyz.company.com/request/transaction?date=2016-01-21&token=$TOKEN" >> api_calls.log
The above script can be set on cron using crontab (or similar), there are plenty of examples out there on how to achieve this.
Then the logstash config can be very simple. It just needs to tail the api_calls.log file and send the document to ES
input {
file {
path => "api_calls.log"
start_position => "beginning"
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "my_index"
document_type" => "my_type"
}
stdout {
codec => "rubydebug"
}
}

How to add custom headers to logstash http poller input?

I am trying to use logstash to poll the Uber API (just as an example) to get data from an API and (at the moment) just output to console. Cannot seem to get custom headers to work:
input {
http_poller {
urls => {
test2 => {
# Supports all options supported by ruby's Manticore HTTP client
method => get
url => "https://api.uber.com/v1/products?latitude=37.7759792&longitude=-122.41823"
headers => {
Authorization => "P8YegbF53nWPZST5xX0ZlktVnufXYYQa01Dy0ocm"
}
}
}
request_timeout => 60
interval => 60
codec => "json"
# A hash of request metadata info (timing, response headers, etc.) will be sent here
metadata_target => "http_poller_metadata"
}
}
output {
stdout {}
}
If I understand the concept right, that should be the place to put the header but since its not basic auth I cannot use the Auth flag in the sample.

Resources