I want to create a conf file for logstash that loads data from a file and send it to kafka.
the file is in json format and has the topicId in it.
This is what I have so far..
input {
file {
path => "~/file1.json"
start_position => "beginning"
codec => "json"
}
}
filter {
json {
source => message
}
}
output {
kafka {
bootstrap_servers => "localhost"
codec => plain {
format => "%{message}"
}
topic_id => "???"
}
}
can this be done?
Regards,
Ido
Yes it can be done.
For example if the message json contains a topic_id key like:
"topicId": "topic1"
Then in logstash kafka output plugin:
output {
kafka {
bootstrap_servers => "localhost"
codec => plain {
format => "%{message}"
}
topic_id => "%{topicId}"
}
}
Related
I want to sent to multiple destinations on logstash, here it is my configuration:
output {
elasticsearch {
hosts => "10.10.10.7:9200"
index => "ubuntu18"
}
kafka {
bootstrap_server => "10.10.10.6:9092"
codec => json
topic_id => "beats"
}
}
But it is not working, any idea?
It seems that your kafka output configuration is wrong, it is bootstrap_servers not bootstrap_server.
output {
elasticsearch {
hosts => "10.10.10.7:9200"
index => "ubuntu18"
}
kafka {
bootstrap_servers => "10.10.10.6:9092"
codec => json
topic_id => "beats"
}
}
I am parsing json log file in Logstash. There is a field named #person.name. I tried to rename this field name before sending it to elasticsearch. I also tried to remove the field but I couldn't remove or delete that field because of that my data not getting indexed in Elasticsearch.
Error recorded in elasticsearch
MapperParsingException[Field name [#person.name] cannot contain '.']
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:276)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:221)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parse(ObjectMapper.java:196)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseProperties(ObjectMapper.java:308)
at org.elasticsearch.index.mapper.object.ObjectMapper$TypeParser.parseObjectOrDocumentTypeProperties(ObjectMapper.java:221)
at org.elasticsearch.index.mapper.object.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:138)
at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:119)
at org.elasticsearch.index.mapper.DocumentMapperParser.parse(DocumentMapperParser.java:100)
at org.elasticsearch.index.mapper.MapperService.parse(MapperService.java:435)
at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.applyRequest(MetaDataMappingService.java:257)
at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.execute(MetaDataMappingService.java:230) at org.elasticsearch.cluster.service.InternalClusterService.runTasksForExecutor(InternalClusterService.java:458)
at org.elasticsearch.cluster.service.InternalClusterService$UpdateTask.run(InternalClusterService.java:762)
My Logstash config
input {
beats {
port => 11153
}
}
filter
{
if [type] == "person_get" {
##Parsing JSON input to JSON Filter..
json {
source => "message"
}
mutate{
rename => { "#person.name" => "#person-name" }
remove_field => [ "#person.name"]
}
fingerprint {
source => ["ResponseTimestamp"]
target => "fingerprint"
key => "78787878"
method => "SHA1"
concatenate_sources => true
}
}
}
output{
if [type] == "person_get" {
elasticsearch {
index => "logstash-person_v1"
hosts => ["xxx.xxx.xx:9200"]
document_id => "%{fingerprint}" # !!! prevent duplication
}
stdout {
codec => rubydebug
}
} }
Is there a way of having the filename of the file being read by logstash as the index name for the output into ElasticSearch?
I am using the following config for logstash.
input{
file{
path => "/logstashInput/*"
}
}
output{
elasticsearch{
index => "FromfileX"
}
}
I would like to be able to put a file e.g. log-from-20.10.2016.log and have it indexed into the index log-from-20.10.2016. Does the logstash input plugin "file" produce any variables for use in the filter or output?
Yes, you can use the path field for that and grok it to extract the filename into the index field
input {
file {
path => "/logstashInput/*"
}
}
filter {
grok {
match => ["path", "(?<index>log-from-\d{2}\.\d{2}\.\d{4})\.log$" ]
}
}
output{
elasticsearch {
index => "%{index}"
}
}
input {
file {
path => "/home/ubuntu/data/gunicorn.log"
start_position => "beginning"
}
}
filter {
grok {
match => {
"message" => "%{USERNAME:u1} %{USERNAME:u2} \[%{HTTPDATE:http_date}\] \"%{DATA:http_verb} %{URIPATHPARAM:api} %{DATA:http_version}\" %{NUMBER:status_code} %{NUMBER:byte} \"%{DATA:external_api}\" \"%{GREEDYDATA:android_client}\""
remove_field => ["message"]
}
}
date {
match => ["http_date", "dd/MMM/yyyy:HH:mm:ss +ssss"]
}
ruby {
code => "event.set('index_name',event.get('path').split('/')[-1].gsub('.log',''))"
}
}
output {
elasticsearch {
hosts => ["0.0.0.0:9200"]
index => "%{index_name}-%{+yyyy-MM-dd}"
user => "*********************"
password => "*****************"
}
stdout { codec => rubydebug }
}
I am trying to load CSV file in logstash but it is not reading the file and not creating the index in elasticsearch
I need to read the CSV file in elasticsearch.
Tried few changes in config file.
My Config file
input {
file {
type => "csv"
path => "/root/installables/*.csv"
start_position => beginning
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
}
output {
elasticsearch {
hosts => localhost
index => "client"
}
}
Could anybody tell how to load CSV file in logstash?
I think you should put a "csv" filter. I make it work like this:
input {
file {
path => "/filepath..."
start_position => beginning
# to read from the beginning of file
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => ["COL1", "COL2"]
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
host => "localhost"
index => "csv_index"
}
}
Also, adding stdout as output helps you to debug and know if the file is loading
Summary:
logstash -> elasticsearch --> Failed parsing date shown in debug output
Events in logfile contain field #timestamp (format: 2014-06-18T11:52:45.370636+02:00)
Events are actually processed to elasticsearch but 'failed parsing' errors are shown.
Versions:
Logstash 1.4.1
Elasticsearch 1.20
Is there something I do wrong?
I have log files that contain events like this:
{"#timestamp":"2014-06-18T11:52:45.370636+02:00","Level":"Info","Machine":"X100","Session":{"MainId":0,"SubId":"5otec"},"Request":{"Url":"http://www.localhost:5000/Default.aspx","Method":"GET","Referrer":"http://www.localhost:5000/Default.aspx"},"EndRequest":{"Duration":{"Main":0,"Page":6720}}}
I use this logstash config:
input {
file {
path => "D:/testlogs/*.log"
codec => "json"
start_position => [ "beginning" ]
}
}
filter {
date {
match => [ "#timestamp", "ISO8601" ]
}
}
output {
stdout {
codec => json
}
elasticsearch {
protocol => http
host => "10.125.26.128"
}
}
When I run logstash with this config on the events in the log files I get the following error:
[33mFailed parsing date from field {:field=>"#timestamp", :value=>"2014-06-18T12:18:34.717+02:00", :exception=>#<TypeError: cannot convert instance of class org.jruby.RubyTime to class java.lang.String>
Now the thing is that actually the events are imported in elasticsearch, but I see these errors.
Can this be a problem or can these failed parsing errors being ignored?
You logs are already in json format, so you no need to parse the date.
You can use the json filter to parse all the field and value.
For example, with this configuration:
input {
file {
path => "D:/testlogs/*.log"
codec => "json"
}
}
filter {
json {
source => "message"
}
}
output {
stdout {
codec => rubydebug
}
}
I can parse your log successfully. And the output is :
{
"#timestamp" => "2014-06-18T17:52:45.370+08:00",
"Level" => "Info",
"Machine" => "X100",
"Session" => {
"MainId" => 0,
"SubId" => "5otec"
},
"Request" => {
"Url" => "http://www.localhost:5000/Default.aspx",
"Method" => "GET",
"Referrer" => "http://www.localhost:5000/Default.aspx"
},
"EndRequest" => {
"Duration" => {
"Main" => 0,
"Page" => 6720
}
},
"#version" => "1",
"host" => "ABC",
"path" => "D:/testlogs/*.log"
}
I also have try that the codec => json in file input is not work but is work in stdin input. Maybe this is a bug in logstash.
Hope this can help you.