Logstash Auto update Data - logstash

I am using the latest version of logstash(7.6.2). I tried uploading a sample data and was able to successfully upload it into the elasticsearch using logstash(enabled auto-reload) and was able to see the index in the Kibana interface.
But, when I make changes to the below config file, I was unable to see the updated data in the Kibana interface. I was trying to remove the mutate filter plugin and the logstash pipeline reloaded but the data in Kibana is not updated. Interestingly it didn't throw up any errors.
Sample.conf
input{
file{
path => "/usr/local/Cellar/sample.log"
start_position => "beginning"
}
}
filter{
grok{
match => ["message", "%{TIMESTAMP_ISO8601:timestamp_string}%{SPACE}%{GREEDYDATA:line}"]
}
date{
match => ["timestamp_string", "ISO8601"]
}
mutate{
remove_field => [message, timestamp_string]
}
}
output{
elasticsearch{
hosts => ["localhost:9200"]
index => "sample"
}
stdout{
codec => rubydebug
}
}
Any help here is appreciated. TIA
P.S. - I am new to ElasticSearch!

If you want to parse again a complete file, you need to :
delete sindedb files
OR only delete the corresponding line in sincedb file
Then, restart Logstash. Logstash will reparse the file.
For more info: https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html#sincedb_path

Related

Make logstash filter by name

I have a log file called "/var/log/commands.log" that I'm trying to separate into fields with logstash & grok. I've got it working. Now, I'm trying to make logstash only do this to the file "/var/log/commands.log" and not any input by doing "if name = commands.log" but something with the "if" statement seems wrong as it skips over it.
input{
file{
path => "/var/log/commands.log"
}
beats{
port => 5044
}
}
filter {
if [log][file][path] == "/var/log/commands.log" {
grok{
match => { "message" => "*very long statement*"
}
}
}
}
output{
elasticsearch { hosts => ["localhost:9200"]}
}
If I remove the if statement it works and the fields are visible in kibana. I'm testing things locally. Does anyone know what's going on?
EDIT: SOLVED: In logstash, it has to be only [path] instead of all the rest.

Logstash not producing output or inserting to elastic search

When i execute the configuration file using the command bin\logstash -f the configfile.conf. There is not display on the console just the logs by logstash.
Here is the configuation file:
input
{
file
{
path => "F:\ELK\50_Startups.csv"
start_position => "beginning"
}
}
filter
{
csv
{
separator => ","
columns => ["R&D","Administration","Marketing","State","Profit"]
}
}
output
{
elasticsearch
{
hosts => ["localhost:9200"]
index => ["Startups"]
}
stdout{}
}
do the input file (50_Startups.csv) has fresh data written to? if not, it might be that logstash already stored the read offset as the last line, and it would not re-read it on future runs, unless you delete the sincedb_path offset files, of just add the following config:
sincedb_path => "/dev/null"
That would force logstash to re-parse the file.
see more info on files offsets here:https://www.elastic.co/guide/en/logstash/current/plugins-inputs-file.html#_tracking_of_current_position_in_watched_files
from it:
By default, the sincedb file is placed in the data directory of Logstash with a filename based on the filename patterns being watched

Sending data to ES index using Logstash?

ES 2.4.1
Logstash 2.4.0
I am sending data to elasticsearch from local to create a index "pica".I used the below conf file.
input {
file {
path => "C:\Output\Receive.txt"
start_position => "beginning"
codec => json_lines
}
}
output {
elasticsearch {
hosts => "http://localhost:9200/"
index => "pica"
}
stdout{
codec => rubydebug
}
}
I couldn't see any output in either logstash prompt or in elasticsearch cluster.
When i seen the .sincedb file it has the following code:
612384816-350504-4325376 0 0 3804
May i know what's the problem here?
Thanks
I guess you're missing out the square brackets [] for the hosts value, since it's a type of array as per the doc. Hence it should look like:
elasticsearch {
hosts => ["localhost:9200"]
index => "pica"
}
OR :
hosts => ["127.0.0.1"] OR hosts => ["localhost"]

Logstash pipeline getting freezed

I am new to Elastic Search Logstash and kabana, I have written logstash.conf Here is the glimpse of it
input{
file{
path=>"C:\Users\mohammadraghib.ahsan\Downloads\Gl\adapterCommon.log"
start_position=>"beginning"
sincedb_path => "C:\Users\mohammadraghib.ahsan\Downloads\Gl\sincedb.db"
}
}
filter{
grok{
match => {"message" => "%{DATA:deviceid} %{GREEDYDATA:data}"}
}
}
output{
stdout { codec => rubydebug }
}
When I am executing it by .\logstash -f logstash.confg i am using powershell on windows
It get freezed on this part
I appreciate for the valuable comment provided by pandaadb and baudsp. Adding one blank line at the end of file resolved this issue. THe problem with logstash is that sometimes it fails to run if it found file with same signature(last modified) so adding one last line at the end helped in changing the file signature.

Duplicate entries into Elastic Search while logstash instance is running

I have been trying to send logs from logstash to elasticsearch.Suppose I am running a logstash instance and while it is running,I make a change to the file which the logstash instance is monitoring,then all the logs which have been previously saved in the elasticsearch are saved again,hence duplicates are formed.
Also,when the logstash instance is closed and is restarted again,the logs gets duplicated in the elasticsearch.
How do I counter this problem?
How to send only the newest added entry in the file from logstash to elasticsearch?
My logstash instance command is the following:
bin/logstash -f logstash-complex.conf
and the configuration file is this:
input
{
file
{
path => "/home/amith/Desktop/logstash-1.4.2/accesslog1"
}
}
filter
{
if [path] =~ "access"
{
mutate
{
replace =>
{ "type" => "apache_access" } }
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch {
host => localhost
index => feb9
}
stdout { codec => rubydebug }
}
I got the solution.
I was opening the file,adding a record and saving it ,due to which logstash treated the same file as a different file each time I saved it as it registered different inode number for the same file.
The solution is to append a line to the file without opening the file but by running the following command.
echo "the string you want to add to the file" >> filename
[ELK stack]
I wanted some custom configs in
/etc/logstash/conf.d/vagrant.conf
so the first step was to make a backup: /etc/logstash/conf.d/vagrant.conf.bk
This caused logstash to add 2 entries in elasticseach for each entry in <file>.log;
the same if i had 3 files in /etc/logstash/conf.d/*.conf.* in ES i had 8 entries for each line in *.log
As you mentioned in your question.
when the logstash instance is closed and is restarted again,the logs gets duplicated in the elasticsearch.
So, it probably you have delete the .since_db. Please have a look at here.
Try to specific the since_db and start_position. For example:
input
{
file
{
path => "/home/amith/Desktop/logstash-1.4.2/accesslog1"
start_position => "end"
sincedb_path => /home/amith/Desktop/sincedb
}
}

Resources