INFO No non-zero metrics in the last 30s message in filebeat - logstash

I'm new to ELK and I'm getting issues while running logstash. I ran the logatash as defined in below link
https://www.elastic.co/guide/en/logstash/current/advanced-pipeline.html
But when run filebeat and logstash, Its show logstash successfully runs at port 9600. In filebeat it gives like this
INFO No non-zero metrics in the last 30s
Logstash is not getting input from filebeat.Please help..
the filebeat .yml is
filebeat.prospectors:
- input_type: log
paths:
- /path/to/file/logstash-tutorial.log
output.logstash:
hosts: ["localhost:5043"]
and I ran this command
sudo ./filebeat -e -c filebeat.yml -d "publish"
The config file is
input {
beats {
port => "5043"
}
}
output {
stdout { codec => rubydebug }
}
then ran the commands
1)bin/logstash -f first-pipeline.conf --config.test_and_exit - this gave warnings
2)bin/logstash -f first-pipeline.conf --config.reload.automatic -This started the logstash on port 9600
I couldn't proceeds after this since filebeat gives the INFO
INFO No non-zero metrics in the last 30s
And the ELK version used is 5.1.2

The registry file stores the state and location information that Filebeat uses to track where it was last reading
So you can try updating or deleting registry file. see here
cd /var/lib/filebeat
sudo mv registry registry.bak
sudo service filebeat restart
I have also faced this issue and I have solved with above commands.

Filebeat reads from the end of your file, and is expecting new stuff to be added over time (like a log file).
To make it read from the beginning of the file, set the 'tail_files' option.
Also note the instructions there about re-processing a file, as that can come into play during testing.

Related

Install Logstash on Windows

Trying to install logstash as windows service. All works when i manually run it from CMD like so:
C:\Elastic\Logstash\bin\logstash -f c:\Elastic\Logstash\config\logstash-sample.conf
I see that file changes are updated and posted to console (per .conf file console output)
However, when i install Logstash as windows service:
sc create Logstash binpath="\"C:\Elastic\Logstash\bin\logstash\" -f \"c:\Elastic\Logstash\config\logstash-sample.conf\""
It creates windows service but will fail when starting it:
Logstash log:
[2019-04-15T14:40:29,605][ERROR][org.logstash.Logstash ]
java.lang.IllegalStateException: Logstash stopped processing because
of an error: (SystemExit) exit
When I try to install logstash with NSSM like below, it runs, but does not work:
nssm.exe install logstash "C:\Elastic\Logstash\bin\logstash.bat" "agent -f C:\Elastic\Logstash\config\logstash-sample.conf"
Found the solution:
The problem I was having is due to "agent" keyword. In CMD i ran this:
nssm edit logstash
Then I got the following window and modified Arguments:

Stop filebeat after ingesting all the logs

I have observed that filebeat runs forever after ingestion of all the logs.
Is there any way through which filebeat will auto stop after the all logs are ingested?
Is the configuration below is correct or not ?
filebeat.prospectors:
shutdown_timeout: 0s
enabled: true
paths:
- D:\new.log
output.logstash:
hosts: "localhost:5044"
I do not find anything in the logstash documentation to help me on this question.
I would suggest you to use client_inactivity_timeout => "30" in the input section of logstash.conf file.
hope this helps.
https://www.elastic.co/guide/en/logstash/current/plugins-inputs-beats.html#plugins-inputs-beats-client_inactivity_timeout

Issue while sending file content from filebeat to logstash

I am new to ELK and i am trying to do some handson using the ELK stack. I am performing the following on WINDOWS,
1. Installed Elastic search,confirmed with http://localhost:9200/
2. Installed logstash,confirmed using http://localhost:9600/
logstash -f logstash.config
logstash.config file looks like this,
input {
beats {
port => "5043"
}
}
# The filter part of this file is commented out to indicate that it is
# optional.
# filter {
#
# }
output {
elasticsearch { hosts => ["localhost:9200"] }
}
3. Installed Kibana, confirmed using http://localhost:5601
Now, i want to use filebeat to pass a log file to logstash which parses and forwards it to Elastic search for indexing. and finally kibana displays it.
In order to do that,
"
i did the following changes in filebeat.yml.
change 1 :
In Filebeat prospectors, i added
paths:
# - /var/log/*.log
- D:\KibanaInput\vinod.log
Contents of vinod.log: Hello World from FileBeat.
Change 2:
In Outputs,
#output.logstash:
# The Logstash hosts
hosts: ["localhost:9600"]
when i run the below command,
filebeat -c filebeat.yml -e
i get the below error,
ERR Connecting error publishing events (retrying): Failed to parse JSON response: json: cannot unmarshal string into Go value of type struct { Number string }
Please let me know what mistake i am doing.
You are in a good path.
Please confirm the following:
in your filebeat.yml make sure that you don't have comment in the output.logstash: line, that correspond to your change number 2.
Make sure your messages are been grok correctly. Add the following output in your logstash pipeline config file.
output {
stdout { codec => json }
}
3.Start your logstash in debug mode.
4.If you are reading the same file with the same content make sure you remove the registry file in filebeat. ($filebeatHome/data/registry)
5.Read the log files.

Setting document_type of log in filebeat stops filebeat restarting

I am trying to import custom log I have on my server through filebeat and send it over to logstash for use in my ELK stack.
I have set this up to work correctly and it runs fine currently.
However, I am wishing to add a logstash filter for this specific log and so decided to add a document_type field for this log to allow me to filter based on it in logstash.
I have done this like so:
filebeat.prospectors:
- input_type: log
paths:
- /var/log/apache2/access.log
document_type: apache-access
- input_type: log
paths:
- /var/www/webapp/storage/logs/laravel.log
- input_type: log
paths:
- /opt/myservice/server/server.log
document_type: myservice
I have added document_type: myservice to the log for myservice, and believe I have done so according to the documentation here. Furthermore it is done the same as I have done it for the apache access log.
However when I restart filebeat, it won't start back up again. I have tried looking at the log for filebeat - however there doesn't seem to be anything in there about why it won't start.
If I comment out document_type: myservice, like this #document_type: myservice and then restart filebeat it boots up correctly which means it must be something to do with that line?
Questions:
Am I doing something wrong here?
Is there an alternative method I could use to apply my logstash filter to this log only other than using if [type] == "myservice"?
Using document_type is a good approach to applying conditionals in Logstash. An alternative method is to apply tags or fields in Filebeat.
The problem with your configuration is the indentation of the document_type: myservice that you added. Notice how the indentation is different than the document_type: apache-access. The document_type field should be at the same level as paths and input_type as they are all prospector options.
You can test your config file with filebeat.sh -c /etc/filebeat/filebeat.yml -e -configtest.
You can also run your config through a tool like http://www.yamllint.com just to check that it's valid YAML.

how to configure logstash with elasticsearch in window8?

I'm currently trying to install and run Logstash on Windows 7 using the guidelines of the Logstash website. I am struggling to configure and use logstash with elasticsearch. Created logstash-simple.conf with below content
`
enter code here`input { stdin { } }
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
when i execute below Command:
D:\logstash-2.4.0\bin>logstash agent -f logstash-simple.conf
I get following error, i tried many things but i get same error
←[31mNo config files found: D:/logstash-2.4.0/bin/logstash-simple.conf
Can you make sure this path is a logstash config file? {:level=>:error}←[0m
The signal HUP is in use by the JVM and will not work correctly on this platform
D:\logstash-2.4.0\bin>
Read No config files found in the error. So Logstash can't find the logstash-simple.conf file.
So try
D:\logstash-2.4.0\bin>logstash agent -f [Direct path to the folder containing logstash-simple.conf]logstash-simple.conf
Verify if extension is .conf or another other thing like .txt (logstash-simple.conf.txt)

Resources