Logstash -Could not find any executable java binary - linux

I have ELK installed on a VM in my laptop.Elasticsearch is up and running.
./bin/logstash -f logstash-filter.conf gives me the below error
Could not find any executable java binary. Please install java in your PATH or set JAVA_HOME.
I tried setting up JAVA_HOME and $ PATH, still the issue is persistent. Am I missing something?
which java
/usr/bin/java
java -version
java version "1.7.0_79"
OpenJDK Runtime Environment (IcedTea 2.5.5) (7u79-2.5.5-0ubuntu0.14.04.2)
OpenJDK 64-Bit Server VM (build 24.79-b02, mixed mode)
echo $JAVA_HOME
/usr/local/java/jdk1.8.0_45
echo $PATH
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/divija/bin:/usr/local/java/jdk1.8.0_45/bin
logstash-filter.conf
input { stdin { } }
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}
output {
elasticsearch { host => localhost
index=>"myindex"
}
stdout { codec => rubydebug }`enter code here`
}

I had to
export JAVACMD=`which java`
to make this work.

I know, it's an old thread, but I was also having the same problem and was doing a very silly thing.
I had updated my $JAVA_HOME var in /etc/environment but was not reloading the file because of which it was not taking effect, and running source /etc/environment solved my problem.

Related

Cannot run logstach 8.1.0 with the command

I have a problem about running defined configuration file in logstash though its command.
Here is my conf file shown below.
input {
file {
type => "userslog"
path => "C:\Users\aaa\Downloads\logstash-8.1.0\users-ms.log"
}
file {
type => "albumslog"
path => "C:\Users\aaa\Downloads\logstash-8.1.0\albums-ms.log"
}
}
output {
if[type]=="userslog"{
elasticsearch {
hosts => ["localhost:9200"]
index => "userslog-%{+YYYY.MM.dd}"
}
} else if[type]=="albumslog"{
elasticsearch {
hosts => ["localhost:9200"]
index => "albumslog-%{+YYYY.MM.dd}"
}
}
stdout {codec => rubydebug}
}
Here is the result shown below.
C:\Users\aaa\Desktop\logstash-8.1.0\bin>logstash logstash-simple.conf
"Using bundled JDK: ."
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
[FATAL] 2022-03-17 12:03:32.267 [main] Logstash - Logstash stopped processing because of an error: (NameError) missing class name (`org.apache.http.─▒mpl.client.StandardHttpRequestRetryHandler')
org.jruby.exceptions.NameError: (NameError) missing class name (`org.apache.http.─▒mpl.client.StandardHttpRequestRetryHandler')
When changed 'Java::OrgApacheHttpImplClient::StandardHttpRequestRetryHandler' to 'Java::OrgApacheHttp.impl.client::StandardHttpRequestRetryHandler', it didn't work.
How can I fix it?

Logstash 7.10.02 failed to start on windows

I tried starting the logstash with the below command
logstash-7.10.2\logstash -f logstash.conf
logstash.conf
input{
file{
path => "D://server.log" start_position=> "beginning" type => "logs"
}
}
filter {
grok {
match => {"message" => "%{TIMESTAMP_ISO8601:logtime} \[%{NOTSPACE:thread}\] \[%{LOGLEVEL:loglevel}\] %{GREEDYDATA:line}"
}
}
}
output {
if "ERROR" in [loglevel]
{
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash"
}
}
}
command prompt displayed the below text and did not start logstash.
Using JAVA_HOME defined java: C:\Program Files\Java\jdk1.8.0_221;
WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK
warning: ignoring JAVA_OPTS=-Xms64m -Xmx128m -XX:NewSize=64m -XX:MaxNewSize=64m -XX:PermSize=64m -XX:MaxPermSize=64m; pass JVM parameters via LS_JAVA_OPTS
No error logs were created.
Have you tried staring logstash in debug mode .
--log.level DEBUG
Pipeline looks okay. Can you try adding below output to see if you the pattern and log data matches. Just to rule out any grokparsefailures.
output {
stdout { codec => rubydebug }
if "ERROR" in [loglevel]
{
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash"
}
}
}

logstash Error: com.mariadb.jdbc.Driver not loaded

I'm trying to sync some tables from MariaDB to ElasticSearch using LogStash.
I'm on a Debian Buster (10) Server
$ java -version
openjdk version "11.0.4" 2019-07-16
OpenJDK Runtime Environment (build 11.0.4+11-post-Debian-1deb10u1)
OpenJDK 64-Bit Server VM (build 11.0.4+11-post-Debian-1deb10u1, mixed mode, sharing)
$ mariadb --version
mariadb Ver 15.1 Distrib 10.3.15-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2
$ /usr/share/logstash/bin/logstash --version
logstash 7.2.0
I tried different connectors :
$ ls -l /usr/share/java/
mariadb-java-client.jar
$ ls -l /etc/logstash/connectors/
mariadb-java-client-2.1.2.jar
mariadb-java-client-2.2.6.jar
mariadb-java-client-2.3.0.jar
mariadb-java-client-2.4.2.jar
mysql-connector-java-8.0.17.jar
Using "org.mariadb.jdbc.Driver" for mariadb connectors and "com.mysql.cj.jdbc.Driver" for mysql connector
$ cat /etc/logstash/conf.d/db-fr-bank.conf
input {
jdbc {
jdbc_connection_string => "jdbc:mariadb://localhost:3306/db_fr"
jdbc_user => "logstash"
jdbc_password => "<password>"
jdbc_driver_library => "/usr/share/java/mariadb-java-client.jar"
jdbc_driver_class => "org.mariadb.jdbc.Driver"
statement => "SELECT * FROM bank"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "fr-bank"
}
}
But, instead of syncing, i keep getting :
$ /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/db-fr-bank.conf
...
[ERROR] 2019-07-29 02:08:17.563 [[main]<jdbc] jdbc - Failed to load /usr/share/java/mariadb-java-client.jar {:exception=>#<TypeError: failed to coerce jdk.internal.loader.ClassLoaders$AppClassLoader to java.net.URLClassLoader>}
[ERROR] 2019-07-29 02:08:17.598 [[main]<jdbc] javapipeline - A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Jdbc jdbc_user=>"logstash", jdbc_password=><password>, statement=>"SELECT * FROM bank", jdbc_driver_library=>"/usr/share/java/mariadb-java-client.jar", jdbc_connection_string=>"jdbc:mariadb://localhost:3306/db_fr", id=>"38a6d112755a5e87278761cf5f41b7e509212d1d02837a03672df2face00943a", jdbc_driver_class=>"org.mariadb.jdbc.Driver", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_f3094292-7482-4b73-95c4-7f78da4da911", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC}, last_run_metadata_path=>"/root/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: org.mariadb.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Exception: LogStash::ConfigurationError
Stack: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in `open_jdbc_connection'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in `execute_statement'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in `execute_query'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:in `run'
/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:309:in `inputworker'
/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:302:in `block in start_input'
...
Same issue here.
I use the workaround from here with this and it works.
ie:
Copy the driver file to {logstash install dir}/logstash-core/lib/jars/ directory. These jars get added to the correct JDK classpath as logstash is started via java.
And
Change the jdbc_driver_library value in the logstash conf to "". i.e.: jdbc_driver_library => "" as well, otherwise the code still tries to load the jar separately

logstash custom patterns don´t get resolved

I´m trying to setup an environment for grok debugging and made this with a docker.
Everything works fine, until logstash tries to resolve a custom pattern.
Here is my environment
I start the docker with
docker run -it --name logstash_debug -v
/home/cloud/docker-elk/logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml
-v /home/cloud/docker-elk/logstash/pipeline/:/usr/share/logstash/pipeline/
-v /home/cloud/docker-elk/logstash/patterns/:/usr/share/logstash/patterns
docker.elastic.co/logstash/logstash:7.2.0
As I said, logstash starts up, loads the pipeline (debug.conf)
input { stdin {} }
filter {
grok {
patterns_dir => ["/usr/share/logstash/patterns"]
match => ["message", "%{YEAR1} \[%{LOGLEVEL:loglvl}\] %{GREEDYDATA:message}"]
}
date {
match => ["customer_time", "${YEAR1}"]
target => "#timestamp"
}
}
output { stdout { codec => rubydebug } }
and gives me this error:
Cannot evaluate ${YEAR1}. Replacement variable YEAR1 is not
defined in a Logstash secret store or as an Environment entry and
there is no default value given.
the patterns_dir contains a file "dateformats" which contains (stripped it down to a minimum)
YEAR1 %{YEAR}
the logstash debug output gives me this:
[DEBUG][logstash.filters.grok ] config
LogStash::Filters::Grok/#patterns_dir =
["/usr/share/logstash/patterns"]
[DEBUG][logstash.filters.grok ] config
LogStash::Filters::Grok/#match = {"message"=>"%{YEAR1}
\[%{LOGLEVEL:loglvl}\] %{GREEDYDATA:message}"}
.....
[DEBUG][logstash.filters.grok ] config
LogStash::Filters::Grok/#patterns_files_glob = "*"
Normally logstash should be able to grab this file (I even started the docker with --user 0 to be sure that I have no permission problem) but it somehow can´t.
Anyone can me give a hint to what´s going on ?
Thanks and cheers,
Wurzelseppi

how to implement the unit or integration tests for logstash configuration?

With the logstash 1.2.1 one can now have conditional to do various stuff. Even the earlier version's conf file can get complicated if one is managing many log files and implement metric extraction.
After looking at this comprehensive example, I really wondered my self, how can I detect any breakages in this configuration?
Any ideas.
For a syntax check, there is --configtest:
java -jar logstash.jar agent --configtest --config <yourconfigfile>
To test the logic of the configuration you can write rspec tests. This is an example rspec file to test a haproxy log filter:
require "test_utils"
describe "haproxy logs" do
extend LogStash::RSpec
config <<-CONFIG
filter {
grok {
type => "haproxy"
add_tag => [ "HTTP_REQUEST" ]
pattern => "%{HAPROXYHTTP}"
}
date {
type => 'haproxy'
match => [ 'accept_date', 'dd/MMM/yyyy:HH:mm:ss.SSS' ]
}
}
CONFIG
sample({'#message' => '<150>Oct 8 08:46:47 syslog.host.net haproxy[13262]: 10.0.1.2:44799 [08/Oct/2013:08:46:44.256] frontend-name backend-name/server.host.net 0/0/0/1/2 404 1147 - - ---- 0/0/0/0/0 0/0 {client.host.net||||Apache-HttpClient/4.1.2 (java 1. 5)} {text/html;charset=utf-8|||} "GET /app/status HTTP/1.1"',
'#source_host' => '127.0.0.1',
'#type' => 'haproxy',
'#source' => 'tcp://127.0.0.1:60207/',
}) do
insist { subject["#fields"]["backend_name"] } == [ "backend-name" ]
insist { subject["#fields"]["http_request"] } == [ "/app/status" ]
insist { subject["tags"].include?("HTTP_REQUEST") }
insist { subject["#timestamp"] } == "2013-10-08T06:46:44.256Z"
reject { subject["#timestamp"] } == "2013-10-08T06:46:47Z"
end
end
This will, based on a given filter configuration, run input samples and test if the expected output is produced.
To run the test, save the test as haproxy_spec.rb and run `logstash rspec:
java -jar logstash.jar rspec haproxy_spec.rb
There are lots of spec examples in the Logstash source repository.
since logstash has been upgraded and now the command will be something like (give the folder)
/opt/logstash/bin/logstash agent --configtest -f /etc/logstash/logstash-indexer/conf.d
If you see some warning, but the error message is mixed together, and you didn't know which one have issue. You have to check its file one by one

Resources