logstash; logstash stopped working when new config file added to sync postgres data - logstash

input {
jdbc {
jdbc_connection_string => “jdbc:postgresql://localhost:5432/FraudInsuranceBizapp”
jdbc_user => “postgres”
jdbc_password => “postgres”
jdbc_driver_class => “org.postgresql.Driver”
statement => “SELECT * from \”jhi_user\””
}
}
output {
elasticsearch {
hosts => [“http://localhost:9200"]
index => “jhi_user”
document_id => “users_%{id}”
doc_as_upsert => true
#user => “es_user”
#password => “es_password”
}
}
And when i run logstash from the command line .\bin\logstash.bat -f
D:\MESCN\softwares\logstash-7.13.2\config\jhi-user.conf getting
the error below
Using JAVA_HOME defined java: "D:\MESCN\softwares\logstash-7.13.2\jdk"
WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future
release.
Sending Logstash logs to D:/MESCN/softwares/logstash-7.13.2/logs which is now configured via log4j2.properties
[2021-06-24T10:11:35,655][INFO ][logstash.runner ] Log4j configuration path used is:
D:\MESCN\softwares\logstash-7.13.2\config\log4j2.properties
[2021-06-24T10:11:35,689][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.13.2",
"jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK
64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [mswin32-x86_64]"}
[2021-06-24T10:11:35,989][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file
because modules or command line options are specified
[2021-06-24T10:11:38,460][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2021-06-24T10:11:38,776][ERROR][logstash.agent ] Failed to execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Expected one of
[ \\t\\r\\n], \"#\", [A-Za-z0-9_-], '\"', \"'\", [A-Za-z_], \"-\",
[0-9], \"[\", \"{\" at line 3, column 32 (byte 51) after input {\r\n
jdbc {\r\n jdbc_connection_string => ",
:backtrace=>["D:/MESCN/softwares/logstash-7.13.2/logstash-core/lib/logstash/compiler.rb:32:in
`compile_imperative'",
"org/logstash/execution/AbstractPipelineExt.java:187:in `initialize'",
"org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'",
"D:/MESCN/softwares/logstash-7.13.2/logstash-core/lib/logstash/java_pipeline.rb:47:in
`initialize'",
"D:/MESCN/softwares/logstash-7.13.2/logstash-core/lib/logstash/pipeline_action/create.rb:52:in
`execute'",
"D:/MESCN/softwares/logstash-7.13.2/logstash-core/lib/logstash/agent.rb:389:in
`block in converge_state'"]}
[2021-06-24T10:11:39,047][INFO ][logstash.runner ] Logstash shut down.
[2021-06-24T10:11:39,071][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747)
~[jruby-complete-9.2.16.0.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710)
~[jruby-complete-9.2.16.0.jar:?]
at D_3a_.MESCN.softwares.logstash_minus_7_dot_13_dot_2.lib.bootstrap.environment.<main>(D:\MESCN\softwares\logstash-7.13.2\lib\bootstrap\environment.rb:89)
~[?:?]

jdbc_connection_string => “jdbc:postgresql://localhost:5432/FraudInsuranceBizapp”
logstash will not accept curly quotes. You need to use regular double quotes. The same applies to all of the options in the input and output.

Related

I was Getting below error when i'm to import csv file using Logstash 8.1

I'm getting below error when I was executing below command in my cmd prompt
C:\Program Files\Elk Stack\logstash-8.1.2\bin>logstash -f ./logstach.conf
Logstach.conf
file => "C:/Users/babus/Downloads/archive/tips.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
filter {
date{
match => [ "timestamp", "MMM dd HH:mm:ss"]
}
csv {
separator =>","
columns => ["total_bill", "tip","sex","smoker","day","time","size"]
}
}
output {
elasticsearch {
hosts=> "localhost:9200"
index => "tips"
}
stdout {}
}```
***Below is the error i'm getting***
logstash -f ./logstach.conf
"Using bundled JDK: C:\Program Files\Elk Stack\logstash-8.1.2\jdk\bin\java.exe"
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to C:/Program Files/Elk Stack/logstash-8.1.2/logs which is now configured via log4j2.properties
[2022-04-18T13:37:58,080][INFO ][logstash.runner ] Log4j configuration path used is: C:\Program Files\Elk Stack\logstash-8.1.2\config\log4j2.properties
[2022-04-18T13:37:58,095][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.1.2", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.14.1+1 on 11.0.14.1+1 +indy +jit [mswin32-x86_64]"}
[2022-04-18T13:37:58,100][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-04-18T13:37:58,248][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-04-18T13:38:02,284][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-04-18T13:38:02,578][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", \"{\" at line 2, column 6 (byte 15) after input {\r\nfile ", :backtrace=>["C:/Program Files/Elk Stack/logstash-8.1.2/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:189:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'", "C:/Program Files/Elk Stack/logstash-8.1.2/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "C:/Program Files/Elk Stack/logstash-8.1.2/logstash-core/lib/logstash/pipeline_action/create.rb:50:in `execute'", "C:/Program Files/Elk Stack/logstash-8.1.2/logstash-core/lib/logstash/agent.rb:376:in `block in converge_state'"]}
[2022-04-18T13:38:02,763][INFO ][logstash.runner ] Logstash shut down.
[2022-04-18T13:38:02,775][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby.jar:?]
at C_3a_.Program_20_Files.Elk_20_Stack.logstash_minus_8_dot_1_dot_2.lib.bootstrap.environment.<main>(C:\Program Files\Elk Stack\logstash-8.1.2\lib\bootstrap\environment.rb:94) ~[?:?]
You don't have the input section on the YAML file, nor the file input with path instead of file.
input {
file {
path => "C:/Users/babus/Downloads/archive/tips.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
}

How can logstash be executed? (error occured)

How can logstash be executed? (error occured)
run logstash on AWS Linux.
but error occurred while executing.
my Linux(Ubuntu Version: 20.04)
elasticsearch installed.(execute successfully)
kibana installed.(execute successfully)
logstash installed. (execute error occured)
my .conf file code
input {
jdbc {
clean_run => true
jdbc_driver_library => "/usr/share/java/mysql-connector-java-8.0.23.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://AWSLINK:3306/schema_name?useSSL=false&user=root&password=1234"
jdbc_user => "root"
jdbc_password => "1234"
schedule => "* * * * *"
statement => "select * from schema_name"
}
}
output {
elasticsearch {
hosts => 52.188.20.167:9200"
index => "AWS_DB_0514"
}
stdout {
codec => rubydebug
}
}
I execute logstash in linux(command)
./logstash -f test.conf --path.settings /etc/logstash/
I execute above code but error occured.(attempt)
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2021-05-14T08:37:16,025][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2021-05-14T08:37:16,039][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.12.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.10+9 on 11.0.10+9 +indy +jit [linux-x86_64]"}
[2021-05-14T08:37:16,466][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-05-14T08:37:17,524][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2021-05-14T08:37:18,048][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [0-9], [ \\t\\r\\n], \"#\", \"}\" at line 16, column 24 (byte 608) after output {\n elasticsearch {\n hosts => 52.188", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:184:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:69:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:389:in `block in converge_state'"]}
[2021-05-14T08:37:18,165][INFO ][logstash.runner ] Logstash shut down.
[2021-05-14T08:37:18,177][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.13.0.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.13.0.jar:?]
at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]
There is an error on line 16.
incorrect code
hosts => 52.188.20.167:9200"
correct code
hosts => "52.188.20.167:9200"

logstash Error: com.mariadb.jdbc.Driver not loaded

I'm trying to sync some tables from MariaDB to ElasticSearch using LogStash.
I'm on a Debian Buster (10) Server
$ java -version
openjdk version "11.0.4" 2019-07-16
OpenJDK Runtime Environment (build 11.0.4+11-post-Debian-1deb10u1)
OpenJDK 64-Bit Server VM (build 11.0.4+11-post-Debian-1deb10u1, mixed mode, sharing)
$ mariadb --version
mariadb Ver 15.1 Distrib 10.3.15-MariaDB, for debian-linux-gnu (x86_64) using readline 5.2
$ /usr/share/logstash/bin/logstash --version
logstash 7.2.0
I tried different connectors :
$ ls -l /usr/share/java/
mariadb-java-client.jar
$ ls -l /etc/logstash/connectors/
mariadb-java-client-2.1.2.jar
mariadb-java-client-2.2.6.jar
mariadb-java-client-2.3.0.jar
mariadb-java-client-2.4.2.jar
mysql-connector-java-8.0.17.jar
Using "org.mariadb.jdbc.Driver" for mariadb connectors and "com.mysql.cj.jdbc.Driver" for mysql connector
$ cat /etc/logstash/conf.d/db-fr-bank.conf
input {
jdbc {
jdbc_connection_string => "jdbc:mariadb://localhost:3306/db_fr"
jdbc_user => "logstash"
jdbc_password => "<password>"
jdbc_driver_library => "/usr/share/java/mariadb-java-client.jar"
jdbc_driver_class => "org.mariadb.jdbc.Driver"
statement => "SELECT * FROM bank"
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "fr-bank"
}
}
But, instead of syncing, i keep getting :
$ /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/db-fr-bank.conf
...
[ERROR] 2019-07-29 02:08:17.563 [[main]<jdbc] jdbc - Failed to load /usr/share/java/mariadb-java-client.jar {:exception=>#<TypeError: failed to coerce jdk.internal.loader.ClassLoaders$AppClassLoader to java.net.URLClassLoader>}
[ERROR] 2019-07-29 02:08:17.598 [[main]<jdbc] javapipeline - A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Jdbc jdbc_user=>"logstash", jdbc_password=><password>, statement=>"SELECT * FROM bank", jdbc_driver_library=>"/usr/share/java/mariadb-java-client.jar", jdbc_connection_string=>"jdbc:mariadb://localhost:3306/db_fr", id=>"38a6d112755a5e87278761cf5f41b7e509212d1d02837a03672df2face00943a", jdbc_driver_class=>"org.mariadb.jdbc.Driver", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_f3094292-7482-4b73-95c4-7f78da4da911", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 00:00:00 UTC}, last_run_metadata_path=>"/root/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: org.mariadb.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Exception: LogStash::ConfigurationError
Stack: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in `open_jdbc_connection'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in `execute_statement'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in `execute_query'
/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:in `run'
/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:309:in `inputworker'
/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:302:in `block in start_input'
...
Same issue here.
I use the workaround from here with this and it works.
ie:
Copy the driver file to {logstash install dir}/logstash-core/lib/jars/ directory. These jars get added to the correct JDK classpath as logstash is started via java.
And
Change the jdbc_driver_library value in the logstash conf to "". i.e.: jdbc_driver_library => "" as well, otherwise the code still tries to load the jar separately

Logstash HTTPD_COMBINEDLOG not defined error

Getting error in starting logstash for apache combined log filter.
Config File:
input {
file {
path => "/u/agrawalo/logstash-5.4.0/event-data/apache_access.log"
start_position => "beginning"
}
http {
}
}
filter {
grok {
match => { "message" => "%{HTTPD_COMBINEDLOG}" }
}
}
output {
stdout {
codec => rubydebug
}
}
Command used to start logstash:
bin/logstash -f config/pipelines/apacheauto.conf --config.reload.automatic
Error:
Sending Logstash's logs to /u/agrawalo/logstash-5.4.0/logs which is now configured via log4j2.properties
04:18:45.723 [[main]-pipeline-manager] ERROR logstash.pipeline - Error registering plugin {:plugin=>"#<LogStash::FilterDelegator:0x7bfa005e #id=\"498367beab653b0a3133b16fc4dcef59f08886de-3\", #klass=LogStash::Filters::Grok, #metric_events=#<LogStash::Instrument::NamespacedMetric:0x684a02d #metric=#<LogStash::Instrument::Metric:0x68e13c68 #collector=#<LogStash::Instrument::Collector:0x7fe7de03 #agent=nil, #metric_store=#<LogStash::Instrument::MetricStore:0x5434c951 #store=#<Concurrent::Map:0x77929e32 #default_proc=nil>, #structured_lookup_mutex=#<Mutex:0x16f1fed4>, #fast_lookup=#<Concurrent::Map:0x57273dcf #default_proc=nil>>>>, #namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :\"498367beab653b0a3133b16fc4dcef59f08886de-3\", :events]>, #logger=#<LogStash::Logging::Logger:0x462b61a2 #logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x4941bd9c>>, #filter=<LogStash::Filters::Grok match=>{\"message\"=>\"%{HTTPD_COMBINEDLOG}\"}, id=>\"498367beab653b0a3133b16fc4dcef59f08886de-3\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{HTTPD_COMBINEDLOG} not defined"}
04:18:45.731 [[main]-pipeline-manager] ERROR logstash.agent - Pipeline aborted due to error {:exception=>#<Grok::PatternError: pattern %{HTTPD_COMBINEDLOG} not defined>, :backtrace=>["/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:in `compile'", "org/jruby/RubyKernel.java:1479:in `loop'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in `compile'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:274:in `register'", "org/jruby/RubyArray.java:1613:in `each'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:269:in `register'", "org/jruby/RubyHash.java:1342:in `each'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:264:in `register'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:268:in `register_plugin'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:279:in `register_plugins'", "org/jruby/RubyArray.java:1613:in `each'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:279:in `register_plugins'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:289:in `start_workers'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:214:in `run'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/agent.rb:398:in `start_pipeline'"]}
04:18:46.405 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
Output of 'ls' command on logstash installation directory
agrawalo#abc:~/logstash-5.4.0> ls
CHANGELOG.md CONTRIBUTORS Gemfile Gemfile.jruby-1.9.lock LICENSE NOTICE.TXT bin config data event-data lib logstash-core logstash-core-plugin-api output.txt vendor
After further debugging I found that httpd pattern is missing :
agrawalo#abc:~/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns> ls
aws bacula bro exim firewalls grok-patterns haproxy java junos linux-syslog mcollective mcollective-patterns mongodb nagios postgresql rails redis ruby
Qn:
How come this pattern is missing?
How can I include or install this pattern in the existing installation of logstash?
I was able to resolve this by updating version of logstash.

LogStash::ConfigurationError but Configuration OK

I verified Logstash config:
root#learn-elk:/etc/logstash/conf.d# /opt/logstash/bin/logstash -t /etc/logstash/conf.d/
Configuration OK
but still getting error and pipeline aborted after
==> /var/log/logstash/logstash.log <==
{:timestamp=>"2016-10-22T17:48:28.391000+0000", :message=>"Pipeline aborted due to error", :exception=>"LogStash::ConfigurationError", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:88:in `config_init'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:72:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:79:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"], :level=>:error}
{:timestamp=>"2016-10-22T17:48:31.424000+0000", :message=>"stopping pipeline", :id=>"main"}
after running logstash with '-v --debug --verbose' I've got much more information:
starting agent {:level=>:info}
starting pipeline {:id=>"main", :level=>:info}
Settings: Default pipeline workers: 1
Registering file input {:path=>["/opt/logstash/GOOG.csv"], :level=>:info}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_0a3b7d0b4841f166ec450717c6ce4124", :path=>["/opt/logstash/GOOG.csv"], :level=>:info}
Pipeline aborted due to error {:exception=>"LogStash::ConfigurationError", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:88:in `config_init'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:72:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:79:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}
Closing inputs {:level=>:info}
Closed inputs {:level=>:info}
After fixing logstash { hosts => ["localhost"] } } vs { host => localhost } } issue I consolidated config into one file below and used stdout instead elasticsearch
input{
file{
path =>"/opt/logstash/GOOG.csv"
start_position =>"beginning"
type => google
}
}
filter{
if [type] == "google" {
csv{
columns =>
["date_of_record","open","high","low","close","volume","adj_close"]
separator => ","
}
date {
match => ["date_of_record","yyyy-MM-dd"]
}
mutate {
convert => ["open","float"]
convert => ["high","float"]
convert => ["low","float"]
convert => ["close","float"]
convert => ["volume","integer"]
convert => ["adj_close","float"]
}
}
output {
stdout {
}
}

Resources