I was Getting below error when i'm to import csv file using Logstash 8.1 - logstash

I'm getting below error when I was executing below command in my cmd prompt
C:\Program Files\Elk Stack\logstash-8.1.2\bin>logstash -f ./logstach.conf
Logstach.conf
file => "C:/Users/babus/Downloads/archive/tips.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
filter {
date{
match => [ "timestamp", "MMM dd HH:mm:ss"]
}
csv {
separator =>","
columns => ["total_bill", "tip","sex","smoker","day","time","size"]
}
}
output {
elasticsearch {
hosts=> "localhost:9200"
index => "tips"
}
stdout {}
}```
***Below is the error i'm getting***
logstash -f ./logstach.conf
"Using bundled JDK: C:\Program Files\Elk Stack\logstash-8.1.2\jdk\bin\java.exe"
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to C:/Program Files/Elk Stack/logstash-8.1.2/logs which is now configured via log4j2.properties
[2022-04-18T13:37:58,080][INFO ][logstash.runner ] Log4j configuration path used is: C:\Program Files\Elk Stack\logstash-8.1.2\config\log4j2.properties
[2022-04-18T13:37:58,095][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.1.2", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.14.1+1 on 11.0.14.1+1 +indy +jit [mswin32-x86_64]"}
[2022-04-18T13:37:58,100][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2022-04-18T13:37:58,248][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-04-18T13:38:02,284][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-04-18T13:38:02,578][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", \"{\" at line 2, column 6 (byte 15) after input {\r\nfile ", :backtrace=>["C:/Program Files/Elk Stack/logstash-8.1.2/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:189:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'", "C:/Program Files/Elk Stack/logstash-8.1.2/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "C:/Program Files/Elk Stack/logstash-8.1.2/logstash-core/lib/logstash/pipeline_action/create.rb:50:in `execute'", "C:/Program Files/Elk Stack/logstash-8.1.2/logstash-core/lib/logstash/agent.rb:376:in `block in converge_state'"]}
[2022-04-18T13:38:02,763][INFO ][logstash.runner ] Logstash shut down.
[2022-04-18T13:38:02,775][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby.jar:?]
at C_3a_.Program_20_Files.Elk_20_Stack.logstash_minus_8_dot_1_dot_2.lib.bootstrap.environment.<main>(C:\Program Files\Elk Stack\logstash-8.1.2\lib\bootstrap\environment.rb:94) ~[?:?]

You don't have the input section on the YAML file, nor the file input with path instead of file.
input {
file {
path => "C:/Users/babus/Downloads/archive/tips.csv"
start_position => "beginning"
sincedb_path => "NULL"
}
}

Related

logstash; logstash stopped working when new config file added to sync postgres data

input {
jdbc {
jdbc_connection_string => “jdbc:postgresql://localhost:5432/FraudInsuranceBizapp”
jdbc_user => “postgres”
jdbc_password => “postgres”
jdbc_driver_class => “org.postgresql.Driver”
statement => “SELECT * from \”jhi_user\””
}
}
output {
elasticsearch {
hosts => [“http://localhost:9200"]
index => “jhi_user”
document_id => “users_%{id}”
doc_as_upsert => true
#user => “es_user”
#password => “es_password”
}
}
And when i run logstash from the command line .\bin\logstash.bat -f
D:\MESCN\softwares\logstash-7.13.2\config\jhi-user.conf getting
the error below
Using JAVA_HOME defined java: "D:\MESCN\softwares\logstash-7.13.2\jdk"
WARNING, using JAVA_HOME while Logstash distribution comes with a bundled JDK
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future
release.
Sending Logstash logs to D:/MESCN/softwares/logstash-7.13.2/logs which is now configured via log4j2.properties
[2021-06-24T10:11:35,655][INFO ][logstash.runner ] Log4j configuration path used is:
D:\MESCN\softwares\logstash-7.13.2\config\log4j2.properties
[2021-06-24T10:11:35,689][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.13.2",
"jruby.version"=>"jruby 9.2.16.0 (2.5.7) 2021-03-03 f82228dc32 OpenJDK
64-Bit Server VM 11.0.11+9 on 11.0.11+9 +indy +jit [mswin32-x86_64]"}
[2021-06-24T10:11:35,989][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file
because modules or command line options are specified
[2021-06-24T10:11:38,460][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2021-06-24T10:11:38,776][ERROR][logstash.agent ] Failed to execute action
{:action=>LogStash::PipelineAction::Create/pipeline_id:main,
:exception=>"LogStash::ConfigurationError", :message=>"Expected one of
[ \\t\\r\\n], \"#\", [A-Za-z0-9_-], '\"', \"'\", [A-Za-z_], \"-\",
[0-9], \"[\", \"{\" at line 3, column 32 (byte 51) after input {\r\n
jdbc {\r\n jdbc_connection_string => ",
:backtrace=>["D:/MESCN/softwares/logstash-7.13.2/logstash-core/lib/logstash/compiler.rb:32:in
`compile_imperative'",
"org/logstash/execution/AbstractPipelineExt.java:187:in `initialize'",
"org/logstash/execution/JavaBasePipelineExt.java:72:in `initialize'",
"D:/MESCN/softwares/logstash-7.13.2/logstash-core/lib/logstash/java_pipeline.rb:47:in
`initialize'",
"D:/MESCN/softwares/logstash-7.13.2/logstash-core/lib/logstash/pipeline_action/create.rb:52:in
`execute'",
"D:/MESCN/softwares/logstash-7.13.2/logstash-core/lib/logstash/agent.rb:389:in
`block in converge_state'"]}
[2021-06-24T10:11:39,047][INFO ][logstash.runner ] Logstash shut down.
[2021-06-24T10:11:39,071][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747)
~[jruby-complete-9.2.16.0.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710)
~[jruby-complete-9.2.16.0.jar:?]
at D_3a_.MESCN.softwares.logstash_minus_7_dot_13_dot_2.lib.bootstrap.environment.<main>(D:\MESCN\softwares\logstash-7.13.2\lib\bootstrap\environment.rb:89)
~[?:?]
jdbc_connection_string => “jdbc:postgresql://localhost:5432/FraudInsuranceBizapp”
logstash will not accept curly quotes. You need to use regular double quotes. The same applies to all of the options in the input and output.

How can logstash be executed? (error occured)

How can logstash be executed? (error occured)
run logstash on AWS Linux.
but error occurred while executing.
my Linux(Ubuntu Version: 20.04)
elasticsearch installed.(execute successfully)
kibana installed.(execute successfully)
logstash installed. (execute error occured)
my .conf file code
input {
jdbc {
clean_run => true
jdbc_driver_library => "/usr/share/java/mysql-connector-java-8.0.23.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://AWSLINK:3306/schema_name?useSSL=false&user=root&password=1234"
jdbc_user => "root"
jdbc_password => "1234"
schedule => "* * * * *"
statement => "select * from schema_name"
}
}
output {
elasticsearch {
hosts => 52.188.20.167:9200"
index => "AWS_DB_0514"
}
stdout {
codec => rubydebug
}
}
I execute logstash in linux(command)
./logstash -f test.conf --path.settings /etc/logstash/
I execute above code but error occured.(attempt)
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2021-05-14T08:37:16,025][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2021-05-14T08:37:16,039][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.12.0", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.10+9 on 11.0.10+9 +indy +jit [linux-x86_64]"}
[2021-05-14T08:37:16,466][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-05-14T08:37:17,524][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2021-05-14T08:37:18,048][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [0-9], [ \\t\\r\\n], \"#\", \"}\" at line 16, column 24 (byte 608) after output {\n elasticsearch {\n hosts => 52.188", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in `compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:184:in `initialize'", "org/logstash/execution/JavaBasePipelineExt.java:69:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:389:in `block in converge_state'"]}
[2021-05-14T08:37:18,165][INFO ][logstash.runner ] Logstash shut down.
[2021-05-14T08:37:18,177][FATAL][org.logstash.Logstash ] Logstash stopped processing because of an error: (SystemExit) exit
org.jruby.exceptions.SystemExit: (SystemExit) exit
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.13.0.jar:?]
at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.13.0.jar:?]
at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]
There is an error on line 16.
incorrect code
hosts => 52.188.20.167:9200"
correct code
hosts => "52.188.20.167:9200"

Logstash: Nothing displayed on console (Mac)

I am trying to set up a very simple logstash config
input {
file {
path => "/path/to/my/log/file"
start_position => "beginning"
ignore_older => 0
}
}
filter {
}
output {
stdout {
codec => rubydebug
}
}
and here is how i start logstash
[logstash-7.1.1]$ bin/logstash -r -f log.conf
but here is all i see on the console
Sending Logstash logs to path/to/logstash-7.1.1/logs which is now configured via log4j2.properties
[2019-05-28T13:22:57,294][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-28T13:22:57,313][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-05-28T13:23:02,904][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, :thread=>"#<Thread:0x7ad3cf30 run>"}
[2019-05-28T13:23:03,254][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"path/to/logstash-7.1.1/data/plugins/inputs/file/.sincedb_8164b23a475b43f1b0c9aba125f7f5cf", :path=>["/path/to/my/log/file"]}
[2019-05-28T13:23:03,284][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-05-28T13:23:03,355][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-05-28T13:23:03,360][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-05-28T13:23:03,703][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
i can see that
No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"path/to/logstash-7.1.1/data/plugins/inputs/file/.sincedb_8164b23a475b43f1b0c9aba125f7f5cf", :path=>["/path/to/my/log/file"]}
so the path seems correct. Also, my log file is not empty.
What am i doing wrong? Why cant I see the content of my log file on the console?
input {
file {
path => "/salaries.csv"
start_position => "beginning"
type => "data"
}
}
filter {
csv{
separator => ","
}
}
output {
stdout {
codec => rubydebug
}
}
This link may helpful to you

Logstash HTTPD_COMBINEDLOG not defined error

Getting error in starting logstash for apache combined log filter.
Config File:
input {
file {
path => "/u/agrawalo/logstash-5.4.0/event-data/apache_access.log"
start_position => "beginning"
}
http {
}
}
filter {
grok {
match => { "message" => "%{HTTPD_COMBINEDLOG}" }
}
}
output {
stdout {
codec => rubydebug
}
}
Command used to start logstash:
bin/logstash -f config/pipelines/apacheauto.conf --config.reload.automatic
Error:
Sending Logstash's logs to /u/agrawalo/logstash-5.4.0/logs which is now configured via log4j2.properties
04:18:45.723 [[main]-pipeline-manager] ERROR logstash.pipeline - Error registering plugin {:plugin=>"#<LogStash::FilterDelegator:0x7bfa005e #id=\"498367beab653b0a3133b16fc4dcef59f08886de-3\", #klass=LogStash::Filters::Grok, #metric_events=#<LogStash::Instrument::NamespacedMetric:0x684a02d #metric=#<LogStash::Instrument::Metric:0x68e13c68 #collector=#<LogStash::Instrument::Collector:0x7fe7de03 #agent=nil, #metric_store=#<LogStash::Instrument::MetricStore:0x5434c951 #store=#<Concurrent::Map:0x77929e32 #default_proc=nil>, #structured_lookup_mutex=#<Mutex:0x16f1fed4>, #fast_lookup=#<Concurrent::Map:0x57273dcf #default_proc=nil>>>>, #namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :\"498367beab653b0a3133b16fc4dcef59f08886de-3\", :events]>, #logger=#<LogStash::Logging::Logger:0x462b61a2 #logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x4941bd9c>>, #filter=<LogStash::Filters::Grok match=>{\"message\"=>\"%{HTTPD_COMBINEDLOG}\"}, id=>\"498367beab653b0a3133b16fc4dcef59f08886de-3\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{HTTPD_COMBINEDLOG} not defined"}
04:18:45.731 [[main]-pipeline-manager] ERROR logstash.agent - Pipeline aborted due to error {:exception=>#<Grok::PatternError: pattern %{HTTPD_COMBINEDLOG} not defined>, :backtrace=>["/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:in `compile'", "org/jruby/RubyKernel.java:1479:in `loop'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in `compile'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:274:in `register'", "org/jruby/RubyArray.java:1613:in `each'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:269:in `register'", "org/jruby/RubyHash.java:1342:in `each'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:264:in `register'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:268:in `register_plugin'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:279:in `register_plugins'", "org/jruby/RubyArray.java:1613:in `each'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:279:in `register_plugins'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:289:in `start_workers'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:214:in `run'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/agent.rb:398:in `start_pipeline'"]}
04:18:46.405 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
Output of 'ls' command on logstash installation directory
agrawalo#abc:~/logstash-5.4.0> ls
CHANGELOG.md CONTRIBUTORS Gemfile Gemfile.jruby-1.9.lock LICENSE NOTICE.TXT bin config data event-data lib logstash-core logstash-core-plugin-api output.txt vendor
After further debugging I found that httpd pattern is missing :
agrawalo#abc:~/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns> ls
aws bacula bro exim firewalls grok-patterns haproxy java junos linux-syslog mcollective mcollective-patterns mongodb nagios postgresql rails redis ruby
Qn:
How come this pattern is missing?
How can I include or install this pattern in the existing installation of logstash?
I was able to resolve this by updating version of logstash.

LogStash::ConfigurationError but Configuration OK

I verified Logstash config:
root#learn-elk:/etc/logstash/conf.d# /opt/logstash/bin/logstash -t /etc/logstash/conf.d/
Configuration OK
but still getting error and pipeline aborted after
==> /var/log/logstash/logstash.log <==
{:timestamp=>"2016-10-22T17:48:28.391000+0000", :message=>"Pipeline aborted due to error", :exception=>"LogStash::ConfigurationError", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:88:in `config_init'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:72:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:79:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"], :level=>:error}
{:timestamp=>"2016-10-22T17:48:31.424000+0000", :message=>"stopping pipeline", :id=>"main"}
after running logstash with '-v --debug --verbose' I've got much more information:
starting agent {:level=>:info}
starting pipeline {:id=>"main", :level=>:info}
Settings: Default pipeline workers: 1
Registering file input {:path=>["/opt/logstash/GOOG.csv"], :level=>:info}
No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_0a3b7d0b4841f166ec450717c6ce4124", :path=>["/opt/logstash/GOOG.csv"], :level=>:info}
Pipeline aborted due to error {:exception=>"LogStash::ConfigurationError", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:88:in `config_init'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/config/mixin.rb:72:in `config_init'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:79:in `initialize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:74:in `register'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:181:in `start_workers'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:136:in `run'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/agent.rb:491:in `start_pipeline'"], :level=>:error}
stopping pipeline {:id=>"main"}
Closing inputs {:level=>:info}
Closed inputs {:level=>:info}
After fixing logstash { hosts => ["localhost"] } } vs { host => localhost } } issue I consolidated config into one file below and used stdout instead elasticsearch
input{
file{
path =>"/opt/logstash/GOOG.csv"
start_position =>"beginning"
type => google
}
}
filter{
if [type] == "google" {
csv{
columns =>
["date_of_record","open","high","low","close","volume","adj_close"]
separator => ","
}
date {
match => ["date_of_record","yyyy-MM-dd"]
}
mutate {
convert => ["open","float"]
convert => ["high","float"]
convert => ["low","float"]
convert => ["close","float"]
convert => ["volume","integer"]
convert => ["adj_close","float"]
}
}
output {
stdout {
}
}

Resources