After adding a config file to the logstash with a mysql driver, the data is no longer sent to the elastic - logstash

In the logstash configuration files with PostgreSQL queries already existed. After adding the config file with mysql, after some time the data ceases to arrive in the elastic, tcpdump on the outgoing port also shows the absence of data, while requests to other servers are made. There are no errors in the logs. When debug is enabled, it shows that the config is re-read and that's it.
On another server, where this config is only one, logstash works fine.
In what there can be an error? Where to look. Tell me please.
input {
jdbc {
jdbc_driver_library => "/etc/logstash/mysql-connector-java-5.1.46-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://xx.xx.xx.xx:3306/database"
jdbc_user => "user"
jdbc_password => "*************"
schedule => "0-59 * * * *"
statement => "SELECT * FROM `database`.table WHERE calldate > :sql_last_value"
tracking_column => "calldate"
tracking_column_type => "timestamp"
use_column_value => true
add_field => { "typetable_id" => "table" }
}
}
output {
if [typetable_id] == "table" {
elasticsearch {
hosts => "xx.xx.xx.xx:9200"
index => "data_index"
user => "elastic"
password => "***********"
}
}
}

Related

ora:01882 timezone region not found - jdbc_input_logstash plugin error

Oracle DB: 11.2.0.4
OJDBC version: ojdbc6.jar
JDK: openjdk 1.8
LogStash version: 6.3.2-1
I am recieving following error in logstash error log [ERROR][logstash.inputs.jdbc ] Unable to connect to database. Tried 1 times {:error_message=>"Java::JavaSql::SQLException: ORA-00604: erro occurred at recursive SQL level 1\nORA-01882: timezone region not found\n"}
Logstash code:
input{
jdbc{
# jdbc_default_timezone => "Asia/Kolkata"
jdbc_driver_library => "/var/lib/logstash/OJDBC-Full/ojdbc6.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:#xxxx:port:sid"
jdbc_user => "xxxx"
jdbc_password => "xxxx"
jdbc_validate_connection => true
statement => "select count(*) from apps.icx_sessions icx join apps.fnd_user usr on usr.user_id=icx.user_id left join apps.fnd_responsibility resp on resp.responsibility_id=icx.responsibility_id where last_connect>sysdate-nvl(FND_PROFILE.VALUE('ICX_SESSION_TIMEOUT'),30)/60/24 and disabled_flag != 'Y' and pseudo_flag = 'N' and USER_NAME <> 'GUEST'"
type => "xxx_RPT_DB_Session_query"
schedule => "*/2 * * * *"
}
}
filter{
}
output{
file{
path => "/var/log/logstash/sample-JDBC-%{+YYYY-MM-dd}.txt"
}
elasticsearch{
hosts => ["xxxxxxxxx:7778"]
index => "q_session"
}
http{
format => "json"
http_method => "post"
url => "https://api.telegram.org/bot629711229:AAFDebywi4NDiSdqqHhmxTFlUH7cMUJwwvE/sendMessage"
mapping => {
"chat_id" => "xxxxx"
"parse_mode" => "html"
"text" => "❗ Current Session Count 😱"
}
}
}
Had the same problem, solved it adding a line in the logstash jvm.options
-Duser.timezone="+01:00"
of course you have to change the +01 with your timezone

Unknown setting ‘host’ for elasticsearch in logstash conf

My logstash.conf file and mysql-connector-java-5.1.38.jar both are in F:\Software\logstash-5.5.1\logstash-5.5.1\bin location
Getting below error while running conf file in cmd :
F:\Software\logstash-5.5.1\logstash-5.5.1\bin>logstash -f logstash.conf
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console.
Sending Logstash’s logs to F:/Software/logstash-5.5.1/logstash-5.5.1/logs which is now configured via log4j2.properties
[2017-08-03T16:01:17,142][ERROR][logstash.outputs.elasticsearch] Unknown setting ‘host’ for elasticsearch
[2017-08-03T16:01:17,149][ERROR][logstash.agent ] Cannot create pipeline {:reason=>“Something is wrong with your configuration.”}
Below is my conf file:
input {
jdbc {
# MySql jdbc connection string to our database, testdb
jdbc_connection_string => "jdbc:mysql://localhost:3306/testdb"
# The user we wish to execute our statement as
jdbc_user => "root"
jdbc_password => "root"
# The path to our downloaded jdbc driver
jdbc_driver_library => "F:/Software/logstash-5.5.1/logstash-5.5.1/bin/mysql-connector-java-5.1.38/mysql-connector-java-5.1.38.jar"
# The name of the driver class for Postgresql
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * from testtable"
}
}
output {
stdout { codec => json_lines }
elasticsearch{
hosts => ["localhost:9200"]
protocol => "http"
index => "test-migrate"
document_type => "data"
}
}
This is standard and basic conf file
Good luck !!!!
input {
jdbc {
jdbc_driver_library => "/mysql-connector-java-5.1.44-bin.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/yourdatabase"
jdbc_user => "root"
jdbc_password => ""
statement => "SELECT * from yourtable"
}
}
output {
#stdout {codec => rubydebug}
elasticsearch {
hosts => "localhost:9200"
index => "inandi"
document_type => "name"
document_id => "%{name}"
}
}

Logstash Kafka output

I'm trying to send some "messages" to kafka using Logstash.
My problem is that the message is sent as a string "%{message}" and not the message
here is my config:
input {
jdbc {
jdbc_connection_string => "jdbc:mysql://127.0.0.1:3306/xxx"
jdbc_user => "xxx"
jdbc_password => "xxx"
jdbc_driver_library => "mysql-connector-java-5.1.41.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
schedule => "* * * * *"
statement => "SELECT * from a WHERE updated_at > :sql_last_value order by updated_at"
use_column_value => true
tracking_column => updated_at
}
output {
kafka {
codec => plain {
format => "%{message}"
}
topic_id => "mytopic"
}
file {
codec => json_lines
path => "/tmp/output_a.log"
}
}
As I mention above, when I dig into kafka messages I see "%{message}" and not the result from the select ... If I open /tmp/output_a.log the result it is there.
Any suggest ?
$bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic mytopic --from-beginning
%{message}
2017-06-21T15:14:00.336Z %{host} %{message} (<- I tried here to remove the codec)
%{message}
solved with
kafka {
codec => json_lines
topic_id => "mytopic"
}
If you want to avoid json overhead, try using just lines

logstash nil import errors

I'm getting some errors attempting to do a data import in logstash. I'm seeing it for every "geo" field that I have. Here are some of my config files
input {
jdbc {
jdbc_driver_library => "c:\binaries\driver\ojdbc6.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:#random:1521/random"
jdbc_user => "user"
jdbc_password => "password"
statement => "select a.*, myfunc() as geo from foo a"
type => "sometype"
}
}
filter{
if [type] == "sometype" {
mutate {
rename => { "sometype_id" => "id" }
remove_field => ["gdo_geometry"]
add_field => [ "display", "%{id}" ]
}
# parses string to json
json{
source => "geo"
target => "geometry"
}
}
}
output {
if [type] == "sometype" {
elasticsearch {
hosts => ["myesbox:80"]
document_id => "%{id}"
index => "sjw"
}
}
}
Here is a second.
input {
jdbc {
jdbc_driver_library => "c:\binaries\driver\ojdbc6.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:#random:1521/random"
jdbc_user => "user"
jdbc_password => "password"
statement => "select a.*, myfunc() as geo from foo2 a"
type => "sometype2"
}
}
filter{
if [type] == "sometype2" {
mutate {
rename => { "sometype2_id" => "id" }
remove_field => ["gdo_geometry"]
add_field => [ "display", "%{id}" ]
}
# parses string to json
json{
source => "geo"
target => "geometry"
}
}
}
output {
if [type] == "sometype2" {
elasticsearch {
hosts => ["myesbox:80"]
document_id => "%{id}"
index => "sjw"
}
}
}
And here is the error message (repeated once for each record in my database tables).
{:timestamp=>"2016-01-05T13:33:18.258000-0800", :message=>"Trouble parsing json", :source=>"geo", :raw=>nil, :exception=>java.lang.ClassCastException: org.jruby.RubyNil cannot be cast to org.jruby.RubyIO, :level=>:warn}
Now interestingly, the field DOES seem to import successfully. I can see the data populated as expected. But I don't know why this warning is being generated. I'm running the logstash as
logstash -f /my/logstash/dir
Also interesting to note is that if I modify the first config file given and changed the source json filter name to "geom" instead of "geo" -- this warning would no longer occur. It seems to only occur when I have multiple config files with the same field/json filter combinations. So if I then added a third config file and it had a "geo" field being parsed by the json filter -- the issue occurs again -- though I would still not see any warning messages for the first config file -- only the second and third.
The issue here actually turned out to be a bug with the 2.0 version of logstash. I'm not sure what exactly the problem was, but upgrading to 2.1 resolved the issue for me.

Logstash SQL Server Data Import

input {
jdbc {
jdbc_driver_library => "sqljdbc4.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://192.168.2.126\\SQLEXPRESS2014:1433;databaseName=test
jdbc_password => "sa#sa2015"
schedule => "0 0-59 0-23 * * *"
statement => "SELECT ID , Name, City, State,ShopName FROM dbo.Shops"
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
}
}
filter {
}
output {
stdout { codec => rubydebug }
elasticsearch {
protocol => "http"
index => "shops"
document_id => "%{id}"
}
}
I want to import data in ElasticSearch using Logstash using JDBC SQL Server as input but I am getting error class path is not correct.
Anybody know how to connect using Logstash for correct location for sqljdbc FILE WITH CONFIG FILE
I think that path to the "sqljdbc4.jar" file is not correct. Here is the config I am using to query data from a sql db into elasticsearch (logstash.conf):
input {
jdbc {
jdbc_driver_library => "D:\temp\sqljdbc\sqljdbc_4.2\enu\sqljdbc42.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://DBSVR_NAME;user=****;password=****;"
jdbc_user => "****"
jdbc_password => "****"
statement => "SELECT *
FROM [DB].[SCHEMA].[TABLE]"
}
}
filter {
}
output {
elasticsearch {
hosts => "localhost"
index => "INDEX_NAME"
document_type => "DOCUMENT_TYPE"
document_id => "%{id}"
protocol => "http"
}
stdout { codec => rubydebug }
}
I downloaded the Microsoft JDBC Driver for SQL Server from here:
"https://msdn.microsoft.com/en-us/sqlserver/aa937724.aspx"
Extracted the files to the path specified in "jdbc_driver_library"
Then I ran the plugin command: "plugin install logstash-input-jdbc" to install the logstash input jdbc plugin.
And finally running logstash: "logstash -f logstash.conf".
As an aside: I am also using Elasticsearch.Net in a .Net service app to refresh the data
"http://nest.azurewebsites.net/"
And this vid: "Adding Elasticsearch To An Existing .NET / SQL Server Application" "https://www.youtube.com/watch?v=sv-MflnT9qI" discuses using a Service Broker queue to get the data out of sql. We are currently exploring this as an option.
Edit - Updated host to hosts as in documentation here https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-hosts
input {
jdbc {
jdbc_driver_library => "C:\Program Files\Microsoft JDBC Driver 6.0 for SQL Server\sqljdbc_6.0\enu\jre8\sqljdbc42.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://[SERVER NAME];databaseName=[DATABASE NAME];"
jdbc_user => "[USERNAME]"
jdbc_password => "[PASSWORD]"
statement => "SELECT eventId, sessionId FROM Events;"
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "events3"
}
stdout { codec => rubydebug }
}
You need to download sqljdbc drivers from https://www.microsoft.com/en-au/download/details.aspx?id=11774
and wherever you will unzip those drivers just give that path in jdbc_driver_library. Try to unzip those drivers into same path as shown in code.
Do it like this:-
input {
jdbc {
jdbc_driver_library => "sqljdbc4.jar"
jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
jdbc_connection_string => "jdbc:sqlserver://192.168.2.126:1433;databaseName=test
jdbc_password => "sa#sa2015"
schedule => "0 0-59 0-23 * * *"
statement => "SELECT ID , Name, City, State,ShopName FROM dbo.Shops"
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
}
}
filter {
}
output {
stdout { codec => rubydebug }
elasticsearch {
protocol => "http"
index => "shops"
document_id => "%{id}"
hosts => "your_host_here"
}
}

Resources