Kibana Index Pattern showing wrong results - logstash

I am using ELK stack in which i have used jdbc input in logstash
I have created 2 indexes
users
employees
Both the indexes have one same column objid
Logstash config file
input {
jdbc {
jdbc_driver_library => "/opt/application/cmt/ELK/logstash-5.3.0/ojdbc14.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:#xx.xxx.xx.xx:xxxx:abc"
jdbc_user => "xxxx"
jdbc_password => "xxxxx"
schedule => "*/2 * * * *"
statement => "select * from table_employee"
}
}
output {
elasticsearch {
index => "employees"
document_type => "employee"
document_id => "%{objid}"
hosts => "xx.xxx.xxx.xx:9200"
}
}
input {
jdbc {
jdbc_driver_library => "/opt/application/cmt/ELK/logstash-5.3.0/ojdbc14.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
jdbc_connection_string => "jdbc:oracle:thin:#xx.xxx.xx.xx:xxxx:abc"
jdbc_user => "xx"
jdbc_password => "xxxxxxx"
schedule => "*/2 * * * *"
statement => "select A.OBJID,A.LOGIN_NAME,A.STATUS,A.USER_ACCESS2PRIVCLASS,A.USER_DEFAULT2WIPBIN,A.SUPVR_DEFAULT2MONITOR,A.USER2RC_CONFIG,A.OFFLINE2PRIVCLASS,A.WIRELESS_EMAIL from table_user a where A.STATUS=1"
}
}
output {
elasticsearch {
index => "users"
document_type => "user"
document_id => "%{objid}%{login_name}"
hosts => "xx.xxx.xxx.xx:9200"
}
}
1st input jdbc 'employees' contains 26935 records
2nd input jdbc 'users' contains 10619 records
Common Records : 9635 ( objid matches )
1st problem is that when i create an index pattern in kibana as '
users
It's showing count of 37554 ,why ? it should show only 10619
2nd problem : when i create an index pattern as '
employees
It's showing count of 27919 ,why ? it should show only 26935
Also i have create different document Id for index 'users' %{objid}%{login_name}

If your users and employees input and output are in the same file/executed at the same time, as what your example shows, you need to use conditionals to route your data to the correct elasticsearch index. Logstash concatenates your files/file into one pipeline, so all your inputs run through all of the filters/outputs, which is likely why you're getting unexpected results. See this discussion.
You will need to do something like this:
input {
jdbc {
statement => "SELECT * FROM users"
type => "users"
}
}
input {
jdbc {
statement => "SELECT * FROM employees"
type => "employees"
}
}
output {
if [type] == "users" {
elasticsearch {
index => "users"
document_type => "user"
document_id => "%{objid}%{login_name}"
hosts => "xx.xxx.xxx.xx:9200"
}
}
if [type] == "employees" {
elasticsearch {
index => "employees"
document_type => "employee"
document_id => "%{objid}"
hosts => "xx.xxx.xxx.xx:9200"
}
}
}

Related

How to transfer data from Elasticsearch to SQLite?

For a Node.js project with Elasticsearch I have to change DBMS to SQLite. I choose Sequelize ORM as a Node.js model.
I was able to create a new SQLite database and model, however I don't know how to convert the data stored with Elasticsearch to SQLite
I believe the best option would be to use Logstash, with JDBC as the output. For example:
input {
elasticsearch {
hosts => "https://localhost:9200/"
user => "yourUser"
password => "yourPassword"
index => "yourIndex"
query => '{"query":{"match_all": {}}}'
scroll => "5m"
size => "5000"
}
}
output {
jdbc {
jdbc_connection_string => ""
jdbc_user => ""
jdbc_password => ""
jdbc_validate_connection => true
jdbc_driver_library => ""
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
statement => "INSERT INTO table_name VALUES (value1, value2, value3, ... <<<<< JUST A QUERY EXAMPLE)"
}
}

logstash add array in document

I would like to import data from my postgresql database into my elasticsearch database.
I have an appointments index, in this index I would like to add a persons field (list of people in an appointment).
here is my logstash configuration file and a sample document.
thank you.
input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://localhost:5432/app"
jdbc_user => "postgres"
jdbc_password => "admin"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_driver_library => "postgresql-42.2.21.jar"
statement => "select id::text,trim(firstname),trim(lastname) from persons"
}
}
filter {
ruby {
code => "
event['persons'].each{|subdoc| subdoc['persons'] = subdoc['persons']['firstname']}
"
}
}
output {
#stdout { codec => json_lines }
elasticsearch {
hosts => "127.0.0.1"
index => "appointments"
doc_as_upsert => true
document_id => "%{id}"
}
}
{
"_index" : "appointments",
"_type" : "_doc",
"_id" : "41",
"_score" : 1.0,
"_source" : {
... others fields
[add array fields]
ex:
persons: [{
"firstname": "firstname1"
}, {
"firstname": "firstname2"
}]
}
}
UPDATE 2:
I made a mistake, I was modifying the wrong document, I modified the document_id and I added appointment_id in my request.
It still does not work. It replaces my document with what is in the request.
input {
jdbc {
jdbc_connection_string => "jdbc:postgresql://localhost:5432/app"
jdbc_user => "postgres"
jdbc_password => "admin"
jdbc_driver_class => "org.postgresql.Driver"
jdbc_driver_library => "postgresql-42.2.21.jar"
statement => "select id::text, appointment_id::text,trim(firstname),trim(lastname) from appointments_persons order by created_at"
}
}
filter {
aggregate {
task_id => "%{appointment_id}"
code => "
map['persons'] ||= []
"
push_map_as_event_on_timeout => true
timeout_task_id_field => "appointment_id"
timeout => 10
}
}
output {
#stdout { codec => json_lines }
elasticsearch {
hosts => "127.0.0.1"
index => "appointments"
action => update
doc_as_upsert => true
document_id => "%{appointment_id}"
}
}
Unless you are running a very old version of logstash (prior to 5.0) you cannot reference or modify the event by treating it as a hash.
The jdbc filter creates one event for each row in the result set. If you want to combine all the events that have the same [id] you could use an aggregate filter. Note the warning about setting pipeline.workers to 1 so that all events go through the same instance of the filter. In your case I do not think you need to preserve event order, so pipeline.ordered can be ignored.
You will need to do something similar to example 3 in the documentation.
aggregate {
task_id => "%{id}"
code => '
map['persons'] ||= []
map['persons'] << { "firstname" => event.get("firstname"), "lastname" => event.get("lastname") }
'
push_map_as_event_on_timeout => true
timeout_task_id_field => "id"
timeout => 10
}
If you are using document_id => "%{appointment_id}" the event read from the database will be written with to elasticsearch with that document id. Then when the aggregate timeout fires a second document will overwrite that. You might want to add event.cancel to the aggregate code option so that the event from the db does not cloud things.

Stripe how to retrieve a Checkout Session's line items quantity?

Create payment Sessions :
$session = \Stripe\Checkout\Session::create([
'payment_method_types' => ['card'], //, 'fpx','alipay'
'line_items' => [[
'price_data' => [
'product_data' => [
'name' => "Topup USDT Wallet",
'images' => ["https://abc-uaha.co/uploads/site_logo/site_logo_20210321130054.png"],
'metadata' => [
'pro_id' => "USD".$_GET['price']/100
]
],
'unit_amount' => $_GET['price'],
'currency' => 'usd',
],
'quantity' => 1,
'description' => "Spartan Capital",
]],
'mode' => 'payment',
'success_url' => STRIPE_SUCCESS_URL.'?session_id={CHECKOUT_SESSION_ID}',
'cancel_url' => STRIPE_CANCEL_URL,
]);
Refer to this docs : https://stripe.com/docs/api/checkout/sessions/line_items
I tried retrieve quantity from session:
try {
$checkout_session = \Stripe\Checkout\Session::retrieve([
'id' =>$session_id,
'expand' => ['line_items'],
]);
}catch(Exception $e) {
$api_error = $e->getMessage();
}
$line_items = $session->line_items[0].quantity;
echo $line_items; //it shows nothing, how to make it output "1"?
line_items are no longer included by default when retrieving Checkout Sessions. To get them in your retrieve call, you need to expand the line_item property.
You have two syntax errors :
You're missing a layer and using dot notation instead of PHP arrow syntax. The 2nd error is using $session instead of $checkout_session. So it should be :
$quantity = $checkout_session->line_items->data[0]->quantity;

Find key contain specific key value pair in Puppet Hash

I am still a beginner in Puppet. So please bear with me. Let's assume i have this hash created in Puppet through some module
account = {
user#desktop1 => {
owner => john,
type => ssh-rsa,
public => SomePublicKey
},
user#desktop2 => {
owner => mary,
type => ssh-rsa,
public => SomePublicKey
},
user#desktop3 => {
owner => john,
type => ssh-rsa,
public => SomePublicKey
},
user#desktop4 => {
owner => matt,
type => ssh-rsa,
public => SomePublicKey
}
}
How can i find find the key for specific key and value pair inside the hash? which in this case just for example i want to find all the key owned by john. So the expected result would be something like:
[user#desktop1, user#desktop3]
Thanks in advance
The question asks about how to do this in Puppet, although, confusingly, the Hash is a Ruby Hash and the question also has a Ruby tag.
Anyway, this is how you do it in Puppet:
$account = {
'user#desktop1' => {
'owner' => 'john',
'type' => 'ssh-rsa',
'public' => 'SomePublicKey',
},
'user#desktop2' => {
'owner' => 'mary',
'type' => 'ssh-rsa',
'public' => 'SomePublicKey',
},
'user#desktop3' => {
'owner' => 'john',
'type' => 'ssh-rsa',
'public' => 'SomePublicKey',
},
'user#desktop4' => {
'owner' => 'matt',
'type' => 'ssh-rsa',
'public' => 'SomePublicKey',
}
}
$users = $account.filter |$k, $v| { $v['owner'] == 'john' }.keys
notice($users)
Puppet applying that leads to:
Notice: Scope(Class[main]): [user#desktop1, user#desktop3]
https://ruby-doc.org/core-2.5.1/Hash.html#method-i-select
account.select {|key, value| value['owner'] == 'john'}.keys
Another option using Enumerable#each_with_object:
account.each_with_object([]) { |(k, v), a| a << k if v['owner'] == 'john'}
#=> ["user#desktop1", "user#desktop3"]
Supposing keys and values to be String.

Yii2- How to add a search box in grid view

I am new to Yii-2.
I have a grid-view in my index page which some entries are displaying.
<?= GridView::widget([
'dataProvider' => $dataProvider,
'filterModel' => $searchModel,
'columns' => [
['class' => 'yii\grid\SerialColumn'],
//'meter_id',
[
'label' => 'Meter MSN',
'value' => function ($d) {
return $d->meter->meter_msn;
},
// 'filter' => Html::activeDropDownList($searchModel, 'meter_id', \app\models\Meters::toArrayList(), ['prompt' => "All Meters", 'class' => 'form-control']),
],
'imsi',
'telecom',
'status',
[
'label' => 'Created By',
'value' => function ($data) {
if (is_object($data))
return $data->created->name;
return ' - ';
},
//'filter' => Html::activeDropDownList($searchModel, 'created_by', \app\models\User::toArrayList(), ['prompt' => "Created By", 'class' => 'form-control']),
],
'comments',
'historic',
['class' => 'yii\grid\ActionColumn'],
],
]); ?>
Now I want to add a search-box against Meter MSN. In above code the filter is hidden so it was working but I don't want to add a drop-down instead I want a search box.
Below is my search class
public function search($params)
{
$query = MetersInventoryStore::find();
// add conditions that should always apply here
$dataProvider = new ActiveDataProvider([
'query' => $query,
]);
$this->load($params);
if (!$this->validate()) {
// uncomment the following line if you do not want to return any records when validation fails
// $query->where('0=1');
return $dataProvider;
}
// grid filtering conditions
$query->andFilterWhere([
'id' => $this->id,
'meter_id' => $this->meter_id,
'created_by' => $this->created_by,
'updated_by' => $this->updated_by,
'created_at' => $this->created_at,
'updated_at' => $this->updated_at,
'store_id' => $this->store_id,
'meter_serial'=>$this->meter_serial,
// 'historic' => $this->historic,
'status'=>'SIM Installed',
])
// ->orFilterWhere(['status'=>'Communication Failed'])
;
// $query->andFilterWhere(['like', 'meter_serial', $this->meter_serial])
// ->andFilterWhere(['like','meter_id',$this->meter_id]);
$query->orderBy(['id' => SORT_DESC]);
return $dataProvider;
}
How can I place a search-box in it? As the simple search class will set up the search functionality by default. But my MSN value is coming from a function so I have no idea how can I place a search-box.
Any help would be highly appreciated.
for add filter field in a calculated column you should add a pubblic var in
in your search model
public function search($params)
{
public $your_column;
// declare as safe
public function rules()
{
return [
...
[[ 'your_column', ], 'safe'],
];
}
$query = MetersInventoryStore::find();
and then refer to your_column in grid_view
...
'columns' => [
['class' => 'yii\grid\SerialColumn'],
//'meter_id',
[
'attribute' => 'your_column',
'label' => 'Meter MSN',
'value' => function ($d) {
return $d->meter->meter_msn;
},
],
And last your searchModel you must expand your filter condition for manage properly your calculated column based on the filter value you passed.
You can find some sample in this tutorial http://www.yiiframework.com/wiki/621/filter-sort-by-calculated-related-fields-in-gridview-yii-2-0/

Resources