Elasticsearch Bulk Data Insert Error Handling - node.js

I'm using elasticsearch for my data and I insert the data by using nodejs scripts. Sometimes there occur errors and the data doesn't inserted successfully to elasticsearch. I want to handle those errors. How can I handle the errors. Thanks for the helps.

Related

sometime Elasticsearch throwing exception while saving logs to elasticsearch

While saving logs in elasticsearch. some time logs save in ES but sometime its throwing exception
mapper_parsing_exception] object mapping for [meta.user_details.permissions] tried to parse field [null] as object, but found a concrete value
I find a solution online that I need to delete index then reindex but I cannot do that.
any other solution.

Knex + SQL Server whereIn query 8-12s -- raw version returns NO results but if I input the .toQuery() result directly I get results

The database is in Azure cloud and not being used in production currently. There are 80.000 rows and a uprn is a VARCHAR(100);
I'm already using JOI to validate each UPRN as well;
I'm using KNEX with a SQL Server database with the following whereIn query:
knex(LOCATIONS.table).whereIn(LOCATIONS.uprn, req.body.uprns)
but this takes 8-12s to complete and sometimes timesout. if I use .toQuery() on the same thing, SSMS will return the result within 1-2.
If I do a raw query, the resulting .toQuery() or toString() works in SSMS and returns results. But if I try to use the raw directly, it will return 0 results.
I'm looking to either fix what's making whereIn so slow or get the raw query working.
EDIT 1:
After much debugging and trying -- it seems that the bug is due to how knex deals with arrays, so I made a for-of loop to add ? ? ? for each array element and then inputed the array for all params.
This led me to realizing the performance issue is due to SQL server way of parameterising.
I ended up building a raw query string with all of the parameters and validating the input with Joi string/regex config:
Joi.string()
.min(1)
.max(35)
.regex(/^[a-z\d\-_\s]+$/i)
allowing only for alphanumeric, dashes and spaces which should prevent sql injection.
I'm going to look deeper into security issues with this and might make a separate login that can only SELECT data from that table and nothing more to run with these queries.
Needed to just handle it raw and validate separately.

How to fix logstash lock error with input-jdbc-plugin?

I'm using logstash with input-jdbc-plugin and output-elasticsearch-plugin, when I try to sync massive data from PostgreSQL to elasticsearch, I got these error message as below:
[WARN ][logstash.shutdownwatcher ] {"inflight_count"=>0, "stalling_thread_info"=>{"other"=>[{"thread_id"=>22, "name"=>"[main]<jdbc", "current_call"=>"[...]/vendor/bundle/jruby/1.9/gems/rufus-scheduler-3.0.9/lib/rufus/scheduler.rb:170:in `join'"}, {"thread_id"=>20, "name"=>"[main]>worker0", "current_call"=>"[...]/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:138:in `lock'"}]}}
I expect know why they can not run with massive data and how to fix it. thanks a lot

Heroku - NodeJS - Send json with data from postgres database

I'm starting with Heroku and NodeJS and Postgres database.
I have several tables named table1, table2, table3, ...table27.
All tables have the same structure 10 columns.
I'm trying to build a webservice to send as json all data from these tables.
I tried several ways, usync async modules or without but without success. I fected some unwanted data (basically, database structure). For example:
JSON-result:
{"command":"SELECT","rowCount":0,"oid":null,"rows":[],"fields":[{"name":"id","tableID":945497,"columnID":1,"dataTypeID":23,"dataTypeSize":4,"dataTypeModifier":-1,"format":"text"},{"name":"code_event","tableID":945497,"columnID":2,"dataTypeID":23,"dataTypeSize":4,"dataTypeModifier":-1,"format":"text"},{"name":"version","tableID":945497,"columnID":3,"dataTypeID":23,"dataTypeSize":4,"dataTypeModifier":-1,"format":"text"},{"name":"used","tableID":945497,"columnID":4,"dataTypeID":23,"dataTypeSize":4,"dataTypeModifier":-1,"format":"text"},{"name":"reference1","tableID":945497,"columnID":5,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"reference2","tableID":945497,"columnID":6,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"reference3","tableID":945497,"columnID":7,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"reference4","tableID":945497,"columnID":8,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"reference5","tableID":945497,"columnID":9,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label1","tableID":945497,"columnID":10,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label2","tableID":945497,"columnID":11,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label3","tableID":945497,"columnID":12,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label4","tableID":945497,"columnID":13,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"label5","tableID":945497,"columnID":14,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":104,"format":"text"},{"name":"export_action","tableID":945497,"columnID":15,"dataTypeID":1043,"dataTypeSize":-1,"dataTypeModifier":54,"format":"text"}],"_parsers":[null,null,null,null,null,null,null,null,null,null,null,null,null,null,null],"rowAsArray":false}
I tried these code:
Node js call function, that access mysql database and returns json result, multiple times
How can I get a proper json with following data?
{table1: {row1, row2...}, table2:{row1,row2...}....}

Resource Conflict after syncing with PouchDB

I am new to CouchDB / PouchDB and until now I somehow could manage the start of it all. I am using the couchdb-python library to send initial values to my CouchDB before I start the development of the actual application. Here I have one database with templates of the data I want to include and the actual database of all the data I will use in the application.
couch = couchdb.Server()
templates = couch['templates']
couch.delete('data')
data = couch.create('data')
In Python I have a loop in which I send one value after another to CouchDB:
value = templates['Template01']
value.update({ '_id' : 'Some ID' })
value.update({'Other Attribute': 'Some Value'})
...
data.save(value)
It was working fine the whole time, I needed to run this several times as my data had to be adjusted. After I was satisfied with the results I started to create my application in Javascript. Now I synced PouchDB with the data database and it was also working. However, I found out that I needed to change something in the Python code, so I ran the first python script again, but now I get this error:
couchdb.http.ResourceConflict: (u'conflict', u'Document update conflict.')
I tried to destroy() the pouchDB database data and delete the CouchDB database as well. But I still get this error at this part of the code:
data.save(value)
What I also don't understand is, that a few values are actually passed to the database before this error comes. So some values are saved() into the db.
I read it has something to do with the _rev values of the documents, but I cannot get an answer. Hope someone can help here.

Resources