Alert for script filled value - linux

I am using Logstash-6.3.0 ,Elastic search-6.3.0 and Kibana-6.3.0 combination. I have some fields in kibana which are scripted.
I need to send an alert based on these values. I can send alert for elastic search fields using watcher plugin for kibana.
How do I configure kibana to send alert based on scripted field values?
I am using elastalert,if there are ways?
Solution using elastalert is fine.

I don't think you can query on scripted fields using ElastAlert as scripted field values are computed at the query time (in kibana only) and aren't indexed into the Elasticsearch hence cannot be queried on as ElastAlert only queries Elasticsearch directly.

Related

Custom metrics using grok with logstash

I'm trying to integrate some code into an existing ELK stack, and we're limited to using filebeats + logstash. I'd like to have a way to configure a grok filter that will allow different developers to log messages in a pre-defined format such that they can capture custom metrics, and eventually build kibana dashboards.
For example, one team might log the following messages:
metric_some.metric=2
metric_some.metric=5
metric_some.metric=3
And another team might log the following messages from another app:
metric_another.unrelated.value=17.2
metric_another.unrelated.value=14.2
Is there a way to configure a single grok filter that will capture everything after metric_ as a new field, along with the value? Everything I've read here seem to indicate that you need to know the field name ahead of time, but my goal is to be able to start logging new metrics without having to add/modify grok filters.
Note: I realize Metricsbeat is probably a better solution here, but as we're integrating with an existing ELK cluster which we do not control, that's not an option for me.
As your messages seems to be a series of key-value pairs, you can use the kv filter instead of grok.
When using grok you need to name the destination field, with kv the name of the destination field will be the same as the key.
The following configuration should work for your case.
filter { kv { prefix => "metric_" } }
For the event metric_another.unrelated.value=17.2 your output will be something like { "another.unrelated.value": "17.2" }

Is it possible to save on database without a PXGraph or a Screen?

The entry for that screen is not needed. All the records are automatically generated. or probably by using DAC only.
The Graph/DAC logic is preferred as you get all of the framework freebies such as field defaulting and calculated formula fields.
You can however get around this using PXDatabase.Insert or PXDatabase.Update PXDatabase.Delete
I use these for upgrade processes or bulk delete of processing records. These calls do not require a graph to execute but ignore all DAC attributes which may or may not default values, calculate values, etc.
If you search on PXDatabase in the Acumatica code browser you can find examples. Here is one from EmployeeMaint.Location_RowPersisted:
PXDatabase.Update<Location>(
new PXDataFieldAssign("VAPAccountLocationID", _KeyToAbort),
new PXDataFieldRestrict("LocationID", _KeyToAbort),
PXDataFieldRestrict.OperationSwitchAllowed);
PXDataFieldAssign is setting column values.
PXDataFieldRestrict is your where condition.
It is best to find multiple examples of PXDatabase in Acumatica and confirm your query results using a tool such as SQL profiler to make sure its executing the correct statement you intend to run.
You can't use DAC without Graph. All BQL queries require PXGraph instance. The only way to save data without using BQL is using ODBC or any other ORM to connect strictly to database and do your changes. But it is not recommended way as in case of doing it in that way you will ignore all the Business Logic.

Updating Solr Field during Post

I use the Simple Post tool to post PDF documents to Solr with PowerShell. Is there a way to post the document and update a field attribute simultaneously?
For example, my current command is:
java -Dauto -Dc=NameOfCore -Drecursive -jar /post.jar "/path/to/my/file.pdf"
Is there a parameter or argument that I can add to the command where I can pass a value of a field?
I'm on Windows and Using Solr 6.5.1
Very Similar Question:
Updating Solr field while posting .pdf document in Windows

Find logs which not contains specified field in kibana

I use ELK to arrange my logs, logs goes from many places and some records might not contain several fields and the question is what is the best way to find such records? Can I find logs which not contains several fields?
Kibana uses the query string query syntax of Elasticsearch in its filters. The syntax to find a document that does not have a given field is _missing_:FIELD_NAME. The + operator is the preferred way to ensure that a particular search term must be present in the matched documents. Combining these two will allow you to search for documents that are missing multiple fields:
+_missing_:foo +_missing_:bar +_missing_:baz

How to do "Not Equals" in couchdb?

Folks, I was wondering what is the best way to model document and/or map functions that allows me "Not Equals" queries.
For example, my documents are:
1. { name : 'George' }
2. { name : 'Carlin' }
I want to trigger a query that returns every documents where name not equals 'John'.
Note: I don't have all possible names before hand. So the parameters in query can be any random text like 'John' in my example.
In short: there is no easy solution.
You have four options:
sending a multi range query
filter the view response with a server-side list function
using a CouchDB plugin
use the mango query language
sending a multi range query
You can request the view with two ranges defined by startkey and endkey. You have to choose the range so, that the key John is not requested.
Unfortunately you have to find the commit request that somewhere exists and compile your CouchDB with it. Its not included in the official source.
filter the view response with a server-side list function
Its not recommended but you can use a list function and ignore the row with the key John in your response. Its like you will do it with a JavaScript array.
using a CouchDB plugin
Create an additional index with e.g. couchdb-lucene. The lucene server has such query capabilities.
use the "mango" query language
Its included in the CouchDB 2.0 developer preview. Not ready for production but will be definitely included in the stable release.

Resources