Opensearch Grafana: how to visualize text fields - string

this is my first ever post on stackoverflow
Im sending json logs from filebeat to logstash to opensearch to grafana
and everything is working perfectly (if it comes to integer data)
i can even see that opensearch receives my string fields and boolean fields and even reads them.
but when i want to make a dashboard to visualize some strings and booleans, it only finds my integer fields
Can someone help me visualize Strings on grafana and not only numbers.
this is an image of what i can see when i try to select data, i only see the number field names
thanks andrew, now i see this, but i want to only see 1 field
and not all of them
logs added to grafana

You can try using the Logs panel
And an example of how I use - the request is something like this:
{namespace=~"$namespace", stream=~"$stream", container =~"$container"} |= "$query"
But I'm using fluent-bit + loki

Related

CustomFields attribute in LogstashEncoder is not working

Currently, this is the setup we have:
currentSetup image attached.
With the above setup, we expect the serviceName to be captured as a searchable field in Elastic/Kibana. Instead, the field is appearing inside the message and that makes the field not searchable.
The work around we have is to add the service field as additional Field like below:
workAround image attached.
The above is working and we can search the field in kibana. Do we know why the customField is not working as expected?
Thanks

Missing Index Patterns

I'm missing some index patterns in Kibana and I've been trying to figure out why this is the case. I have installed logstash, elasticsearch and kibana and started the services. How do I get logstash, apache-access etc to show in this section? Only filebeat shows.
I've used the CURL command for the localhost and port to see the indices and only kibana and filebeat are shown there are and apache-access and logstash are no where to be seen.
Can anyone guide me in the right direction to resolving this and being able to see 'logstash' and 'apache-access' under the patterns section.
Data is being saved inside indices in Elasticsearch cluster, in Kibana you can define index-patterns to show multiple indices at the same time.
When you look in the left menu of your screenshot you'll find a menu item called "Index Management", all indices will be shown there, here you'll find the name of the indices that exist in your Elasticsearch cluster.
An index pattern in Kibana is just a (wildcarded) pattern to allow you to see the data.
On the top right of your screenshot you see the button "+ Create Index Pattern", by clicking there you can define a new pattern which will live next to the existing one (filebeat-*).
Once you defined a second one, you'll be able to define which one is the default one chosen when you open Kibana and a dropdown will be available on your discover page in Kibana with the active index-pattern for your discovery at that time.
tash
So in short, press the "create index pattern" button twice entering once logstash* as the pattern and once apache-access* as pattern.

Extract Alerts logs from Azure without Azure security Centre

I want to extract alerts log in CSV format to show that I have received this type of alerts.
But unable to extract from azure log query Or I have to install some agent?
You may list all existing alerts, where the results can be filtered on the basis of multiple parameters (e.g. time range). The results can then be sorted on the basis specific fields, with the default being lastModifiedDateTime:
GET https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.AlertsManagement/alerts?api-version=2018-05-05
Similar with Optional Parameters:
GET https://management.azure.com/subscriptions/{subscriptionId}/providers/Microsoft.AlertsManagement/alerts?targetResource={targetResource}&targetResourceType={targetResourceType}&targetResourceGroup={targetResourceGroup}&monitorService={monitorService}&monitorCondition={monitorCondition}&severity={severity}&alertState={alertState}&alertRule={alertRule}&smartGroupId={smartGroupId}&includeContext={includeContext}&includeEgressConfig={includeEgressConfig}&pageCount={pageCount}&sortBy={sortBy}&sortOrder={sortOrder}&select={select}&timeRange={timeRange}&customTimeRange={customTimeRange}&api-version=2018-05-05
To check other URI parameter for Logging, you may refer this URL.
And finally when you have availed response(s) in JSON format, you may get that automatically converted into CSV format using any of the freely available online conversion utilities (like this service HERE)

How do I group logs in Kibana/Logstash?

We have an ELK setup and the Logstash is receiving all the logs from the Filebeat installed on the server. So when I open Kibana and it asks for an index I put just a * for the index value and go to the Discover tab to check the logs and it shows each line of the log in a separate expandable section.
I want to be able to group the logs based on the timestamp first and then on a common ID that is generated in our logs per request to identify it from the rest. An example of the logs we get :
DEBUG [2018-11-23 11:28:22,847][298b364850d8] Some information
INFO [2018-11-23 11:27:33,152][298b364850d8] Some information
INFO [2018-11-24 11:31:20,407][b66a88287eeb] Some information
DEBUG [2018-11-23 11:31:20,407][b66a88287eeb] Some information
I would like to see all logs for request ID : 298b364850d8 in the same drop down given they are continuous logs. Then it can break into the second dropdown again grouped by the request ID : b66a88287eeb in the order of timestamp.
Is this even possible or am I expecting too much from the tool?
OR if there is a better strategy to grouping of logs I'm more than happy to listen to suggestions.
I have been told by a friend that I could configure this in logstash to group logs based on some regex n stuff but I just don't know where and how to configure it to fo the grouping.
I am completely new to the whole ELK stack to bear with my questions which might be quite elementary in nature.
Your question is truly a little vague and broad as you say. However, I will try to help :)
Check the index that you define in the logstash output. This is the index that need to be defined Kibana - not *.
Create an Index Pattern to Connect to Elasticsearch. This will parse the fields of the logs and will allow you to filter as you want.
It recommend using a GUI tool (like Cerebro) to better understand what is going on in you ES. It would also help you to get better clue of the indices you have there.
Good Luck
You can use #timeStamp filter and search query as below sample image to filter what you want.

Working with large number of fields in kibana

Is there a way to filter through the entries in the "Fields" dropdown in Kibana under the Visualize tab?
My data has over 1000 fields and so its not convenient having to scroll through a really long dropdown menu (that looks like below) - just to pick a field thats buried in there somewhere.
Is there a way to make it searchable like how it is in the discover page for indexes and for fields - as seen below:
I am open to other suggestions as well - if there is a different way to achive the same result - i.e., to pick fields to visualize when there are a lot of fields to pick from.
I am using Kibana 5.4.1 on Windows
Go to https://github.com/elastic/kibana and clone the repository, it's in 6 version

Resources