I'm trying to create comments in my Azure Log Analytics queries and I'm stumped. Part of my challenge I think is treating this system as if it were SQL, which it is clearly not. using "--" for instance results in a syntax error
traces
| where severityLevel > 1
-- this is an example of a line comment
| where message !contains "DiagnosticsLogger.GetMethod contains message 1"
| where message !contains "DiagnosticsLogger.GetMethod contains message 2"
| summarize by timestamp, message, severityLevel
Couldn't find anything for search term "Comment" either on the https://docs.loganalytics.io reference.
// works for Azure Log Analytics queries
Related
I created an Azure Alert using a Query (KQL - Kusto Query Language) reading from the Log. That is, it's an Log Alert.
After a few minutes, the alert was triggered (as I expected based on my condition).
My condition checks if there are Pods of a specific name in Failed state:
KubePodInventory
| where TimeGenerated between (now(-24h) .. now())
| where ClusterName == 'mycluster'
| where Namespace == 'mynamespace'
| where Name startswith "myjobname"
| where PodStatus == 'Failed'
| where ContainerStatusReason == 'Completed' //or 'Error' or doesn't matter? (there are duplicated entries, one with Completed and one with Error)
| order by TimeGenerated desc
These errors stay in the log, and I only want to catch (alert about them) once per day (that is, I check if there is at least one entry in the log (threshold), then fire the alert).
Is the log query evaluated every time there is a new entry in the log, or is it evaluated in a set frequency?I could not find in Azure Portal a frequency specified to check Alerts, so maybe it evaluates the Alert(s) condition(s) every time there is something new in the Log?
I have a Function App that pulls the Azure public key usng a PowerShell script and outputs it into log analytics. I am trying to get notified if the public key updates. At the moment, I am comparing the top two results to see changers.
Does anyone know a query that could compare two message outputs to see if the output value changers? This will be to create an alert and more automation if the public key does change.
Yes this is possible, for example by using the next() function:
traces
| where message has "OUTPUT"
| top 2 by timestamp desc
| extend different = next(message) != message
| take 1
| where different == true
using next() you can lookup the value of a column of the next row. We can use that to compare with the current value of message.
We're using AKS and have our container logs writing to Log Analytics. We have an application that emits several print statements in the container log per request, and we'd like to group all of those events/log lines into aggregate events, one event per incoming request, so it's easier for us to find lines of interest. So, for example, if the request started with the line "GET /my/app" and then later the application printed something about an access check, we want to be able to search through all the log lines for that request with something like | where LogEntry contains "GET /my/app" and LogEntry contains "access_check".
I'm used to queries with Splunk. Over there, this type of inquiry would be a cinch to handle with the transaction command:
But, with Log Analytics, it seems like multiple commands are needed to pull this off. Seems like I need to use extend with row_window_session in order to give all the related log lines a common timestamp, then summarize with make_list to group the lines of log output together into a JSON blob, then finally parse_json and strcat_array to assemble the lines into a newline-separated string.
Something like this:
ContainerLog
| sort by TimeGenerated asc
| extend RequestStarted= row_window_session(TimeGenerated, 30s, 2s, ContainerID != prev(ContainerID))
| summarize logLines = make_list(LogEntry) by RequestStarted
| extend parsedLogLines = strcat_array(parse_json(logLines), "\n")
| where parsedLogLines contains "GET /my/app" and parsedLogLines contains "access_check"
| project Timestamp=RequestStarted, LogEntry=parsedLogLines
Is there a better/faster/more straightforward way to be able to group multiple lines for the same request together into one event and then perform a search across the contents of that event?
After reading your question, there is no such an easy way to do that in azure log analytics.
If the logs are in this format, you need to do some other work to meet your requirement.
I am trying to trigger an alert when the columns in the AzureDiagnostic Table in Log Analytics is >400 since there is a 500 column limit to the table where records will start dropping.
The issue is Alerts expects and AggregatedValue and a TimeGenerated. Since this is a schema there is not a true Time Generated. I've tried a "time" metric and renaming the column to be "TimeGenerated" but get the following error:
Search Query should contain 'AggregatedValue' and 'bin(TimeGenerated,
[roundTo])' for Metric alert type
This is the alert query I have:
AzureDiagnostics
| getschema
| summarize AggregatedValue = count(ColumnName) by bin(1d, 5m)
|project AggregatedValue, TimeGenerated=Column1
And I get these results:
I changed my logic to return a record or not. It will return a record only if the threshold has been met of 400 columns and then set my alert Threshold value to > 0.
AzureDiagnostics
| getschema
| summarize count(ColumnName)
| where count_ColumnName >400
Alert:
I'm from the Azure Monitor Log Analytics team. We are actively working in Azure Log Analytics to avoid it all together. We are working now to have dedicated tables for most of Azure resource so it wouldn't overpopulate the AzureDiagnostics table. Some Azure resource like Azure Data Factory have options to control whether it would use the dedicated tables or AzureDiagnistcs. See #4 here: https://learn.microsoft.com/en-us/azure/data-factory/monitor-using-azure-monitor#monitor-data-factory-metrics-with-azure-monitor
I have created my chatbot's knowledge base with Qna Maker and I am trying to visualize some statistics with Analytics Application Insights.
What I want to do
I would like to create a chart with the most requested Qna Maker questions.
My problem
I can't find the Qna Maker questions in the customDimensions traces on Analytics but only their Id :
My question
Is their a way to get the Qna Maker Question linked to this Id directly from the Analytics Application Insights tool ?
Thank you.
PS : I had to use "Q" instead of "Question" in the title due to Stackoverflow rules.
Not directly.
the only info you have in appinsights is whatever was submitted with the data. so if they aren't sending the question (odd that they send the answer but not the question?) then you're out of luck.
As a workaround, you could create a custom table in your application insights instance:
https://learn.microsoft.com/en-us/azure/application-insights/app-insights-analytics-import
and populate that table with the id and question.
then you could join those two things in analytics queries in the analytics tool or in workbooks.
if you're looking for a query with question and answer linked through id, here is your answer:
requests
| where url endswith "generateAnswer"
| project timestamp, id, name, resultCode, duration
| parse name with *"/knowledgebases/"KbId"/generateAnswer"
| join kind= inner (
traces | extend id = operation_ParentId
) on id
| extend question = tostring(customDimensions['Question'])
| extend answer = tostring(customDimensions['Answer'])
| project KbId, timestamp, resultCode, duration, question, answer
This doesn't necessarily solve your issue but might be of help for other people looking for a simple report of question/answer to improve their QnA maker.
The sample could be found in the official documentation:
https://learn.microsoft.com/en-us/azure/cognitive-services/qnamaker/how-to/get-analytics-knowledge-base
Here is a query that I came up with that will pull in the id of the knowlegebase question, the question the user typed, and the knowlegebase anwer. It also ties multiple questions together if they are from the same session.
I have not yet been able to find a way to identify a way to get the knowlegebase question associated to the id though.
requests
| where url endswith 'generateAnswer'
| project id, url, sessionId = extract('^[a-z0-9]{7}', 0, itemId)
| parse kind = regex url with *'(?i)knowledgebases/'knowlegeBaseId'/generateAnswer'
| join kind= inner
(
traces
| extend id = operation_ParentId
| where message == 'QnAMaker GenerateAnswer'
| extend userQuestion = tostring(customDimensions['Question'])
| extend knowlegeBaseQuestionId = tostring(customDimensions['Id'])
| extend knowlegeBaseAnswer = tostring(customDimensions['Answer'])
| extend score = tostring(customDimensions['Score'])
)
on id
| where timestamp >= ago(10d)
| project knowlegeBaseId, timestamp, userQuestion, knowlegeBaseQuestionId, knowlegeBaseAnswer, score, sessionId
| order by timestamp asc