I am using the below Kusto query for exporting logs from App insight log
traces
| extend request = parse_json(tostring(parse_json(customDimensions).PayloadMessage))
| extend message = request.message
| extend SecondaryInfo = request.SecondaryInfo
| extend Logtype = request.Logtype
| extend File = request.File
| extend LineNo = request.LineNo
| extend MemberName = request.MemberName
| extend User = split(split(message,'User(').[1],')').[0]
| project timestamp,message,SecondaryInfo,Logtype,File,LineNo,MemberName,User
I want to extract the email address from the message column. So I am using split operation. However, I could not find the correct logic to extract the email address. Could someone please share some regex.
Below are some values in the messages column
-MMS.Core.Logging.MmsException: 1012 - Error while fetching Room Booking Requests for (User: TestAccount100#fr.domain.com) at
-MMS.Core.Logging.MmsException: 1011 - Error while fetching User's Locations (TestAccount100#fr.domain.com) at"
RbacRepository.GetUserCityBuildings-correlationId-11111fd6-e111-4d11-1111-11e101fbe111--DT-04162021T084110893-- START: (Email:TestAccount100#fr.domain.com), Cities mapped 4"
RoomBookingDetailsRepository.GetRequestDetailsAsync: Getting the BookingDetails data for User(TestAccount100#fr.domain.com), Role(Admin), IsAdmin(True), PPED() Status(), AssignedTo ()"
Below is an example that uses a naive regex.
You can go over the documentation for RE2 to adjust it, if required: https://github.com/google/re2/wiki/Syntax
datatable(s:string)
[
"-MMS.Core.Logging.MmsException: 1012 - Error while fetching Room Booking Requests for (User: TestAccount100#fr.domain.com) at",
"-MMS.Core.Logging.MmsException: 1011 - Error while fetching User's Locations (TestAccount100#fr.domain.com) at",
"RbacRepository.GetUserCityBuildings-correlationId-11111fd6-e111-4d11-1111-11e101fbe111--DT-04162021T084110893-- START: (Email:TestAccount100#fr.domain.com), Cities mapped 4",
"RoomBookingDetailsRepository.GetRequestDetailsAsync: Getting the BookingDetails data for User(TestAccount100#fr.domain.com), Role(Admin), IsAdmin(True), PPED() Status(), AssignedTo ()",
]
| extend email = extract(#"(\w+#\w+\.\w+)", 1, s)
Related
I need to get the category name from category id using kusto query.
First i have got the most searched url from the website using the below kusto query and ran it in app insights logs
Requests
| where resultCode==200
| where url contains "bCatID"
| summarize count=sum(itemCount) by url
| sort by count
| take 1
From the above query i got the result like
https://www.test.com/item.aspx?idItem=123456789&bCatID=1282
So for corresponding categoryid=1282 i need to get the category name using kusto
you can use the parse operator.
for example:
print input = 'https://www.test.com/item.aspx?idItem=123456789&bCatID=1282'
| parse input with 'https://www.test.com/item.aspx?idItem=123456789&bCatID='category_id:long
parse_urlquery
print url ="https://www.test.com/item.aspx?idItem=123456789&bCatID=1282"
| extend tolong(parse_urlquery(url)["Query Parameters"]["bCatID"])
url
Query Parameters_bCatID
https://www.test.com/item.aspx?idItem=123456789&bCatID=1282
1282
Fiddle
Feedback to the OP query
Not an answer
KQL is case sensitive. The name of the table in Azure Application Insights is requests (and not Requests).
resultCode is of type string (and not integer/long) and should be compared to "200" (and not 200)
bCatID is a token and therefore can be searched using has or even has_cs, which should be preferred over contains due to performance reasons.
URLs can be used with different parameters. It might make more sense to summarize only by the Host & Path parts + the bCatID query parameter.
count is a reserved word. It can be used as alias only if qualified: ["count"] or ['count'] (or better not used at all).
sort followed by take 1 can be replaced with the more elegant top 1 by ...
requests
| where resultCode == "200"
| project url = parse_url(url), itemCount
| summarize sum(itemCount) by tostring(url.Host), tostring(url.Path), tolong(url["Query Parameters"]["bCatID"])
| top 1 by sum_itemCount
The log item looks like below, the currencyamount field has multiple case situation:
{ "AdditionalFields":{
"backendRequestBody":{
"currencyamount":1
} } }
{ "AdditionalFields":{
"backendRequestBody":{
"CurrencyAmount":1
} } }
{ "AdditionalFields":{
"backendRequestBody":{
"currencyAmount":1
} } }
However, the parse_json log query is case sensitive, is there any way to get the currentAmount field case insensitively using azure log query?
The query below only able to get one of the log entry which has lower case currencyamount field.
AzureDiagnostics
| where apiId_s contains "targetId" and AdditionalFields.backendRequestBody has "amount"
| extend amt = (parse_json(tostring(AdditionalFields.backendRequestBody)).currencyamount)
AFAIK, in parse Json we cannot be able to use Incase sensitive Json object. Instead of that you can use following way to achieve.
AzureActivity
| where apiId_s contains "targetId" and AdditionalFields.backendRequestBody has "amount"
| extend backendReqbody = parse_json(AdditionalFields.backendRequestBody)
| extend lowercuramount = parse_json(tostring(parse_json(backendReqbody.currencyamount)))
| extend curamount = parse_json(tostring(parse_json(backendReqbody.CurrencyAmount)))
| extend lowupcuramount = parse_json(tostring(parse_json(backendReqbody.currencyAmount)))
You can use conditions like (iff) Ms -Doc while filtering the data in a result.
Json text isn't parsing in KQL correctly. I tried using parse_json as well but that didn't work either. I did confirm the extend AllProperties is holding the correct data.
DeviceInfo
| where RegistryDeviceTag == "Standard"
| extend AllProperties = todynamic(LoggedOnUsers)
| project DeviceName, Users = AllProperties["Username"]
Output gives me the correct DeviceName but doesn't give any data in the Username field.
(based on the sample input you provided in the comment)
if the array that is "LoggedOnUsers" includes exactly one entry, you can do this:
print input = '[{"UserName":"TheUserName","DomainName":"TheDomainName","Sid":"TheSID#"}]'
| project UserName = parse_json(input)[0].UserName
otherwise, you can use mv-expand or mv-apply:
print input = '[{"UserName":"TheUserName","DomainName":"TheDomainName","Sid":"TheSID#"}]'
| project parse_json(input)
| mv-apply input on (
project UserName = input.UserName
)
Below is the Cucumber Scenario in which steps 1,2,4,5,6 are working fine as it picks from other feature
file but then 3 and 7 step is giving an problem.
#Api
Scenario Outline: Getting Primary category count from Solr
Given User should have the base url for search vehicle
When User passes items per page as "20"
And User passes facet region as "US" and primarycategory type as "<Primary Category>"
And User fetches records with "/api/vehicles" endpoint
Then User should get status code "200" in response
And User should get 20 vehicles records in the response
Then User should get "<Primary Category>" count as "<Category Count>" in the response
Examples:
| Primary Category | Category Count |
| truck | 2125 |
| Tractor | 2366 |
| Trailer | 530 |
| Specialized | 0 |
| Reclassified | 0 |
Below is my code
#And("^User passes facet region as \"([^\"]*)\" and primarycategory type as \"([^\"]*)\" $")
public void user_passes_facet_region_as_and_primarycategory_type_as(String region, String primary_category) {
httpRequest.queryParam("facets", "r="+region+";g="+primary_category).urlEncodingEnabled(true);
}
#Then("User should get \"([^\"]*)\" count as \"([^\"]*)\" in the response$")
public void user_should_get_primary_category_count_as_in_the_response(int int1) {
int cat_count = response.jsonPath().get("Data.Count");
Assert.assertEquals(cat_count,int1);
}
After run i am getting the below error on console
#Api
Scenario Outline: Getting Primary category count from Solr # featurefile/PrimaryCategoryCount_From_Solr_API.feature:16
Given User should have the base url for search vehicle # FacetsSearchAPISteps.user_should_have_the_base_url_for_search_vehicle()
When User passes items per page as "20" # FacetsSearchAPISteps.user_passes_items_per_page_as(String)
And User passes facet region as "US" and primarycategory type as "truck" # null
And User fectches records with "/api/vehicles" endpoint # FacetsSearchAPISteps.user_fectches_records_with_endpoint(String)
Then User should get status code "200" in response # FacetsSearchAPISteps.i_should_get_status_code_in_response(String)
And User should get 20 vehicles records in the response # FacetsSearchAPISteps.user_should_get_vehicles_records_in_the_response(int)
Then User should get "truck" count as "2125" in the response # PrimaryCategoryCount_From_Solr_API_Steps.user_should_get_primary_category_count_as_in_the_response(int)
I dont know where is the issue can somebody help me.
I need to app insights traces with following pattern for messages:
"D1 connected"
"D2 connected"
"D3 connected"
"D1 disconnected"
"D3 disconnected"
"D1 connected"
"D2 disconnected"
etc.
I'm basically monitoring some devices and the connection time. How can I write a query that "pairs" events (D1 connected/disconnected, D2 connected/disconnected, etc.) and evaluates how long the "sessions" are?
I'd need to get information like:
total connection time for a day
distribution of the connection for a specific device on a day
etc.
Doing this just based on the text of the trace will be hard. I suggest using custom properties to assist in this.
By far the easiest option is to send some additional properties along with the disconnected event that have all the info required. Like:
// Start of session
var tt = new TraceTelemetry("D1 connected");
tt.Properties.Add("Event", "SessionStart");
telemetryClient.TrackTrace(tt);
var startTime = DateTime.Now;
// Do your thing
....
tt = new TraceTelemetry("D1 disconnected");
tt.Properties.Add("Event", "SessionEnd");
tt.Properties.Add("SessionLength", (startTime - DateTime.Now).TotalMilliseconds.ToString());
telemetryClient.TrackTrace(tt);
Custom properties are stored in the customDimensions field of an event.
Now in AI analytics you can query these values like this:
Count:
traces
| where customDimensions.Event == "SessionEnd"
| summarize count()
Session lengths:
traces
| where customDimensions.Event == "SessionEnd"
| project message, customDimensions.Length
Total duration of all sessions:
traces
| where customDimensions.Event == "SessionEnd"
| extend duration = tolong(customDimensions.SessionLength)
| summarize sum(duration)
I would also suggest adding the device Id as a custom property for all emitted events. It will make querying easier. You can then calculate min, max and average session lengths per device, for example:
traces
| where customDimensions.Event == "SessionEnd"
| extend duration = tolong(customDimensions.SessionLength)
| extend device = tostring(customDimensions.DeviceName)
| summarize sum(duration) by device
If you want to join the start events as well or cannot or will not do the above you have to join the start events with the end events to make these queries. You will still need to use some custom properties since querying on text alone will be hard because you will then need to analyze the text to determine what event and what device is involved.
take a look here azure AI QUERY combine start and response to calculate average to see how joins work in AI Analytics.