Is there a way to export the Dialogflow-ES Analytics - Intent Path graph in a high resolution?
Please see screenshot below:
Related
My goal is to import telemetry data from an ApplicationInsights resource into a SQL Azure database.
To do so, I enabled allLogs and AllMetrics in the Diagnostic settings of the ApplicationInsights instance, and set the Destination details to "Archive to a storage account".
This works fine, and I can see data being saved to containers beneath the specified storage account, as expected. For example, page views are successfully written to the insights-logs-apppageviews container.
My understanding is that I can use StreamAnalytics job(s) from here to import these JSON files into SQL Azure by specifying a container as input, and a SQL Azure table as output.
The problems I encounter from here are twofold:
I don't know what to use for the "Path pattern" on the input resource: there are available tokens to use for {date} and {time}, but the actual observed container path is slightly different, and I'm not sure how to account for this discrepancy. For example, the {date} token expects the YYYY/MM/DD format, but that part of the observed path is of the format /y={YYYY}/m={MM}/d={DD}. If I try to use the {date} token, nothing is found. As far as I can tell, there doesn't appear to be any way to customize this.
For proof-of-concept purposes, I resorted to using a hard-coded container path with some data in it. With this, I was able to set the output to a SQL Azure table, check that there were no schema errors between input and output, and after starting the job, I do see the first batch of data loaded into the table. However, if I perform additional actions to generate more telemetry, I see the JSON files updating in the storage containers, yet no additional data is written to the table. No errors appear in the Activity Log of the job to explain why the updated telemetry data is not being picked up.
What settings do I need to use in order for the job to run continuously/dynamically and update the database as expected?
I would like to have my local device query and store data from the same Log Analytics platform that it reports to. All the documentation I have seen shows me how to access/query Log Analytics from the Azure UI & Azure PowerShell, but I have not seen anything on how to query the same data from the Virtual Machine's own PowerShell terminal. Any recommendations? Is this possible, or not?
I found the answer to my own question. To the benefit of anyone who is struggling with the same problem, refer to this:
https://igeorgiev.eu/azure/howto-query-log-analytics-workspace-from-azure-powershell-using-service-principal/
The Log Analytics Workspace context can be retrieved with
$workspace = Get-AzOperationalInsightsWorkspace
After you have retrieved the context and defined a KQL Query, you can query the Log Analytics platform with
$QueryResults = Invoke-AzOperationalInsightsQuery -Workspace $Workspace -Query $kqlQuery
I have Azure Function Apps running in App Service, and I am able to get the number of Http Server Errors by instance level in the Metrics (Pls see image). I would like to get the same level of metics via Kusto query and tried all the Log tables I can't find it. Is it possible to get those metrics by instance using Kusto?
I checked in AzureMetrics there is no instance level data stored: Here is the query I am using to get all Http Server Errors overall.
AzureMetrics
| where ResourceGroup == "RG"
| where TimeGenerated {TimeRange}
| where ResourceId in ("ResourceId")
| where MetricName == "Http5xx"
Since you are looking at Azure Metrics in metrics explorer, those generally are NOT coming from a kql backed data source (not all standard azure metrics are in any workspaces/etc for cost/compat reasons)
in workbooks, instead of using a Query step, you'd use a Metrics step to get this data instead. you'd pick that time range parameter in the time range dropdown, likewise you'd select the appropriate resource type and that resources or resource parameter in the resource picker of the metrics item, and you'd add that metric. (there's a preview feature coming to help with this, add ?feature.sendtoworkbooks=true to your azure portal url, like https://portal.azure.com/?feature.sendtoworkbooks=true) and the Metrics Explorer view will have additional "Send to workbooks" options in the share and pin menus that will convert the metrics view to a workbook)
If Application Insights is configured on this function app, you could possibly query the appinsights customMetrics table to get custom metrics in the function app, but probably not the standard metrics as KQL)
I experience my "Integrations" tap is missing in my DialogFlow agent. See image below:
Missing "Integrations" tap
Does anyone know what I have to do in order to get it back? When creating a new agent and assign it to another project it works fine.
Kind Regards,
Mathias
Upon checking the screenshot you've provided, it seems that you're using Europe as the region of your agent. Note that the Integrations feature is not available in the Europe region. For more information, see here: https://cloud.google.com/dialogflow/es/docs/how/region#limits.
As a workaround, you may migrate your current agent to the US region by going to Setting > Export and Import > Export as ZIP to export your agent. Then, create a new agent in the US region, go to Setting > Export and Import > Restore from ZIP to restore your agent.
Here are the detailed steps to migrate your current agent to the US region.
Go to agent settings > Export and Import > EXPORT AS ZIP
Create a new agent in the US region. Make sure to change the region before creating an agent.
Once you’re in the US region, you can now create a new agent and restore the previously exported agent from the Europe region.
I am looking for a API or documentation for querying for name availability of Azure Data Factory (ADF), Time Series Insights (TSI) and Stream Analytics Job resources similar to https://learn.microsoft.com/en-us/rest/api/keyvault/vaults/checknameavailability.
I have tried looking at azure-arm-datafactory, azure-arm-streamanalytics node libraries but couldn't find functionality to check name availability of resources.
I am looking for something similar to below..
import KeyVaultMangementClient from 'azure-arm-keyvault';
const client = new KeyVaultMangementClient(this._credentials, this._subscriptionId);
return client.vaults.checkNameAvailability({name: keyVaultName})
.then((result: any) => {
console.log(result.nameAvailable);
return Promise.resolve(result.nameAvailable);
});
AFAIK this (check name availability) feature is currently unsupported for ADF, TSI, Stream Analytics. So I would recommend you to please add a new feature request on uservoice with your suggestion(s) and let us know the link when you do it so that others can vote as well and which would in turn would raise visibility and priority on it.