How to select a log analytics workspace in azure monitor notebook using a parameter? - azure-monitor-workbooks

I would like to have the user select the log analytics workspace as a parameter in an Azure Monitor notebook to perform a query, similar to what is explained in the parameter documentation. However, I can't resolve the log analytics workspace using a variable.
What I would like to do is something like:
let event_table = workspace({parameter}).Event;
event_table
| take 5
The following query using a string literal is successful in both an azure monitor notebook and in a log query:
let event_table = workspace("name_of_work_space").Event;
event_table
| take 5
The following fails with the error "Unknown function 'workspace'. in both an azure monitor notebook and a log query:
let logAnalyticsWorkspaceName = "name_of_work_space";
let event_table = workspace(logAnalyticsWorkspaceName).Event;
event_table
| take 5
It seems that only string literals are allowed as arguments to the workspace() function.
Additionally, iff() and case() functions only return scalars so I cannot use iff() and case() to conditionally return a table or workspace based on a Azure Notebook parameter.
How do I supply a parameter to an Azure Monitor Notebook to query against a particular log analytics workspace?

Dynamically setting the workspace be accomplished using a parameter. Create a parameter for Workspaces where the parameter type is Resource picker (from Azure Resource Graph). That parameter will then appear as an option in the Log Analytics drop down for downstream controls. A demonstration appears in the video "How to build tabs and alerts in Azure workbooks | Azure Portal" at 5:00.

Related

Azure log analytics Azure Synapse integration

I am trying to bring in Azure Synapse logs into Loganalytics to create dashboards on usage level.
I have already setup in diagnostic settings to pass on the logs to my loganalytics workspace.
But while trying to execute queries from below documentation, I am getting error saying -
Query -
//Chart the most active resource classes
AzureDiagnostics | where Category contains "ExecRequests" | where
Status_s == "Completed" | summarize totalQueries = dcount(RequestId_s)
by ResourceClass_s | render barchart
Error:
'where' operator: Failed to resolve column or scalar expression named 'Status_s'...
Documentation link for queries : https://learn.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-monitor-workload-portal
Please let me know if there is something I am missing. I am directly logging to loganalytics workspace and running these queries inside a workbook...
Also i didnt find any proper documentation/blogs/links for connecting synapse to loganalytics, please let me know if anyone has that..
The documentation linked in your post appears to be out of date even though the last update date is recent.
See this link:
Azure services that use resource-specific mode store data in a table
specific to that service and do not use the AzureDiagnostics
table.
The link also lists a number of resource-specific tables for Synapse. "SynapseSqlPoolExecRequests" and "SynapseSqlPoolSqlRequests" are a few examples that might provide the info you're seeking.

ApplicationLog_CL in Log Analytics Workspace

I use Log Analaytics Workspace in order to add logs from my application and use kusto query as follows:
ApplicationLog_CL
| order by TimeGenerated desc
What does this ApplicationLog_CL indicate? Is there a way to add another log type?
The 'ApplicationLog_CL' is a reference to the Table in Log Analytics you are querying with KQL.
The postfix of '_CL' indicates that it's a Custom Table and will most likely also have the type of 'Custom Table' if you look in the Log Analytics Workspace under Tables.
When trying to create a custom table in LAW through e.g. Bicep or the Azure CLI and you do not add the postfix '_CL' to the table, the creation of the table fails.
It specifies here that the table needs to postfixed with '_CL'.
If creating the table through the Azure Portal, Azure will postfix the table with '_CL' automatically.

Kusto Query to get Http5xx by Instance level

I have Azure Function Apps running in App Service, and I am able to get the number of Http Server Errors by instance level in the Metrics (Pls see image). I would like to get the same level of metics via Kusto query and tried all the Log tables I can't find it. Is it possible to get those metrics by instance using Kusto?
I checked in AzureMetrics there is no instance level data stored: Here is the query I am using to get all Http Server Errors overall.
AzureMetrics
| where ResourceGroup == "RG"
| where TimeGenerated {TimeRange}
| where ResourceId in ("ResourceId")
| where MetricName == "Http5xx"
Since you are looking at Azure Metrics in metrics explorer, those generally are NOT coming from a kql backed data source (not all standard azure metrics are in any workspaces/etc for cost/compat reasons)
in workbooks, instead of using a Query step, you'd use a Metrics step to get this data instead. you'd pick that time range parameter in the time range dropdown, likewise you'd select the appropriate resource type and that resources or resource parameter in the resource picker of the metrics item, and you'd add that metric. (there's a preview feature coming to help with this, add ?feature.sendtoworkbooks=true to your azure portal url, like https://portal.azure.com/?feature.sendtoworkbooks=true) and the Metrics Explorer view will have additional "Send to workbooks" options in the share and pin menus that will convert the metrics view to a workbook)
If Application Insights is configured on this function app, you could possibly query the appinsights customMetrics table to get custom metrics in the function app, but probably not the standard metrics as KQL)

Import Schemas in Azure Data Factory with Parameters

I am trying to develop a simple ADF pipeline that copies data from a delimited file to MySQL database, when such a file is uploaded to a Blob Storage Account. I am using parameters to define the name of the Storage Account, the Container that houses the files and file name (inputStorageAccount, inputContainer, inputFile). The name of the Storage Account is a global parameter and the other two are meant to be provided by the trigger. The Linked Service has also been parameterized.
However, I want to define the mappings for this operation. So, I am trying to 'import schemas' by providing the values for these parameters (I have stored a sample file in the Storage Account). But, I keep getting this error when trying to do so,
What am I doing wrong? How can I get this to work?
I would also like to know why I am not being asked to provide a value for the inputContainer parameter when I try to use 'import schema' at the dataset level,
Where you have to add the values Add dynamic content [Alt+P] :
Just as mentioned here in the below Snip, Go to the + Symbol where you will find a window and need to fill in the parameter name, type and value:
Where we can directly select the parameter according to the options :
Here is another detailed scenario which might help: Using Azure DataFactory Parameterized Linked Service | Docs, then you can reset the schema.

Azure Activity Log for all the users of the organization

We want to see activity logs initiated by all the users from the organization(like users#mycompamy.com). We don't want to see the activity initiated by platform(by azure policy, by backup management, etc).
On the Azure portal, there is only two option: Either select 'All' or type a single user's name. I tried '*#mycompany.com' but it didn't work. Is there any way to get this.
thanks
Updated:
In azure monitor -> Logs, you can write the query like below:
AzureActivity
| where Caller contains "#mycompamy.com"
Add a screenshot for this:
Original answer:
A simple way is that just type the #mycompany.com in the search box. The screenshot is as below:
Another more advanced method is that nav to azure monitor -> logs -> then use kusto query, then you can query what you like as per the condition like use this where clause EventInitiatedBy contains "#mycompany.com".

Resources