AzurePerformanceDiagnostics extension Vs Azure Log Analytics - azure

I am new to Azure and just want to know what is the difference in between Azure Performance Diagnostic Extension Vs Azure Log Analytics? or are they same functionality wise.

Azure Performance Diagnostic Extension collects performance diagnostic data from VMs. The extension performs analysis, and provides a report of findings and recommendations to identify and resolve performance issues on the virtual machine.
Azure Log Analytics is a log aggregation tool, it will collect and store your data from various log sources and allow you to query over them using a custom query language. It collects not only performance data but also event log, as well as log aggregation.
If you just want to collect the performance data of VM, you could use Azure Performance Diagnostic Extension. If you need to do anything more complex with this data or query across multiple resources, you could use Log Analytics.
Reference:
Performance diagnostics for Azure virtual machines
Azure Performance Diagnostics VM Extension for Windows
View or analyze data collected with Log Analytics log search
Azure Monitor and Azure Log Analytics

Related

Is it possible to query Azure data warehouse within log analytics

I have a scenario where I would like to query Azure Data warehouse tables within the Log Analytics workspace and using those records I need to create a result set and prepare a chart.
I do see some objects in log analytics workspace like a database, table but not sure what is the purpose and are these objects specific to a resource or generic and how to use them I couldn't get documentation for these objects can somebody guide me on this.
Unfortunately, you cannot use Azure Log Analytics to query Azure SQL Data Warehouse.
Use Azure Data Studio to connect and query data in Azure SQL data warehouse.
Recommended tools for querying data in Azure SQL Data Warehouse.
Azure Log Analytics is used to write, execute, and manage Azure Monitor log queries in the Azure portal. You can use Log Analytics queries to search for terms, identify trends, analyze patterns, and provide many other insights from your data.
For more information about log queries, see Overview of log queries in Azure Monitor.
For a detailed tutorial on writing log queries, see Get started with log queries in Azure Monitor.

Difference between Auditing vs Diagnostic settings in SQL Azure

In SQL Azure, there are two options for getting database events in Azure portal are Auditing and Diagnostic settings.
In which scenario we will use them.
Azure SQL database Auditing is usually used for:
Retain an audit trail of selected events. You can define categories
of database actions to be audited.
Report on database activity. You can use pre-configured reports and a
dashboard to get started quickly with activity and event reporting.
Analyze reports. You can find suspicious events, unusual activity,
and trends.
Diagnostics settings:
You can use the Diagnostics settings menu for each single, pooled, or instance database in Azure portal to configure streaming of diagnostics telemetry. In addition, diagnostic telemetry can also be configured separately for database containers: elastic pools and managed instances. You can set the following destinations to stream the diagnostics telemetry: Azure Storage, Azure Event Hubs, and Azure Monitor logs.
As 4c74356b41 said, they are difference things and with different uses.
Please reference:
Azure SQL Database metrics and diagnostics logging
Get started with SQL database auditing
Hope this helps.
Just found this "When auditing is configured to a Log Analytics workspace or to an Event Hub destination via the Azure portal or PowerShell cmdlet, a Diagnostic Setting is created with "SQLSecurityAuditEvents" category enabled.".
link
For me, this means that auditing is an easier way to enable one of the features of Diagnostic Settings.

Querying IIS logs from blob using Azure Log Analytics (formerly OMS)

I am attempting to use Azure Log analytics to query IIS logs that sit in four separate storage accounts (based on region) that are generated from our WebApps. Is this possible? I'm only seeing azure activity logs on my queries - I'm very new to log analytics, so any help would be greatly appreciated.
There is no direct way for this, you can take a look at this issue and it mentions:
Log Analytics is only supported at the IaaS (VM) level, not at PaaS (App Service) level.
If you want to do that, you can manually set up a way which sends the logs from blob storage to Log analytics, following this tutorial(It's a little hard to do that).
As well, you can also upvote on this feedback.

Azure Data Factory Pipeline Logs

Where does data logs of Azure Pipeline v2 gets stored, I would like to retrieve data of failed pipelines for specific date.( Dont want to use azure portal to view these data). Is there any table/view holds such datalogs from database.
To my knowledge, to obtain diagnostic logs you can use Azure Monitor, Operations Management Suite (OMS), or monitor those pipelines visually.
By Azure Pipeline v2, you mean Azure Data Factory v2. Alert and Monitor data factories using Azure Monitor
Diagnostic logs:
Save them to a Storage Account for auditing or manual inspection. You can specify the retention time (in days) using the diagnostic settings.
Stream them to Event Hubs for ingestion by a third-party service or custom analytics solution such as PowerBI.
Analyze them with Log Analytics
The logs are stored on Azure Data Factory web server for 45 days. If you want to get the pipeline run and activity run metadata, you can use Azure Data Factory SDK to extract the information you need and save it somewhere you want.
Recommended approach on this for log term analysis as well as limiting access to a production data factory would be to configure logs to be sent to log analytics. Be sure to enable dedicated logging tables as this will help on the backend in terms of organizing your logs.
From there you can also set up alerts and access groups running off of log analytics queries for better monitoring.

Is it possible to log Azure Cloud Service Performance Counters in Elasticsearch?

I'd like to use Kibana to create views that display log and metric information output by our Azure Cloud Service Web Roles and Worker Roles. In particular, we'd like to store performance counter information, as described here: https://learn.microsoft.com/en-us/azure/cloud-services/cloud-services-dotnet-diagnostics-performance-counters
Microsoft provides a few tools to view this data if it's stored in Azure diagnostic tables and blobs, but the tools don't have the formatting & visualizing flexibility the Elk stack does. Is anyone aware of how we might be able to get Azure Cloud Service Performance Counter information into Elasticsearch, preferably via logstash?
I've achieved this by enabling Azure diagnostics (writing to Azure table storage) and using ConveyorBelt to send the logs to ElasticSearch.
As the GitHub page states, ConveyorBelt is:
A horizontally scalable headless cluster to shovel Azure diagnostic
data (and other custom data) to ElasticSearch

Resources