Azure Data Factory Pipeline Logs - azure

Where does data logs of Azure Pipeline v2 gets stored, I would like to retrieve data of failed pipelines for specific date.( Dont want to use azure portal to view these data). Is there any table/view holds such datalogs from database.

To my knowledge, to obtain diagnostic logs you can use Azure Monitor, Operations Management Suite (OMS), or monitor those pipelines visually.

By Azure Pipeline v2, you mean Azure Data Factory v2. Alert and Monitor data factories using Azure Monitor
Diagnostic logs:
Save them to a Storage Account for auditing or manual inspection. You can specify the retention time (in days) using the diagnostic settings.
Stream them to Event Hubs for ingestion by a third-party service or custom analytics solution such as PowerBI.
Analyze them with Log Analytics

The logs are stored on Azure Data Factory web server for 45 days. If you want to get the pipeline run and activity run metadata, you can use Azure Data Factory SDK to extract the information you need and save it somewhere you want.

Recommended approach on this for log term analysis as well as limiting access to a production data factory would be to configure logs to be sent to log analytics. Be sure to enable dedicated logging tables as this will help on the backend in terms of organizing your logs.
From there you can also set up alerts and access groups running off of log analytics queries for better monitoring.

Related

Is there access log or metadata for Azure Data Catalog

Is there a way to access the metadata of Azure Data Catalog? I looked up the documentation and went through the Azure Activity log of Azure Data Catalog. However, it seems like there is no access activities(i.e. who accessed Azure Data Catalog at what point of time) log I can use. Is there such activity anywhere in Azure at the moment?
Unfortunately there is no such way to check the activity logs. I would recommend you to please have a look at Azure Purview which has updated Data Catalog features.
You can refer to this document which has describes how to configure metrics, alerts, and diagnostic settings for Azure Purview using Azure Monitor: Azure Purview metrics in Azure Monitor

where i can find azure logs for tracking api response

I want to view logs in azure, I mean logs that I have in the console in localhost where I can find them in web site deployed in azure? I am consuming an external API and I want to see what I send and what I received from the prod env
thank you
There are couple of ways to track logs in Azure
Azure API Management
Azure Monitor
Azure API Management helps you track all kinds of requests including
View activity logs
View resource logs
View metrics of your API
Set up an alert rule when your API gets unauthorized calls
Azure Monitor on the other hand helps it possible to programmatically retrieve the available default metric definitions, granularity, and metric values.
The data can be saved in a separate data store such as Azure SQL Database, Azure Cosmos DB, or Azure Data Lake. From there additional analysis can be performed as needed.

Difference between Auditing vs Diagnostic settings in SQL Azure

In SQL Azure, there are two options for getting database events in Azure portal are Auditing and Diagnostic settings.
In which scenario we will use them.
Azure SQL database Auditing is usually used for:
Retain an audit trail of selected events. You can define categories
of database actions to be audited.
Report on database activity. You can use pre-configured reports and a
dashboard to get started quickly with activity and event reporting.
Analyze reports. You can find suspicious events, unusual activity,
and trends.
Diagnostics settings:
You can use the Diagnostics settings menu for each single, pooled, or instance database in Azure portal to configure streaming of diagnostics telemetry. In addition, diagnostic telemetry can also be configured separately for database containers: elastic pools and managed instances. You can set the following destinations to stream the diagnostics telemetry: Azure Storage, Azure Event Hubs, and Azure Monitor logs.
As 4c74356b41 said, they are difference things and with different uses.
Please reference:
Azure SQL Database metrics and diagnostics logging
Get started with SQL database auditing
Hope this helps.
Just found this "When auditing is configured to a Log Analytics workspace or to an Event Hub destination via the Azure portal or PowerShell cmdlet, a Diagnostic Setting is created with "SQLSecurityAuditEvents" category enabled.".
link
For me, this means that auditing is an easier way to enable one of the features of Diagnostic Settings.

Searching Storage Account with Azure Log Analytics

Using Log Analytics, is it possible to search thru data stored in a container inside an Azure storage account? We have an Azure Function that reaches out to an API in O365 for log data and then it pushes that data into a storage account. We would like to be able to query this data.
We can push content inside your container to log analytics workspace repository using something called log analytics http data collector API.
We need to build your own integration of sending container content to log analytics by leveraging http data collector API.
You may refer to the suggestion mentioned in the article
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
Additional information: - Azure Functions
- Azure Automation
- Logic App
With any of these what you will do is have some schedule that will run on certain interval. When it is ran you will execute query against Log Analytics to get data. The results from the query you will transfer to Azure Storage may be as blob. You might have to do some transformation on the data depending on your scenario. The most important that you have to make sure is that you do not miss data or upload the same data twice to the storage. Log Analytics query language allows you to specify time frame for the results. I hope this will help you.
Kindly let us know if the above helps or you need further assistance on this issue.

Event alerts in Azure SQL using auditing

We are looking to get alerts on specific tables in a Azure DB to initiate actions.These could be
a) calling a Azure Search indexer API so that changes get indexed in near-real time every time the datasource changes
b) push updates to users via SignalR when there is an update
I understand AzureDB has functional limitations and triggers cannot invoke the Azure function directly due to lack of CLR support.
Azure SQL Database trigger to insert audit info into Azure Table
Given the triggers in Azure cannot invoke APIs, I was told that we could get information on database updates via the auditing. However, the auditing seems to be at the blob level- not at a table level
Given the multiple changes underway, it would be quite a task and time-delay to check out for the event of interest from the blob.
While there is a line around the blob auditing being configurable(supports higher granularity object-level auditing), I couldn't find an approach to limit blob updates to certain table alone. Any pointers would be appreciated
Also, given blob auditing seems to be built for threat detection/ regulatory purposes,any issues using this for event alerts?
Additional info on granular Blob Auditing in Azure SQL DB (including limiting the audit policy to specific tables/views) can be found here:
https://msdn.microsoft.com/library/azure/mt695939.aspx
Please note that we have also created the following OMS integration app for advanced analysis of Audit logs - you can use it to push the Blob audit logs into OMS, and then create customized alerts in OMS on top of the audit logs (this is a temporary solution, until our fully integrated OMS/Log Analytics solution is available):
https://github.com/Microsoft/Azure-SQL-DB-auditing-OMS-integration
Best Regards,
Gilad Mittelman
SQL Security, Microsoft

Resources