Is there a way to determine the data retention period set on an existing Log Analytics workspace?
You can find the details of retention in pricing details
The first 5 GB of data ingested to the Azure Log Analytics service
every month is offered free.
Every GB of data ingested into your Azure Log Analytics workspace is
retained at no charge for the first 31 days.
Related
I am trying to fetch 1 year retention logs from azure log analytics workspace and push it to storage account.
Tried data export option but it is pushing the logs from the time I had enabled Data Export, not the retention logs.
As per my understanding, in your case Data export in Log Analytics workspace starts exporting data(new), formed from the configuration time of Data export rules. So, you are getting the logs formed after enabling the Data export but not the retention logs.
Please check whether you have configured retention time as per your requirement.
As per this Microsoft Doc, Data in a Log Analytics workspace is retained
for a specified period of time after which it's either removed or
archived with a reduced retention fee. Set the retention time to
balance your requirement for having data available with reducing your
cost for data retention.
To know how to set retention time, follow below steps:
Go to Azure Portal -> Your Log Analytics Workspace -> Usage and Estimated Costs -> Data Retention
To do this from PowerShell/CLI, find this link
Data Retention may also differ based on the pricing tier you are using. To know more about that in detail, please go through this link.
You can also try using other export options as mentioned in this Microsoft Doc.
If you are still facing any issue, please find below references if they are helpful.
References:
Troubleshooting why Log Analytics is no longer collecting data
Other options to export data from Log Analytics workspace
Data Collection Rules in Azure Monitor - Azure Monitor | Microsoft Docs
I am trying to find Azure blob size or container size using Diagnostic settings (classic) version 2 logs. I have switched on the Diagnostic settings (classic) version 2 logs for a storage account and trying to analyze storage account activity. But is it possible to find blob or a container size over a period of time
Any help is much appreciated
Diagnostic settings classic 2 Version: Two entities are stored in the $MetricsCapacityBlob table each day, One summarizing storage account blob and container size details and, the other summarizing size details of the $logs container.
Possibility to find blob or a container size over a period of time:
Note that Partition Key : A timestamp in UTC that represents the starting hour for metrics, in the following format: YYYYMMddThhmm. Because data is only reported once per day, hhmm (hour and minutes) will always be 0000. This value is the PartitionKey for all entries in the table. Reference
To enable diagnostic settings you can refer this.
Also See storage analytics metrics/capacity-metrics for more information.
Note:There are costs associated with examining monitoring data in the
Azure portal.
To check Size of a container
Azure portal :
In azure portal ,you can see Usage under Blob Service blade.
Open the blob container, select properties under Settings, then click on the Calculate Size button which gives size for each type of blob and also total size of the container.
Please refer these threads if you may want to use powershell scripts : thread 1 ,thread 2
As mentioned here https://learn.microsoft.com/en-us/azure/security/fundamentals/operational-security#azure-storage-analytics
Does Azure Storage Analytics stop logging in $logs container after reaching the 20TB limit? Or does it automatically delete the older log files to make space for the new ones?
Credit to Sumarigo-MSFT on Azure forums for the answer
By default, Storage Analytics will not delete any logging or metrics data. Blobs and table entities will continue to be written until the shared 20 TiB limit is reached. Once the 20 TiB limit is reached, Storage Analytics will stop writing new data and will not resume until free space is available. This 20 TiB limit is independent of the total limit for your storage account. For more information on storage account limits, see Azure Storage Scalability and Performance Targets.
There are two ways to delete Storage Analytics data: by manually making deletion requests or by setting a data retention policy. Manual requests to delete Storage Analytics data are billable, but delete requests resulting from a retention policy are not billable.
Note: To avoid unnecessary charges, set a retention policy for logging and metrics
You can configure two data retention policies: one for logging and one for metrics. When enabled for both, Storage Analytics will delete logs and table entries older than the specified number of days. The maximum retention period is 365 days (1 year)..
Note: If you disable Storage Analytics for a storage service but a data retention policy is enabled, your old data will continue to get deleted. To avoid accidental data loss, ensure that you configure your data retention policy when enabling and disabling Storage Analytics.
Source: Does Storage Analytics stop logging in $logs after reaching the 20TB limit?
I am trying to check status of my jobs running in Azure Databricks through Azure Log Monitor dashboard but most of the recent data is not visible can someone help me out in getting near real-time monitoring of my jobs
Log Analytics has a delay in sending metrics not just for Databricks but almost any resource in Azure has this delay.
At this time it is:
Data from resource logs take 2-15 minutes, depending on the Azure
service.
Azure platform metrics take 3 minutes to be sent to Log
Analytics ingestion point.
Activity log data will take about 10-15 minutes to be sent to Log Analytics ingestion point.
This is due to just how the data is read and inserted on the backend.
If looking for an increase ability to Monitor Databricks jobs look at integrating your Databricks Notebook with Azure Data Factory to create pipelines
In Metric section of Cosmos we have Number of requests, Max Consumed RU/s per partition Key by collection. How to I send this to OMS so that I can have a nice consolidated dashboard and set altering system
FYI,
OMS portal is transitioned to Azure Portal for Log Analytics Users
The OMS portal will be officially retired on January 15, 2019
Back to your question
Yes, OMS/Log Analytics can be used to collect metrics/Logs from Cosmos DB.
Just configure the Diagnostics logs to the Log Analytics from your cosmos DB
Refer here