I have Azure Data Collection Rule, I have Log Analytics WorkSpace, I have Azure Arc, but when I want Select Data Source Destination, I don't see anyone.
Is somebody who knows where I have error?
Thank you very much.
I tried to reproduce the same in my environment I also get "blank" on my data source destination like below:
This issue may occur, if your Log Analytics workspace is not in same region
To confirm this I have created log analytics with different region it does not appear in my data source destination with central us like below:
When I create Log Analytics workspace with same region, I got my data source destination like below:
Make sure your log analytic and azure arc in same region.
Related
I have an azure blob storage to migrate csv files to azure SQL Database. Data will updated to this Blob storage account every month.
The issue I'm facing is that I don't know the source for that blob storage. For last month the no data is uploaded, so I don't know where the issue occurred. Is there any possibility to find where is the source for that blob?
I couldn't try anything because I couldn't recognize the problem.
I agree with #Eric Qvarnstrom too much information is missing like SAS tokens and where is the data coming from.
Is there any possibility to find where is the source for that blob? and For last month the no data is uploaded,
AFAIK it is not possible to find where the data is coming from. For that you need to check with Source owners or the Stack holders who are placing files in storage.
Also Check the keys shared with source system are expired and if no data is uploaded check with that source data team are they facing any issues in uploading new data this month.
I would suggest you set alerts rule to track the Create/Update Storage Account (Storage Accounts) to notify you next time.
As Shown in below images Under Monitoring blade select Alerts> Create> Alert Rule.
After selecting Create an alert rule under scope select your storage account and under condition select Create/Update Storage Account (Storage Accounts) as shown below.
Refer this MS document to create a new alert rule.
Some one has deleted the file share folder from the storage account in azure . It can be recovered as soft delete is enabled. But how to know that who has deleted the file?
It is possible to view operations within an Azure resource using Resource Logs. This is possible by Monitoring Azure Blob Storage which is a feature of Azure Monitor.
You would first start with creating a Diagnostic Setting- https://learn.microsoft.com/en-us/azure/storage/blobs/monitor-blob-storage?tabs=azure-portal#creating-a-diagnostic-setting
And then view logged activity by using a Log Analytics query or you can go the destination that you are forwarding the logs to as setup in the diagnostics setting and look for the respective API, example- "DeleteBlob" or "DeleteContainer" etc.,
However, if you have not already setup a diagnostic setting already and are forwarding data to a specific destination, it may not be possible to retrieve this information right now. Hope this helps!
I have configured a diagnostic setting in app insight to transfer telemetry data to storage account. I do not want to transfer/migrate user_authenticationId column from pageViews data. How can I prevent it from transferring to storage account using Diagnostic settings.
• Using ‘Diagnostic setting’, it is not possible to exclude columns from being exported through ‘pageViews’ category in application insights. Rather, you can exclude the column ‘user_authenticationID’ with an application insights log filter query by executing the same on the ‘pageViews’ table and then save the query as a function to be executed at a time of your choosing or export the output of that query for a particular timestamp to an excel file or a storage account of your choosing.
Please find the below application insights log query for excluding the column as stated above: -
Also, find the below documentation link for more detailed information on exporting the query results in a storage account and the requirements for the same: -
https://learn.microsoft.com/en-us/azure/azure-monitor/app/export-telemetry
Thus, in this way, you can achieve the desired.
In my azure subscription I have a storage account with a lot of tables that contains important data.
As far as I know azure offers a backup point-in-time for the storages and blobs, and geo redundancy in event of a failover. But I couldn't find anything regarding the backup of table storages.
The only way to do so is by using azCopy which is fine and a logic, but I couldn't make it work as I had some issues with permissions even if I set the Azure Blob Data Contributor to my container.
So as an option, I was thinking if there is a way how to implement this using python code to loop throu all the tables in a specific container and make a copy into another container.
Can anyone enlighten me on this matter please?
Did you set the Azure Storage firewall: allow access from all networks?:
Python code is a way but we can't help you design the code. And there isn't an example for you. It doesn't meet Stack Overflow's guideline.
If you still couldn't figure it out with AzCopy, I would suggest you think about use Data Factory to schedule backup the data from table storage to another container.
Create a pipeline with copy active to copy the data from Table
Storage. Ref this tutorial:Copy data to and from Azure Table
storage by using Azure Data Factory.
Create a schedule trigger for the pipeline to make the jobs
automatic.
If the Table storage has many tables, the easiest way is using Copy Data Tool.
Update:
Copy data tool source settings:
Sink settings: auto create the table in sink table storage
HTH.
I have created a pipeline in Azure data factory (V1). I have a copy pipeline, that has an AzureSqlTable data set on input and AzureBlob data set as output. The AzureSqlTable data set that I use as input, is created as output of another pipeline. In this pipeline I launch a procedure that copies one table entry to blob csv file.
I get the following error when launching pipeline:
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
How can I solve this?
According to the error information, it indicateds that it is not supported action for Azure data factory, but if use Azure sql table as input and Azure blob data as output it should be supported by Azure data factory.
I also do a demo test it with Azure portal. You also could follow the detail steps to do that.
1.Click the copy data from Azure portal.
2.Set copy properties.
3.Select the source
4.Select the destination data store
5.Complete the deployment
6.Check the result from azure and storage.
Update:
If we want to use the existing dataset we could choose [From Existing Conections], for more information please refer to the screenshot.
Update2:
For Data Factory(v1) copy activity settings it just supports to use existing Azure blob storage/Azure Data Lake Store Dataset. More detail information please refer to this link.
If using Data Factory(V2) is acceptable, we could using existing azure sql dataset.
So, actually, if we don't use this awful "Copy data (PREVIEW)" action and we actually add an activity to existing pipeline and not a new pipeline - everything works. So the solution is to add a copy activity manually into an existing pipeline.