I'm using the Copy Data task in Data Factory to copy data from CSV files in Azure Files to a SQL Azure DB.
Within the task there is a setting called Fault tolerance which can be set to skip and log incompatible rows which writes an error log to Azure Blob Storage.
However I'd like the errors picked up from the file to be emailed to a user to action and also store the list of errors in a DB rather than a log file in blob storage.
All features of Fault tolerance are established, no such email alert mechanism in that. However you could use workaround to implement your requirements.
Blob Trigger Azure Function to monitor the blob path you configured in the fault tolerance. Once the error logs streams into your blob file, you could collect the log and use send email sdk(For example,you could just configure the output as SendGrid service in MS) to the destinations you want.
As for store errors into DB,you could create another trigger function to configure the output as Table Storage.
Just a reminder,ADF has own monitor and alert mechanism. It's for all pipelines in ADF,not specific for copy activity. You could get an idea of it from this link.
Related
I am fairly new to Azure .
I have a requirement where Source will send the event data in flat files. File will contain header and trailer records and events as data records. Each file will be in 10MB size and can contains about 50000-60000 events.
I want to process this file using python/scala and send the data into Azure eventhub. Can someone suggest me is this the best solution and how can I achieve this please?
Its an architectural question but you can use either Azure Logic Apps or Azure Functions.
First of all you should trigger whatever you choose by upload a file to Blob Storage. The file will gets picked and processed and then sent.
Use Azure Logic apps if you can simply parse the files for instance because they are JSON files and then simply repeat for each event and direct it to the event hub you want.
If the parsing of the files is more complex use Azure Functions, write up the code and output it to an event hub.
I have a Blob Storage, and an Azure SQL DB.
When I upload a text file to my Blob Storage, says users.txt which contains list of users I need to import to User table in my SQL DB.
Is there a way that whenever a file arrive to Blob Storage, it will trigger an event. That event will trigger another event to import data to SQL DB(I don't know, but may be an Azure function, Logic App...). Therefore I don't need to write any code. Is that possible? If so, could you please let me know step by step how to do it?
Any help would be highly appreciated!.
Thanks!.
Teka a look at Azure Blob storage trigger for Azure Functions, which describes how you can use a "blob added" event to trigger an Azure Function. You can do something like below.
[FunctionName("SaveTextBlobToDb")]
public static void Run(
[BlobTrigger("container-with-text-files/{name}", Connection = "StorageConnectionAppSetting")] Stream streamWithTextFile)
{
// your logic for handling new blob (streamWithTextFile)
}
In the implementation, you can save the blob content to your SQL database. If you want to make sure that the blob is not lost due to any transient errors (like issues with db connectivity), you can first put the info about new blob to an Azure storage queue, and then have a separate Azure Function to take each blob-info from the queue and transfer the content to the database.
One solution that comes to mind, other than the options you already know, is Azure Data Factory. It is a kind of ETL tool for the cloud. It allows you to set up pipelines for data processing with defined inputs and outputs. In your scenario the input would be a blob and the output would be a Sql Server database record.
You can trigger the pipeline to be executed in the event a new blob is added. The docs even have an example showing just that, you can find it here.
In your case you can probably use the Copy Activity to copy the data from the blob to sql server. A tutorial titled "Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool" is found here
An Azure Function will do the job as well but will involve coding. A Logic App is also a good option.
You answered your question...azure function or logic app. You can declaratively bind to your blob within an azure function, you can use the blob trigger on a logic app as well. Someone suggested data factory (this would necessarily be the most expensive option).
I am trying my best to solve following scenario.
I am using PowerShell scripts to collect some information about my server environments and saving like .csv files.
There are information about Hardware, Running Services etc. in the .csv files.
I am sending these .csv files into Blob Storage and using Azure Data Factory V2 Pipelines to write these information into Azure SQL. I have succesfully configured mail notification via Azure Logic Apps that is informing me the Pipeline Run was succesfull/unsuccesfull.
Now I am trying to lookup into source data to find concrete column. In my scenario it is column with the name of Windows Service - for example - Column: PrintSpooler - Row: Running.
So I need to lookup for concretely column and also send a mail notification if the service is running or it is stopped.
Is there any way how to do that ?
In ideal way I want to receive a mail only in case the Service in my Source Data is stopped.
Thank you for any ideas.
Do you update the .csv file or upload a new .csv file?
If you upload a new .csv, then you can use azure function blob trigger.
This trigger will collect the new upload blob and you can do process on this blob. You can get the data in the .csv file and create an alert to your email.
This is the offcial document of azure function timetrigger:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-scheduled-function
In the blobtrigger, you can search whether there is a value in the .csv file and then you can set an output binding.
And then, go to this place:
Then you will get the alert in your email when the data in csv file is meet your requirement.
Using Log Analytics, is it possible to search thru data stored in a container inside an Azure storage account? We have an Azure Function that reaches out to an API in O365 for log data and then it pushes that data into a storage account. We would like to be able to query this data.
We can push content inside your container to log analytics workspace repository using something called log analytics http data collector API.
We need to build your own integration of sending container content to log analytics by leveraging http data collector API.
You may refer to the suggestion mentioned in the article
https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api
Additional information: - Azure Functions
- Azure Automation
- Logic App
With any of these what you will do is have some schedule that will run on certain interval. When it is ran you will execute query against Log Analytics to get data. The results from the query you will transfer to Azure Storage may be as blob. You might have to do some transformation on the data depending on your scenario. The most important that you have to make sure is that you do not miss data or upload the same data twice to the storage. Log Analytics query language allows you to specify time frame for the results. I hope this will help you.
Kindly let us know if the above helps or you need further assistance on this issue.
I have an event hub which receives telemetry data from different devices. I created a stream analytics job to process this data and output it to various sinks (Power BI, Cosmos DB and Data Lake). While creating the data lake output I found that I couldn't set the output path based on the message payload. The path I can set inside the sink is of the format: [folder_structure]/{date}{time}. I need a very specific folder structure which would check the message payload and put the file in the specified location. Is there any way to do that?
This capability is currently available in private preview - for output to blob storage.
https://azure.microsoft.com/en-us/blog/4-new-features-now-available-in-azure-stream-analytics/
If this is something you can use, please provide details at the following url. we will add you to the preview program.
https://forms.office.com/Pages/ResponsePage.aspx?id=v4j5cvGGr0GRqy180BHbR8zMnUkKzk5Elg9i6hoUmJVUNDhIMjJESFdVNDhRODNMTVZTNDVIR0w2Qi4u