Azure Data Factory Lookup Source Data and Mail Notification - azure

I am trying my best to solve following scenario.
I am using PowerShell scripts to collect some information about my server environments and saving like .csv files.
There are information about Hardware, Running Services etc. in the .csv files.
I am sending these .csv files into Blob Storage and using Azure Data Factory V2 Pipelines to write these information into Azure SQL. I have succesfully configured mail notification via Azure Logic Apps that is informing me the Pipeline Run was succesfull/unsuccesfull.
Now I am trying to lookup into source data to find concrete column. In my scenario it is column with the name of Windows Service - for example - Column: PrintSpooler - Row: Running.
So I need to lookup for concretely column and also send a mail notification if the service is running or it is stopped.
Is there any way how to do that ?
In ideal way I want to receive a mail only in case the Service in my Source Data is stopped.
Thank you for any ideas.

Do you update the .csv file or upload a new .csv file?
If you upload a new .csv, then you can use azure function blob trigger.
This trigger will collect the new upload blob and you can do process on this blob. You can get the data in the .csv file and create an alert to your email.
This is the offcial document of azure function timetrigger:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-scheduled-function
In the blobtrigger, you can search whether there is a value in the .csv file and then you can set an output binding.
And then, go to this place:
Then you will get the alert in your email when the data in csv file is meet your requirement.

Related

How do I compress data in an Azure Logic App to send in an email?

I have a simple logic app that runs a daily SQL report, converts the data into a CSV, then attaches the CSV to an email. (see image)
The problem is, that the data is getting too big for the email server to allow. So, if I can compress the CSV, or convert it to an XLS, it'll be small enough for the email server to handle it.
Can I get this done without writing to blob storage or any other storage system? I did find a 3rd party action by Encodian which I might be able to use, just can't figure out the details.
Can I get this done without writing to blob storage or any other storage system? I did find a 3rd party action by Encodian which I might be able to use, just can't figure out the details.
You can use encodian's Convert Excel Action.
In order to set up the connection, you need to add the API key choosing the Encodian subscription that fits your needs and budget.
For more information, you can refer Encodian Flowr for Azure Logic Apps
Alternatively, You can use Plumsail's Csv to Excel action.
while adding the connector you can generate an API key from Plumsail Account API Key.
REFERENCES:
Convert Excel and CSV Files
How to convert CSV files to Excel

Process event files into Azure EventHub

I am fairly new to Azure .
I have a requirement where Source will send the event data in flat files. File will contain header and trailer records and events as data records. Each file will be in 10MB size and can contains about 50000-60000 events.
I want to process this file using python/scala and send the data into Azure eventhub. Can someone suggest me is this the best solution and how can I achieve this please?
Its an architectural question but you can use either Azure Logic Apps or Azure Functions.
First of all you should trigger whatever you choose by upload a file to Blob Storage. The file will gets picked and processed and then sent.
Use Azure Logic apps if you can simply parse the files for instance because they are JSON files and then simply repeat for each event and direct it to the event hub you want.
If the parsing of the files is more complex use Azure Functions, write up the code and output it to an event hub.

How to convert excel file to CSV using SSIS Script or Azure Logic apps task from One Azure blob container to another

Receiving an excel file every day with one sheet(Sheet name will be the different every time) and that will be stored in Azure Blob container and is there any possibility to convert the excel to CSV either by using SSIS script Task or Azure Logic Apps.
Any help would be appreciated. Thank you.
There are many ways we can do that.
With Logic app, you could ref the answer here:
Converting should be pretty easy. On high level, you can do following:
Use Excel connector to read into the content of the excel file
Use Data Operations - Create CSV Table to create a CSV format populated with dynamic data from step #1
Use Azure Blob Connector to create and save the new csv file on the blob storage
Since the excel is stored in Blob Storage, I would suggest you use Data factory, it supports Excel file directly:
Create the Source dataset:
Create Sink dataset: set the new csv file name:
Copy active overview:
It works well and very easy and directly:

Data Factory Email Errors when found on rows

I'm using the Copy Data task in Data Factory to copy data from CSV files in Azure Files to a SQL Azure DB.
Within the task there is a setting called Fault tolerance which can be set to skip and log incompatible rows which writes an error log to Azure Blob Storage.
However I'd like the errors picked up from the file to be emailed to a user to action and also store the list of errors in a DB rather than a log file in blob storage.
All features of Fault tolerance are established, no such email alert mechanism in that. However you could use workaround to implement your requirements.
Blob Trigger Azure Function to monitor the blob path you configured in the fault tolerance. Once the error logs streams into your blob file, you could collect the log and use send email sdk(For example,you could just configure the output as SendGrid service in MS) to the destinations you want.
As for store errors into DB,you could create another trigger function to configure the output as Table Storage.
Just a reminder,ADF has own monitor and alert mechanism. It's for all pipelines in ADF,not specific for copy activity. You could get an idea of it from this link.

Azure Data Factory and SharePoint

I have some Excel files stored in SharePoint online. I want copy files stored in SharePoint folders to Azure Blob storage.
To achieve this, I am creating a new pipeline in Azure Data factory using Azure Portal. What are possible ways to copy files from SharePoint to Azure blob store using Azure Data Factory pipelines?
I have looked at all linked services types in Azure data factory pipeline but couldn't find any suitable type to connect to SharePoint.
Rather than directly accessing the file in SharePoint from Data Factory, you might have to use an intermediate technology and have Data Factory call that. You have a few of options:
Use a Logic App to move the file
Use an Azure Function
Use a custom activity and write your own C# to copy the file.
To call a Logic App from ADF, you use a web activity.
You can directly call an Azure Function now.
We can create a linked service of type 'File system' by providing the directory URL as 'Host' value. To authenticate the user, provide username and password/AKV details.
Note: Use Self-hosted IR
You can use the logic app to fetch data from Sharepoint and load it to azure blob storage and now you can use azure data factory to fetch data from blob even we can set an event trigger so that if any file comes into blob container the azure pipeline will automatically trigger.
You can use Power Automate (https://make.powerautomate.com/) to do this task automatically:
Create an Automated cloud flow trigger whenever a new file is dropped in a SharePoint
Use any mentioned trigger as per your requirement and fill in the SharePoint details
Add an action to create a blob and fill in the details as per your use case
By using this you will be pasting all the SharePoint details to the BLOB without even using ADF.
My previous answer was true at the time, but in the last few years, Microsoft has published guidance on how to copy documents from a SharePoint library. You can copy file from SharePoint Online by using Web activity to authenticate and grab access token from SPO, then passing to subsequent Copy activity to copy data with HTTP connector as source.
I ran into some issues with large files and Logic Apps. It turned out there were some extremely large files to be copied from that SharePoint library. SharePoint has a default limit of 100 MB buffer size, and the Get File Content action doesn’t natively support chunking.
I successfully pulled the files with the web activity and copy activity. But I found the SharePoint permissions configuration to be a bit tricky. I blogged my process here.
You can use a binary dataset if you just want to copy the full file rather than read the data.
If my file is located at https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV, the URL I need to retrieve the file is https://mytenant.sharepoint.com/sites/site1/libraryname/folder1/folder2/folder3/myfile.CSV')/$value.
Be careful about when you get your auth token. Your auth token is valid for 1 hour. If you copy a bunch of files sequentially, and it takes longer than that, you might get a timeout error.

Resources