When to use azure functions vs azure durable functions when doing ETL at regular interval - azure

I need to run a job at regular 5 min interval where the we will call a web service to get data then transform the data and update the database.
For such scenario should is it best to use azure functions or azure durable functions.

Related

Can Azure notebooks call Azure Functions?

I Googled a bit but question remains: Can Azure notebooks call Azure Functions?
Seems Azure Function dont play well with Data Bricks/Notebooks?
I can think of a way to integrate Azure Functions <--> Data Bricks/Notebooks
My Azure Functions can persist calculation result in SQL server for example. Notebooks simply read from it.
I am however unsure if my approach is most appropriate.
Many thanks
You can use Azure Data Factory to orchestrate your pipeline; ADF can trigger both Databricks Notebooks as well as Function Apps. You can also pass outputs from the Databricks Notebook into the Function App.

Alternative to trigger Azure Functions or AppService based on a table row insert into Azure SQL MI

Is it possible to trigger Azure Functions or AppService webapp whenever an insert operation is performed against a table on Azure SQL MI?
if not, is there a way to trigger applications outside Azure SQL rather than using LogicApp? I want to avoid LogicApp because it requries using one more application, and it is still using polling.
Link below said it is not for Azure functions
https://feedback.azure.com/forums/355860-azure-functions/suggestions/16711846-sql-azure-trigger-support
Link below suggests using LogicApp.
Trigger Azure Function by inserting (adding) new row into table, SQL Server Database
Today, in Azure SQL, there is no such possibility. The closest option is to create a Timer Trigger Azure Function that checks if there has been any changes in the table you want to monitor (using Change Tracking, for example).
If you are using Azure SQL MI instead, you could create a SQLCLR procedure that calls an Azure Function via an HTTP request or, another option, via Azure Event Hubs or Azure Event Grid
There have been several feature requests for triggering Azure functions based on changes to at Azure SQL database. For example:
https://github.com/Azure/azure-functions-python-worker/issues/365
It seems that they are not prioritizing it, since it is possible to implement this functionality using logic apps.

Read records from large xlsx/csv files and post it to azure service bus topic

We receive large feed files(excel/csv) of ~5gb size every night, need to read and have to post one record after another from the file to azure service bus topic. Is it possible using Azure Data Factory, if yes, how? or any other better azure based solution available? please suggest.
Data Factory doesn't support Azure Service Bus topic. So it's impossible to using Azure Data Factory
For more details, please see : Supported data stores and formats.
I think Azure Function may can help you achieve it.
You can reference :
Azure Blob storage bindings for Azure Functions.
This article explains how to work with Azure Blob storage bindings in Azure Functions. Azure Functions supports trigger, input, and output bindings for blobs. The article includes a section for each binding: Blob trigger, Blob input binding,Blob output binding.
Azure Service Bus bindings for Azure Functions. This article explains how to work with Azure Service Bus bindings in Azure Functions. Azure Functions supports trigger and output bindings for Service Bus queues and topics
Here is a blog about Copy data from Azure Service Bus Topic to a Storage Account inside Azure.
We don't know how much time it will take. Azure Functions has two kinds of pricing plans:Consumption plan and App Service plan.
Azure Functions in a Consumption plan are limited to 10 minutes for a single execution. In the Premium plan, the run duration defaults to 30 minutes to prevent runaway executions. However, you can modify the host.json configuration to make this unbounded for Premium plan apps.
For more details, please reference: Azure Functions Premium plan.
Maybe Azure Function is not fit for this long process, but can help you execute this long process.
Hope this helps.

Use Azure Functions as custom activity in ADFv2

Is it possible to somehow package and execute already written azure function as a custom activity in azure data factory?
My workflow is next:
I want to use azure function (which is doing some data processing) in ADF pipeline as a custom activity. This custom activity is just one of the activities in pipeline but its key to be executed.
Is it possible to somehow package and execute already written azure
function as a custom activity in azure data factory?
As I know, there is no way to do that so far. In my opinion, you do not need to package the Azure Function. I suggest you using Web Activity to invoke the endpoint of your Azure Function which could merge into previous pipeline nicely.

Azure Data Lake Store and Azure SQL with WebJob/Azure Function

I need to upload WEB API response files into Azure Data Lake.
Then I have to dump those files into Azure SQL tables.
Above both processes must be scheduled to execute on hourly basis.
Should I use Azure Web Jobs or Azure Function.
Azure Data Factory is probably the better mechanism to drive this recurring hourly pipeline. More details here.
https://learn.microsoft.com/en-us/azure/data-factory/data-factory-scheduling-and-execution
If you are running Azure Functions on Consumption plan, the function call must complete within 5 minutes, which might be not enough for big data sets.
For the rest Functions and Web Jobs are similar for your scenario. Functions are actually running on top of Web Jobs. And if you don't need any advanced features of Functions (e.g. bindings), I would go for a Job.

Resources