How to make code on an Azure VM trigger from storage blob change (like Functions do) - azure

I've got some image processing code that I need to run in Azure. It's perfect for an Azure Function, but unfortunately requires a component with a complex installation procedure and therefore will need to run in a VM.
However, I'd like to make it behave much like an Azure Function, and trigger whenever new items arrive in blob storage.
My question is: Does Azure provide me with any handy way of doing this, or do I have to write code that polls the blob storage looking for new items?

Have a look at Azure WebJobs SDK. It shares API model with Functions, but you can host it in any .NET application. Blob Trigger.

Related

Using a new azure webjobs storage for the azure function

we have a set of blob trigger functions and we are planning to use a new azure webjobs storage for these azure functions. My question is: since the new storage account doesn't have any track of the already processed file, will the blobs be reprocessed? If yes, can we avoid this reprocessing and in which?
I think you're talking about Blob Receipt feature.
When you're using a new azure webjobs storage for the azure function, it definitely re-process the already processed file. This is by design.
The only way I can think of is that, when using a new azure webjobs storage, you can add a list which contains all the processed files in your function code, and when the code detects the file is already processed, then do nothing with it.

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

Notification about changes in azure storage

Currently, I work on a task to sync files inside azure with file-storage on a custom data center. I need the way to get a notification if something changes inside Azure file storage.
For example, for AWS I can configure notification through lambda function. Is there any similar way to do this in Azure?
As of today, this feature is not there as Azure Files binding is not supported. There is an open ticket on Github regarding this: https://github.com/Azure/azure-webjobs-sdk-extensions/issues/14. It is available for Blob Storage though (that's why I asked in my comment).
For a list of available bindings, please see this: https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings.

Azure Functions - Table Storage Trigger with Azure Functions

I need a way to trigger the Azure functions when an entity is added to the Azure Table storage. Is there a way to do this ? When I tried to add a new Azure function, I did not see any Azure Table storage trigger. I see there is Queue and Blob triggers available.
If there is no support for the Azure table storage trigger, then should I need to have a Http trigger and have the Azure Table storage as input binding ?
Thanks
There is no trigger binding for Table Storage.
Here's a detailed view on what is supported by the different bindings available today:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings#overview
If there is no support for the Azure table storage trigger, then should I need to have a Http trigger and have the Azure Table storage as input binding ?
Yes, this approach would work and would allow you to pass the table data as an input while relying on a separate trigger. Depending on the type of clients you are working with, and your requirements, using a queue trigger is also another good option.
#venki What the Fabio Cavalcante said to you is really true. Azure Function doesn't have a trigger option for Storage Table. But, whether your business needs store the data into the Storage Table and you as a Developer decide to use Azure Function into your architecture, you're able to configure you Function to use data that will come from Storage Table as a Input to your Function! This works really well.
But, There is another way to configure your Function to have "automagically" trigger, using Storage Queue (for small business) or Service Bus (for a business that needs a mechanism more robust)

Can Azure Data Factory write to FTP

I want to write the output of pipeline to an FTP folder. ADF seems to support on-premises file but not FTP folder.
How can I write the output in text format to an FTP folder?
Unfortunately FTP Servers are not a supported data store for ADF as of right now. Therefore there is no OOTB way to interact with an FTP Server for either reading or writing.
However, you can use a custom activity to make it possible, but it will require some custom development to make this happen. A fellow Cloud Solution Architect within MS put together a blog post that talks about how he did it for one of his customers. Please take a look at the following:
https://blogs.msdn.microsoft.com/cloud_solution_architect/2016/07/02/creating-ftp-data-movement-activity-for-azure-data-factory-pipeline/
I hope that this helps.
Upon thinking about it you might be able to achieve what you want in a mildly convoluted way by writing the output to a Azure Blob storage account and then either
1) manually: downloading and pushing the file to the "FTP" site from the Blob storage account or
2) automatically: using Azure CLI to pull the file locally and then push it to the "FTP" site with a batch or shell script as appropriate
As a lighter weight approach to custom activities (certainly the better option for heavy work).
You may wish to consider using azure functions to write to ftp (note there is a time out when using a consumption plan - not in other plans, so it will depend on how big the files are).
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function
You could instruct data factory to write to a intermediary blob storage.
And use blob storage triggers in azure functions to upload them as soon as they appear in blob storage.
Or alternatively, write to blob storage. And then use a timer in logic apps to upload from blob storage to ftp. Logic Apps hide a tremendous amount of power behind there friendly exterior.
You can write a Logic app that will pick your file up from Azure storage and send it to an FTP site. Then call the Logic App using a Data Factory Web Activity.
Make sure you do some error handling in your Logic app to return 400 if the ftp fails.

Resources