Can azure event hub ingest json events from azure blog storage without writing any code? - azure

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.

Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

Related

How to push a file from Azure Data Factory onto a MS Teams Channel

At the moment, I am having to routinely manually upload a file onto a Teams Channel.
I have managed to create a pipeline to upload the file into my Azure Data Lake. I would now like to push the file from my Azure environment to my Teams Channel. I have found that webhooks cannot work with files and that bots can send files in the chat but not "Upload" them into a channel.
Is there a way to upload files from Azure to MS Teams using Data Factory or other alternatives?
Thank you.
The most appropriate way to partially achieve this would be to use a Teams Adaptive Card Connector but I couldn't find any way to easily set it up. It seems quite complex. Second option would be to use a Teams Post Message Connector from Azure Logic App but unfortunately it might not yet support attachments but you can send links to files stored elsewhere (SharePoint, Blob etc). Here's what you can try in the meantime.
In ADF:
Create a Copy Activity that will send the files to a Storage.
Create a Web Activity that will send a notification to a Logic App when the pipeline. You only need to use this as a trigger for the Logic App.
In Logic Apps:
Create an HTTP Connector that will receive the pipeline run information the files using the Azure Storage Blob Create blob action.
After that, create a Microsoft Teams Post a message (V3) Teams Connector, choose your Team and Channel
Use the Get blob content using path action to get a URL to the file - you might need to construct this URL.
Create your message using variables from the above connector and parse a link to the file to be downloaded.
Depending on the content of the file, you could also try to retrieve partial content and display it in the message body directly (if you can consider that an alternative for you).
I've not tried using the Adaptive Card Connector but I know it does give you a far richer dynamic experience. You would need to spend some time to design a custom card solution. Use this playground to see if it's something you can explore in the future.
The flow is as follows and kinda runs in parallel:
ADF Pipeline runs > ADF Copy Activity saves file in Blob storage
ADF Pipeline runs > ADF Web Activity triggers Logic App HTTP Connector > Logic App retrieves file from Blob storage > Sends a message to a Teams channel with a link to the file.
Here are all the supported Teams Actions.

How to events from azure event hub to azure blob storage using logic app?

I want to move events from azure event hub to Azure blob storage using the logic app, can anyone suggest to me how to use with logic app connector, triggers, and action with the designer example.
Note:
Events are json events and need to be stored in blob storage.
You can start with a blank logic app, and use the search assistant to find what you're looking for.
Typing event hub gives:
where you can provide the connection by providing the name.
Save the content in a variable.
You can use SaveInitialJsonToBlobStorage to now store this json in a blob storage:

How to make code on an Azure VM trigger from storage blob change (like Functions do)

I've got some image processing code that I need to run in Azure. It's perfect for an Azure Function, but unfortunately requires a component with a complex installation procedure and therefore will need to run in a VM.
However, I'd like to make it behave much like an Azure Function, and trigger whenever new items arrive in blob storage.
My question is: Does Azure provide me with any handy way of doing this, or do I have to write code that polls the blob storage looking for new items?
Have a look at Azure WebJobs SDK. It shares API model with Functions, but you can host it in any .NET application. Blob Trigger.

Notify email when Azure Storage table gets new entry

Is there an inbuilt way to notify an email when a new entry is added to the Table?
I am asking for anything programatically just within their own UI
Not currently but you could put it on an Azure Storage Queue and process it to Table Storage and send an Email with Azure Functions.
Check out this page what is possible - https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings
Okay, so the easiest way to see the data is to use their desktop app
Use https://azure.microsoft.com/en-us/features/storage-explorer/
The lastest Azure product updates covering event-driven applications suggest to adopt these application patterns to react to events published by Azure Storage. These docs/resources might help to explore the application pattern and current platform/framework support.
Azure Storage
Reacting to Blob storage events (preview)
Did not found anything similar for Azure Tables - what about suggesting this on UserVoice?
CosmosDB
Working with the change feed support in Azure Cosmos DB

Where is Azure Event Hub messages stored?

I generated a SAS signature using this RedDog tool and successfully sent a message to Event Hub using the Events Hub API refs. I know it was successful because I got a 201 Created response from the endpoint.
This tiny success brought about a question that I have not been able to find an answer to:
I went to the azure portal and could not see the messages I created anywhere. Further reading revealed that I needed to create a storage account; I stumbled on some C# examples (EventProcessorHost) which requires the storage account creds etc.
Question is, are there any APIs I can use to persist the data? I do not want to use the C# tool.
Please correct me if my approach is wrong, but my aim is to be able to post telemetries to EventHub, persist the data and perform some analytics operations on it. The telemetry data should be viewable on Azure.
You don't have direct access to the transient storage used for EventHub messages, but you could write a consumer that reads from the EventHub continuously and persist the messages to Azure Table or to Azure Blob.
The closest thing you will find to a way to automatically persist messages (as with Amazon Kinesis Firehose vs Amazon Kinesis which EventHubs are basically equivalent to), would be to use Azure Streaming Analytics configured to write the output either to Azure Blob or to Azure Table. This example shows how to set up a Streaming Analytics job that passes the data through and stores it in SQL, but you can see the UI where you can choose a choice such as Azure Table. Or you can get an idea of the options from the output API.
Of course you should be aware of the requirements around serialization that led to this question
The Event Hub stores data for maximum of 7 days; that’s too in standard pricing tier. If you want to persist the data for longer in a storage account, you can use the Event Hub Capture feature. You don’t have to write a single line of code to achieve this. You can configure it through Portal or ARM template. This is described in this document - https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-capture-overview
The event hub stores it’s transient data in Azure storage. It doesn’t give any more detail in relation to the data storage. This is evident from this documentation - https://learn.microsoft.com/en-us/azure/event-hubs/configure-customer-managed-key
The storage account you need for EventProcessorHost is only used for checkpointing or maintaining the offset of the last read event in a partition.

Resources