Data transfer from Azure event hub to Azure Redis Cache - azure

We've have product data in Azure event hub which is coming from external system, now our requirement is to send this data to Azure Redis cache from event hub.
Is there any Out of the box way or standard function in Azure to implement it.
Thanks,
Kuldeep

There is no out-of-the-box support, but it should be very easy to achieve this with an EventHub-triggered Azure Function that writes into Redis.

Related

Azure Hub IoT to Azure Function Pricing

I'm making a comparison between different Cloud vendors for IoT solutions.
I'm now on Azure IoT Hub, which will ingest data from IoT devices (we want only to send from devices to cloud, say through MQTT, and not receive anything back). My aim is to pass these data to an Azure Function, execute some computation, and save them in some DB (e.g.CosmosDB). Here my doubts:
Which MQTT messages Do I have to consider for the billing? (only those with telemetry or which others?)
I saw that there is the possibility of going from Hub IoT to Azure Function with a built-in endpoint. Is it free? I'm afraid of some hidden costs, like those of Event Hubs or the built-in endpoint.
The sending of data from IoT Hub to Azure Function is considered as a normal "cloud-to-device" messages (and consequently they're billed), or are they free? For example IoT Hub ingest 10 messages (<4KB) and forward them to Functions. Do I pay them as 10 messages or 20 messages?
Thanks in advance for your help.
Which MQTT messages - You have to consider the below list of messages for Azure IoTHub Billing, they are
Device to Cloud
Cloud to Device
Method execution
Jobs
Detailed break up of billing information by each operation
https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-pricing#charges-per-operation
Nothing free in the cloud :)
Data read from IoTHub to any other service attracts the regular data transfer cost. Refer to the below link
https://azure.microsoft.com/en-in/pricing/details/bandwidth/
All the pricing information detailed here https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-pricing is specific to Azure IoTHub for D2C & C2D. Any data egress from IoTHub to other Azure Services is as per the regular data transfer charges.
https://azure.microsoft.com/en-in/pricing/details/bandwidth/

Can Azure Event Hub be configured in a Pub/Sub fashion to another Event Hub without coding?

We are looking to consume data from another Event hub not in our Tenant and want to ensure a guarantee delivery. Can an Event Hub be configured to connect to another Event Hub without build an Azure Function, Databricks process or some other solution OOTB?
Today we are planning to setup the consumer Hub using either the HTTP or AMQP protocols and have the producer push via some code.
From what I have read Stream Analytics could do this but doesn't sound like it's reliable and would prefer to leverage a feature before building out a solution.
I would suggest you to have a look at Azure Eventgrid:
It can ingest data from an Eventhub: Azure Event Hubs as an Event Grid source
It can send data to an EventHub: Event hub as an event handler for Azure Event Grid events
It doesn't require coding but will still require configuration.

How to connect Service bus queue to Azure Data Factory as source?

I have a service bus configured where i get messages in my subscription, I would like to move the data from the messages in the service bus queue to table storage using Azure Data factory, I would like to know if it is possible to do it.i couldn't find any online resource which talks about service bus as a source so would like to know if anyone has any experience here.
Thanks
This is the answer I got from Microsoft, so posting it here for community>
Data Factory does not have a connector for Service bus. However there
are several options available to you.
You can create a consumer for Data Factory to call upon.
You can raise a feature request in the feedback forum.
You could re-route your messages to be written to blob, and then
leverage the Blob Event Trigger.
Use ADF Web Activity to retrieve a message.
By "create a consumer for Data Factory to call upon," I mean either
create a Function App which batch-reads the messages, and returns
them, utilizing ADF Azure Function, or , create some code to do the
same with the ADF Batch Service Custom Activity. There are more
variations as well.
Which one to use, depends upon your volume and cadence (frequency).

Is there a way to "stream" data from Google Cloud Platform to Microsoft Azure?

I am using GCP to manage IOT devices using IOT Core. The incoming telemetry triggers a cloud function which stores the data in Firestore.
I have been asked to send the telemetry to an Azure SQL database. I am not familiar with Azure but, with the products that both GCP and Azure provide, there must be a way to get this right.
The device sends an update to GCP once per minute.
My initial thought was to use a cloud function to "pass" the data on to Azure when it is received in GCP.
Thanks in advance
Azure have many IoT services but for a message per minute, they are overkill for your scenario. While they are many ways to achieve your goal, one would be to create a Google Cloud Function that will send the data each minute to an Azure Function. That function would then save the data to your Azure SQL Database.

Service Bus or Queue Processing

Probably a stupid question.
Generally what is the best approach to have a program listening to an MQTT feed (done), placing messages onto a queue or service bus and then have those automated processed via Azure?
How would I process the messages on a queue? Is there a way for some Azure function/feature to automatically then put that into a storage account and a database after some manipulation? Generally what's the best approach? Ideally using C#.
Feed listens for data feeds (done)
Puts message onto queue or service bus (easily done)
Something on Azure will take that item and put it on a Storage Account and Cosmos database. (stuck on best appoach)
Thanks.
You just need to add a message no a Service Bus Queue or Storage Account Queue. Both provide bindinds for Azure Functions, which would be the consumer. Also using Azure functions, you can use output bindings and persist to Storage Account (blob) or Cosmos DB.
Here are useful links:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb?tabs=csharp
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus

Resources