Azure Service Fabric routing - azure

I would like to get some recommendation, for designing a routing of IoT messages in Azure.
I have following scenario:
Senors sending messages to Azure IoT Hub in Google Protobuf format. Depending of the type of a message, I want to route the message to different applications inside a service fabric.
My current approach is to use a service fabric application to receive all messages from the IoT hub, parse the protobuf message, send the message depending on their type (attribute inside the protobuf) to an type-specific Azure event hub. Now the applications fetches the messages from their "own" event hub and process the messages.
I'm not sure if this is the best approach. I don't like the fact, to have one event hub for each type of message. Service Bus Topics are probably not an option, because I have a lot of messages (~30k per second).
Do I realy need a event hub, to decoupling this process, or does it make sense, to send the messages from the "routing application" direct to the different "type applications"?
What do you think?
Regards,
Markus

If you really need high performance you should take a look at IoT Hub and Event Hubs. Azure Event Hubs is a highly scalable data streaming platform and event ingestion service capable of receiving and processing millions of events per second. Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters.
In other hand if you need only 30k messages per second you can go with Premium Messaging.
Comparison of Azure IoT Hub and Azure Event Hubs
Premium Messaging: How fast is it?
What is Event Hubs?

Related

Fundamental difference between Azure Event Hub and Azure Service Bus?

Apart from Azure service bus uses Topics and Azure Event Hub is based on Events - is there any Fundamental difference between Azure Event Hub and Azure Service Bus?
To me, there is no real difference between events and messages as both are just a different type of Json.
Even though you are dealing with JSON in both services, there is fundamentally difference between the two.
Azure Event Hubs focuses more on event streaming whereas Azure Service Bus is more focused on high-value enterprise messaging, which denotes Azure service is focused on messages rather than events.
In azure service bus With a topic, every consumer that subscribed to the topic will get each message which means each message is picked up by only 1 consumer. In case of event hub you can have multiple consumers.
You can read from the docs page here

Can Azure Event Hub be configured in a Pub/Sub fashion to another Event Hub without coding?

We are looking to consume data from another Event hub not in our Tenant and want to ensure a guarantee delivery. Can an Event Hub be configured to connect to another Event Hub without build an Azure Function, Databricks process or some other solution OOTB?
Today we are planning to setup the consumer Hub using either the HTTP or AMQP protocols and have the producer push via some code.
From what I have read Stream Analytics could do this but doesn't sound like it's reliable and would prefer to leverage a feature before building out a solution.
I would suggest you to have a look at Azure Eventgrid:
It can ingest data from an Eventhub: Azure Event Hubs as an Event Grid source
It can send data to an EventHub: Event hub as an event handler for Azure Event Grid events
It doesn't require coding but will still require configuration.

Azure messages pattern

Right now, on IoT Hub there is an information that limit for messages per day 8000. I would like to ask you about any patterns which are being used in Azure.
I am curious if I am able to hit to Azure with some service outside Messages in order to prevent it from being overloaded by big amount of data, or save some confidentiality for this service.
For example, I would like to store some data from given service to Messages that are not being confidential and other data by using some WebSocket or any Rest protocol. I think that there are some patterns that serve that scenarios.
Does anyone has experience with that kind of situation?
Not everything needs to go through IoT Hub. IoT Hub is great for two way communication to/from IoT devices. You could also look at Event Hubs for ingestion from devices that don't need two way comms. We have a write up on the differences here Connecting IoT Devices to Azure: IoT Hub and Event Hubs.

Is it possible to reuse Connections on Azure Functions when sending Device-to-Cloud messages to IoTHub?

I have an Azure IoTHub with thousands of devices registered. These devices communicate through a Telco provider who sends messages through an Azure Storage Queue. This Storage Queue triggers an Azure Function which needs to parse the messages and Send an Event to the IoTHub as below.
Currently, we use the Azure IoTHub SDK to create a DeviceClient for each payload and we send the event. Because the DeviceClient represents a device in the IoTHub and is carrying the context of the source of the events, we are having to recreate a device client for each event. This quickly exceeds the threshold of the number of Connections allowed on Azure Functions.
We have tried using the IoTHub Output bindings for Azure Functions, but could not get to work and I do not think it would work because we need to make sure that the events get to the IoTHub with the right context (messages are sent by the right device).
What's the right way to solve this? Can the connections to the IoTHub be reused? Should we abandon Azure Function in favour of something else?
I assume that Telco is some kind of custom device management solution(vendor lock solution), that can also communicate with the device and receive the device telemetry, and eventually forward it to the specified endpoint, correct?
If I may ask and if my assumption is correct, why do you need to deliver the events to IoT Hub, if you are not managing Telco devices through IoT Hub(the arrows on your diagram are only in one direction)?
Using the IoT Hub just as a message broker for essentially cloud-to-cloud communication is not beneficial if that is the only purpose. Also conceptually what you described is cloud-to-cloud communication, and IoT Hub is intended to be used for devices.
Here is what I would do. Setup the API Management(or http triggered Azure Function) as a front door for Telco and pass the messages to the Event Hub.
You can choose here to pass request body for example where your telemetry data is - I assume again.
Keep the IoT Hub, and setup the routing to previously created Event Hub.
Now, in case you have devices that are not vendor locked and that can talk directly to IoT Hub, messages will be re-routed to Event Hub. Also Telco device messages will be routed to exactly the same Event Hub.
Now you can have for example Azure Stream Analytics that can analyze data stream just from the Event Hub, and for both, Telco devices and potentially non-Telco devices.
After trying a few things, I ended up moving away from using the SDK for pushing messages to IoT Hub. This is because the SDK uses AMQP, and creating a DeviceClient for each payload is not viable.
We switched to using HTTPS instead to push the messages to IoT Hub and using HttpClientFactory, we are able to do connection pooling.
I thought I would put this here in case someone has the same issue.
Here is an example of the Http request to send message to IoT Hub
Host: https://<iothubname>.azure-devices.net/devices/<deviceId>/messages/events?api-version=2018-06-30
Authorization: SharedAccessSignature sr=<iothubname>.azure-devices.net&sig=abc123;12344iweoippweruea=iothubowner&se=1570574220
Body: <normal Interval or alarms payloads> // example {"deviceid": "abc", "hello": "world"}
Lastly, thanks #kgalic for the answer but your suggestion would not work. This is not pure B2B integration. Our implementation have to allow for both devices connecting directly to the IoT Hub and devices connecting through the Telco. This is why every device needs to have its own identity and digital twin.

Route and transform data from Azure IoT Hub

in our usecase, we receive messages on Azure IoT Hub, and would like to route the data to different Event Hubs or Service Bus topics.
IoT Hub routes and endpoints are no option, because the data is binary data (protobuf), and there are only 10 different endpoints possible (we need more).
Our requierements are:
Splitting the message
Transform the data (maybe json)
Routing to different endpoints based on the payload (different parts of message could be routed to different endpoints)
(optional) enrich the data with additional payload
I see different options:
Azure Stream Analytics
Azure Functions
Spark or Flink
Do it yourself (write an Application and run it in Service Fabric or Kubernets)
Which techology would you recommend?
Regards,
Markus
There is also another option for your scenario such as using an Azure Event Grid. In this case, the telemetry data from the Azure IoT Hub are pushed to the Event Grid via its custom topic endpoint. Note, that there is a limit for the event message such as 64KB, see more details here.
The Event Grid allows to subscribe unlimited number of the Event Hubs, more details about the Event Grid are here and here.
Based on the above, the following screen snippet shows your another option for routing a small telemetry data to more than 10 Event Hubs, basically to any kind of subscriber.

Resources