How to save money on Azure Event Hub Basic Throughput - azure

I don't quite understand how Microsoft is calculating my Evenhub costs.
The current setup is:
I have a Raspberry PI Zero which sends messages once a minute via https to an Azure IOT Hub and the IOT Hub routes it to the Event Hub. One Message seems to be around 2kb big as seen in the Event Hub Throughput Graph. The Eventhub is then read by Elastic Logstash which uploads the messages into Elasticsearch. So we have 2kb/min incoming and outgoing traffic.
A raw message looks like:
{ "humidity":98.86653465006785,
"#timestamp":"2021-02-12T01:07:05.883Z",
"pressure":1035.0542695256731,
"#version":"1",
"temperature":-10.694375312741613
}
This is only 149 Bytes in total. I got the number through putting it into a txt file and taking a look into the properties.
My Service is now running since three days and already consumed 0,68$ which seems too much in my opinion.
If I interpret the MS Azure Event Hub pricing page corectly it charges me with 0,015$/h for generating traffic at 1MB/s incoming and 2mb/s outgoing
Did I make a mistake or is there a way of lowering the costs?

For anyone looking up to this question, there is indeed a way to reduce the cost of operation. In my case I used the IOT Hub to redirect the messages to the Event Hub, which is nonsense in my case. The Event Hub is totally unneccesary for this. You can use the IOT Hub like an Event Hub to get your messages.

Related

Azure - ingesting data into IoT Hub and sending notifications via slack/email for certain messages

I've got data coming into IoTHub and want to filter on them.
Relevant data I want to forward to slack as notification.
I've got the IoT Hub and a slack subscription in place and am having trouble connecting the two.
In order to do a rather complex time-based query, I figure to use Stream Analytics and configure the IoT Hub as input. From research I found Logic Apps can send messages to Slack over a webhook. Using a Service Bus Queue as output for Stream Analytics, I can get the data into Logic Apps.
So it's:
IoT Hub (ingest all data) => Stream Analytics (filter) => Service Bus Queue (queue up the data) => Logic Apps (send to Slack)
Looks a bit bulky but that seems to be one way of doing it (is there a better one?!?)
Doing this I ran into issues. I selected my IoT Hub as input for Stream Analytics and the simple query SELECT * INTO [Queue] FROM [Hub] fails, saying there was no data.
It does make sense if the IoT Hub just pushes new data to its endpoints and then discards it. So I created a test set in the Stream Analytics Job and the query runs fine.
However I do get data into the Hub which is not (all) picked up nor forwarded by the job to the service bus queue. I do see some activity on the queue but not nearly enough to be the data I receive.
This seems to be a very common scenario, ingesting data in IoT Hub and sending notifications to email or slack if they are of a certain type. Can you explain the steps to take or point me to a resource that does it. Maybe I'm on the wrong path as I cannot find anything that describes this.
Thanks

Azure IoT Hub - Recovering from a device flooding the hub with reported device twin messages

I have an Azure IoT Hub application and a device just started to send messages indicating changes in reported device twin properties every other second. In a matter of few hours, the total messages that day went over 50k. When this number goes of 40k, the IoT Hub becomes VERY slow to respond for ALL customers - not only until the device is shut off but until all those messages have managed to throttle through the system which seems to be after several hours or until next morning.
So if this type of flooding happens, the entire system for all customers grinds to a halt due to slowness.
This is a device bug and needs to be fixed but I was wondering if there is a way - IF this happens - to get the whole IoT hub back to normal where it isn't slow? Something like kicking the offending device or rebooting the hub or something. Or better yet - is there a way to prevent devices from flooding the hub faster than an x amount of messages per minute or something?
You could build some logic to kick the offending device when it starts spamming your hub. One approach might be to route all twinChangeEvents to a separate endpoint and write a Stream Analytics Job to group the messages per deviceId and keep a count of the events in a sliding window of X minutes. After the count reaches a threshold you set, you could call an Azure Function to disable the device and send a notification.
There is one caveat, the docs state:
If the rate of change is too high, or for other reasons such as
internal failures, the IoT Hub might send only one notification that
contains all changes.
I don't know if your device reaches that rate, but I think this would be a suitable approach to kick the offending device.
At some point the IoT hub will start rejecting your messages once the throttling limit is reached as per your tier and units purchased. Now, to handle excess of messages either throttling reached or the IoT hub is slow in processing the message you should auto-scale your IoT hub.
As per the docs in above link provided:-
The sample solution outlined in this article provides the ability to monitor an IoT Hub for the case where the current message count has exceeded a set threshold (for example, 90% of the allowed messages) and, in that case, to automatically scale the IoT Hub up to the next unit of capacity.
At the end of the day you need to also auto-down scale your IoT hub so that at low traffic received the cost is not high for IoT hub. Check the Scaling down section in the article link above.

Azure messages pattern

Right now, on IoT Hub there is an information that limit for messages per day 8000. I would like to ask you about any patterns which are being used in Azure.
I am curious if I am able to hit to Azure with some service outside Messages in order to prevent it from being overloaded by big amount of data, or save some confidentiality for this service.
For example, I would like to store some data from given service to Messages that are not being confidential and other data by using some WebSocket or any Rest protocol. I think that there are some patterns that serve that scenarios.
Does anyone has experience with that kind of situation?
Not everything needs to go through IoT Hub. IoT Hub is great for two way communication to/from IoT devices. You could also look at Event Hubs for ingestion from devices that don't need two way comms. We have a write up on the differences here Connecting IoT Devices to Azure: IoT Hub and Event Hubs.

Azure Service Fabric routing

I would like to get some recommendation, for designing a routing of IoT messages in Azure.
I have following scenario:
Senors sending messages to Azure IoT Hub in Google Protobuf format. Depending of the type of a message, I want to route the message to different applications inside a service fabric.
My current approach is to use a service fabric application to receive all messages from the IoT hub, parse the protobuf message, send the message depending on their type (attribute inside the protobuf) to an type-specific Azure event hub. Now the applications fetches the messages from their "own" event hub and process the messages.
I'm not sure if this is the best approach. I don't like the fact, to have one event hub for each type of message. Service Bus Topics are probably not an option, because I have a lot of messages (~30k per second).
Do I realy need a event hub, to decoupling this process, or does it make sense, to send the messages from the "routing application" direct to the different "type applications"?
What do you think?
Regards,
Markus
If you really need high performance you should take a look at IoT Hub and Event Hubs. Azure Event Hubs is a highly scalable data streaming platform and event ingestion service capable of receiving and processing millions of events per second. Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters.
In other hand if you need only 30k messages per second you can go with Premium Messaging.
Comparison of Azure IoT Hub and Azure Event Hubs
Premium Messaging: How fast is it?
What is Event Hubs?

Does Microsoft Azure IoT Hub stores data?

I have just started learning Azure IoT and it's quite interesting. I am confuse about does IoT Hub stores data somewhere?
i.e. Suppose i am passing room Temperature to IoT hub and want to store it in database for further use. How it's possible?
I am clear on how device-to-cloud and cloud-to-device works with IoT hub.
IoT Hub exposes device to cloud messages through an event hubs endpoint. Event Hubs has a retention time expressed in days. It's a stream of data that the reading client could re-read more time because the cursor is on client side (not on server side like queues and topics). With IoT Hub the related retention time is 1 day by default but you can change it.
If you want to store received messages from device you need to have a client reading on the Event Hubs exposed endpoint (for example with an Event Processor Host) that has the business logic to process the messages and store them into a database for example.
Of course you could use another decoupling layer so that the client reads from event hubs and store messages into queues. Then you have another client that at its own pace reads from queues and store into database. In this way you have a fast path reading event hubs.
This is pretty much the use case for all IoT scenarios.
Step 1: High scale data ingestion via Event Hub.
Step 2: Create and use a stream processing engine (Stream Analytics or HDInsight /Storm). You can run conditions (SQL like queries) to filter and store appropriate data in either cold or hot store for further analytics.
Step 3: Storage for cold-path analytics can be Azure BLOB. Stream Analytics can directly be configured to write the Data into it. Cold can contain all other data that doesn't require querying and will be cheap.
Step 4: Processing for hot-path analytics. This is data that is more regularly queries for. Or data where real time analytics needs to be carried on. Like in your case checking for Temperature values going beyond a threshold! needs an urgent trigger!
Let me know if you face any challenges while configuring the Stream analytics job! :)
If you take a look at the IoT Suite remote monitoring preconfigured solution (https://azure.microsoft.com/documentation/articles/iot-suite-remote-monitoring-sample-walkthrough/) you'll see that it persists telemetry in blob storage and maintains device status information in DocumentDb. This preconfigured solution gives you a working illustration of the points made in the previous answers.

Resources