How to find what time a packet reached at Azure IOT hub - azure

I have a device which is sending data via IOT hub to Eventhub and then to other data layers. The data is in JSON format. I could identify some delay to reach the data to my Database, compared to the time the packet is actually generated at the device(We have a time in the JSON packet). I need to identify the ingestion time of a packet when it reaches IOT hub to analyze and find where the delay is happening. How we can do that?

Every message gets a couple of System Properties added by IoT Hub. What you are looking for is iothub-enqueuedtime

Related

Azure - ingesting data into IoT Hub and sending notifications via slack/email for certain messages

I've got data coming into IoTHub and want to filter on them.
Relevant data I want to forward to slack as notification.
I've got the IoT Hub and a slack subscription in place and am having trouble connecting the two.
In order to do a rather complex time-based query, I figure to use Stream Analytics and configure the IoT Hub as input. From research I found Logic Apps can send messages to Slack over a webhook. Using a Service Bus Queue as output for Stream Analytics, I can get the data into Logic Apps.
So it's:
IoT Hub (ingest all data) => Stream Analytics (filter) => Service Bus Queue (queue up the data) => Logic Apps (send to Slack)
Looks a bit bulky but that seems to be one way of doing it (is there a better one?!?)
Doing this I ran into issues. I selected my IoT Hub as input for Stream Analytics and the simple query SELECT * INTO [Queue] FROM [Hub] fails, saying there was no data.
It does make sense if the IoT Hub just pushes new data to its endpoints and then discards it. So I created a test set in the Stream Analytics Job and the query runs fine.
However I do get data into the Hub which is not (all) picked up nor forwarded by the job to the service bus queue. I do see some activity on the queue but not nearly enough to be the data I receive.
This seems to be a very common scenario, ingesting data in IoT Hub and sending notifications to email or slack if they are of a certain type. Can you explain the steps to take or point me to a resource that does it. Maybe I'm on the wrong path as I cannot find anything that describes this.
Thanks

How to bring data from multiple Azure IoT devices with different timestamp properties into same Azure Time Series Insights environment?

I have multiple Azure IoT devices sending telemetry messages to a single IoT hub.
Each device has its own timestamp property name. How can I make the data from all these devices go into same TSI environment?
I tried creating multiple event sources for the same IoT hub but with different timestamp property. That makes only one event source's time stamp as the $ts at a time but how do I query respective time series data for a specific device?
Timeseries Insights requires that the incoming messages have the same schema.
What you can do, is make sure that you transform the incoming messages to a canonical data-format before ingesting them into Timeseries Insights.
My idea would be that you have an Azure Function that listens on the standard IoT Hub endpoint, and processes all messages that are coming into IoT Hub. The function then makes sure that the messages are transformed to a common model, and puts them on an EventHub.
That EventHub is then the event-source for Timeseries Insights.

Azure IoT Device to Cloud, Metrics graph drops to zero at a particular time stamp

I have an Azure IoT device connected to an Azure IoT Hub. The device sends 6 - 7 messages per minute. By looking at the D2C message metrics, I found an outlier, that states that at a specific time, the count of the D2C message was zero (see picture). As the messages are routed to a storage, I can check the storage to see if there are messages missing at that specific time, but the data saved in the storage shows that every message was received correctly at that time. Does any one know how that comes or if the metrics are not that reliable generally? If that's the case, what is the best practice to monitor the IoT Hub message transfer?
IoT Hub D2C Message Metrics
EnqueuedTimeUtc in the storage
For precisely monitor the flow of each message through IoT Hub you will need to Trace Azure IoT device-to-cloud messages with distributed tracing (currently in preview)
This trace context includes correlation IDs that allow you to correlate events from one component with events from another component
Automatically log the trace context to Azure Monitor Logs.
Measure and understand message flow and latency from devices to IoT Hub and routing endpoints.
Ref: https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-distributed-tracing

How to save money on Azure Event Hub Basic Throughput

I don't quite understand how Microsoft is calculating my Evenhub costs.
The current setup is:
I have a Raspberry PI Zero which sends messages once a minute via https to an Azure IOT Hub and the IOT Hub routes it to the Event Hub. One Message seems to be around 2kb big as seen in the Event Hub Throughput Graph. The Eventhub is then read by Elastic Logstash which uploads the messages into Elasticsearch. So we have 2kb/min incoming and outgoing traffic.
A raw message looks like:
{ "humidity":98.86653465006785,
"#timestamp":"2021-02-12T01:07:05.883Z",
"pressure":1035.0542695256731,
"#version":"1",
"temperature":-10.694375312741613
}
This is only 149 Bytes in total. I got the number through putting it into a txt file and taking a look into the properties.
My Service is now running since three days and already consumed 0,68$ which seems too much in my opinion.
If I interpret the MS Azure Event Hub pricing page corectly it charges me with 0,015$/h for generating traffic at 1MB/s incoming and 2mb/s outgoing
Did I make a mistake or is there a way of lowering the costs?
For anyone looking up to this question, there is indeed a way to reduce the cost of operation. In my case I used the IOT Hub to redirect the messages to the Event Hub, which is nonsense in my case. The Event Hub is totally unneccesary for this. You can use the IOT Hub like an Event Hub to get your messages.

Does Microsoft Azure IoT Hub stores data?

I have just started learning Azure IoT and it's quite interesting. I am confuse about does IoT Hub stores data somewhere?
i.e. Suppose i am passing room Temperature to IoT hub and want to store it in database for further use. How it's possible?
I am clear on how device-to-cloud and cloud-to-device works with IoT hub.
IoT Hub exposes device to cloud messages through an event hubs endpoint. Event Hubs has a retention time expressed in days. It's a stream of data that the reading client could re-read more time because the cursor is on client side (not on server side like queues and topics). With IoT Hub the related retention time is 1 day by default but you can change it.
If you want to store received messages from device you need to have a client reading on the Event Hubs exposed endpoint (for example with an Event Processor Host) that has the business logic to process the messages and store them into a database for example.
Of course you could use another decoupling layer so that the client reads from event hubs and store messages into queues. Then you have another client that at its own pace reads from queues and store into database. In this way you have a fast path reading event hubs.
This is pretty much the use case for all IoT scenarios.
Step 1: High scale data ingestion via Event Hub.
Step 2: Create and use a stream processing engine (Stream Analytics or HDInsight /Storm). You can run conditions (SQL like queries) to filter and store appropriate data in either cold or hot store for further analytics.
Step 3: Storage for cold-path analytics can be Azure BLOB. Stream Analytics can directly be configured to write the Data into it. Cold can contain all other data that doesn't require querying and will be cheap.
Step 4: Processing for hot-path analytics. This is data that is more regularly queries for. Or data where real time analytics needs to be carried on. Like in your case checking for Temperature values going beyond a threshold! needs an urgent trigger!
Let me know if you face any challenges while configuring the Stream analytics job! :)
If you take a look at the IoT Suite remote monitoring preconfigured solution (https://azure.microsoft.com/documentation/articles/iot-suite-remote-monitoring-sample-walkthrough/) you'll see that it persists telemetry in blob storage and maintains device status information in DocumentDb. This preconfigured solution gives you a working illustration of the points made in the previous answers.

Resources