Azure IoTHub - How to get usage data per device - azure

I am using Azure Monitor to view diagnostics/logs for my IoTHub. In the metrics available for IoTHub there is deviceDataUsage. As I understand it, this is the total data usage for all of the devices connected to this IoTHub.
Is there a built-in monitoring/logging solution to Azure IoTHub that would allow me to view per device data usage? Or will I need to use a different tool, such as stream analytics, to build my own solution?

Unfortunately there isn't a way to get the data usage for an individual IoT device through similar means as the monitoring tab of IoTHub nor through Kusto query.
There is a sort of a workaround. It would require some level of development on your end, if you are routing the messages to an event hub you can read directly from there and do aggregation on system property for device-id. Information on this can be found here: https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-event-processor-host#receive-messages. Alternatively, another form of workaround is to include the device-id in the telemetry message being sent and the messages be queried internally on your end to separate messages with specific device-id’s. These are all merely suggestions that can or cannot be used based on your business needs of course.
Finally you can reach out directly to assisted-support team and request the device usage on a case-by-case basis.

Related

Timeline of Iot edge reported clients status

How can I draw (I mean, get data to draw) a timeline of IotHub device client connection state?
I would like to draw an availability status timeline from all my devices, for that I am doing the following:
Every one minute: Request all '$edgeHub' Module Identity Twin
Save the '$edgeHub' reported clients on a database
Get a timeline from this database
When my number of devices grows I will do a lot of requests, I was wondering if there is no other optimized way to do it using Azure IoT resources.
From '$edgeHub' module Twin I get the sample:
"reported": {
"clients": {
"iot/device": {
"status": "Connected",
"lastConnectedTimeUtc": "2020-11-30T12:00:41.5918442Z",
"lastDisconnectedTimeUtc": "2020-11-30T12:00:41.5737114Z"
}
}
For API calls I am using https://github.com/amenzhinsky/iothub
Appreciate any response that helps me to investigate more about Azure monitoring device status.
1. Query
Instead of requesting all the module twins one by one, I would opt for using an IoT Hub query.
SELECT * FROM devices.modules WHERE is_defined(properties.reported.clients)
I don't know if your SDK supports that, but most (if not all) of the official SDKs have support to run queries. This will return every module twin that has the clients reported property defined. You could run that on a schedule and then save that output to a database as you had originally planned.
2. Route all module twin events to an endpoint
This one is a bit more tricky, but you can route device/module changes based on a query. You can then route all the events to a separate endpoint. The route would be something like:
IS_OBJECT($twin.properties.reported.clients)
You can read more on message routing here. The benefit of this approach is that you don't do any requests to IoT Hub and receive changes real-time. You can even consume these events using Azure Stream Analytics, which supports output to Power BI, Table storage and Cosmos DB natively. Result: you wrote no code and used only Azure services. You might want to consult the Azure pricing calculator if you want to leverage Azure Stream Analytics though.
Note: I did not thoroughly test solution #2, but theoretically this should work.
To add to #matthijs-van-der-veer's answer you could also subscribe to device twin changes and update the counters on the twin change event.
Another approach, try sending the device life cycle events Device connected, Device Disconnected from Event Grid to Event Hub. And from Event Hub send this to any endpoint for processing the event i.e. may be a module that listens to the event from Event Hub.
So the flow will be like this->
IoT Hub Blade -> Events -> Add Subscription -> Add Event hub namespace endpoint

What is practical usage of CorrelationId and MessageId when sending telemetry?

In the Azure IOT Hub Client telemetry sample there are two calls you can make that are commented out:
// Set Message property
/*(void)IoTHubMessage_SetMessageId(message_handle, "MSG_ID");
(void)IoTHubMessage_SetCorrelationId(message_handle, "CORE_ID");
...
*/
https://github.com/Azure/azure-iot-sdk-c/blob/master/iothub_client/samples/iothub_ll_telemetry_sample/iothub_ll_telemetry_sample.c
I understand I can pass strings into these calls in a certain format.
But what is their use case in Azure?
What should go into these fields to best help the user process the telemetry in Azure IoT Hub?
This page tells me the format of message id, not much guidance:
https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-construct
IoT Hub is one of the first Azure services to support distributed tracing. As more Azure services support distributed tracing, you'll be able trace IoT messages throughout the Azure services involved in your solution.
Enabling distributed tracing for IoT Hub gives you the ability to:
Precisely monitor the flow of each message through IoT Hub using trace context. This trace context includes correlation IDs that allow you to correlate events from one component with events from another component. It can be applied for a subset or all IoT device messages using device twin.
Automatically log the trace context to Azure Monitor diagnostic logs.
Measure and understand message flow and latency from devices to IoT Hub and routing endpoints.
Start considering how you want to implement distributed tracing for the non-Azure services in your IoT solution.
here you can find more.

How to route Event Hub messages to different Azure functions based on their message type

I have an Azure Event Hub over which I would like to send various types of messages. Each message should be handled by a separate Azure Function, based on their message type. What is the best way to accomplish this?
Actually, I could create some JSON container with a type and payload property and let one parent Azure Function dispatch all the messages payloads - based on their type - to other functions, but that feels a bit hacky.
This question basically asks the same - however it is answered how it can be done using the IoT Hub and message routing. In the Event Hub configuration I cannot find any setting to configure message routing though.
Or should I switch to an Azure Message Queue to get this functionality?
I would use Azure Streaming Analytics to route it to the different Azure Functions. ASAs allow you to specify Event Hubs as a source and several sinks (one of which can be multiple Azure Functions). You can read more about setting up Azure Streaming Analytics services through the Azure Portal here. You'll need to set up the Event Hub as your source (docs). You'll also need to set up your sink (docs). You write some MS SQL-like code to route the messages to the various sinks. However, ASAs are costly relative to other services since you're paying for a fixed amount of compute.
I put some pseudo code below. You'll have to swap it out based on how you configure you're ASA using the information from the attached MS Documentation.
SELECT
*
INTO
[YourOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
SELECT
*
INTO
[YourAlternateOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
Based on your additional info about the business requirements and assuming that the event size < 64KB (1MB in preview), the following screen snippet shows an example of your solution:
The concept of the above solution is based on the pushing a batch of the events to the Event Domain Endpoint of the AEG. The EventHub Trigger function has a responsibility for mapping each event message type in the batch to the domain topic before its publishing to the AEG.
Note, that using the Azure IoT Hub for ingestion of the events, the AEG can be directly integrated to the IoT Hub and each event message can be distributed in the loosely decoupled Pub/Sub manner. Besides that, for this business requirements can be used the B1 scale tier for IoT Hub ($10/month) comparing to the Basic Event Hubs ($11.16).
The IoT Hub has built-in a message routing mechanism (with some limitations), but a recently new feature of the IoT/AEG integration such as publishing a device telemetry message is giving a good support in the serverless architecture.
I ended up using Azure Durable Functions using the Fan Out/Fan In pattern.
In this approach, all events are handled by a single Orchestrator Function which in fact is a Durable Azure Function (F1). This deserializes incoming JSON to the correct DTO. Based on the content of the DTO, a corresponding activity function (F2) is invoked which processes it.

How to send data from UDF to Cosmos DB in Azure Digital Twin?

Setup till now:
I have created spaces. At the top level I have the IOT hub resource. In two of spaces, I have attached devices to it along with the sensors. I have created a Matcher for the Temperature sensor along with the UDF that is similar to the documentation. I have also assigned permissions to UDF. To send data to IOT hub, I have also fetched the device connection string for the dotnet sample
List of issues I am facing:
When I try to run the dotnet sample, I can see that it is able to reach the UDF(checked it via debugging), but in the UDF, it is not able to access the telemetry variable as given in documentation . The error it shows is :
Unexpected exception occurred while processing user-defined function. Please contact support and provide the correlation ID for the request.
I have created an endpoint to send Raw Telemetry to Event Hub. But I want to send the processed data from UDF to cosmos db. Is it possible? If yes then how?
Thanks for the question and reaching out...for #2 you could do this by doing a notify method in your UDF. You can setup egress to other endpoints such as Event Hub, Event Grid or Service Bus via the endpoint dispatcher. You would setup endpoint via the /endpoint API and then in your UDF you could specify what you want to send out and which changes. For details on the events and endpoints you can see here: https://learn.microsoft.com/en-us/azure/digital-twins/how-to-egress-endpoints
Here's also here is a link to learn more about this connecting Digital Twins over to Logic Apps: https://learn.microsoft.com/en-us/azure/digital-twins/tutorial-facilities-events which would have a similar pattern to sending data over to Cosmos DB.
As for the first one I am not sure if you are still seeing this. Which region? Do you have a correlation ID that you can pass along? Also if you turn on logs and look in Azure Monitor are there details there?

Does Microsoft Azure IoT Hub stores data?

I have just started learning Azure IoT and it's quite interesting. I am confuse about does IoT Hub stores data somewhere?
i.e. Suppose i am passing room Temperature to IoT hub and want to store it in database for further use. How it's possible?
I am clear on how device-to-cloud and cloud-to-device works with IoT hub.
IoT Hub exposes device to cloud messages through an event hubs endpoint. Event Hubs has a retention time expressed in days. It's a stream of data that the reading client could re-read more time because the cursor is on client side (not on server side like queues and topics). With IoT Hub the related retention time is 1 day by default but you can change it.
If you want to store received messages from device you need to have a client reading on the Event Hubs exposed endpoint (for example with an Event Processor Host) that has the business logic to process the messages and store them into a database for example.
Of course you could use another decoupling layer so that the client reads from event hubs and store messages into queues. Then you have another client that at its own pace reads from queues and store into database. In this way you have a fast path reading event hubs.
This is pretty much the use case for all IoT scenarios.
Step 1: High scale data ingestion via Event Hub.
Step 2: Create and use a stream processing engine (Stream Analytics or HDInsight /Storm). You can run conditions (SQL like queries) to filter and store appropriate data in either cold or hot store for further analytics.
Step 3: Storage for cold-path analytics can be Azure BLOB. Stream Analytics can directly be configured to write the Data into it. Cold can contain all other data that doesn't require querying and will be cheap.
Step 4: Processing for hot-path analytics. This is data that is more regularly queries for. Or data where real time analytics needs to be carried on. Like in your case checking for Temperature values going beyond a threshold! needs an urgent trigger!
Let me know if you face any challenges while configuring the Stream analytics job! :)
If you take a look at the IoT Suite remote monitoring preconfigured solution (https://azure.microsoft.com/documentation/articles/iot-suite-remote-monitoring-sample-walkthrough/) you'll see that it persists telemetry in blob storage and maintains device status information in DocumentDb. This preconfigured solution gives you a working illustration of the points made in the previous answers.

Resources