Timeline of Iot edge reported clients status - azure

How can I draw (I mean, get data to draw) a timeline of IotHub device client connection state?
I would like to draw an availability status timeline from all my devices, for that I am doing the following:
Every one minute: Request all '$edgeHub' Module Identity Twin
Save the '$edgeHub' reported clients on a database
Get a timeline from this database
When my number of devices grows I will do a lot of requests, I was wondering if there is no other optimized way to do it using Azure IoT resources.
From '$edgeHub' module Twin I get the sample:
"reported": {
"clients": {
"iot/device": {
"status": "Connected",
"lastConnectedTimeUtc": "2020-11-30T12:00:41.5918442Z",
"lastDisconnectedTimeUtc": "2020-11-30T12:00:41.5737114Z"
}
}
For API calls I am using https://github.com/amenzhinsky/iothub
Appreciate any response that helps me to investigate more about Azure monitoring device status.

1. Query
Instead of requesting all the module twins one by one, I would opt for using an IoT Hub query.
SELECT * FROM devices.modules WHERE is_defined(properties.reported.clients)
I don't know if your SDK supports that, but most (if not all) of the official SDKs have support to run queries. This will return every module twin that has the clients reported property defined. You could run that on a schedule and then save that output to a database as you had originally planned.
2. Route all module twin events to an endpoint
This one is a bit more tricky, but you can route device/module changes based on a query. You can then route all the events to a separate endpoint. The route would be something like:
IS_OBJECT($twin.properties.reported.clients)
You can read more on message routing here. The benefit of this approach is that you don't do any requests to IoT Hub and receive changes real-time. You can even consume these events using Azure Stream Analytics, which supports output to Power BI, Table storage and Cosmos DB natively. Result: you wrote no code and used only Azure services. You might want to consult the Azure pricing calculator if you want to leverage Azure Stream Analytics though.
Note: I did not thoroughly test solution #2, but theoretically this should work.

To add to #matthijs-van-der-veer's answer you could also subscribe to device twin changes and update the counters on the twin change event.

Another approach, try sending the device life cycle events Device connected, Device Disconnected from Event Grid to Event Hub. And from Event Hub send this to any endpoint for processing the event i.e. may be a module that listens to the event from Event Hub.
So the flow will be like this->
IoT Hub Blade -> Events -> Add Subscription -> Add Event hub namespace endpoint

Related

Azure function missing IoT hub trigger messages

I have created an Azure function to route messages from an IoT hub to an Azure SQL DB using the IoTHubTrigger following mostly the instructions in this link Azure Functions - how to set up IoTHubTrigger for my IoTHub messages?.
Each IoT device captures data every 8 minutes. When capturing is done, the device streams the data in 4 different messages. Then the azure function takes over to write these 4 different messages to the database.
When only one device was streaming, I had no issues with the data, which where written in the db and I could also see/monitor events/messages using az iot hub monitor-events.
When a second device started streaming in the same IoT hub, I started missing messages, meaning that from each device only one message is being stored in the db. Also when using iot hub monitor-events only one message appears from each device. I was also expecting that if I disable the 2nd device, then the 1st one will go back to normal. Unfortynately the issue remains the same.
So my question is: how is it possible a 2nd device screwing up the way that the 1st one interacts with the hub?
If that's not the case, then how we are supposed to figure out what causes the problem at this stage?
Thanks :)
Difficult to say without more details. Are you routing messages in IoT Hub somewhere else? I would go back to a clean IoT Hub with one device and create a consumer group on the IoT Hub for the function. Before running the function I would monitor that consumer group (I like to use the Azure IoT Explorer application) to see if data is coming through as expected, then add another device and keep monitoring the same consumer group. If data is coming through then start the function (consuming data from the consumer group).
If telemetry was not getting read from the IoT Hub consumer group then you will need to look at your device code for any issues.

Is it possible to reuse Connections on Azure Functions when sending Device-to-Cloud messages to IoTHub?

I have an Azure IoTHub with thousands of devices registered. These devices communicate through a Telco provider who sends messages through an Azure Storage Queue. This Storage Queue triggers an Azure Function which needs to parse the messages and Send an Event to the IoTHub as below.
Currently, we use the Azure IoTHub SDK to create a DeviceClient for each payload and we send the event. Because the DeviceClient represents a device in the IoTHub and is carrying the context of the source of the events, we are having to recreate a device client for each event. This quickly exceeds the threshold of the number of Connections allowed on Azure Functions.
We have tried using the IoTHub Output bindings for Azure Functions, but could not get to work and I do not think it would work because we need to make sure that the events get to the IoTHub with the right context (messages are sent by the right device).
What's the right way to solve this? Can the connections to the IoTHub be reused? Should we abandon Azure Function in favour of something else?
I assume that Telco is some kind of custom device management solution(vendor lock solution), that can also communicate with the device and receive the device telemetry, and eventually forward it to the specified endpoint, correct?
If I may ask and if my assumption is correct, why do you need to deliver the events to IoT Hub, if you are not managing Telco devices through IoT Hub(the arrows on your diagram are only in one direction)?
Using the IoT Hub just as a message broker for essentially cloud-to-cloud communication is not beneficial if that is the only purpose. Also conceptually what you described is cloud-to-cloud communication, and IoT Hub is intended to be used for devices.
Here is what I would do. Setup the API Management(or http triggered Azure Function) as a front door for Telco and pass the messages to the Event Hub.
You can choose here to pass request body for example where your telemetry data is - I assume again.
Keep the IoT Hub, and setup the routing to previously created Event Hub.
Now, in case you have devices that are not vendor locked and that can talk directly to IoT Hub, messages will be re-routed to Event Hub. Also Telco device messages will be routed to exactly the same Event Hub.
Now you can have for example Azure Stream Analytics that can analyze data stream just from the Event Hub, and for both, Telco devices and potentially non-Telco devices.
After trying a few things, I ended up moving away from using the SDK for pushing messages to IoT Hub. This is because the SDK uses AMQP, and creating a DeviceClient for each payload is not viable.
We switched to using HTTPS instead to push the messages to IoT Hub and using HttpClientFactory, we are able to do connection pooling.
I thought I would put this here in case someone has the same issue.
Here is an example of the Http request to send message to IoT Hub
Host: https://<iothubname>.azure-devices.net/devices/<deviceId>/messages/events?api-version=2018-06-30
Authorization: SharedAccessSignature sr=<iothubname>.azure-devices.net&sig=abc123;12344iweoippweruea=iothubowner&se=1570574220
Body: <normal Interval or alarms payloads> // example {"deviceid": "abc", "hello": "world"}
Lastly, thanks #kgalic for the answer but your suggestion would not work. This is not pure B2B integration. Our implementation have to allow for both devices connecting directly to the IoT Hub and devices connecting through the Telco. This is why every device needs to have its own identity and digital twin.

How to consume events delivered by Azure Event Grid to GCP

Basically what I understood from few Azure topics is as below:
Azure Event Hub - where data is received initially and converted into events
Service Bus- acting as a queue
Azure Event Grid - where events converted in hub are transferred here.
so the connection is like below:
Hub -> Service Bus -> Event Grid -> Pub Sub -> Storage
I understood this concept. My problem is I want data to be pushed from the event grid to GCP (subscription / topics). My question are:
How can I establish this using PUSH method?
What do I need to develop exactly?
How can I push things from grid to pubsub/subscriptions?
I found this link where data is getting published into Event Grid but I want to push data from the event grid to gcp. Can anybody explain me where am I going wrong or what exactly should I start with. I am new to this and its very confusing so I just need little bit of guidance over here.
I have below doubts:
Is there any direct subscriber option available with event grid listener? I mean can I directly link my google storage account with this listener so, whenever there is an event triggered it will be directly pushed to my GCP account(I don't have Azure account with me right now since access issue is in progress so I can't see it that's why I am asking here)
Suppose I have 20 columns in my data but I want only 16 columns to be pushed in GCP so is there any customization possible while sending data from event grid/event hub to pub/sub
If I write custom connectors code as per the links provided in the below answers then how can I run it? I mean where I can deploy those scripts on the cloud so that they will be triggered automatically whenever an event is triggered?
Can I implement webhooks in this scenario? (as an alternative to connectors), If yes then how can I do it and on which side do I need to create it?
Also, I read some articles and I came to know from a few guys that they experienced data loss in this entire process. So, what's the possibility over here and how can it be avoided
Can anybody explain me where am I going wrong or what exactly should I start with.
It's right here:
so the connection is like below:
Hub -> Service Bus -> Event Grid -> Pub Sub -> Storage
Although this might be the case, it sounds very much as if you're looking at one (very) specific scenario where data flows in this exact way.
Azure Event Hub, Azure Service Bus and Azure Event Grid can work together, but can also be used completely separate from each other.
Event Grid
The purpose of Event Grid is to enable Reactive programming. Use this when you want to react to (status) changes.
Event Hubs
Event Hubs facilitate a big data pipeline. Use this when you need telemetry and distributed data streaming.
Service Bus
The purpose of Service bus is to enable High-value enterprise messaging. Use this when you want to do something like Order processing and financial transactions.
In some cases, you use the services side by side to fulfill distinct roles. For example, an ecommerce site can use Service Bus to process the order, Event Hubs to capture site telemetry, and Event Grid to respond to events like an item was shipped.
In other cases, you link them together to form an event and data pipeline. You use Event Grid to respond to events in the other services. For an example of using Event Grid with Event Hubs to migrate data to a data warehouse, see Stream big data into a data warehouse.
Taken from the very interesting and important documentation article Choose between Azure messaging services - Event Grid, Event Hubs, and Service Bus
EDIT
My problem is I want data to be pushed from event grid to GCP (subscription / topics). So how can I establish this using PUSH method??
Possibly the simplest solution is to have an Event Grid Event trigger a webhook (which might run an Azure Function or a Google Cloud Function) which in turn puts the event/message on the GCP Topic.
Publishing messages is quite well documented. There are examples on how to do so with a REST call, command-line, C#, Go, JAVA, NodeJS, PHP, Python and Ruby.
EDIT 2
What you need to do is create an Event Grid Subscription to listen to and handle Event Grid Events.
Here's an example screenshot on how to listen for events for a specific Storage Account and call a WebHook whenever such an event occurs:
Pay attention to the "Endpoint Details": that's where you can specify to, for instance, call a webhook every time an event is triggered.
The easiest way to transfer the EventHub generated events would probably be to create an EventHub event receiver in Node.js (which you mentioned in your comments) as described here, which receives events and publishes them to Cloud Pub/Sub directly, as described in the Cloud Pub/Sub publisher documentation for Node.js.

How to send data from UDF to Cosmos DB in Azure Digital Twin?

Setup till now:
I have created spaces. At the top level I have the IOT hub resource. In two of spaces, I have attached devices to it along with the sensors. I have created a Matcher for the Temperature sensor along with the UDF that is similar to the documentation. I have also assigned permissions to UDF. To send data to IOT hub, I have also fetched the device connection string for the dotnet sample
List of issues I am facing:
When I try to run the dotnet sample, I can see that it is able to reach the UDF(checked it via debugging), but in the UDF, it is not able to access the telemetry variable as given in documentation . The error it shows is :
Unexpected exception occurred while processing user-defined function. Please contact support and provide the correlation ID for the request.
I have created an endpoint to send Raw Telemetry to Event Hub. But I want to send the processed data from UDF to cosmos db. Is it possible? If yes then how?
Thanks for the question and reaching out...for #2 you could do this by doing a notify method in your UDF. You can setup egress to other endpoints such as Event Hub, Event Grid or Service Bus via the endpoint dispatcher. You would setup endpoint via the /endpoint API and then in your UDF you could specify what you want to send out and which changes. For details on the events and endpoints you can see here: https://learn.microsoft.com/en-us/azure/digital-twins/how-to-egress-endpoints
Here's also here is a link to learn more about this connecting Digital Twins over to Logic Apps: https://learn.microsoft.com/en-us/azure/digital-twins/tutorial-facilities-events which would have a similar pattern to sending data over to Cosmos DB.
As for the first one I am not sure if you are still seeing this. Which region? Do you have a correlation ID that you can pass along? Also if you turn on logs and look in Azure Monitor are there details there?

Can Azure IOT hub be used to read(Get) data from some devices?

In my case I have 1000+ of devices that stores activity inside. I need to send a http get request to this device to get those data in csv or json format and save it in a storage hosted on azure.
Cab IOT hub require data using get request and can it be scheduled to read daily/weekly?
What other azure services would you suggest to facilitated this scheduled reads?
You have not mentioned which the Azure IoT Hub scale tier is used. Basically there are two price groups such as Basic and Standard with a significant different cost and capabilities. The Basic tier offers only services for one-way communications between the devices and Azure IoT Hub.
Based on that, the following scenarios can be used for your business case:
1. Basic Tier (non event-driven solution)
The device pushs periodicaly a telementry and non-telemetry messages based on the needs to the Azure IoT Hub, where the non-telemetry messages are routed to the Azure Function via the Service Bus Queue/Topic. Responsibility for this non-telemetry pipe is to persist a real device state in the database. Note, that the 6M messages will cost only $50/month. The back-end application can any time to query this database for devices state.
2. Standard Tier (event-driven solution) In this scenario you can use a Device Twin of the Azure IoT Hub to enable storing a real-device state in the cloud-backend (described by #HelenLo). The device can be triggered by C2D message, changing a desired property, invoking a method or based on the device edge trigger to the action for updating a state (reported properties).
The Azure IoT Hub has a capabilities to run your scheduled jobs for multiple devices.
In this solution, the back-end application can call any time a job for ExportDevicesAsync to the blob storage, see more details here. Note, that the 6M messages will cost $250/month.
As you can see the above each scenario needs to build a different device logic model based on the communications capabilities between the devices and Azure IoT Hub and back. Note, there are some limitations for these communications, see more details here.
You can consider using Device Twin of IoT Hub
https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-device-twins
Use device twins to:
Store device-specific metadata in the cloud. For example, the deployment location of a vending machine.
Report current state information such as available capabilities and conditions from your device app. For example, a device is connected to your IoT hub over cellular or WiFi.
Synchronize the state of long-running workflows between device app and back-end app. For example, when the solution back end specifies the new firmware version to install, and the device app reports the various stages of the update process.
Query your device metadata, configuration, or state.
IoT Hub provides you with the ability to connect your devices over various protocols. Preferred protocols are messaging protocols, such as MQTT or AMQP, but HTTPS is also supported. Using IoT hub, you do not request data from the device, though. The device will send the data to the IoT Hub. You have to options to implement that with IoT Hub:
The device connects to the IoT Hub whenever it has some data to be sent, and pushes the data up to IoT Hub
The device does not send any data on its own, but stays always or at least regularly connected to IoT Hub. You then can send a cloud to device message over IoT Hub to the device, requesting the data to be sent. The device then sends the data the same way it would in the first option.
When the data then has been sent to IoT Hub, you need to push it somewhere where it is persistently stored - IoT Hub only keeps messages for 1 day by default. Options for this are:
Create a blob storage account and push to that directly from IoT Hub using a custom endpoint This would probably be the easiest and cheapest. Dependening on how you need to access your data, a blob might not be the best option, though
Create a function app, create a function with an EventHubTrigger, connect it to IoT Hub and let the function process incoming data by outputting it into any kind of data sink, such as SQL, CosmosDB, Table Storage...

Resources