How to send data from UDF to Cosmos DB in Azure Digital Twin? - azure

Setup till now:
I have created spaces. At the top level I have the IOT hub resource. In two of spaces, I have attached devices to it along with the sensors. I have created a Matcher for the Temperature sensor along with the UDF that is similar to the documentation. I have also assigned permissions to UDF. To send data to IOT hub, I have also fetched the device connection string for the dotnet sample
List of issues I am facing:
When I try to run the dotnet sample, I can see that it is able to reach the UDF(checked it via debugging), but in the UDF, it is not able to access the telemetry variable as given in documentation . The error it shows is :
Unexpected exception occurred while processing user-defined function. Please contact support and provide the correlation ID for the request.
I have created an endpoint to send Raw Telemetry to Event Hub. But I want to send the processed data from UDF to cosmos db. Is it possible? If yes then how?

Thanks for the question and reaching out...for #2 you could do this by doing a notify method in your UDF. You can setup egress to other endpoints such as Event Hub, Event Grid or Service Bus via the endpoint dispatcher. You would setup endpoint via the /endpoint API and then in your UDF you could specify what you want to send out and which changes. For details on the events and endpoints you can see here: https://learn.microsoft.com/en-us/azure/digital-twins/how-to-egress-endpoints
Here's also here is a link to learn more about this connecting Digital Twins over to Logic Apps: https://learn.microsoft.com/en-us/azure/digital-twins/tutorial-facilities-events which would have a similar pattern to sending data over to Cosmos DB.
As for the first one I am not sure if you are still seeing this. Which region? Do you have a correlation ID that you can pass along? Also if you turn on logs and look in Azure Monitor are there details there?

Related

Timeline of Iot edge reported clients status

How can I draw (I mean, get data to draw) a timeline of IotHub device client connection state?
I would like to draw an availability status timeline from all my devices, for that I am doing the following:
Every one minute: Request all '$edgeHub' Module Identity Twin
Save the '$edgeHub' reported clients on a database
Get a timeline from this database
When my number of devices grows I will do a lot of requests, I was wondering if there is no other optimized way to do it using Azure IoT resources.
From '$edgeHub' module Twin I get the sample:
"reported": {
"clients": {
"iot/device": {
"status": "Connected",
"lastConnectedTimeUtc": "2020-11-30T12:00:41.5918442Z",
"lastDisconnectedTimeUtc": "2020-11-30T12:00:41.5737114Z"
}
}
For API calls I am using https://github.com/amenzhinsky/iothub
Appreciate any response that helps me to investigate more about Azure monitoring device status.
1. Query
Instead of requesting all the module twins one by one, I would opt for using an IoT Hub query.
SELECT * FROM devices.modules WHERE is_defined(properties.reported.clients)
I don't know if your SDK supports that, but most (if not all) of the official SDKs have support to run queries. This will return every module twin that has the clients reported property defined. You could run that on a schedule and then save that output to a database as you had originally planned.
2. Route all module twin events to an endpoint
This one is a bit more tricky, but you can route device/module changes based on a query. You can then route all the events to a separate endpoint. The route would be something like:
IS_OBJECT($twin.properties.reported.clients)
You can read more on message routing here. The benefit of this approach is that you don't do any requests to IoT Hub and receive changes real-time. You can even consume these events using Azure Stream Analytics, which supports output to Power BI, Table storage and Cosmos DB natively. Result: you wrote no code and used only Azure services. You might want to consult the Azure pricing calculator if you want to leverage Azure Stream Analytics though.
Note: I did not thoroughly test solution #2, but theoretically this should work.
To add to #matthijs-van-der-veer's answer you could also subscribe to device twin changes and update the counters on the twin change event.
Another approach, try sending the device life cycle events Device connected, Device Disconnected from Event Grid to Event Hub. And from Event Hub send this to any endpoint for processing the event i.e. may be a module that listens to the event from Event Hub.
So the flow will be like this->
IoT Hub Blade -> Events -> Add Subscription -> Add Event hub namespace endpoint

How to route Event Hub messages to different Azure functions based on their message type

I have an Azure Event Hub over which I would like to send various types of messages. Each message should be handled by a separate Azure Function, based on their message type. What is the best way to accomplish this?
Actually, I could create some JSON container with a type and payload property and let one parent Azure Function dispatch all the messages payloads - based on their type - to other functions, but that feels a bit hacky.
This question basically asks the same - however it is answered how it can be done using the IoT Hub and message routing. In the Event Hub configuration I cannot find any setting to configure message routing though.
Or should I switch to an Azure Message Queue to get this functionality?
I would use Azure Streaming Analytics to route it to the different Azure Functions. ASAs allow you to specify Event Hubs as a source and several sinks (one of which can be multiple Azure Functions). You can read more about setting up Azure Streaming Analytics services through the Azure Portal here. You'll need to set up the Event Hub as your source (docs). You'll also need to set up your sink (docs). You write some MS SQL-like code to route the messages to the various sinks. However, ASAs are costly relative to other services since you're paying for a fixed amount of compute.
I put some pseudo code below. You'll have to swap it out based on how you configure you're ASA using the information from the attached MS Documentation.
SELECT
*
INTO
[YourOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
SELECT
*
INTO
[YourAlternateOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
Based on your additional info about the business requirements and assuming that the event size < 64KB (1MB in preview), the following screen snippet shows an example of your solution:
The concept of the above solution is based on the pushing a batch of the events to the Event Domain Endpoint of the AEG. The EventHub Trigger function has a responsibility for mapping each event message type in the batch to the domain topic before its publishing to the AEG.
Note, that using the Azure IoT Hub for ingestion of the events, the AEG can be directly integrated to the IoT Hub and each event message can be distributed in the loosely decoupled Pub/Sub manner. Besides that, for this business requirements can be used the B1 scale tier for IoT Hub ($10/month) comparing to the Basic Event Hubs ($11.16).
The IoT Hub has built-in a message routing mechanism (with some limitations), but a recently new feature of the IoT/AEG integration such as publishing a device telemetry message is giving a good support in the serverless architecture.
I ended up using Azure Durable Functions using the Fan Out/Fan In pattern.
In this approach, all events are handled by a single Orchestrator Function which in fact is a Durable Azure Function (F1). This deserializes incoming JSON to the correct DTO. Based on the content of the DTO, a corresponding activity function (F2) is invoked which processes it.

IOT hub to Email Notification

I am developing a Azure IOTHUB use case.
Multiple Load cells are sending continuously (every 1/2 sec) sending data to IOTHUB. (DeviceID, weight).
SQL Table with User Data .
I want to make a system that that sends an email notification on certain weight to the device owner.
What is the right approach to achieve that.
I have seen Logic apps is an option but how to implement it with multiple user account and devices.
I would use IoT Hub routing to push the messages that meet the weight criteria to a service bus queue. From there you can use an Azure Function with a Service Bus Trigger. I assume the user account information (e-mail address?) is available via a query in the SQL table. Azure Functions have a SendGrid binding that you'd then use to send out the e-mail.
Note that routing IoT Hub directly to a function is on the backlog.
Basically, there are two solutions for your scenario, when each device has own criteria on the weight:
The device twin desired property contains a weight value used for publishing a non-telemetry alert message by a real device to the Azure IoT Hub. This alert message can be routed in the Azure IoT Hub Routes to the custom endpoint the same way like is described in Jim's answer (ServiceBus->AzureFuction->SendGrid)
The second solution is more complex, generic, very flexible and it doesn't require any special coding on the device side or device twin. It's based on the standard telemetry stream pipeline with Azure Stream Analytics (ASA) job for analyzing events and generating a notification message for output to the Azure Function with SendGrid. The ASA job used a reference data (user data, weight, etc.) from the blob file generated and refreshed by SQL Database.
The following screen snippet shows this solution:
I would like to present one another approach which I think is correct too (I tested this flow):
Data is sent to the Azure IoT Hub from device
Azure Stream Analytics filters this data basing on weight and deviceID
Once data is analyzed there is a call to the Azure Function which triggers Azure Logic App with data collected from the Stream Analytics
Azure Logic App receives data (HTTP trigger) from Azure Function App
Then Logic App uses "Get row" action step to get user data from SQL Database
Last step is up to you - you can use either "SendGrid - send e-mail" action or integrate Logic App with Outlook or even Office365, Gmail and other services.
Here are the links to the documentation:
Connect to SQL Server or Azure SQL Database from Azure Logic Apps
Send emails and manage mailing lists in SendGrid by using Azure Logic Apps

Azure Function: IoTHub as Input and Output

I've developed an azure function to handle decompression of messages as they enter the IoTHub.
The Function is connected to the IoTHub's built in Messaging Endpoint, so it can function like an EventHub.
What I would like to do it have the Function output the decompressed content back into the IoTHub so the Stream Analytics and other Jobs that I have running will not have to be connected to a different Endpoint to continue receiving telemetry.
There seems to be a fair amount of documentation surrounding the Azure Functions and hooking them up to IoTHubs, but some of it is from last year and I know things have changed quite a bit.
This is my current connection string to read and write to the same IoTHub:
Endpoint=sb://iothub-ns-34997-5db385cb1f.servicebus.windows.net/;SharedAccessKeyName=iothubowner;SharedAccessKey=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx=;EntityPath=IoTHub
Right now I've setup the Output to go to the IoTHub endpoint and I'm getting an error
Exception while executing function: Functions.DecompressionJS. Microsoft.Azure.WebJobs.Host: Error while handling parameter _binder after function returned:. Microsoft.ServiceBus: Unauthorized access. 'Send' claim(s) are required to perform this operation. Resource: 'sb://iothub-ns-34997-5db385cb1f.servicebus.windows.net/iothub'. TrackingId:e85de1ed565243bcb30bc622a2cab252_G4, SystemTracker:gateway6, Timestamp:6/22/2017 9:20:16 PM.
So I figured there was something wrong with the connection string and so I modified it to include the /iothub that the exception was telling me to use, since the rest of the endpoint matched the current connection string.
Once I updated the connection string and reran the function I got a different exception:
Exception while executing function: Functions.DecompressionJS. Microsoft.Azure.WebJobs.Host: Error while handling parameter _binder after function returned:. Microsoft.ServiceBus: Invalid EventHub address. It must be either of the following. Sender: <EventHubName>. Partition Sender: <EventHubName>/Partitions/<PartitionNumber>. Partition Receiver: <EventHubName>/ConsumerGroups/<ConsumerGroupName>/Partitions/<PartitionNumber>. TrackingId:ecb290822f494a86a61c21712656ea4c_G0, SystemTracker:gateway6, Timestamp:6/22/2017 8:44:14 PM.
So at this point I'm thinking that the IoTHub endpoint is only for reading messages and there is no way to get the decompressed content back into the IoTHub.
I'm hoping someone can prove me wrong and help me to configure my connection strings so I can have a closed loop and retrieve and send messages to and from the IoTHub without an intermediary.
The Azure IoT Hub is a bidirectional gateway between the devices and Azure cloud back end solutions. The communications with the Azure IoT Hub is done via its device-facing and service-facing endpoints. See more details here.
Your scenario requires to decompress a device event before its passing to the telemetry stream pipeline. Basically this telemetry pre-processing in the typical Azure stream pipeline can be done in the Azure Function (or worker role) and/or Azure Stream Analytics (ASA) job like is shown in the following picture:
As you can see, the AF and/or ASA job are changing a real-time telemetry data in the stream pipeline and their state are stored in the next entity such as Event Hub. That's the common and recommended pattern of the real-time stream pipeline and push model.
Your scenario also requires to keep the same telemetry path (source) as you have it for uncompressed device events, so than there is a "non-standard" solution. The following screen snippet shows an example of this solution:
The concept of the above solution is based on the device emulator on the backend side. The Azure IoT Hub Routes will forward all events for their preprocessing to the custom endpoint such as Event Hub.
Behind that, the Azure Function will have a responsibility to decompress an ingested event and create new one for that device such as emulated device. Now, this emulated device can send a D2C message to the Azure IoT Hub like others real devices.
Note, that the emulated device is using a Https protocol (connection less) and Azure IoT Hub Authorization.
The events from the emulated devices in the Azure IoT Hub are routed to the Default Event Hub such as a default telemetry path.
Note, that the above solution allows to select an event preprocessing based on the Routes/Rules and its usage is depended from the business model.

How to use WebJob to process IoT hub messages and save them to SQL Database?

Im trying to create a complete solution to present data from IoT devices on to a webpage.
The data and devices will never be in the millions so using Stream Analytics, Machine Learning, Big Data etc. is costly and unnecessary.
I've looked at docs, blogs, forums for weeks now, and im stuck with the part on how to process the messages that the IoT hub receives, i want to save them to a SQL database and then build a website that will present them to the users.
What i have so far:
1. Device part
Raspberry Pi 3 has Windows IoT Core installed
Messages are sent and recieved on both Hub and Device ends successfully
(verified with Device Explorer and IoT hub dashboard)
2. Processing part
The most similar approach is detailed here but i don't want to use NoSQL, ive tried to use the Azure Function with the External Table (experimental) but there is zero documentation for that and all my attempts failed with function error.
Now im trying to connect a WebJob to process IoT Hub messages but i cant find any relevant samples or docs. Essentially id want to convert a Console App to a WebJob which will be triggered when a message arrives to the IoT hub
3. Webpage part
Once i get the messages to the SQL database i will create my custom portal for managing and registering devices, issuing one-off commands to devices and for request-response data.
The telemetry will be queried from the database and presented statically or near real time (with SignalR) by device type, location, by user privilages etc. this part is preety clear to me.
Please can anyone help me out with the processing part??
I found a solution by using Azure WebJobs and this article explains how to tie an EvenHub (IoT Hub) to the WebJob.

Resources