Azure Function: IoTHub as Input and Output - node.js

I've developed an azure function to handle decompression of messages as they enter the IoTHub.
The Function is connected to the IoTHub's built in Messaging Endpoint, so it can function like an EventHub.
What I would like to do it have the Function output the decompressed content back into the IoTHub so the Stream Analytics and other Jobs that I have running will not have to be connected to a different Endpoint to continue receiving telemetry.
There seems to be a fair amount of documentation surrounding the Azure Functions and hooking them up to IoTHubs, but some of it is from last year and I know things have changed quite a bit.
This is my current connection string to read and write to the same IoTHub:
Endpoint=sb://iothub-ns-34997-5db385cb1f.servicebus.windows.net/;SharedAccessKeyName=iothubowner;SharedAccessKey=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx=;EntityPath=IoTHub
Right now I've setup the Output to go to the IoTHub endpoint and I'm getting an error
Exception while executing function: Functions.DecompressionJS. Microsoft.Azure.WebJobs.Host: Error while handling parameter _binder after function returned:. Microsoft.ServiceBus: Unauthorized access. 'Send' claim(s) are required to perform this operation. Resource: 'sb://iothub-ns-34997-5db385cb1f.servicebus.windows.net/iothub'. TrackingId:e85de1ed565243bcb30bc622a2cab252_G4, SystemTracker:gateway6, Timestamp:6/22/2017 9:20:16 PM.
So I figured there was something wrong with the connection string and so I modified it to include the /iothub that the exception was telling me to use, since the rest of the endpoint matched the current connection string.
Once I updated the connection string and reran the function I got a different exception:
Exception while executing function: Functions.DecompressionJS. Microsoft.Azure.WebJobs.Host: Error while handling parameter _binder after function returned:. Microsoft.ServiceBus: Invalid EventHub address. It must be either of the following. Sender: <EventHubName>. Partition Sender: <EventHubName>/Partitions/<PartitionNumber>. Partition Receiver: <EventHubName>/ConsumerGroups/<ConsumerGroupName>/Partitions/<PartitionNumber>. TrackingId:ecb290822f494a86a61c21712656ea4c_G0, SystemTracker:gateway6, Timestamp:6/22/2017 8:44:14 PM.
So at this point I'm thinking that the IoTHub endpoint is only for reading messages and there is no way to get the decompressed content back into the IoTHub.
I'm hoping someone can prove me wrong and help me to configure my connection strings so I can have a closed loop and retrieve and send messages to and from the IoTHub without an intermediary.

The Azure IoT Hub is a bidirectional gateway between the devices and Azure cloud back end solutions. The communications with the Azure IoT Hub is done via its device-facing and service-facing endpoints. See more details here.
Your scenario requires to decompress a device event before its passing to the telemetry stream pipeline. Basically this telemetry pre-processing in the typical Azure stream pipeline can be done in the Azure Function (or worker role) and/or Azure Stream Analytics (ASA) job like is shown in the following picture:
As you can see, the AF and/or ASA job are changing a real-time telemetry data in the stream pipeline and their state are stored in the next entity such as Event Hub. That's the common and recommended pattern of the real-time stream pipeline and push model.
Your scenario also requires to keep the same telemetry path (source) as you have it for uncompressed device events, so than there is a "non-standard" solution. The following screen snippet shows an example of this solution:
The concept of the above solution is based on the device emulator on the backend side. The Azure IoT Hub Routes will forward all events for their preprocessing to the custom endpoint such as Event Hub.
Behind that, the Azure Function will have a responsibility to decompress an ingested event and create new one for that device such as emulated device. Now, this emulated device can send a D2C message to the Azure IoT Hub like others real devices.
Note, that the emulated device is using a Https protocol (connection less) and Azure IoT Hub Authorization.
The events from the emulated devices in the Azure IoT Hub are routed to the Default Event Hub such as a default telemetry path.
Note, that the above solution allows to select an event preprocessing based on the Routes/Rules and its usage is depended from the business model.

Related

Azure - ingesting data into IoT Hub and sending notifications via slack/email for certain messages

I've got data coming into IoTHub and want to filter on them.
Relevant data I want to forward to slack as notification.
I've got the IoT Hub and a slack subscription in place and am having trouble connecting the two.
In order to do a rather complex time-based query, I figure to use Stream Analytics and configure the IoT Hub as input. From research I found Logic Apps can send messages to Slack over a webhook. Using a Service Bus Queue as output for Stream Analytics, I can get the data into Logic Apps.
So it's:
IoT Hub (ingest all data) => Stream Analytics (filter) => Service Bus Queue (queue up the data) => Logic Apps (send to Slack)
Looks a bit bulky but that seems to be one way of doing it (is there a better one?!?)
Doing this I ran into issues. I selected my IoT Hub as input for Stream Analytics and the simple query SELECT * INTO [Queue] FROM [Hub] fails, saying there was no data.
It does make sense if the IoT Hub just pushes new data to its endpoints and then discards it. So I created a test set in the Stream Analytics Job and the query runs fine.
However I do get data into the Hub which is not (all) picked up nor forwarded by the job to the service bus queue. I do see some activity on the queue but not nearly enough to be the data I receive.
This seems to be a very common scenario, ingesting data in IoT Hub and sending notifications to email or slack if they are of a certain type. Can you explain the steps to take or point me to a resource that does it. Maybe I'm on the wrong path as I cannot find anything that describes this.
Thanks

Azure function missing IoT hub trigger messages

I have created an Azure function to route messages from an IoT hub to an Azure SQL DB using the IoTHubTrigger following mostly the instructions in this link Azure Functions - how to set up IoTHubTrigger for my IoTHub messages?.
Each IoT device captures data every 8 minutes. When capturing is done, the device streams the data in 4 different messages. Then the azure function takes over to write these 4 different messages to the database.
When only one device was streaming, I had no issues with the data, which where written in the db and I could also see/monitor events/messages using az iot hub monitor-events.
When a second device started streaming in the same IoT hub, I started missing messages, meaning that from each device only one message is being stored in the db. Also when using iot hub monitor-events only one message appears from each device. I was also expecting that if I disable the 2nd device, then the 1st one will go back to normal. Unfortynately the issue remains the same.
So my question is: how is it possible a 2nd device screwing up the way that the 1st one interacts with the hub?
If that's not the case, then how we are supposed to figure out what causes the problem at this stage?
Thanks :)
Difficult to say without more details. Are you routing messages in IoT Hub somewhere else? I would go back to a clean IoT Hub with one device and create a consumer group on the IoT Hub for the function. Before running the function I would monitor that consumer group (I like to use the Azure IoT Explorer application) to see if data is coming through as expected, then add another device and keep monitoring the same consumer group. If data is coming through then start the function (consuming data from the consumer group).
If telemetry was not getting read from the IoT Hub consumer group then you will need to look at your device code for any issues.

Is it possible to reuse Connections on Azure Functions when sending Device-to-Cloud messages to IoTHub?

I have an Azure IoTHub with thousands of devices registered. These devices communicate through a Telco provider who sends messages through an Azure Storage Queue. This Storage Queue triggers an Azure Function which needs to parse the messages and Send an Event to the IoTHub as below.
Currently, we use the Azure IoTHub SDK to create a DeviceClient for each payload and we send the event. Because the DeviceClient represents a device in the IoTHub and is carrying the context of the source of the events, we are having to recreate a device client for each event. This quickly exceeds the threshold of the number of Connections allowed on Azure Functions.
We have tried using the IoTHub Output bindings for Azure Functions, but could not get to work and I do not think it would work because we need to make sure that the events get to the IoTHub with the right context (messages are sent by the right device).
What's the right way to solve this? Can the connections to the IoTHub be reused? Should we abandon Azure Function in favour of something else?
I assume that Telco is some kind of custom device management solution(vendor lock solution), that can also communicate with the device and receive the device telemetry, and eventually forward it to the specified endpoint, correct?
If I may ask and if my assumption is correct, why do you need to deliver the events to IoT Hub, if you are not managing Telco devices through IoT Hub(the arrows on your diagram are only in one direction)?
Using the IoT Hub just as a message broker for essentially cloud-to-cloud communication is not beneficial if that is the only purpose. Also conceptually what you described is cloud-to-cloud communication, and IoT Hub is intended to be used for devices.
Here is what I would do. Setup the API Management(or http triggered Azure Function) as a front door for Telco and pass the messages to the Event Hub.
You can choose here to pass request body for example where your telemetry data is - I assume again.
Keep the IoT Hub, and setup the routing to previously created Event Hub.
Now, in case you have devices that are not vendor locked and that can talk directly to IoT Hub, messages will be re-routed to Event Hub. Also Telco device messages will be routed to exactly the same Event Hub.
Now you can have for example Azure Stream Analytics that can analyze data stream just from the Event Hub, and for both, Telco devices and potentially non-Telco devices.
After trying a few things, I ended up moving away from using the SDK for pushing messages to IoT Hub. This is because the SDK uses AMQP, and creating a DeviceClient for each payload is not viable.
We switched to using HTTPS instead to push the messages to IoT Hub and using HttpClientFactory, we are able to do connection pooling.
I thought I would put this here in case someone has the same issue.
Here is an example of the Http request to send message to IoT Hub
Host: https://<iothubname>.azure-devices.net/devices/<deviceId>/messages/events?api-version=2018-06-30
Authorization: SharedAccessSignature sr=<iothubname>.azure-devices.net&sig=abc123;12344iweoippweruea=iothubowner&se=1570574220
Body: <normal Interval or alarms payloads> // example {"deviceid": "abc", "hello": "world"}
Lastly, thanks #kgalic for the answer but your suggestion would not work. This is not pure B2B integration. Our implementation have to allow for both devices connecting directly to the IoT Hub and devices connecting through the Telco. This is why every device needs to have its own identity and digital twin.

How to route Event Hub messages to different Azure functions based on their message type

I have an Azure Event Hub over which I would like to send various types of messages. Each message should be handled by a separate Azure Function, based on their message type. What is the best way to accomplish this?
Actually, I could create some JSON container with a type and payload property and let one parent Azure Function dispatch all the messages payloads - based on their type - to other functions, but that feels a bit hacky.
This question basically asks the same - however it is answered how it can be done using the IoT Hub and message routing. In the Event Hub configuration I cannot find any setting to configure message routing though.
Or should I switch to an Azure Message Queue to get this functionality?
I would use Azure Streaming Analytics to route it to the different Azure Functions. ASAs allow you to specify Event Hubs as a source and several sinks (one of which can be multiple Azure Functions). You can read more about setting up Azure Streaming Analytics services through the Azure Portal here. You'll need to set up the Event Hub as your source (docs). You'll also need to set up your sink (docs). You write some MS SQL-like code to route the messages to the various sinks. However, ASAs are costly relative to other services since you're paying for a fixed amount of compute.
I put some pseudo code below. You'll have to swap it out based on how you configure you're ASA using the information from the attached MS Documentation.
SELECT
*
INTO
[YourOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
SELECT
*
INTO
[YourAlternateOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
Based on your additional info about the business requirements and assuming that the event size < 64KB (1MB in preview), the following screen snippet shows an example of your solution:
The concept of the above solution is based on the pushing a batch of the events to the Event Domain Endpoint of the AEG. The EventHub Trigger function has a responsibility for mapping each event message type in the batch to the domain topic before its publishing to the AEG.
Note, that using the Azure IoT Hub for ingestion of the events, the AEG can be directly integrated to the IoT Hub and each event message can be distributed in the loosely decoupled Pub/Sub manner. Besides that, for this business requirements can be used the B1 scale tier for IoT Hub ($10/month) comparing to the Basic Event Hubs ($11.16).
The IoT Hub has built-in a message routing mechanism (with some limitations), but a recently new feature of the IoT/AEG integration such as publishing a device telemetry message is giving a good support in the serverless architecture.
I ended up using Azure Durable Functions using the Fan Out/Fan In pattern.
In this approach, all events are handled by a single Orchestrator Function which in fact is a Durable Azure Function (F1). This deserializes incoming JSON to the correct DTO. Based on the content of the DTO, a corresponding activity function (F2) is invoked which processes it.

How to send data from UDF to Cosmos DB in Azure Digital Twin?

Setup till now:
I have created spaces. At the top level I have the IOT hub resource. In two of spaces, I have attached devices to it along with the sensors. I have created a Matcher for the Temperature sensor along with the UDF that is similar to the documentation. I have also assigned permissions to UDF. To send data to IOT hub, I have also fetched the device connection string for the dotnet sample
List of issues I am facing:
When I try to run the dotnet sample, I can see that it is able to reach the UDF(checked it via debugging), but in the UDF, it is not able to access the telemetry variable as given in documentation . The error it shows is :
Unexpected exception occurred while processing user-defined function. Please contact support and provide the correlation ID for the request.
I have created an endpoint to send Raw Telemetry to Event Hub. But I want to send the processed data from UDF to cosmos db. Is it possible? If yes then how?
Thanks for the question and reaching out...for #2 you could do this by doing a notify method in your UDF. You can setup egress to other endpoints such as Event Hub, Event Grid or Service Bus via the endpoint dispatcher. You would setup endpoint via the /endpoint API and then in your UDF you could specify what you want to send out and which changes. For details on the events and endpoints you can see here: https://learn.microsoft.com/en-us/azure/digital-twins/how-to-egress-endpoints
Here's also here is a link to learn more about this connecting Digital Twins over to Logic Apps: https://learn.microsoft.com/en-us/azure/digital-twins/tutorial-facilities-events which would have a similar pattern to sending data over to Cosmos DB.
As for the first one I am not sure if you are still seeing this. Which region? Do you have a correlation ID that you can pass along? Also if you turn on logs and look in Azure Monitor are there details there?

Resources