How to use WebJob to process IoT hub messages and save them to SQL Database? - azure

Im trying to create a complete solution to present data from IoT devices on to a webpage.
The data and devices will never be in the millions so using Stream Analytics, Machine Learning, Big Data etc. is costly and unnecessary.
I've looked at docs, blogs, forums for weeks now, and im stuck with the part on how to process the messages that the IoT hub receives, i want to save them to a SQL database and then build a website that will present them to the users.
What i have so far:
1. Device part
Raspberry Pi 3 has Windows IoT Core installed
Messages are sent and recieved on both Hub and Device ends successfully
(verified with Device Explorer and IoT hub dashboard)
2. Processing part
The most similar approach is detailed here but i don't want to use NoSQL, ive tried to use the Azure Function with the External Table (experimental) but there is zero documentation for that and all my attempts failed with function error.
Now im trying to connect a WebJob to process IoT Hub messages but i cant find any relevant samples or docs. Essentially id want to convert a Console App to a WebJob which will be triggered when a message arrives to the IoT hub
3. Webpage part
Once i get the messages to the SQL database i will create my custom portal for managing and registering devices, issuing one-off commands to devices and for request-response data.
The telemetry will be queried from the database and presented statically or near real time (with SignalR) by device type, location, by user privilages etc. this part is preety clear to me.
Please can anyone help me out with the processing part??

I found a solution by using Azure WebJobs and this article explains how to tie an EvenHub (IoT Hub) to the WebJob.

Related

Azure how to get events shown in CLI from IoT to a database

I am having some issues actually retrieving and using the data I send to the IoT Hub in Azure. When I run 'az IoT hub monitor-events --hub-name ' in CLI I can see my events, and I can also send messages to my devices in the IoT hub.
I have then tried to create a stream, to forward the messages to my SQL database, but without any luck. Do you have any suggestions on how to retrieve this data?
There are multiple ways about this. The two most common scenarios are probably using an Azure Function, or using a Stream Analytics job. I don't know what you've tried up until this point, but a Stream Analytics job is probably the easiest way to go.
Stream Analytics
This answer on SO could be what you're looking for, it also links to this tutorial that you could follow from "Create a new Azure SQL Database" onwards. It covers creating an IoT Hub input and Azure SQL output on your Stream Analytics job and using a simple query to link the two together. There is more info in the Microsoft docs here
Azure Function
While looking this one up I found this answer, which is mine, awkward. But it describes how you can go about creating an Azure Function that accepts IoT Hub messages and shoots them to your database. This option is a lot more cost-efficient (or even free, if you use the consumption plan for a Function) for a few devices.

Azure function missing IoT hub trigger messages

I have created an Azure function to route messages from an IoT hub to an Azure SQL DB using the IoTHubTrigger following mostly the instructions in this link Azure Functions - how to set up IoTHubTrigger for my IoTHub messages?.
Each IoT device captures data every 8 minutes. When capturing is done, the device streams the data in 4 different messages. Then the azure function takes over to write these 4 different messages to the database.
When only one device was streaming, I had no issues with the data, which where written in the db and I could also see/monitor events/messages using az iot hub monitor-events.
When a second device started streaming in the same IoT hub, I started missing messages, meaning that from each device only one message is being stored in the db. Also when using iot hub monitor-events only one message appears from each device. I was also expecting that if I disable the 2nd device, then the 1st one will go back to normal. Unfortynately the issue remains the same.
So my question is: how is it possible a 2nd device screwing up the way that the 1st one interacts with the hub?
If that's not the case, then how we are supposed to figure out what causes the problem at this stage?
Thanks :)
Difficult to say without more details. Are you routing messages in IoT Hub somewhere else? I would go back to a clean IoT Hub with one device and create a consumer group on the IoT Hub for the function. Before running the function I would monitor that consumer group (I like to use the Azure IoT Explorer application) to see if data is coming through as expected, then add another device and keep monitoring the same consumer group. If data is coming through then start the function (consuming data from the consumer group).
If telemetry was not getting read from the IoT Hub consumer group then you will need to look at your device code for any issues.

How should I link a backend solution to an IoT Hub

So, I am working on an IoT solution on Azure, we have been using a partner solution where we had the partner's devices linked to his cloud solution that exposes the data to us Via REST services. Right now we want to have our own IoT Cloud Solution on Azure.
At first, I am planning to build a Bridge between our IoT Solution and the partner's cloud solution via its REST Services that will link to our IoT Hub in order to ingest the data to our cloud.
Also, the data will not be only telemetry data but we'll have to send commands as well to those devices.
My question: I would like to know what would be the appropriate technology/solution to use a gateway (Data Grid, Azure Function, Azure WebJob)
The numbers in the picture represent the step that I am considering to tackle this problem.
1- First we are implementing an Application gateway that will have to get the data from the partner's system and sending commands to their system. It will allow us to first build the other components of our system and make sure that it can handle what is in place right now.
2- Second, the partner's devices will connect directly to a device gateway that is connected to our IoT Hub. In this case, we will not be using the gateway made in 1 anymore.
3- Finally, we will have our own devices connected to our IoT Hub, the partner's devices will always be connected to our IoT Hub via the gateway built in 2.
Let me try to answer your questions in the order you have asked.
For application gateway, where you are trying to pull data through
REST, you can use Azure functions and then you use Cosmos DB or any
storage to save data. I see , after getting device data from Partner
network, you are routing it to IoT-Hub (I would not say, its
incorrect), however once we pull data through Rest, we can directly
put into DB. So my Answer is to use Azure functions to pull data
from Partner solutions and put into DB.
If partner device is capable of running Azure IoT sdks or can be
provisioned to send data to IoT Hub directly, this will ease lot of
things and you would be able to send D2C and C2D messages easily.
further, here you can route data to DB by using configuration from
IoT Hub.
For your devices you can use IoT Hub Directly or can use Azure
IoT Edge (device gateway as you pointed ), both are fine , depends
on use case and also if we want to do some edge computation or
analytics at device side. And one important suggestions, use Azure
functions where ever you find that you have to integrate devices
data through Rest. Most cost effective in such scenarios.
Let me know if it clears your doubts.
After some time working on the subject, I did implement an AZURE Function app for the following reasons :
Supports Continuous Deployment and Integration Even though Azure Functions is serverless architecture, it still supports Continuous Deployment and Continuous Integration
Capabilities for implementing code - Being event-driven, the application platform has capabilities to implement code triggered by events occurring in any third-party service or on-premise system.
Compute-on-demand: This delivery model ensures that computing resources are available to the users as per their demand.
I have also used Azure Table Storage as database storage technology.

How to send data from UDF to Cosmos DB in Azure Digital Twin?

Setup till now:
I have created spaces. At the top level I have the IOT hub resource. In two of spaces, I have attached devices to it along with the sensors. I have created a Matcher for the Temperature sensor along with the UDF that is similar to the documentation. I have also assigned permissions to UDF. To send data to IOT hub, I have also fetched the device connection string for the dotnet sample
List of issues I am facing:
When I try to run the dotnet sample, I can see that it is able to reach the UDF(checked it via debugging), but in the UDF, it is not able to access the telemetry variable as given in documentation . The error it shows is :
Unexpected exception occurred while processing user-defined function. Please contact support and provide the correlation ID for the request.
I have created an endpoint to send Raw Telemetry to Event Hub. But I want to send the processed data from UDF to cosmos db. Is it possible? If yes then how?
Thanks for the question and reaching out...for #2 you could do this by doing a notify method in your UDF. You can setup egress to other endpoints such as Event Hub, Event Grid or Service Bus via the endpoint dispatcher. You would setup endpoint via the /endpoint API and then in your UDF you could specify what you want to send out and which changes. For details on the events and endpoints you can see here: https://learn.microsoft.com/en-us/azure/digital-twins/how-to-egress-endpoints
Here's also here is a link to learn more about this connecting Digital Twins over to Logic Apps: https://learn.microsoft.com/en-us/azure/digital-twins/tutorial-facilities-events which would have a similar pattern to sending data over to Cosmos DB.
As for the first one I am not sure if you are still seeing this. Which region? Do you have a correlation ID that you can pass along? Also if you turn on logs and look in Azure Monitor are there details there?

How to forward a message from Azure estate to NodeJS app?

I'm creating an IoT solution. On my web app i want to have a table on the dashboard that updates alert events in realtime.
I currently have an API, written in NodeJS that receives the JSON and invokes sockets.io to update the table. This solution seems a bit clunky.
I'm wondering is there a more seamless way to do this, similar to how the different components within azure link together.
I've looked into Azure Queues and having nodeJS subscribe and consume from the queue but as far as I could find there's no way to persist a queue connection from NodeJS and I would have had to continually poll.
I've looked into Power BI as well but I need to be able to completely change and modify every aspect of the design too.
The following is what I have so far:
Devices send data to IoT Hub. This is then processed by an azure stream analytics job and if a certain criteria is hit it sends the message to a documentDB for storage, and also to a service bus queue. I have a logic app which is triggered by a message arriving on the service bus queue and then POSTs the data to my API.

Resources