I have created an Azure function to route messages from an IoT hub to an Azure SQL DB using the IoTHubTrigger following mostly the instructions in this link Azure Functions - how to set up IoTHubTrigger for my IoTHub messages?.
Each IoT device captures data every 8 minutes. When capturing is done, the device streams the data in 4 different messages. Then the azure function takes over to write these 4 different messages to the database.
When only one device was streaming, I had no issues with the data, which where written in the db and I could also see/monitor events/messages using az iot hub monitor-events.
When a second device started streaming in the same IoT hub, I started missing messages, meaning that from each device only one message is being stored in the db. Also when using iot hub monitor-events only one message appears from each device. I was also expecting that if I disable the 2nd device, then the 1st one will go back to normal. Unfortynately the issue remains the same.
So my question is: how is it possible a 2nd device screwing up the way that the 1st one interacts with the hub?
If that's not the case, then how we are supposed to figure out what causes the problem at this stage?
Thanks :)
Difficult to say without more details. Are you routing messages in IoT Hub somewhere else? I would go back to a clean IoT Hub with one device and create a consumer group on the IoT Hub for the function. Before running the function I would monitor that consumer group (I like to use the Azure IoT Explorer application) to see if data is coming through as expected, then add another device and keep monitoring the same consumer group. If data is coming through then start the function (consuming data from the consumer group).
If telemetry was not getting read from the IoT Hub consumer group then you will need to look at your device code for any issues.
Related
I've got data coming into IoTHub and want to filter on them.
Relevant data I want to forward to slack as notification.
I've got the IoT Hub and a slack subscription in place and am having trouble connecting the two.
In order to do a rather complex time-based query, I figure to use Stream Analytics and configure the IoT Hub as input. From research I found Logic Apps can send messages to Slack over a webhook. Using a Service Bus Queue as output for Stream Analytics, I can get the data into Logic Apps.
So it's:
IoT Hub (ingest all data) => Stream Analytics (filter) => Service Bus Queue (queue up the data) => Logic Apps (send to Slack)
Looks a bit bulky but that seems to be one way of doing it (is there a better one?!?)
Doing this I ran into issues. I selected my IoT Hub as input for Stream Analytics and the simple query SELECT * INTO [Queue] FROM [Hub] fails, saying there was no data.
It does make sense if the IoT Hub just pushes new data to its endpoints and then discards it. So I created a test set in the Stream Analytics Job and the query runs fine.
However I do get data into the Hub which is not (all) picked up nor forwarded by the job to the service bus queue. I do see some activity on the queue but not nearly enough to be the data I receive.
This seems to be a very common scenario, ingesting data in IoT Hub and sending notifications to email or slack if they are of a certain type. Can you explain the steps to take or point me to a resource that does it. Maybe I'm on the wrong path as I cannot find anything that describes this.
Thanks
I have an Azure IoTHub with thousands of devices registered. These devices communicate through a Telco provider who sends messages through an Azure Storage Queue. This Storage Queue triggers an Azure Function which needs to parse the messages and Send an Event to the IoTHub as below.
Currently, we use the Azure IoTHub SDK to create a DeviceClient for each payload and we send the event. Because the DeviceClient represents a device in the IoTHub and is carrying the context of the source of the events, we are having to recreate a device client for each event. This quickly exceeds the threshold of the number of Connections allowed on Azure Functions.
We have tried using the IoTHub Output bindings for Azure Functions, but could not get to work and I do not think it would work because we need to make sure that the events get to the IoTHub with the right context (messages are sent by the right device).
What's the right way to solve this? Can the connections to the IoTHub be reused? Should we abandon Azure Function in favour of something else?
I assume that Telco is some kind of custom device management solution(vendor lock solution), that can also communicate with the device and receive the device telemetry, and eventually forward it to the specified endpoint, correct?
If I may ask and if my assumption is correct, why do you need to deliver the events to IoT Hub, if you are not managing Telco devices through IoT Hub(the arrows on your diagram are only in one direction)?
Using the IoT Hub just as a message broker for essentially cloud-to-cloud communication is not beneficial if that is the only purpose. Also conceptually what you described is cloud-to-cloud communication, and IoT Hub is intended to be used for devices.
Here is what I would do. Setup the API Management(or http triggered Azure Function) as a front door for Telco and pass the messages to the Event Hub.
You can choose here to pass request body for example where your telemetry data is - I assume again.
Keep the IoT Hub, and setup the routing to previously created Event Hub.
Now, in case you have devices that are not vendor locked and that can talk directly to IoT Hub, messages will be re-routed to Event Hub. Also Telco device messages will be routed to exactly the same Event Hub.
Now you can have for example Azure Stream Analytics that can analyze data stream just from the Event Hub, and for both, Telco devices and potentially non-Telco devices.
After trying a few things, I ended up moving away from using the SDK for pushing messages to IoT Hub. This is because the SDK uses AMQP, and creating a DeviceClient for each payload is not viable.
We switched to using HTTPS instead to push the messages to IoT Hub and using HttpClientFactory, we are able to do connection pooling.
I thought I would put this here in case someone has the same issue.
Here is an example of the Http request to send message to IoT Hub
Host: https://<iothubname>.azure-devices.net/devices/<deviceId>/messages/events?api-version=2018-06-30
Authorization: SharedAccessSignature sr=<iothubname>.azure-devices.net&sig=abc123;12344iweoippweruea=iothubowner&se=1570574220
Body: <normal Interval or alarms payloads> // example {"deviceid": "abc", "hello": "world"}
Lastly, thanks #kgalic for the answer but your suggestion would not work. This is not pure B2B integration. Our implementation have to allow for both devices connecting directly to the IoT Hub and devices connecting through the Telco. This is why every device needs to have its own identity and digital twin.
In my case I have 1000+ of devices that stores activity inside. I need to send a http get request to this device to get those data in csv or json format and save it in a storage hosted on azure.
Cab IOT hub require data using get request and can it be scheduled to read daily/weekly?
What other azure services would you suggest to facilitated this scheduled reads?
You have not mentioned which the Azure IoT Hub scale tier is used. Basically there are two price groups such as Basic and Standard with a significant different cost and capabilities. The Basic tier offers only services for one-way communications between the devices and Azure IoT Hub.
Based on that, the following scenarios can be used for your business case:
1. Basic Tier (non event-driven solution)
The device pushs periodicaly a telementry and non-telemetry messages based on the needs to the Azure IoT Hub, where the non-telemetry messages are routed to the Azure Function via the Service Bus Queue/Topic. Responsibility for this non-telemetry pipe is to persist a real device state in the database. Note, that the 6M messages will cost only $50/month. The back-end application can any time to query this database for devices state.
2. Standard Tier (event-driven solution) In this scenario you can use a Device Twin of the Azure IoT Hub to enable storing a real-device state in the cloud-backend (described by #HelenLo). The device can be triggered by C2D message, changing a desired property, invoking a method or based on the device edge trigger to the action for updating a state (reported properties).
The Azure IoT Hub has a capabilities to run your scheduled jobs for multiple devices.
In this solution, the back-end application can call any time a job for ExportDevicesAsync to the blob storage, see more details here. Note, that the 6M messages will cost $250/month.
As you can see the above each scenario needs to build a different device logic model based on the communications capabilities between the devices and Azure IoT Hub and back. Note, there are some limitations for these communications, see more details here.
You can consider using Device Twin of IoT Hub
https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-device-twins
Use device twins to:
Store device-specific metadata in the cloud. For example, the deployment location of a vending machine.
Report current state information such as available capabilities and conditions from your device app. For example, a device is connected to your IoT hub over cellular or WiFi.
Synchronize the state of long-running workflows between device app and back-end app. For example, when the solution back end specifies the new firmware version to install, and the device app reports the various stages of the update process.
Query your device metadata, configuration, or state.
IoT Hub provides you with the ability to connect your devices over various protocols. Preferred protocols are messaging protocols, such as MQTT or AMQP, but HTTPS is also supported. Using IoT hub, you do not request data from the device, though. The device will send the data to the IoT Hub. You have to options to implement that with IoT Hub:
The device connects to the IoT Hub whenever it has some data to be sent, and pushes the data up to IoT Hub
The device does not send any data on its own, but stays always or at least regularly connected to IoT Hub. You then can send a cloud to device message over IoT Hub to the device, requesting the data to be sent. The device then sends the data the same way it would in the first option.
When the data then has been sent to IoT Hub, you need to push it somewhere where it is persistently stored - IoT Hub only keeps messages for 1 day by default. Options for this are:
Create a blob storage account and push to that directly from IoT Hub using a custom endpoint This would probably be the easiest and cheapest. Dependening on how you need to access your data, a blob might not be the best option, though
Create a function app, create a function with an EventHubTrigger, connect it to IoT Hub and let the function process incoming data by outputting it into any kind of data sink, such as SQL, CosmosDB, Table Storage...
I have a little IoT project with one device. Arduino sends some values to azure where function application processes them and sends instructions for arduino to the endpoint in IoT hub. (/devices/MKR1000/messages/devicebound?api-version=2016-02-03)
I need to get data from this endpoint in real time so I want arduino to read only the last (the newest) message every time, but it starts from the oldest.
It's possible to make the arduino read all the messages from the endpoint and than show the last, but I'm looking for a more efficient way.
Thank you.
You receive old messages because they are still queued in Azure IoT Hub due to the device not "complete" these messages. IoT Hub supports the option to complete/reject/abandon C2D messages over HTTPS and AMQP only at the moment.
Another option is setting ExpiryTimeUtc(in function application?) to release older messages faster(Minimum 1 minute. Default: 1 hour.).
More information you can reference "Send cloud-to-device messages from IoT Hub".
Im trying to create a complete solution to present data from IoT devices on to a webpage.
The data and devices will never be in the millions so using Stream Analytics, Machine Learning, Big Data etc. is costly and unnecessary.
I've looked at docs, blogs, forums for weeks now, and im stuck with the part on how to process the messages that the IoT hub receives, i want to save them to a SQL database and then build a website that will present them to the users.
What i have so far:
1. Device part
Raspberry Pi 3 has Windows IoT Core installed
Messages are sent and recieved on both Hub and Device ends successfully
(verified with Device Explorer and IoT hub dashboard)
2. Processing part
The most similar approach is detailed here but i don't want to use NoSQL, ive tried to use the Azure Function with the External Table (experimental) but there is zero documentation for that and all my attempts failed with function error.
Now im trying to connect a WebJob to process IoT Hub messages but i cant find any relevant samples or docs. Essentially id want to convert a Console App to a WebJob which will be triggered when a message arrives to the IoT hub
3. Webpage part
Once i get the messages to the SQL database i will create my custom portal for managing and registering devices, issuing one-off commands to devices and for request-response data.
The telemetry will be queried from the database and presented statically or near real time (with SignalR) by device type, location, by user privilages etc. this part is preety clear to me.
Please can anyone help me out with the processing part??
I found a solution by using Azure WebJobs and this article explains how to tie an EvenHub (IoT Hub) to the WebJob.