Use a custom angular client application with Azure IoT Central - azure

I am exploring the Azure IoT central rest APIs to create a custom Angular client. Is this possible or does it have any limitations? IoT Central is attractive due to its pricing. Specifically, I found that retrieving multiple telemetry values isn't possible as per the following documentation page. Which means you have to send individual "get" requests to fetch multiple telemetry.
Azure IoT Central (get telemetry value)
Is there a possibility to register a call back and receive regular updates of the values like in event hubs? Basically I want to develop a custom client facing app with the IoT Central Pricing. Is it possible?

It is possible, to receive regular updates on telemetry you can use continuous data export. You can export to Service Bus, Event Hubs and Blob Storage. See the documentation here on how to set that up. You can receive those events in any JavaScript application.
Please be aware that continuous data export will give you updates from all devices. If you need to filter them out you will need to build something to filter that out. One example I have built in the past is a .NET Core application that listens to the messages and sends that to the different clients over SignalR.

Related

How should I link a backend solution to an IoT Hub

So, I am working on an IoT solution on Azure, we have been using a partner solution where we had the partner's devices linked to his cloud solution that exposes the data to us Via REST services. Right now we want to have our own IoT Cloud Solution on Azure.
At first, I am planning to build a Bridge between our IoT Solution and the partner's cloud solution via its REST Services that will link to our IoT Hub in order to ingest the data to our cloud.
Also, the data will not be only telemetry data but we'll have to send commands as well to those devices.
My question: I would like to know what would be the appropriate technology/solution to use a gateway (Data Grid, Azure Function, Azure WebJob)
The numbers in the picture represent the step that I am considering to tackle this problem.
1- First we are implementing an Application gateway that will have to get the data from the partner's system and sending commands to their system. It will allow us to first build the other components of our system and make sure that it can handle what is in place right now.
2- Second, the partner's devices will connect directly to a device gateway that is connected to our IoT Hub. In this case, we will not be using the gateway made in 1 anymore.
3- Finally, we will have our own devices connected to our IoT Hub, the partner's devices will always be connected to our IoT Hub via the gateway built in 2.
Let me try to answer your questions in the order you have asked.
For application gateway, where you are trying to pull data through
REST, you can use Azure functions and then you use Cosmos DB or any
storage to save data. I see , after getting device data from Partner
network, you are routing it to IoT-Hub (I would not say, its
incorrect), however once we pull data through Rest, we can directly
put into DB. So my Answer is to use Azure functions to pull data
from Partner solutions and put into DB.
If partner device is capable of running Azure IoT sdks or can be
provisioned to send data to IoT Hub directly, this will ease lot of
things and you would be able to send D2C and C2D messages easily.
further, here you can route data to DB by using configuration from
IoT Hub.
For your devices you can use IoT Hub Directly or can use Azure
IoT Edge (device gateway as you pointed ), both are fine , depends
on use case and also if we want to do some edge computation or
analytics at device side. And one important suggestions, use Azure
functions where ever you find that you have to integrate devices
data through Rest. Most cost effective in such scenarios.
Let me know if it clears your doubts.
After some time working on the subject, I did implement an AZURE Function app for the following reasons :
Supports Continuous Deployment and Integration Even though Azure Functions is serverless architecture, it still supports Continuous Deployment and Continuous Integration
Capabilities for implementing code - Being event-driven, the application platform has capabilities to implement code triggered by events occurring in any third-party service or on-premise system.
Compute-on-demand: This delivery model ensures that computing resources are available to the users as per their demand.
I have also used Azure Table Storage as database storage technology.

How to route Event Hub messages to different Azure functions based on their message type

I have an Azure Event Hub over which I would like to send various types of messages. Each message should be handled by a separate Azure Function, based on their message type. What is the best way to accomplish this?
Actually, I could create some JSON container with a type and payload property and let one parent Azure Function dispatch all the messages payloads - based on their type - to other functions, but that feels a bit hacky.
This question basically asks the same - however it is answered how it can be done using the IoT Hub and message routing. In the Event Hub configuration I cannot find any setting to configure message routing though.
Or should I switch to an Azure Message Queue to get this functionality?
I would use Azure Streaming Analytics to route it to the different Azure Functions. ASAs allow you to specify Event Hubs as a source and several sinks (one of which can be multiple Azure Functions). You can read more about setting up Azure Streaming Analytics services through the Azure Portal here. You'll need to set up the Event Hub as your source (docs). You'll also need to set up your sink (docs). You write some MS SQL-like code to route the messages to the various sinks. However, ASAs are costly relative to other services since you're paying for a fixed amount of compute.
I put some pseudo code below. You'll have to swap it out based on how you configure you're ASA using the information from the attached MS Documentation.
SELECT
*
INTO
[YourOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
SELECT
*
INTO
[YourAlternateOutputAlias]
FROM
[YourInputAlias]
HAVING
[CONDITION]
Based on your additional info about the business requirements and assuming that the event size < 64KB (1MB in preview), the following screen snippet shows an example of your solution:
The concept of the above solution is based on the pushing a batch of the events to the Event Domain Endpoint of the AEG. The EventHub Trigger function has a responsibility for mapping each event message type in the batch to the domain topic before its publishing to the AEG.
Note, that using the Azure IoT Hub for ingestion of the events, the AEG can be directly integrated to the IoT Hub and each event message can be distributed in the loosely decoupled Pub/Sub manner. Besides that, for this business requirements can be used the B1 scale tier for IoT Hub ($10/month) comparing to the Basic Event Hubs ($11.16).
The IoT Hub has built-in a message routing mechanism (with some limitations), but a recently new feature of the IoT/AEG integration such as publishing a device telemetry message is giving a good support in the serverless architecture.
I ended up using Azure Durable Functions using the Fan Out/Fan In pattern.
In this approach, all events are handled by a single Orchestrator Function which in fact is a Durable Azure Function (F1). This deserializes incoming JSON to the correct DTO. Based on the content of the DTO, a corresponding activity function (F2) is invoked which processes it.

IOT hub to Email Notification

I am developing a Azure IOTHUB use case.
Multiple Load cells are sending continuously (every 1/2 sec) sending data to IOTHUB. (DeviceID, weight).
SQL Table with User Data .
I want to make a system that that sends an email notification on certain weight to the device owner.
What is the right approach to achieve that.
I have seen Logic apps is an option but how to implement it with multiple user account and devices.
I would use IoT Hub routing to push the messages that meet the weight criteria to a service bus queue. From there you can use an Azure Function with a Service Bus Trigger. I assume the user account information (e-mail address?) is available via a query in the SQL table. Azure Functions have a SendGrid binding that you'd then use to send out the e-mail.
Note that routing IoT Hub directly to a function is on the backlog.
Basically, there are two solutions for your scenario, when each device has own criteria on the weight:
The device twin desired property contains a weight value used for publishing a non-telemetry alert message by a real device to the Azure IoT Hub. This alert message can be routed in the Azure IoT Hub Routes to the custom endpoint the same way like is described in Jim's answer (ServiceBus->AzureFuction->SendGrid)
The second solution is more complex, generic, very flexible and it doesn't require any special coding on the device side or device twin. It's based on the standard telemetry stream pipeline with Azure Stream Analytics (ASA) job for analyzing events and generating a notification message for output to the Azure Function with SendGrid. The ASA job used a reference data (user data, weight, etc.) from the blob file generated and refreshed by SQL Database.
The following screen snippet shows this solution:
I would like to present one another approach which I think is correct too (I tested this flow):
Data is sent to the Azure IoT Hub from device
Azure Stream Analytics filters this data basing on weight and deviceID
Once data is analyzed there is a call to the Azure Function which triggers Azure Logic App with data collected from the Stream Analytics
Azure Logic App receives data (HTTP trigger) from Azure Function App
Then Logic App uses "Get row" action step to get user data from SQL Database
Last step is up to you - you can use either "SendGrid - send e-mail" action or integrate Logic App with Outlook or even Office365, Gmail and other services.
Here are the links to the documentation:
Connect to SQL Server or Azure SQL Database from Azure Logic Apps
Send emails and manage mailing lists in SendGrid by using Azure Logic Apps

How to forward a message from Azure estate to NodeJS app?

I'm creating an IoT solution. On my web app i want to have a table on the dashboard that updates alert events in realtime.
I currently have an API, written in NodeJS that receives the JSON and invokes sockets.io to update the table. This solution seems a bit clunky.
I'm wondering is there a more seamless way to do this, similar to how the different components within azure link together.
I've looked into Azure Queues and having nodeJS subscribe and consume from the queue but as far as I could find there's no way to persist a queue connection from NodeJS and I would have had to continually poll.
I've looked into Power BI as well but I need to be able to completely change and modify every aspect of the design too.
The following is what I have so far:
Devices send data to IoT Hub. This is then processed by an azure stream analytics job and if a certain criteria is hit it sends the message to a documentDB for storage, and also to a service bus queue. I have a logic app which is triggered by a message arriving on the service bus queue and then POSTs the data to my API.

Sending Azure Application Insights data to Event Hub

I have a static dot net web application with application insights sdk. How do I send application insights data to Azure Event Hub? I have successfully used the Azure Continuous Export feature but I would rather like to send the telemetry data to the Event Hub.
To explicitly send data to eventhub you will need to use EventHub SDK, which is currently available in .NET/C#, Java, REST, and Node.js. For your case which is a web application, sending via REST APIs might be the easiest way. Take a look at API reference for more information: https://msdn.microsoft.com/en-us/library/azure/dn790674.aspx
One catch is that receiving events is not currently supported using REST, you would still need a .NET or Java application to be on the receive side.
If you are looking for a common logging framework - which can be configured to send data to "whatever data destination" you want to - you should consider looking at log4net.
Here's a good implementation of log4net-appender for EventHubs.
#greypanda,
As you know Continuous Export currently only exports Application Insights to blob storage, from which you can pick up the data for use in any workstream you want. Exporting directly in an Event Hub is something that could be a future feature, so please log this at our UserVoice site: https://visualstudio.uservoice.com/forums/357324-application-insights.
We will also have a set of REST APIs for Application Insights soon (see https://visualstudio.uservoice.com/forums/357324-application-insights/suggestions/4999529), which might help you.
I would like to learn more about your scenario so I can better help you in this instance and improve our export and API features. Feel free to reply here or if you want, shoot me a mail offline.
Thank you
Dale Koetke (dalek#microsoft.com)
We don't really support that. It's a lot easier to let the SDK send the data to App Insights portal, then you can use Continuous Export to move it out into Storage. If you want, you can use Stream Analytics to move it from there.
What do you plan to ultimately do with the data? (I mean, why event hub...?)

Resources