Azure IOT Suite how to create a dashboard to display sensor data - azure

I am new to Azure. I have sensors and would like to send data from sensors to the Azure backend, preferably to a database. After collecting those sensor data I would like to display them on a dashboard. I wonder if there is a sample tutorial or source code to implement such a solution. Hope you can help me.
Thanks in advance & Best Regards.

The Azure IoT Suite is an accelerator that configures a solution using standard Azure services and each one comes with a dashboard. The source code is available on GitHub: Remote Monitoring and Predictive Maintenance

There are multiple ways of achieving this as Azure is not a final product but consists of different "modules" if you will.
If the idea is to create a dashboard that shows the sensor data, you don't necessarily need to store them into a database. You can create live streams and display them. If the storage of data is also of concern, then in parallel you can do that as well.
The redirection logic of the data here would be Stream Analytics, it works with the concepts of an input sink, a query and an output sink. Now in your case you might want to create an Event Hub/ IoT Hub, then use it as an input sink to the Stream Analytics, and use PowerBI as the output sink. This will get the data and display it in PBi. If you also want to store the data, you can add another output sink for different options like , blob storage, document db, azure sql database etc.
Also there is an Azure IoT suite remote monitoring solution, that automates most of those tasks with some extra modules, you can use that to create a solution and use it as a boilerplate.
Below are step-by-step tutorials for;
Event Hubs
Stream analytics
PowerBI
Hope this helps!
Mert

Related

From Azure Stream Analytics to Power Bi Desktop

I have done some research on my own, but cant seem to find a solution. We have an IoT solution in Azure. We are gathering real-time streaming data from sensors to Azure IoT HUB. This data is getting sent to Azure Stream Analytics and through Azure functions we will display the data in Power Bi. This works for us so far, but we want the real-time streaming data not only in browser mode but also in Power BI desktop. Since the desktop has way more functionalities this is important. Does anybody have any input on this?
We have tried what I described
You need to output the data from Stream Analytics to some kind of data store, like Azure SQL Database, Cosmos DB, etc. Then you can connect your PowerBI Desktop to that datastore.

Grafana as Azure Stream Analytics output

I am pushing events to my Event hub, then this data is being analyzed in Azure Stream Analytics. I'd like to visualize output from stream analytics in Grafana.
What is the easiest approach to achieve this?
Azure Stream Analytics job can natively ingest the data into Azure Data Explorer. https://techcommunity.microsoft.com/t5/azure-data-explorer/azure-data-explorer-is-now-supported-as-output-for-azure-stream/ba-p/2923654
You can then use the Azure Data Explorer plugin in Grafana. https://techcommunity.microsoft.com/t5/azure-data-explorer/azure-data-explorer-is-now-supported-as-output-for-azure-stream/ba-p/2923654
Another option is to use Power BI instead of Grafana. https://learn.microsoft.com/en-us/azure/stream-analytics/stream-analytics-power-bi-dashboard
If I remember correctly, Grafana doesn't store data locally, you need to define a data source on top of one of the compatible storage systems.
Azure Stream Analytics doesn't come with a storage layer either, it's compute only.
So if you want to use ASA and Grafana, you need to output data from ASA to a data source that is supported by Grafana in ingress.
Looking at both lists that leaves only MSSQL via Azure SQL (hopefully it's compatible) as a native option. It's not a bad option for narrow dashboards, or if you intend to store your data in a RDBMS anyway. You can store your entire payload in an NVARCHAR(MAX) if you don't plan to consume the data in SQL.
But being clever, we can actually use the Functions output to write to any other store, or call any API. I'm not sure if Grafana has a direct ingestion API, but Azure Monitor does and it's a supported data source in Grafana.
The other option would be to go through ADX as explained in the other answer.
Not straightforward but doable ;)

Azure data factory data flow task cannot take on prem as source

Hey,
I am having trouble in creating a Data Flows task which uses an on-prem source. Is this not possible in the Preview version?
I have created a self hosted IR to connect ADF to my laptop, and that is what I want to use. In the pic below I am trying to create a dataset off self hosted IR. It works great in Copy task but for Data Flows it is greyed out.
For this question, I asked Azure support for help and they replied me with the answer:
Answer:
This means on-premise SQL server is not supported as dataset in data flow in current stage.
Screen shot:
Update:
Data flow now only support Azure IR so it doesn’t support on-premise dataset.
Refer to Integration runtime types.
Hope this helps.
If your goal is to use visual data transformations in ADF using Mapping Data Flows with on-prem data, then build a pipeline with a Copy Activity first. Use the Self-Hosted Integration Runtime with the Copy Activity to stage your data in Blob Store. Then add a subsequent Execute Data Flow activity to transform that data.
I made video on how to do this:
https://www.youtube.com/watch?v=IN-4v0e7UIs
Reproduce your issue on my side ,however nothing about this feature is claimed on the official document. As you can see everywhere about data flow cliams that:
You could submit any voice here:
Also found a feedback for data flow in ADF for your reference.If you need push the progress of it,you could vote up it. Also,i would suggest you referring to the comments in the link:
For access to the 80+ ADF connectors, use Copy Activity to stage data
for transformation.
Data Flows will access data in your lake (Blob, ADB, ADW, ADLS) for
transformation.
Think of Copy Activity as your data heavy-lifting activity and Data
Flow as your data transformation engine.

Does azure data factory support real time copy activity?

I want to copy data from an on-premise oracle db to sql server on a real time basis.
Data Factory has many strengths, but frequency is not one of them. Have you considered a different approach?
For real time integrations I would recommend Functional App, Logic App and/or Service Bus. I would call this App whenever there is a relevant change in the oracle DB. Alternatively you could have an API on top of this on-premise oracle DB that you call from a scheduled APP.
If you expect heavy traffic you might want to consider using a Service Bus. The illustration below shows how Azure Service Bus sends data from a Publisher(on-premise) to a Subscriber(Azure sql DB) using Topic Message.
Illustration of large scale service bus dataflow
Ref. Azure Service Bus
Welcome to Stack Overflow!
You can copy data from an Oracle database to any supported sink data store. For a list of data stores that are supported as sources or sinks by the copy activity, see the Supported data stores table.
To copy data from and to an Oracle database that isn't publicly accessible, you need to set up a Self-hosted Integration Runtime. The integration runtime provides a built-in Oracle driver. Therefore, you don't need to manually install a driver when you copy data from and to Oracle.
For more details and step by step procedure, refer Copy data from and to Oracle by using Azure Data Factory.
You can also use custom activity as per your needs. Refer custom activities.
Hope this helps.
You could copy data incrementally. But there is frequency limitation. Please reference this post. https://social.msdn.microsoft.com/Forums/en-US/54380f98-716b-4a95-88af-cad2ab7e47b5/what-type-of-data-ingestion-does-azure-data-factory-use?forum=AzureDataFactory

Where is Azure Event Hub messages stored?

I generated a SAS signature using this RedDog tool and successfully sent a message to Event Hub using the Events Hub API refs. I know it was successful because I got a 201 Created response from the endpoint.
This tiny success brought about a question that I have not been able to find an answer to:
I went to the azure portal and could not see the messages I created anywhere. Further reading revealed that I needed to create a storage account; I stumbled on some C# examples (EventProcessorHost) which requires the storage account creds etc.
Question is, are there any APIs I can use to persist the data? I do not want to use the C# tool.
Please correct me if my approach is wrong, but my aim is to be able to post telemetries to EventHub, persist the data and perform some analytics operations on it. The telemetry data should be viewable on Azure.
You don't have direct access to the transient storage used for EventHub messages, but you could write a consumer that reads from the EventHub continuously and persist the messages to Azure Table or to Azure Blob.
The closest thing you will find to a way to automatically persist messages (as with Amazon Kinesis Firehose vs Amazon Kinesis which EventHubs are basically equivalent to), would be to use Azure Streaming Analytics configured to write the output either to Azure Blob or to Azure Table. This example shows how to set up a Streaming Analytics job that passes the data through and stores it in SQL, but you can see the UI where you can choose a choice such as Azure Table. Or you can get an idea of the options from the output API.
Of course you should be aware of the requirements around serialization that led to this question
The Event Hub stores data for maximum of 7 days; that’s too in standard pricing tier. If you want to persist the data for longer in a storage account, you can use the Event Hub Capture feature. You don’t have to write a single line of code to achieve this. You can configure it through Portal or ARM template. This is described in this document - https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-capture-overview
The event hub stores it’s transient data in Azure storage. It doesn’t give any more detail in relation to the data storage. This is evident from this documentation - https://learn.microsoft.com/en-us/azure/event-hubs/configure-customer-managed-key
The storage account you need for EventProcessorHost is only used for checkpointing or maintaining the offset of the last read event in a partition.

Resources