Is there a way to get sensor data from IoT central a MySQL database? - azure

I created simulated sensors that send the data heartrate, blood pressure and temperature. However, I am not sure how to send this data to a MySQL database. I believe that one of the ways I can do it is use Azure functions to send the data, but I am not sure how to go about it. Can anyone help?

Welcome to the community! You can achieve your use case with the following steps.
Push data from Azure IoT Central to an Azure cloud end point such as Event Hub or Service Bus. Here are the supported cloud destinations.
Note If this is a mockup project and you are using MySQL database for testing, I would recommend going with Azure Data Explorer instead. It will eliminate the need for next two steps if you decide to go with Export to Azure Data Explorer. If you still would like to use MySQL database, go with either of the following options-- a) Export to Service Bus b) Export to Event Hubs
Create an Azure Trigger function based on your cloud end point.
a) Azure Event Hubs trigger for Azure Functions
b) Azure Service Bus trigger for Azure Functions
Within this function, you can write custom code based on the coding language you choose to push data to MySQL database. Here is a reference tutorial that pushes data from Azure Functions to Azure SQL DB -- Azure SQL output binding for Azure Functions

Related

Is there a way to ingest data from an Azure IoT hub to a MySQL database?

I am trying to get data from my Azure IoT hub to a database, which will be then displayed on a website.
I have seen many ways to ingest data such as azure data explorer cluster and stream analytics jobs. However, the issue i have with data explorer is that i cannot get connection details for the database so i cot connect to it using code. As for stream analytics jobs, MySQL databases are not supported, and I have only have experience with MySQL.
Does anyone know a way i can ingest data from my Azure IoT hub to a MySQL database? Thanks in advance.
You will have to write some code to do this.
You can for instance create an Azure Function that uses an IoTHubTrigger or EventHubTrigger so that the Function receives the messages that are available on the IoT Hub. In the Function, you write code to make sure that the data is inserted in the appropriate tables in your database.
Find some documentation regarding this here.

Alternative to trigger Azure Functions or AppService based on a table row insert into Azure SQL MI

Is it possible to trigger Azure Functions or AppService webapp whenever an insert operation is performed against a table on Azure SQL MI?
if not, is there a way to trigger applications outside Azure SQL rather than using LogicApp? I want to avoid LogicApp because it requries using one more application, and it is still using polling.
Link below said it is not for Azure functions
https://feedback.azure.com/forums/355860-azure-functions/suggestions/16711846-sql-azure-trigger-support
Link below suggests using LogicApp.
Trigger Azure Function by inserting (adding) new row into table, SQL Server Database
Today, in Azure SQL, there is no such possibility. The closest option is to create a Timer Trigger Azure Function that checks if there has been any changes in the table you want to monitor (using Change Tracking, for example).
If you are using Azure SQL MI instead, you could create a SQLCLR procedure that calls an Azure Function via an HTTP request or, another option, via Azure Event Hubs or Azure Event Grid
There have been several feature requests for triggering Azure functions based on changes to at Azure SQL database. For example:
https://github.com/Azure/azure-functions-python-worker/issues/365
It seems that they are not prioritizing it, since it is possible to implement this functionality using logic apps.

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

Does azure data factory support real time copy activity?

I want to copy data from an on-premise oracle db to sql server on a real time basis.
Data Factory has many strengths, but frequency is not one of them. Have you considered a different approach?
For real time integrations I would recommend Functional App, Logic App and/or Service Bus. I would call this App whenever there is a relevant change in the oracle DB. Alternatively you could have an API on top of this on-premise oracle DB that you call from a scheduled APP.
If you expect heavy traffic you might want to consider using a Service Bus. The illustration below shows how Azure Service Bus sends data from a Publisher(on-premise) to a Subscriber(Azure sql DB) using Topic Message.
Illustration of large scale service bus dataflow
Ref. Azure Service Bus
Welcome to Stack Overflow!
You can copy data from an Oracle database to any supported sink data store. For a list of data stores that are supported as sources or sinks by the copy activity, see the Supported data stores table.
To copy data from and to an Oracle database that isn't publicly accessible, you need to set up a Self-hosted Integration Runtime. The integration runtime provides a built-in Oracle driver. Therefore, you don't need to manually install a driver when you copy data from and to Oracle.
For more details and step by step procedure, refer Copy data from and to Oracle by using Azure Data Factory.
You can also use custom activity as per your needs. Refer custom activities.
Hope this helps.
You could copy data incrementally. But there is frequency limitation. Please reference this post. https://social.msdn.microsoft.com/Forums/en-US/54380f98-716b-4a95-88af-cad2ab7e47b5/what-type-of-data-ingestion-does-azure-data-factory-use?forum=AzureDataFactory

Where is Azure Event Hub messages stored?

I generated a SAS signature using this RedDog tool and successfully sent a message to Event Hub using the Events Hub API refs. I know it was successful because I got a 201 Created response from the endpoint.
This tiny success brought about a question that I have not been able to find an answer to:
I went to the azure portal and could not see the messages I created anywhere. Further reading revealed that I needed to create a storage account; I stumbled on some C# examples (EventProcessorHost) which requires the storage account creds etc.
Question is, are there any APIs I can use to persist the data? I do not want to use the C# tool.
Please correct me if my approach is wrong, but my aim is to be able to post telemetries to EventHub, persist the data and perform some analytics operations on it. The telemetry data should be viewable on Azure.
You don't have direct access to the transient storage used for EventHub messages, but you could write a consumer that reads from the EventHub continuously and persist the messages to Azure Table or to Azure Blob.
The closest thing you will find to a way to automatically persist messages (as with Amazon Kinesis Firehose vs Amazon Kinesis which EventHubs are basically equivalent to), would be to use Azure Streaming Analytics configured to write the output either to Azure Blob or to Azure Table. This example shows how to set up a Streaming Analytics job that passes the data through and stores it in SQL, but you can see the UI where you can choose a choice such as Azure Table. Or you can get an idea of the options from the output API.
Of course you should be aware of the requirements around serialization that led to this question
The Event Hub stores data for maximum of 7 days; that’s too in standard pricing tier. If you want to persist the data for longer in a storage account, you can use the Event Hub Capture feature. You don’t have to write a single line of code to achieve this. You can configure it through Portal or ARM template. This is described in this document - https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-capture-overview
The event hub stores it’s transient data in Azure storage. It doesn’t give any more detail in relation to the data storage. This is evident from this documentation - https://learn.microsoft.com/en-us/azure/event-hubs/configure-customer-managed-key
The storage account you need for EventProcessorHost is only used for checkpointing or maintaining the offset of the last read event in a partition.

Resources