How Azure Stream Analytics find schema of data coming from Event Hub - azure

I am following this tutorial to understand how Stream Analytics work on Azure. https://learn.microsoft.com/en-us/azure/stream-analytics/stream-analytics-real-time-fraud-detection?toc=%2Fazure%2Fsynapse-analytics%2Fsql-data-warehouse%2Ftoc.json&bc=%2Fazure%2Fsynapse-analytics%2Fsql-data-warehouse%2Fbreadcrumb%2Ftoc.json
A cmd event generator application is sending data to event hub to which a stream analytics job is connected. I don't understand two things
I have not specified data schema anywhere, yet I can query the data. How?
The tuturial recommends that I create a consumer group in event hub. A consumer group MyConsumerGroup is created in the tutorial but never used. What is the purpose of the consumer group?

Related

Azure - ingesting data into IoT Hub and sending notifications via slack/email for certain messages

I've got data coming into IoTHub and want to filter on them.
Relevant data I want to forward to slack as notification.
I've got the IoT Hub and a slack subscription in place and am having trouble connecting the two.
In order to do a rather complex time-based query, I figure to use Stream Analytics and configure the IoT Hub as input. From research I found Logic Apps can send messages to Slack over a webhook. Using a Service Bus Queue as output for Stream Analytics, I can get the data into Logic Apps.
So it's:
IoT Hub (ingest all data) => Stream Analytics (filter) => Service Bus Queue (queue up the data) => Logic Apps (send to Slack)
Looks a bit bulky but that seems to be one way of doing it (is there a better one?!?)
Doing this I ran into issues. I selected my IoT Hub as input for Stream Analytics and the simple query SELECT * INTO [Queue] FROM [Hub] fails, saying there was no data.
It does make sense if the IoT Hub just pushes new data to its endpoints and then discards it. So I created a test set in the Stream Analytics Job and the query runs fine.
However I do get data into the Hub which is not (all) picked up nor forwarded by the job to the service bus queue. I do see some activity on the queue but not nearly enough to be the data I receive.
This seems to be a very common scenario, ingesting data in IoT Hub and sending notifications to email or slack if they are of a certain type. Can you explain the steps to take or point me to a resource that does it. Maybe I'm on the wrong path as I cannot find anything that describes this.
Thanks

In Azure stream analytics trying to pull a data from event hub

In Azure stream analytics while trying to pull a data from event hub I am receiving the following error
Diagnostics: Source '<unknown_location>' had 1 occurrences of kind
'InputDeserializerError.InvalidData' between processing times
'2020-11-19T04:08:35.3436931Z' and '2020-11-19T04:08:35.3686240Z'.
Unable to create records from the given Avro record schema
I want to know what could be the reason?
Is there a way to find what kind of data file is streaming in EVENTHUB?
Check the event serialization format of the input(your event hub). Potentially, what you set there, and what you are sending indeed does not match. Could be that you are sending JSON, but you specified AVRO as the event serialization or vice versa.
You can download Service bus explorer, and connect it to your EventHub and that way inspect what you are getting. I advise adding an additional consumer group for your EventHub just to avoid competing consumers and connecting Service bus explorer to that particular consumer group.

how many stream analytic jobs can you have for one iot hub

I have created two stream analytic jobs for one iothub having multiple devices.
But, the data is being received only to the first created stream analytic job. Even if I stop that, no data is being sent to the second stream analytic job.
Is that a bug or am I missing something. Or is it simply that one iothub can have only one stream analytic job.
It looks like your both stream analytic jobs are using the same Consumer Group such as $Default from the same IoT Hub.
So, create for Azure IoT Hub two Consumer Groups dedicated for each ASA job, in other words, each ASA job will have own Consumer Group.
the following screen snippet shows an example of the plugin ASA job to the IoT Hub, where the Consumer group can be selected specifically for each job.

How can I Implement the logic for to send the values of event hub (filtered values by stream analytics job) to IoT Hub using UWP App?

I am currently working on Internet Of Things, in my current project I was implemented the logic for to send temperature values to IoT Hub (using Raspberry PI2 and BMP280 sensor), in the azure part I created stream analytics job for receiving the messages from IoT Hub and filters those values based on my query like if temperature value exceeds 30 deg and post those filtered values to event hub is one of the output of the stream analytics job.
Query I wrote in stream analytics job.
SELECT
System.timestamp AS Time,
DeviceId,
RoomTemp,
RoomPressure,
RoomAlt
INTO
eventhub
FROM
bmpsensordata
WHERE RoomTemp>35
I was already created one event hub in azure, and monitor those filtered values by stream analytics job, in dashboard of event hub what I was created earlier.
But I want to send the values (filtered values by stream analytics job) of event hub to IoT Hub from that I will receive the values in the form of alert message/ notification using UWP App(C# language).
Please tell me how I can do it above scenario.
Regards,
Pradeep
I think that your solution could look like in the diagram I prepared:
Once data is retrieved from the IoT Hub and analyzed by Stream Analytics it can call Azure Function which triggers Azure Notification Hub to send push notification to your UWP application.
Please use my tutorial to see how to use Stream Analytics together with Azure function and at the end how to send SMS alerts - in your case you should replace the code with the one for Notification Hub:
https://github.com/Daniel-Krzyczkowski/Daniel-Krzyczkowski.github.io/blob/master/cloudyofthings/article1/index.md
Here is the documentation how to use Notification Hub SDK and how to integrate it with UWP applications:
https://learn.microsoft.com/en-us/azure/notification-hubs/notification-hubs-aspnet-backend-windows-dotnet-wns-notification
I think you will need another Stream Analytics job with Event Hub as input, and IoT Hub as output. Then you can receive the cloud-to-device messages from IoT Hub in your UWP application as described in this article.
You can have multiple output from a single Stream Analytics job. Refer to https://blogs.msdn.microsoft.com/streamanalytics/2015/09/16/query-pattern-of-the-week-send-data-to-multiple-outputs/ for more information on this.
Stream Analytics does not have a direct output to IoT hub though.
You'd need to put the info in to an EventHub and have a worker role process this and send the info from there in to IoT hub
Per my experience, I think you can try to integrate Notification Hub with IoTHub, Stream Analytics, Event Hub to implement your needs. Please see the details below.
Create a Stream Analytics Job with IoTHub as input and Event Hub as output for filtering sensor data.
Create a Notification Hub for pushing data to UWP app.
Create an server service or a scheduler job for receiving and sending the data from the Event Hub to the Notification Hub, such as continuous WebJob.
As references, there are some documents which show you how to do it.
Get started with Azure Stream Analytics to process data from IoT devices
, https://azure.microsoft.com/en-us/documentation/articles/stream-analytics-get-started-with-azure-stream-analytics-to-process-data-from-iot-devices/
Getting started with Notification Hubs for Windows Store Apps, https://azure.microsoft.com/en-us/documentation/articles/notification-hubs-windows-store-dotnet-get-started/
Event Hubs programming guide, https://azure.microsoft.com/en-us/documentation/articles/event-hubs-programming-guide/
Notification Hub Server SDK reference for .NET, https://msdn.microsoft.com/library/mt414893.aspx
Create a .NET WebJob in Azure App Service (run continuously), https://azure.microsoft.com/en-us/documentation/articles/websites-dotnet-webjobs-sdk-get-started/
Hope it helps.
Any concern, please feel free to let me know.

Does Microsoft Azure IoT Hub stores data?

I have just started learning Azure IoT and it's quite interesting. I am confuse about does IoT Hub stores data somewhere?
i.e. Suppose i am passing room Temperature to IoT hub and want to store it in database for further use. How it's possible?
I am clear on how device-to-cloud and cloud-to-device works with IoT hub.
IoT Hub exposes device to cloud messages through an event hubs endpoint. Event Hubs has a retention time expressed in days. It's a stream of data that the reading client could re-read more time because the cursor is on client side (not on server side like queues and topics). With IoT Hub the related retention time is 1 day by default but you can change it.
If you want to store received messages from device you need to have a client reading on the Event Hubs exposed endpoint (for example with an Event Processor Host) that has the business logic to process the messages and store them into a database for example.
Of course you could use another decoupling layer so that the client reads from event hubs and store messages into queues. Then you have another client that at its own pace reads from queues and store into database. In this way you have a fast path reading event hubs.
This is pretty much the use case for all IoT scenarios.
Step 1: High scale data ingestion via Event Hub.
Step 2: Create and use a stream processing engine (Stream Analytics or HDInsight /Storm). You can run conditions (SQL like queries) to filter and store appropriate data in either cold or hot store for further analytics.
Step 3: Storage for cold-path analytics can be Azure BLOB. Stream Analytics can directly be configured to write the Data into it. Cold can contain all other data that doesn't require querying and will be cheap.
Step 4: Processing for hot-path analytics. This is data that is more regularly queries for. Or data where real time analytics needs to be carried on. Like in your case checking for Temperature values going beyond a threshold! needs an urgent trigger!
Let me know if you face any challenges while configuring the Stream analytics job! :)
If you take a look at the IoT Suite remote monitoring preconfigured solution (https://azure.microsoft.com/documentation/articles/iot-suite-remote-monitoring-sample-walkthrough/) you'll see that it persists telemetry in blob storage and maintains device status information in DocumentDb. This preconfigured solution gives you a working illustration of the points made in the previous answers.

Resources