Azure ioT and Stream Analytics job - azure

I am trying to collect data from a ioT device, as for now i am using this code to simulate the device. remote_monitoring. It send data and i can se the data in the dashboard. Next thing is that i want to save the data to a SQL database. I was thinking of using Stream Analytics to do the job. The problem i am having now is that when i select ioT HUB as a input i get the error
Please check if the input source is configured correctly and data is in correct format.
I am trying to find documentation if there is something special i need to add to my JSON object before i send it.

IoTHub is a supported input for Azure Stream Analytics and there is nothing wrong with using ASA as a "pump" to copy data from IoT Hub or Event Hubs to a store like SQL DB. Many use cases of ASA combine such "archiving" with other functions. The only thing to be careful with is the limited ingress rate of many ASA outputs, so SQL DB may not be able to keep up and throttle ASA in which case ASA may fall behind beyond the used hub's retention window, causing data loss.

Try to use Event Hub, update my post when i have a awnser.

What about this doc https://github.com/Azure/azure-content/blob/master/articles/stream-analytics/stream-analytics-define-inputs.md ? Does it help you?

Related

Azure how to get events shown in CLI from IoT to a database

I am having some issues actually retrieving and using the data I send to the IoT Hub in Azure. When I run 'az IoT hub monitor-events --hub-name ' in CLI I can see my events, and I can also send messages to my devices in the IoT hub.
I have then tried to create a stream, to forward the messages to my SQL database, but without any luck. Do you have any suggestions on how to retrieve this data?
There are multiple ways about this. The two most common scenarios are probably using an Azure Function, or using a Stream Analytics job. I don't know what you've tried up until this point, but a Stream Analytics job is probably the easiest way to go.
Stream Analytics
This answer on SO could be what you're looking for, it also links to this tutorial that you could follow from "Create a new Azure SQL Database" onwards. It covers creating an IoT Hub input and Azure SQL output on your Stream Analytics job and using a simple query to link the two together. There is more info in the Microsoft docs here
Azure Function
While looking this one up I found this answer, which is mine, awkward. But it describes how you can go about creating an Azure Function that accepts IoT Hub messages and shoots them to your database. This option is a lot more cost-efficient (or even free, if you use the consumption plan for a Function) for a few devices.

Using partitionId or partitionKey with Iot Hub Azure

We are developing an application where IoT devices will be publishing events to azure IoT hub using MQTT protocol (by using one topic to push message). We want to consume these message using Stream Analytic service. And to scale Stream analytic services, it is recommended to use partitionBy clause.
Since, we are not using Azure Event hub SDK, can we somehow attached partitionId with events?
Thanks In Advance
As Rita mentioned in the comments, Event Hub will automatically associate each device to a particular partition.
Then, when you can use PARTITION BY PartitionId for steps closer to the input to efficiently parallelize processing of the input and reduce/aggregate the data.
Then, you can have another non-partitioned step to output to SQL sending some aggregate data.
Doing that you will be able to assign more thank 6 SUs, even with an output to SQL.
We will update our documentation to give more info about scaling ASA jobs and describe the different possible scenarios.
Thanks,
JS - Azure Stream Analytics

Stream Analytics conditional check

I'm preparing proof of concept for azure iot . I send data from device a to azure iot hub and from iot hub sending data to database via stream analytics.
Question is : I want to check if I have this record in database if not I want to add record. And I want to create "start time" when some certain event occurred and "finish time" event is finished so I need to update row. Is it possible with stream analytics and is stream analytics correct place to do this kind of checks ?
If you want to be sure to only add one record in the database, you can follow the instructions on our team blog: https://blogs.msdn.microsoft.com/streamanalytics/2017/01/13/how-to-achieve-exactly-once-delivery-for-sql-output/
Thanks,
JS - Azure Stream Analytics
actually Im not Sure but hope its helps.
U cannot update your database.
Azure Stream Analytics is works for stream your data.
I think you have a 2 option for this case
One way you can keep your data on temptable on stream analytics you can check your conditions and insert database but its not work with older datas
second way u can use trigger on your database.
CREATE TRIGGER TriggerDataFromStream
ON DataFromStream
AFTER INSERT
AS
Like this.

How can I have a aws iot rule-engine type of scenario in Azure IoT hub?

I actually wanted a conditional statement varying an output. For example, if temperature<0 do something, likewise if temperature>50 do something. I wanted to know how is it possible to have this conditional case in Azure Iot? What should i follow? What term should I be looking for? Please help.
You can query the IoT Hub messages flow in real time using Stream Analytics. You'll write a SQL like query with your condition (temperature<0) that will then send a message to an output like an Event Hub message queue or write to a database.

Does Microsoft Azure IoT Hub stores data?

I have just started learning Azure IoT and it's quite interesting. I am confuse about does IoT Hub stores data somewhere?
i.e. Suppose i am passing room Temperature to IoT hub and want to store it in database for further use. How it's possible?
I am clear on how device-to-cloud and cloud-to-device works with IoT hub.
IoT Hub exposes device to cloud messages through an event hubs endpoint. Event Hubs has a retention time expressed in days. It's a stream of data that the reading client could re-read more time because the cursor is on client side (not on server side like queues and topics). With IoT Hub the related retention time is 1 day by default but you can change it.
If you want to store received messages from device you need to have a client reading on the Event Hubs exposed endpoint (for example with an Event Processor Host) that has the business logic to process the messages and store them into a database for example.
Of course you could use another decoupling layer so that the client reads from event hubs and store messages into queues. Then you have another client that at its own pace reads from queues and store into database. In this way you have a fast path reading event hubs.
This is pretty much the use case for all IoT scenarios.
Step 1: High scale data ingestion via Event Hub.
Step 2: Create and use a stream processing engine (Stream Analytics or HDInsight /Storm). You can run conditions (SQL like queries) to filter and store appropriate data in either cold or hot store for further analytics.
Step 3: Storage for cold-path analytics can be Azure BLOB. Stream Analytics can directly be configured to write the Data into it. Cold can contain all other data that doesn't require querying and will be cheap.
Step 4: Processing for hot-path analytics. This is data that is more regularly queries for. Or data where real time analytics needs to be carried on. Like in your case checking for Temperature values going beyond a threshold! needs an urgent trigger!
Let me know if you face any challenges while configuring the Stream analytics job! :)
If you take a look at the IoT Suite remote monitoring preconfigured solution (https://azure.microsoft.com/documentation/articles/iot-suite-remote-monitoring-sample-walkthrough/) you'll see that it persists telemetry in blob storage and maintains device status information in DocumentDb. This preconfigured solution gives you a working illustration of the points made in the previous answers.

Resources