Stream Analytics conditional check - azure

I'm preparing proof of concept for azure iot . I send data from device a to azure iot hub and from iot hub sending data to database via stream analytics.
Question is : I want to check if I have this record in database if not I want to add record. And I want to create "start time" when some certain event occurred and "finish time" event is finished so I need to update row. Is it possible with stream analytics and is stream analytics correct place to do this kind of checks ?

If you want to be sure to only add one record in the database, you can follow the instructions on our team blog: https://blogs.msdn.microsoft.com/streamanalytics/2017/01/13/how-to-achieve-exactly-once-delivery-for-sql-output/
Thanks,
JS - Azure Stream Analytics

actually Im not Sure but hope its helps.
U cannot update your database.
Azure Stream Analytics is works for stream your data.
I think you have a 2 option for this case
One way you can keep your data on temptable on stream analytics you can check your conditions and insert database but its not work with older datas
second way u can use trigger on your database.
CREATE TRIGGER TriggerDataFromStream
ON DataFromStream
AFTER INSERT
AS
Like this.

Related

Azure how to get events shown in CLI from IoT to a database

I am having some issues actually retrieving and using the data I send to the IoT Hub in Azure. When I run 'az IoT hub monitor-events --hub-name ' in CLI I can see my events, and I can also send messages to my devices in the IoT hub.
I have then tried to create a stream, to forward the messages to my SQL database, but without any luck. Do you have any suggestions on how to retrieve this data?
There are multiple ways about this. The two most common scenarios are probably using an Azure Function, or using a Stream Analytics job. I don't know what you've tried up until this point, but a Stream Analytics job is probably the easiest way to go.
Stream Analytics
This answer on SO could be what you're looking for, it also links to this tutorial that you could follow from "Create a new Azure SQL Database" onwards. It covers creating an IoT Hub input and Azure SQL output on your Stream Analytics job and using a simple query to link the two together. There is more info in the Microsoft docs here
Azure Function
While looking this one up I found this answer, which is mine, awkward. But it describes how you can go about creating an Azure Function that accepts IoT Hub messages and shoots them to your database. This option is a lot more cost-efficient (or even free, if you use the consumption plan for a Function) for a few devices.

Is there a way to stream Azure Information Protection Activity Logs to an Event Hub?

I have configured Azure Information Protection analytics through the Azure portal for my subscription and I am able to see log data under the Activity logs (preview) tab.
I want to forward that log data to a configured Event Hub but I have not found a way to do it. This data appears to be written to a table called InformationProtection_CL. How do I get that query output to stream to an Event Hub? Is what I'm trying to do possible?
You can use the REST API Query - Get to get the log data in the table.
GET https://api.loganalytics.io/v1/workspaces/{workspaceId}/query?query={query}
Then follow this doc to send events to the event hub programmatically, the specific situation and language depend on you.
https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-dotnet-standard-getstarted-send#send-events

Using partitionId or partitionKey with Iot Hub Azure

We are developing an application where IoT devices will be publishing events to azure IoT hub using MQTT protocol (by using one topic to push message). We want to consume these message using Stream Analytic service. And to scale Stream analytic services, it is recommended to use partitionBy clause.
Since, we are not using Azure Event hub SDK, can we somehow attached partitionId with events?
Thanks In Advance
As Rita mentioned in the comments, Event Hub will automatically associate each device to a particular partition.
Then, when you can use PARTITION BY PartitionId for steps closer to the input to efficiently parallelize processing of the input and reduce/aggregate the data.
Then, you can have another non-partitioned step to output to SQL sending some aggregate data.
Doing that you will be able to assign more thank 6 SUs, even with an output to SQL.
We will update our documentation to give more info about scaling ASA jobs and describe the different possible scenarios.
Thanks,
JS - Azure Stream Analytics

Pulling data from Stream Analytics to Azure Machine Learning

Working on a IoT telemetry project that receives humidity and weather pollution data from different sites on the field. I will then apply Machine Learning on the collected data. I'm using Event Hubs and Stream Analytics. Is there a way of pulling the data to Azure Machine Learning without the hassle of writing an application to get it from Stream Analytics and push to AML web service?
Stream Analytics has a functionality called the “Functions”. You can call any web service you’ve published using AML from within Stream Analytics and apply it within your Stream Analytics query. Check this link for a tutorial.
Example workflow in your case would be like the following;
Telemetry arrives and reaches Stream Analytics
Streaming Analytics (SA) calls the Machine Learning function to apply it on the data
SA redirects it to the output accordingly, here you can use the PowerBI to create a predictions dashboards.
Another way would be using R, and here’s a good tutorial showing that https://blogs.technet.microsoft.com/machinelearning/2015/12/10/azure-ml-now-available-as-a-function-in-azure-stream-analytics/ .
It is more work of course but can give you more control as you control the code.
Yes,
This is actually quite easy as it is well supported by ASA.
You can call custom AzureML function from your ASA query when you create this function from the portal.
See the following tutorial on how to achieve something like this.

Azure ioT and Stream Analytics job

I am trying to collect data from a ioT device, as for now i am using this code to simulate the device. remote_monitoring. It send data and i can se the data in the dashboard. Next thing is that i want to save the data to a SQL database. I was thinking of using Stream Analytics to do the job. The problem i am having now is that when i select ioT HUB as a input i get the error
Please check if the input source is configured correctly and data is in correct format.
I am trying to find documentation if there is something special i need to add to my JSON object before i send it.
IoTHub is a supported input for Azure Stream Analytics and there is nothing wrong with using ASA as a "pump" to copy data from IoT Hub or Event Hubs to a store like SQL DB. Many use cases of ASA combine such "archiving" with other functions. The only thing to be careful with is the limited ingress rate of many ASA outputs, so SQL DB may not be able to keep up and throttle ASA in which case ASA may fall behind beyond the used hub's retention window, causing data loss.
Try to use Event Hub, update my post when i have a awnser.
What about this doc https://github.com/Azure/azure-content/blob/master/articles/stream-analytics/stream-analytics-define-inputs.md ? Does it help you?

Resources