multi-tenant azure stream analytics - azure

I have a use case in IoT streaming, in our current architecture data from the IoT hub is consumed by our stream analytics jobs for realtime reporting on powerBi dashboards. I want to be able to expand this to additional tenants now. From what i have gathered this seems to be possible with dedicated azure stream analytics clusters and i dont seem to understand how the ingestion to the clusters would occur? Would it mean i will need to have a load balancer between my IoT hub and stream analytics jobs? Or is there a better way i could achieve this?

Related

what value should we put for Devices attribute while calculating azure stream analytics pricing

I need to calculate pricing for azure stream analytics and i'm confused with the field 'Devices' under 'Stream Analytics on IoT Edge'. is it important for estiamating pricing ? if yes, how to fill it?
Are you planning to run Azure Stream Analytics jobs on edge devices?
Note that this is not running Azure Stream Analytics jobs on edge device data. But actually deploying jobs on a device so it can run the query locally.
For most use cases the answer to the first answer is no, and 0 should be put in that box.

Azure how to get events shown in CLI from IoT to a database

I am having some issues actually retrieving and using the data I send to the IoT Hub in Azure. When I run 'az IoT hub monitor-events --hub-name ' in CLI I can see my events, and I can also send messages to my devices in the IoT hub.
I have then tried to create a stream, to forward the messages to my SQL database, but without any luck. Do you have any suggestions on how to retrieve this data?
There are multiple ways about this. The two most common scenarios are probably using an Azure Function, or using a Stream Analytics job. I don't know what you've tried up until this point, but a Stream Analytics job is probably the easiest way to go.
Stream Analytics
This answer on SO could be what you're looking for, it also links to this tutorial that you could follow from "Create a new Azure SQL Database" onwards. It covers creating an IoT Hub input and Azure SQL output on your Stream Analytics job and using a simple query to link the two together. There is more info in the Microsoft docs here
Azure Function
While looking this one up I found this answer, which is mine, awkward. But it describes how you can go about creating an Azure Function that accepts IoT Hub messages and shoots them to your database. This option is a lot more cost-efficient (or even free, if you use the consumption plan for a Function) for a few devices.

Getting Azure VM event logs into Eventhubs

We are currently investigating methods in getting our Security log data out of our Azure VM's and into our SIEM for analysis.
Currently I have been able to get the logs from the VM to log analytics work-space but I'm no sure how to get them from the log analytics workspace to the eventhub to then pull down the events.
Has anyone faced a similar challenge before / how did you overcome this challenge?
I'm currently pull the data into a Log analytics workspace
Welcome to Stackoverflow!
Azure diagnostic logs can be streamed in near real time to any application using the built-in “Export to Event Hubs” option in the Portal, or by enabling the Event Hub Authorization Rule ID in a diagnostic setting via the Azure PowerShell Cmdlets or Azure CLI.
What you can do with diagnostics logs and Event Hubs:
Here are just a few ways you might use the streaming capability for Diagnostic Logs:
Stream logs to 3rd party logging and telemetry systems – You can stream all of your diagnostic logs to a single event hub to pipe log data to a third-party SIEM or log analytics tool.
View service health by streaming “hot path” data to Power BI – Using Event Hubs, Stream Analytics, and Power BI, you can easily transform your diagnostics data in to near real-time insights on your Azure services.
Build a custom telemetry and logging platform – If you already have a custom-built telemetry platform or are just thinking about building one, the highly scalable publish-subscribe nature of Event Hubs allows you to flexibly ingest diagnostic logs.
After data is displayed in the event hub, you can access and read the data in two ways:
Configure a supported SIEM tool. To read data from the event hub, most tools require the event hub connection string and certain permissions to your Azure subscription. Third-party tools with Azure Monitor integration included.
For more details, refer "Stream Azure Diagnostic Logs to an event hub" and "How to integrate Azure Monitor with SIEM tools".
Hope this helps.
You can’t pull the VM data from log analytics to an event hub, you can use windows/Linux diagnostic extensions to route data to an event hub.
Stream Azure monitoring data to an event hub for consumption by an external tool

Using partitionId or partitionKey with Iot Hub Azure

We are developing an application where IoT devices will be publishing events to azure IoT hub using MQTT protocol (by using one topic to push message). We want to consume these message using Stream Analytic service. And to scale Stream analytic services, it is recommended to use partitionBy clause.
Since, we are not using Azure Event hub SDK, can we somehow attached partitionId with events?
Thanks In Advance
As Rita mentioned in the comments, Event Hub will automatically associate each device to a particular partition.
Then, when you can use PARTITION BY PartitionId for steps closer to the input to efficiently parallelize processing of the input and reduce/aggregate the data.
Then, you can have another non-partitioned step to output to SQL sending some aggregate data.
Doing that you will be able to assign more thank 6 SUs, even with an output to SQL.
We will update our documentation to give more info about scaling ASA jobs and describe the different possible scenarios.
Thanks,
JS - Azure Stream Analytics

Azure Stream Analytics: how to achieve high-availability?

I'm evaluating Azure Stream Analytics (ASA) for a mission critical event data processing application. The SLA for ASA is 99.9, which is not sufficient for my use case. Has anyone developed a good strategy for ensuring high availability with ASA?

Resources