Is stopping the azure stream analytics will stop the costing? - azure

I would need to know that,
Is stopping Azure stream analytics service will stop the costing.

As per the answer from MSFT: For Azure Stream Analytics, there is no charge when the job is stopped.
But for Azure Stream Analytics on IoT Edge: Billing starts when an ASA job is deployed to devices, no matter what the job status is (running/failed/stopped).

Welcome to Stackoverflow!
Note: There is no charges for the stopped jobs. It will be billed on basis on steaming units in Cloud and jobs/devices in Edge.
Detailed explanation:
As a cloud service, Stream Analytics is optimized for cost. There are no upfront costs involved - you only pay for the streaming units you consume, and the amount of data processed. There is no commitment or cluster provisioning required, and you can scale the job up or down based on your business needs.
While creating stream Analytics Job, if you created a Stream Analytics job with streaming units = 1, it will be billed $0.11/hour.
Pricing:
Azure Stream Analytics on Cloud: If you created a Stream Analytics job with streaming units with N, it will be billed $0.11 * N/hour.
Azure Stream Analytics on Edge: Azure Stream Analytics on IoT Edge is priced by the number of jobs that have been deployed on a device. For instance, if you have two devices and the first device has one job whereas the second device has two jobs your monthly charge will be (1 job)(1 device)($1/job/device)+(2 jobs)(1 device)($1/job/device) = $1+$2 = $3 per month.
Hope this helps. If you have any further query do let us know.

Related

what value should we put for Devices attribute while calculating azure stream analytics pricing

I need to calculate pricing for azure stream analytics and i'm confused with the field 'Devices' under 'Stream Analytics on IoT Edge'. is it important for estiamating pricing ? if yes, how to fill it?
Are you planning to run Azure Stream Analytics jobs on edge devices?
Note that this is not running Azure Stream Analytics jobs on edge device data. But actually deploying jobs on a device so it can run the query locally.
For most use cases the answer to the first answer is no, and 0 should be put in that box.

Azure Stream Analytics. Pricing

I am looking at pricing information for Azure Stream Analytics using the Azure Pricing Calculator: https://azure.microsoft.com/en-us/pricing/calculator/#stream-analytics378d63b3-303d-4548-afd0-bac5d4249611
I says that it cost 80 USD for 1 month (730 hours).
Is this the idle price?
If i don't send any data to Steam Analytics I will be charged 80 USD a month for having the service deployed. Correct?
Then op top of that I pay for my actual usage in form of streaming units, where a streaming unit, according to the FAQ, is a blend of compute, memory, and throughput.
As a cloud service, Stream Analytics is optimized for cost. There are no upfront costs involved - you only pay for the streaming units you consume, and the amount of data processed. There is no commitment or cluster provisioning required, and you can scale the job up or down based on your business needs.
While creating stream Analytics Job, if you created a Stream Analytics job with streaming units = 1, it will be billed $0.11/hour.
If you created a Stream Analytics job with streaming units with N, it will be billed $0.11 * N/hour.
Hope this helps. If you have any further query do let us know.
NO. You will pay only for what you use. For example if you configure 1 streaming unit and used 10 hours per month (10*0.11). If you configure 2 streaming unit and used 20 hours per month (20*0.11*2)
1.If i don't send any data to Steam Analytics I will be charged 80 USD a
month for having the service deployed.
Quick answer is yes.Based on this document:
As a cloud service, Stream Analytics is optimized for cost. There are
no upfront costs involved - you only pay for the streaming units you
consume, and the amount of data processed. There is no commitment or
cluster provisioning required, and you can scale the job up or down
based on your business needs.
SU is basis of ASA job. Choosing the number of required SUs for a particular job depends on the partition configuration for the inputs and the query that's defined within the job.More details,please refer to this link.
2.Then op top of that I pay for my actual usage in form of streaming
units, where a streaming unit, according to the FAQ, is a blend of
compute, memory, and throughput.
If your data is not updated in real time, or very small,maybe ASA is not your Best choice. You could follow some suggestions in this thread:Azure Stream Analytics job expensive for small data?

Azure Stream Analytics job expensive for small data?

In order to write sensor data from an IoT device to a SQL database in the cloud I use an Azure Streaming Analytics job. The SA job has an IoT Hub input and a SQL database output. The query is trivial; it just sends all data through).
According to the MS price calculator, the cheapest way of accomplishing this (in western Europe) is around 75 euros per month (see screenshot).
Actually, only 1 message per minute is send through the hub and the price is fixed per month (regardless of the amount of messages). I am surprised by the price for such a trivial task on small data. Would there be a cheaper alternative for such low capacity needs? Perhaps an Azure function?
If you are not processing the data real-time then SA is not needed, you could just use an Event Hub to ingest your sensor data and forward it on. There are several options to move data from the Event Hub to SQL. As you mentioned in your question, you could use an Azure Function or if you want a no-code solution, you could us a Logic App.
https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-azure-event-hubs
https://learn.microsoft.com/en-us/azure/connectors/connectors-create-api-sqlazure
In addition to Ken's answer, the "cold path" can be your solution, when the telemetry data are stored in the blob storage by Azure IoT Hub every 720 seconds (such as a maximum batch frequency).
Using the Azure Event Grid on the blob storage, it will trigger an EventGridTrigger subscriber when we can handle starting a streaming process for this batch (or for a group of batches within an one hour). After this batch process is done, the ASA job can be stopped.
Note, that the ASA job is billed based on the active processing time (that's the time between the Start/Stop) which your cost using an ASA job can be significantly dropped down.

IoT hub event ingestion and data storage

I'm using an Azure stream Analytics job to process the data from an IoT hub. So my data comes from my simulated device to an IoT hub and I have configured the hub as an input to the stream analytics job and a blob storage as an output job.
My question is if I stop the stream analytics job and restart it, do I lose the data between stop and start? Or is the data stored on the IoT hub and when I restart the job and select the start time as from when it stopped, I'll get all the data.
As krishmam said, you can choose the outputStartMode when starting a job. Specifies if the job should start producing output at a given timestamp or at the point when the job starts.
You won't lose any data. Stream Analytics gives you an option to start a job from the previous stop time when you start the job.

Azure Event Hub to Stream Analytics with Partitions

Azure documentation states that:
Partitions are a data organization mechanism and are more related to the degree of downstream parallelism required in consuming applications than to Event Hubs throughput.
Assuming that the only consumer of the EventHubClient is Azure Stream Analytics, is it relevant to configure a series of Partitions as input to the Stream Analytics job?
For example, if the Stream Analytics job is configured to scale to 6 Streaming Units, will configuring the EventHubClient, that loads the events, to leverage 6 Partitions, effect 6 parallel streams of input?
Or, are Partitions even relevant when the only consuming client is a Stream Analytics job?
The 6 Streaming units has nothing to do with the EventHubclient, it is relevant to the #partitions you configured in ASA job

Resources