I have a bunch of sensors (currently 350) that are sending a total of around 500000 messages a day to an Azure IoT Hub. The sensors are grouped in to differently sized studies and i need to report on those studies each month.
I've tried to use stream analytics but couldn't find a way to dynamically route the messages to their respective locations. I dont want to have to add an individual output for each study.
Can anyone suggest a way to get these messages in to an Azure Data Lake so that each study's messages get put in their own folder eg \{StudyID}\{Year}\{Month}\[Message Data].
Sorry for the inconvenience. At this time, you cannot add custom variable in the output structure when exporting data from ASA to Azure Data Lake.
We're keep this request in our backlog for future updates of the product.
You can also add suggestions in our User Voice portal here: https://feedback.azure.com/forums/270577-azure-stream-analytics
Thanks,
JS
Related
I have a device which sends data in the form of Hexadecimal value based on an algorithm to IoT Hub. I want to parse that data into a json string to store it in Cosmos DB. Is there any way to achieve this?
I would like to add to Roman's comment on your question, he's right to offer this link to Stream Analytics. It will get the job done. Depending on how many devices you have and how often you are receiving telemetry, you might want to consider using Azure Functions instead. See this sample on how to integrate Azure Functions between IoT Hub and CosmosDB.
The reason I offer this extra solution is that a Stream Analytics Job will cost you a fixed price per hour per streaming unit, while a Function is paid by consumption. Because the conversion from hexadecimal is a fairly small Function, you might even use it for free whereas a Stream Analytics Job in West Europe will cost at least 74 Euros.
I am using Azure Monitor to view diagnostics/logs for my IoTHub. In the metrics available for IoTHub there is deviceDataUsage. As I understand it, this is the total data usage for all of the devices connected to this IoTHub.
Is there a built-in monitoring/logging solution to Azure IoTHub that would allow me to view per device data usage? Or will I need to use a different tool, such as stream analytics, to build my own solution?
Unfortunately there isn't a way to get the data usage for an individual IoT device through similar means as the monitoring tab of IoTHub nor through Kusto query.
There is a sort of a workaround. It would require some level of development on your end, if you are routing the messages to an event hub you can read directly from there and do aggregation on system property for device-id. Information on this can be found here: https://learn.microsoft.com/en-us/azure/event-hubs/event-hubs-event-processor-host#receive-messages. Alternatively, another form of workaround is to include the device-id in the telemetry message being sent and the messages be queried internally on your end to separate messages with specific device-id’s. These are all merely suggestions that can or cannot be used based on your business needs of course.
Finally you can reach out directly to assisted-support team and request the device usage on a case-by-case basis.
I am trying to collect data from a ioT device, as for now i am using this code to simulate the device. remote_monitoring. It send data and i can se the data in the dashboard. Next thing is that i want to save the data to a SQL database. I was thinking of using Stream Analytics to do the job. The problem i am having now is that when i select ioT HUB as a input i get the error
Please check if the input source is configured correctly and data is in correct format.
I am trying to find documentation if there is something special i need to add to my JSON object before i send it.
IoTHub is a supported input for Azure Stream Analytics and there is nothing wrong with using ASA as a "pump" to copy data from IoT Hub or Event Hubs to a store like SQL DB. Many use cases of ASA combine such "archiving" with other functions. The only thing to be careful with is the limited ingress rate of many ASA outputs, so SQL DB may not be able to keep up and throttle ASA in which case ASA may fall behind beyond the used hub's retention window, causing data loss.
Try to use Event Hub, update my post when i have a awnser.
What about this doc https://github.com/Azure/azure-content/blob/master/articles/stream-analytics/stream-analytics-define-inputs.md ? Does it help you?
I have just started learning Azure IoT and it's quite interesting. I am confuse about does IoT Hub stores data somewhere?
i.e. Suppose i am passing room Temperature to IoT hub and want to store it in database for further use. How it's possible?
I am clear on how device-to-cloud and cloud-to-device works with IoT hub.
IoT Hub exposes device to cloud messages through an event hubs endpoint. Event Hubs has a retention time expressed in days. It's a stream of data that the reading client could re-read more time because the cursor is on client side (not on server side like queues and topics). With IoT Hub the related retention time is 1 day by default but you can change it.
If you want to store received messages from device you need to have a client reading on the Event Hubs exposed endpoint (for example with an Event Processor Host) that has the business logic to process the messages and store them into a database for example.
Of course you could use another decoupling layer so that the client reads from event hubs and store messages into queues. Then you have another client that at its own pace reads from queues and store into database. In this way you have a fast path reading event hubs.
This is pretty much the use case for all IoT scenarios.
Step 1: High scale data ingestion via Event Hub.
Step 2: Create and use a stream processing engine (Stream Analytics or HDInsight /Storm). You can run conditions (SQL like queries) to filter and store appropriate data in either cold or hot store for further analytics.
Step 3: Storage for cold-path analytics can be Azure BLOB. Stream Analytics can directly be configured to write the Data into it. Cold can contain all other data that doesn't require querying and will be cheap.
Step 4: Processing for hot-path analytics. This is data that is more regularly queries for. Or data where real time analytics needs to be carried on. Like in your case checking for Temperature values going beyond a threshold! needs an urgent trigger!
Let me know if you face any challenges while configuring the Stream analytics job! :)
If you take a look at the IoT Suite remote monitoring preconfigured solution (https://azure.microsoft.com/documentation/articles/iot-suite-remote-monitoring-sample-walkthrough/) you'll see that it persists telemetry in blob storage and maintains device status information in DocumentDb. This preconfigured solution gives you a working illustration of the points made in the previous answers.
I am building a service on azure and wanted to know if there is any way to know how much resources (data downloaded or uploaded, time required to do the processing) a customer has used in a given session and what level of services they have used in order to bill them accordingly. We expose the whole framework as a service, this consists of various small levels of services, like reading the data from some external FTP server, downloading it to blob, reading the file downloaded and storing them in tables and performing some operations the data in the table, email some results from service required by the user, etc.
So, depending on what all services the customer has used, we would like to bill them accordingly.
Thanks!
The only Azure specific function that I can think of that will help you with what you want to track is Azure Storage Logging which will track each and every request to Azure Storage, I'm not sure how much that is going to help you though.
I think you will have to decide exactly what you want to bill your customers for and then start tracking that yourself. This might be items similar to what MS charges you for (tracking the size of incoming requests, counting the number of transactions and the size of data stored to Azure Storage) or maybe just some arbitrary values based on some of this information