Azure IoT Hub - How to use Table Storage in IoT Hub? - azure

First of all, sorry for my english skill.
I'm a high school student from South Korea who's doing project with Azure IoT Hub.
I am working on a project where a raspberry pi device is sending values to an Azure IoT Hub. I would like to save this data in Azure Table Storage as this data will be used by some other services (Azure WebApp for example).
So I tried to save raspberry pi values in Azure Table Storage.
But when I add endpoints of IoT Hub, I just can use only blob storage container
of course i still don't understand about iot hub
please don't look so bad.
In a nutshell
I want to send raspberry pi values to Azure Table Storage and not Blob Storage however only option available to me is Blob Storage when I am setting endpoints for Azure IoT Hub.
How to send values to Table Storage via Azure IoT Hub.
by any chance, my logic for Azure is completely wrong?

You can use either functions or Azure Stream Analytics to push hub data to Azure table storage. I found Stream Analytics worked best for me as I was better able to format the data.

Simple way how to store D2C messages in the Azure Store Table is using an Azure EventHubTrigger Function.
More details about the function and its integration with Azure IoT Hub can be found here.

IoT Hub natively supports routing messages to Azure storage as blobs. Refer Save IoT hub messages that contain sensor data to your Azure blob storage.
There does, however, seem to be a typo in the doc where it lists ‘table storage’ instead of 'blob storage'. We’ll get those typos corrected.

Related

Send event from Azure Blog Storage to Azure IoT Hub

I have a NodeJS app that can successfully send data to the Azure IoT Hub, which in turn sends the data to the Azure Blob Storage.
That all works fine.
I can also manually send a message with the Message to device tool of the Azure Devices portal and I can receive this message in my NodeJS app.
What I want to do now, is to send a confirmation message back to the NodeJS app when the blob has been created or deleted.
Can someone please guide me on how to do that? There is little information out there to send messages back to the Azure Client.
I see that I can create a subscription to the Azure Blob Storage but I don't know how to hook it up to the Azure IoT Hub.
Cheers
This is possible by using an Azure Function with a Blob Storage trigger. You could write an Azure Function that subscribes to blob storage changes and use the IoT Hub Service SDK to send a message back to the device. I'm assuming you're using IoT Hub's message routing feature to store the telemetry in blob storage. This comes with a challenge, as there is no way to tell from the name of the blob what device it originated from. You would need to combine it with a blob input to read the file input.
Do you absolutely need this confirmation on the device side? Another note, if you're not interested in persisting telemetry, but instead want to upload a file from your NodeJS app, you could consider the IoT Hub File Upload feature instead.
You can use a device twins to represent a state between the device and service sides.
Updating a state (device twin) at the service side, the notification message is delivered to the device side.

Is it possible to integrate Arc GIS with Azure Data Factory or Azure Events Hub?

I am having a requirement to integrate Arc GIS application and Azure including location-based data collected from Arc GIS application into data warehouse. Are there any possibilities to use Azure Data Factory pipelines or Azure Events Hub to extract the data from Arc GIS?
Unfortunately, so far Azure Data Factory doesn't have a connector to integrate with ArcGIS. Please visit the ADF Connectors Overview to know more about the available supported data stores.
As far as Azure Eventhub integration with ArcGIS, as per this official document, Azure Event Hub can be used as a central message hub for bi-directional communication between ArcGIS Velocity and your IoT devices.

How to display log messages from azure iot device client code

I see log messages in azure iot device client source code like this:
log.debug("Connection already opened by TransportClient."); or
log.info("Device client opened successfully");
My question is where these log messages going? how to get that messages for debug purpose?
Thanks
In general, Blob Storage is added as a 'logging endpoint' which shall encompass a storage account, container in the account and blob in the container. The blobs of type 'Block blobs' shall be utilized for storing text and binary data.
All logs get stored in 'Block blobs' in a container named $logs, which is automatically created when Storage Analytics is enabled for a storage account. The $logs container is located in the blob namespace of the storage account, for example: http://.blob.core.windows.net/$logs
To view and analyze your log data, you should download the blobs that contain the log data you are interested in to a local machine. There are many tools like AzCopy, Azure Storage Data Movement library, Azure Import/Export service to import or export data to and from your storage account. To view the logs, you can also use any tool that can access Azure blob storage, such as Visual Studio or Cerebrata Azure Management Studio.
In case of azure-iot-sdk, each IoT hub exposes a set of endpoints(service endpoints) for the solution's back end to communicate with the devices. An IoT hub has a default built-in-endpoint (messages/events). By default, messages are routed to the built-in service-facing endpoint (messages/events) that is compatible with Event Hubs. You can refer to below link to various methods to read from built-in endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-builtin
You can also create custom endpoints to route messages to by linking other services in your subscription to the IoT Hub. In case if the custom endpoint is created, a message is routed to multiple endpoints whose routing queries it matches. There are two storage services IoT Hub can route messages to Azure Blob Storage and ADLS (Azure Data Lake Storage) Gen2 accounts. You can refer to the below link to various methods to read from custom endpoint https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-messages-read-custom
In the scenario of logs from IoT SDK itself, the logs can get logged in stdout or stderr as per type of log and based on the deployment environment and also can be redirected accordingly as per the requirement. The SDK uses debug library for detailed logs. The below link can be helpful in giving additional details https://github.com/Azure/azure-iot-sdk-node/wiki/Troubleshooting-Guide-Devices

Logic Apps(event hub to adls)

I'm currently working a gig to build a logic app for one of our main clients.
They(A) currently have Azure Backup data streaming into an event hub. On the directory is a logic app that's collecting the data from A - Event hub.
They have asked us to move the data from the event hub in the logic app to a ADLS store. Anyone know what types of connectors to use within the logic app to move data from event hub to data lake ?
Thank you!
Anyone know what types of connectors to use within the logic app to move data from event hub to data lake ?
We could use the Eventhub trigger and Datalake action to do that in the logic app. For more information please refer to the screenshot. But we also could Use Azure Data Lake Storage Gen1 to capture data from Event Hubs directly.

Connect Stream Analytics to Azure Postgres - Possible?

Getting error messages. Seems this is not possible ?
message is :
The JSON provided in the request body is invalid. Property 'server' value 'postgres-mydatabase123.postgres.database.azure.com' is not acceptable.
According to the documentation this is not possible as currently the support output sinks are
Azure Data Lake Store
SQL Database Blob storage
Event Hub
Power BI
Table Storage
Service Bus Queues
Service Bus Topics
Azure Cosmos DB
Azure Functions (In Preview)
Out of all of these Azure Functions or Event Hub might be interesting to you as they allow custom code to process the data. In your case that would be sending it to the PostgreSQL database.

Resources