Is it possible to integrate Arc GIS with Azure Data Factory or Azure Events Hub? - azure

I am having a requirement to integrate Arc GIS application and Azure including location-based data collected from Arc GIS application into data warehouse. Are there any possibilities to use Azure Data Factory pipelines or Azure Events Hub to extract the data from Arc GIS?

Unfortunately, so far Azure Data Factory doesn't have a connector to integrate with ArcGIS. Please visit the ADF Connectors Overview to know more about the available supported data stores.
As far as Azure Eventhub integration with ArcGIS, as per this official document, Azure Event Hub can be used as a central message hub for bi-directional communication between ArcGIS Velocity and your IoT devices.

Related

Is there any possibility to transfer data from Azure data lake gen2 to Azure event hub by using Azure data factory?

Is there any possibility to transfer data from Azure data lake gen2 to Azure event hub by using Azure data factory? Is there any alternative ways to to preserve same folder structure in Event hub once transfer to Event hub from Data Lake?
Azure Data Factory support Azure data lake gen2 but doesn't support Azure event hut now.
Please see Azure Data Factory connector overview.
Hope this helps.
There is no direct connection to Event Hub, but you can use this service to see what IO direct endpoints are available and use the IO tree to see how you can connect multiple services

How to perform Event based data ingestion using Azure Data Lake Storage Gen2 and Azure Data factory V2?

Recently we came across a scenario where our source and sink location are of ADLS Gen2 type. Now we got one interesting use case wherein we have to push data from source to sink with the help of ADF V2. Having said that, its not just normal copy activity we are expecting but we need to perform this activity on an event basis.
While going through the ADLS Gen2 documents found that ADLS Gen2 yet to support "Azure Event Grids" and that's the reason though we are able to configure ADF's event-based triggers they did not work.
Can anyone suggest me to tackle this situation, since Azure Event Gird is not supported at this instance of time we don't believe we can achieve this with Azure Event Hubs and their integration with ADF?
Thanks.
From my repro, currently event based trigger are supported only on v2 storage accounts.
Data Factory is now integrated with Azure Event Grid, which lets you trigger pipelines on an event.
Note: This integration supports only version 2 Storage accounts (General purpose).
Azure Event Grid doesn't receive events from Azure Data Lake Gen2 accounts because those accounts don't yet generate them.
For more details, refer “Known issues with Azure Data Lake Storage Gen2”.

Logic Apps(event hub to adls)

I'm currently working a gig to build a logic app for one of our main clients.
They(A) currently have Azure Backup data streaming into an event hub. On the directory is a logic app that's collecting the data from A - Event hub.
They have asked us to move the data from the event hub in the logic app to a ADLS store. Anyone know what types of connectors to use within the logic app to move data from event hub to data lake ?
Thank you!
Anyone know what types of connectors to use within the logic app to move data from event hub to data lake ?
We could use the Eventhub trigger and Datalake action to do that in the logic app. For more information please refer to the screenshot. But we also could Use Azure Data Lake Storage Gen1 to capture data from Event Hubs directly.

How to trigger a pipeline in Azure Data Factory v2 or a Azure Databricks Notebook by a new file in Azure Data Lake Store gen1

I am using a Azure Data Lake Store gen1 for storing JSON files. Based on these files i have Notebooks in Azure Databricks for processing them. Now i want to trigger such a Azure Databricks Notebook when a new file is creating in Azure Data Lake Store gen1. I couldnt find any Trigger which could do this. do you know any way?
Currently, this is not yet implemented/Supported by Microsoft. But it is on their Roadmap(I believe).
You can do this in 2 ways,
Azure Functions(through Event Grid)
Logic Apps
Option #1
Currently, Microsoft is building on #1.
You can track the issue here.
As per this
This feature is not a high priority for us right now, but I will note
that the announcement for Azure Event Grid listed Data Lake as one of
the integrations they are building. Once you can subscribe to Data
Lake updates through Event Grid, running an Azure Function would be
trivial (see here for some info).
You can vote your voice to support the event grid (provider) in DataLake.
Option #2
This is also not yet implemented, but you can Upvote your voice here to support this feature

Azure IoT Hub - How to use Table Storage in IoT Hub?

First of all, sorry for my english skill.
I'm a high school student from South Korea who's doing project with Azure IoT Hub.
I am working on a project where a raspberry pi device is sending values to an Azure IoT Hub. I would like to save this data in Azure Table Storage as this data will be used by some other services (Azure WebApp for example).
So I tried to save raspberry pi values in Azure Table Storage.
But when I add endpoints of IoT Hub, I just can use only blob storage container
of course i still don't understand about iot hub
please don't look so bad.
In a nutshell
I want to send raspberry pi values to Azure Table Storage and not Blob Storage however only option available to me is Blob Storage when I am setting endpoints for Azure IoT Hub.
How to send values to Table Storage via Azure IoT Hub.
by any chance, my logic for Azure is completely wrong?
You can use either functions or Azure Stream Analytics to push hub data to Azure table storage. I found Stream Analytics worked best for me as I was better able to format the data.
Simple way how to store D2C messages in the Azure Store Table is using an Azure EventHubTrigger Function.
More details about the function and its integration with Azure IoT Hub can be found here.
IoT Hub natively supports routing messages to Azure storage as blobs. Refer Save IoT hub messages that contain sensor data to your Azure blob storage.
There does, however, seem to be a typo in the doc where it lists ‘table storage’ instead of 'blob storage'. We’ll get those typos corrected.

Resources