Connect Azure Event Hubs with Data Lake Store - azure

What is the best way to send data from Event Hubs to Data Lake Store?

I am assuming you want to ingest data from EventHubs to Data Lake Store on a regular basis. Like Nava said, you can use Azure Stream Analytics to get data from EventHub into Azure Storage Blobs. Thereafter you can use Azure Data Factory (ADF) to copy data on a scheduled basis from Blobs to Azure Data Lake Store. More details on using ADF are available here: https://azure.microsoft.com/en-us/documentation/articles/data-factory-azure-datalake-connector/. Hope this helps.
==
March 17, 2016 update.
Support for Azure Data Lake Store as an output for Azure Stream Analytics is now available. https://blogs.msdn.microsoft.com/streamanalytics/2016/03/14/integration-with-azure-data-lake-store/ . This will be the best option for your scenario.
Sachin Sheth
Program Manager, Azure Data Lake

In addition to Nava's reply: you can query data in a Windows Azure Blob Storage container with ADLA/U-SQL as well. Or you can use the Blob Store to ADL Storage copy service (see https://azure.microsoft.com/en-us/documentation/articles/data-lake-store-copy-data-azure-storage-blob/).

One way would be to write a process to read messages from the event hub event hub API and writes them into a Data Lake Store. Data Lake SDK.
Another alternative would be to use Steam Analytics to get data from Event Hub into a Blob, and Azure Automation to run a powershell that would read the data from the blob and write into a data lake store.

Not taking credit for this, but sharing with the community:
It is also possible to archive the Events (look into properties\archive), this leaves an Avro blob.
Then using the AvroExtractor you can convert the records into Json as described in Anthony's blob:
http://anthonychu.ca/post/event-hubs-archive-azure-data-lake-analytics-usql/

One of the ways would be to connect your EventHub to Data Lake using EventHub capture functionality (Data Lake and Blob Storage is currently supported). Event Hub would write to Data Lake every N mins interval or once data size threshold is reached. It is used to optimize storage "write" operations as they are expensive on a high scale.
The data is stored in Avro format, so if you want to query it using USQL you'd have to use an Extractor class. Uri gave a good reference to it https://anthonychu.ca/post/event-hubs-archive-azure-data-lake-analytics-usql/.

Related

Using Azure Data Factory to migrate Salesforce data to Dynamics 365

I'm looking for some advice around using Azure Data Factory to migrate data from Salesforce to Dynamics365.
My research has discovered plenty of articles about moving salesforce data to sinks such as azure data lakes or blob storage and also articles that describe moving data from azure data lakes or blob storage into D365.
I haven't found any examples where the source is salesforce and the sink is D365.
Is it possible to do it this way or do I need to copy the SF data to an intermediate sink such as Azure Data Lake or blob storage and then use that as the source of a copy/dataflow to then send to D365?
I will need to perform transformations on the SF data before storing it in D365.
Thanks
I would recommend to add ADLS Gen 2 as a Stage between SalesForce and D365
I am afraid that a direct sink as D365 can be done

Is there any possibility to transfer data from Azure data lake gen2 to Azure event hub by using Azure data factory?

Is there any possibility to transfer data from Azure data lake gen2 to Azure event hub by using Azure data factory? Is there any alternative ways to to preserve same folder structure in Event hub once transfer to Event hub from Data Lake?
Azure Data Factory support Azure data lake gen2 but doesn't support Azure event hut now.
Please see Azure Data Factory connector overview.
Hope this helps.
There is no direct connection to Event Hub, but you can use this service to see what IO direct endpoints are available and use the IO tree to see how you can connect multiple services

How to perform Event based data ingestion using Azure Data Lake Storage Gen2 and Azure Data factory V2?

Recently we came across a scenario where our source and sink location are of ADLS Gen2 type. Now we got one interesting use case wherein we have to push data from source to sink with the help of ADF V2. Having said that, its not just normal copy activity we are expecting but we need to perform this activity on an event basis.
While going through the ADLS Gen2 documents found that ADLS Gen2 yet to support "Azure Event Grids" and that's the reason though we are able to configure ADF's event-based triggers they did not work.
Can anyone suggest me to tackle this situation, since Azure Event Gird is not supported at this instance of time we don't believe we can achieve this with Azure Event Hubs and their integration with ADF?
Thanks.
From my repro, currently event based trigger are supported only on v2 storage accounts.
Data Factory is now integrated with Azure Event Grid, which lets you trigger pipelines on an event.
Note: This integration supports only version 2 Storage accounts (General purpose).
Azure Event Grid doesn't receive events from Azure Data Lake Gen2 accounts because those accounts don't yet generate them.
For more details, refer “Known issues with Azure Data Lake Storage Gen2”.

Query blobs in Blob storage

I have serialized text data that is stored in a blob inside Azure blob storage. The text is basically key/value data. I am wondering if there is a way to easily query the blob without exploding the data into another table/database or pulling the blob into memory?
Azure Blob storage has no API to query data within the blob - it's just dumb storage. See here for the Blob Storage API. You're essentially stuck reading, deserializing and grabbing your value(s).
Perhaps Azure table storage would be a better fit for this application? That at least keeps things in the realm of an Azure storage account rather than needing to pull in a SQL Server instance.
One option you could consider is to use Data Lake Analytics, as it supports Azure Blobs as data source.
Depending on what your preferred way of accessing the data is, you can use PowerShell, .NET SDK etc. to query the data...

using Azure Data Lake for Analytics

Currently as part of our requirements we are working with the below Azure components
Azure Event Hub
Azure Stream Analytics
Azure Table Storage
Azure Sql DB
Basically with first 3 components, we will be building an Analytics and Reports platform.
Currently as we just started we analyze the data from Azure Table Storage and display it in the analytics dashboard.
Recently we came across a new Azure product Azure Data Lake . Doing some research on microsoft website , we could see we can easily migrate data from Azure Table Storage (with help of Azure Data Factory) to Azure Lake Store. Creating big data pipelines using Azure Data Lake and Azure Data Factory
As we go through the above link, it's mentioned that we need to create an Azure Data Lake Analytics pipeline to process the data.
So what am unclear is the where will be analytics output data will be saved. Do we need to save the analytics output to some DB ? or can we real-time analytics through a Http request ?
We have huge number rows of records in Azure Table Storage that will be moved to Azure Data Lake. For this scenario is it a good option or Can we go an analytics-based solution from Azure Table Storage itself.
Please share your thoughts
You can store your analytics output data on Azure Data Lake Store (a data repository that enables you to store all kinds of data in their raw format without defining schemas.) after processing it through Azure Data lake Analytics (An analytics service that enables you to run jobs on data sets without having to think about clusters.)
As you said "We have huge number rows of records in Azure Table Storage that will be moved to Azure Data Lake.", I think performing analytics on data placed on Azure data lake store is much more efficient because it offers unlimited storage with immediate read/write access to it and scaling the throughput you need for your workloads. It's also offers small writes at low latency for big data sets. So I believe it is better choice then Azure Table storage.

Resources