Read from ADLS gen 2 with SSIS - azure

Does anyone know which connection and Data Flow Component to use for ADLS (Azure Data Lake Store) gen2?
I've managed to use the blob connector in the connection manager and successfully connect to ADLS Gen2, but when I try to use the blob source component I get a 400 bad request. Works fine if it's just a blob storage without HNS.
The ADLS components states it's just for ADLS gen 1.
So how to read and write to/from ADLS Gen 2?

A current version of SSIS Azure Feature Pack supports ADLS Gen2. It can be used as a data source or destination in dataflow:
The screenshot is to show it as a destination, but the ADLSgen2 works well also as a source via corresponding "Flexible File Destination" and "Flexible File Source"

First of all, based on the great link provided by #rickvdbosch it looks like that there are many temporary limitations with Azure Data Lake Storage Gen2 concerning the BLOB Storage API. Which means that it is not a component limitation and maybe you should wait until it will be integrated with SSIS.
Microsoft SQL SERVER Feature pack for Azure
If you meant these components when you mentioned that:
The ADLS components states it's just for ADLS gen 1.
Then ignore this part.
I am not pretty sure if it supports Gen2, but I think you can use the Azure Data Lake Store components which are a part of the Microsoft SQL SERVER feature pack for Azure. For more information you can refer to:
Azure Data Lake Store in SSIS
Azure Data Lake Store Source
Azure Data Lake Store Destination
Download Link
Azure Feature Pack for Integration Services (SSIS)
Other methods
If the suggestion above didn't work then you should use Azure Data Factory or a command line by Installing AWS CLI and using AzCopy v10

I got the following info:
"At the moment Gen 2 don’t support BLOB API (but it will in a short time) and hence, SSIS is not able to connect."
So for SSIS it's currently either ADLS Gen 1, or blob store

I used the Script Task to write files or System.Objects (converted to csv in Memory) to Azure Storage Gen 2 (Hierarchical Namespace Enabled) using the Rest API. I did this as a demo until the SSIS components are released.

You can't write to ADLS Gen2 using the old components from the Azure Feature Pack, but you can connect to the blob Gen2 (non-hierarchical) using the Azure Blob Destination Component.

Related

Copy Data from Azure Data Lake to SnowFlake without stage using Azure Data Factory

All the Azure Data Factory examples of copying data from Azure Data Lake Gen 2 to SnowFlake use a storage account as stage. If the stage is not configured (as shown in picture), I get this error in Data Factory even when my source is a csv file in Azure data lake - "Direct copying data to Snowflake is only supported when source dataset is DelimitedText, Parquet, JSON with Azure Blob Storage or Amazon S3 linked service, for other dataset or linked service, please enable staging".
At the same time, SnowFlake documentation says the the external stage is optional. How can I copy data from Azure Data Lake to SnowFlake using Data Factory's Copy Data Activity without having an external storage account as stage?
If staging storage is needed to make it work, we shouldn't say that data copy from Data Lake to SnowFlake is supported. It works only when, Data Lake data is is first copied in a storage blob and then to SnowFlake.
Though Snowflake supports blob storage, Data Lake storage Gen2, General purpose v1 & v2 storages, loading data into snowflake is supported- through blob storage only.
The source linked service is Azure Blob storage with shared access signature authentication. If you want to directly copy data from Azure Data Lake Storage Gen2 in the following supported format, you can create an Azure Blob linked service with SAS authentication against your ADLS Gen2 account, to avoid using staged copy to Snowflake.
Select Azure blob storage in linked service, provide SAS URI details of Azure data lake gen2 source file.
Blob storage linked service with data lake gen2 file:
You'll have to configure blob storage and use it as staging. As an alternative you can use external stage. You'll have to create a FILE TYPE and NOTIFICATION INTEGRATION and access the ADLS and load data into Snowflake using copy command. Let me know if you need more help on this.

Uploading Data(csv file) using Azure Functions(Nodejs) To Azure DataLakeGen2

I am currently trying to send a csv file using Azure Function with NodeJs to Azure Data Lake gen2 but unable to do the same, Any suggestions regarding the same would be really helpful.
Thanks.
I have tried to use Credentials of blob storage present in ADLS gen2 using the Blob storage API's but i am getting an error.
For now this could not be implemented with SDK. Please check this known issue:
Blob storage APIs are disabled to prevent feature operability issues that could arise because Blob Storage APIs aren't yet interoperable with Azure Data Lake Gen2 APIs.
And in the table of features, you could find the information about APIs for Data Lake Storage Gen2 storage accounts:
multi-protocol access on Data Lake Storage is currently in public preview. This preview enables you to use Blob APIs in the .NET, Java, Python SDKs with accounts that have a hierarchical namespace. The SDKs don't yet contain APIs that enable you to interact with directories or set access control lists (ACLs). To perform those functions, you can use Data Lake Storage Gen2 REST APIs.
So if you want to implement it, you have to use the REST API:Azure Data Lake Store REST API.

Is there a way to load data to Azure data lake storage gen 2 using logic app?

I have load data to azure datalake storage gen2 using logic app.I tried using the connector azure file storage but i couldn't get any filesytem folder in that.Can some one help me on this issue?
Note: without using copy activity.
Currently, there has no connector for data lake gen2 in logic app. https://feedback.azure.com/forums/287593-logic-apps/suggestions/37118125-connector-for-azure-data-lake-gen-2.
Here is a workaround which I have tested to work.
1. create a azure data factory service.
2. create a pipeline to copy files from data lake gen1 to data lake gen2.
https://learn.microsoft.com/en-us/azure/data-factory/load-azure-data-lake-storage-gen2#load-data-into-azure-data-lake-storage-gen2.
use data factory connector in logic app to create a pipeline run.
Once run successfully, the related files will be copied to the target folder under data lake gen2.
Isn't ADLS Gen2 just a blob container? Select the Azure Blob Storage connector, then Create Blob task.
I selected "Azure Blob Storage" as action in logic app and then selected my ADLSGen2 storage account name. it is working fine. Do you guys see any issue ??

Logic Apps - Get Blob Content Using Path

I have a event driven logic app (blob event) which reads a block blob using the path and uploads the content to Azure Data Lake. I noticed the logic app is failing with 413 (RequestEntityTooLarge) reading a large file (~6 GB). I understand that logic apps has the limitation of 1024 MB - https://learn.microsoft.com/en-us/connectors/azureblob/ but is there any work around to handle this type of situation? The alternative solution I am working on is moving this step to Azure Function and get the content from the blob. Thanks for your suggestions!
If you want to use an Azure function, I would suggest you to have a look at this at this article:
Copy data from Azure Storage Blobs to Data Lake Store
There is a standalone version of the AdlCopy tool that you can deploy to your Azure function.
So your logic app will call this function that will run a command to copy the file from blob storage to your data lake factory. I would suggest you to use a powershell function.
Another option would be to use Azure Data Factory to copy file to Data Lake:
Copy data to or from Azure Data Lake Store by using Azure Data Factory
You can create a job that copy file from blob storage:
Copy data to or from Azure Blob storage by using Azure Data Factory
There is a connector to trigger a data factory run from logic app so you may not need azure function but it seems that there is still some limitations:
Trigger Azure Data Factory Pipeline from Logic App w/ Parameter
You should consider using Azure Files connector:https://learn.microsoft.com/en-us/connectors/azurefile/
It is currently in preview, the advantage it has over Blob is that it doesn't have a size limit. The above link includes more information about it.
For the benefit of others who might be looking for a solution of this sort.
I ended up creating an Azure Function in C# as the my design dynamically parses the Blob Name and creates the ADL structure based on the blob name. I have used chunked memory streaming for reading the blob and writing it to ADL with multi threading for adderssing the Azure Functions time out of 10 minutes.

How to connect Data Lake store in Azure analysis services

How to connect Data Lake store in Azure analysis services
Can we use HIVE ODBC or any other options?
I assume you want to use Azure Data Lake as a data source for Azure Analysis Services (e.g. you have fact and dimension files in the Data Lake).
There is no connector in Azure Analysis Services to pull data directly from Azure Data Lake at present, although hopefully this is something Microsoft will address soon.
As a workaround you could try the following:
Azure Analysis Services will allow you to use Azure Blob Storage as a data source. So once you have transformed your data in Azure Data Lake you then need to copy the fact and dimension files into Azure Blob Storage (e.g. using Azure Data Factory) and then you should be able to use Azure Analysis Services to build your model.
Note that the Blob Storage data source option is only available if you build a 1400 compatibility model in Azure Analysis Services. This option is only available if you have the latest version of SQL Server Data Tools for Visual Studio (you may need to upgrade to version 17.1 of SSDT).
I hope this helps

Resources