Azure Datafactory Copy activity from SAP system on Unpartitioned data - azure

How to read TetaByte of data from On-prem SAP system into Azure blob storage very fastly using Azure datafactory?

Refer to this LINK
MSFT has provided detailed documentation on ADF connectivity with SAP. You can first create a linked service to SAP, create a Dataset and use that dataset as Source in a Copy Activity

Related

How to Ingest data into Azure data explorer from the Azure Table Storage source without Data Factory

I'm new to Azure Data Explorer. Here I need to migrate the data from Azure Table Storage table data into Azure Data Explorer Cluster's Database's Table without using Azure Data Factory.
May be if able to do programatically using .NET, kindly suggest it.
Thanks in advance.

How to store data from Azure Analysis services into Azure Datalake using Azure Data Factory?

How to store data from Azure Analysis services into Azure Datalake using Azure Data Factory?
I have two tables in Azure Analysis service and I need to copy and store those data into Azure Datalake using Azure Data Factory.
Could you please help me or share the reference url.
There is no direct connector for Azure analysis services in ADF but there are some ways via which you can copy the data :
1 such way is to create a linked server across a on prem/IaaS or SQL MI database instance and analysis services and get the data from the SQL instance through ADF.
Can we copy data from Azure Analysis Services using Azure Data Factory?
The above links explains it in details w.r.t the setup.
https://datasharkx.wordpress.com/2021/03/16/copy-data-from-ssas-aas-through-azure-data-factory/

Azure Lake to Lake transfer of files

My company has two Azure environments. The first one was a temporary environment and is being re-purposed / decommissioned / I'm not sure. All I know is I need to get files from one Data Lake on one environment, to a DataLake on another. I've looked at adlcopy and azcopy and neither seem like they will do what I need done. Has anyone encountered this before and if so, what did you use to solve it?
Maybe you can think about Azure Data Factory, it can helps you transfer files or data from one Azure Data Lake to Another Data Lake.
You can reference Copy data to or from Azure Data Lake Storage Gen2 using Azure Data Factory.
This article outlines how to use Copy Activity in Azure Data Factory to copy data to and from Data Lake Storage Gen2. It builds on the Copy Activity overview article that presents a general overview of Copy Activity.
For example, you can learn from this tutorial: Quickstart: Use the Copy Data tool to copy data.
In this quickstart, you use the Azure portal to create a data factory. Then, you use the Copy Data tool to create a pipeline that copies data from a folder in Azure Blob storage to another folder.
Hope this helps.

Error trying to copy data from Azure SQL database to Azure Blob Storage

I have created a pipeline in Azure data factory (V1). I have a copy pipeline, that has an AzureSqlTable data set on input and AzureBlob data set as output. The AzureSqlTable data set that I use as input, is created as output of another pipeline. In this pipeline I launch a procedure that copies one table entry to blob csv file.
I get the following error when launching pipeline:
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
How can I solve this?
According to the error information, it indicateds that it is not supported action for Azure data factory, but if use Azure sql table as input and Azure blob data as output it should be supported by Azure data factory.
I also do a demo test it with Azure portal. You also could follow the detail steps to do that.
1.Click the copy data from Azure portal.
2.Set copy properties.
3.Select the source
4.Select the destination data store
5.Complete the deployment
6.Check the result from azure and storage.
Update:
If we want to use the existing dataset we could choose [From Existing Conections], for more information please refer to the screenshot.
Update2:
For Data Factory(v1) copy activity settings it just supports to use existing Azure blob storage/Azure Data Lake Store Dataset. More detail information please refer to this link.
If using Data Factory(V2) is acceptable, we could using existing azure sql dataset.
So, actually, if we don't use this awful "Copy data (PREVIEW)" action and we actually add an activity to existing pipeline and not a new pipeline - everything works. So the solution is to add a copy activity manually into an existing pipeline.

How to connect Data Lake store in Azure analysis services

How to connect Data Lake store in Azure analysis services
Can we use HIVE ODBC or any other options?
I assume you want to use Azure Data Lake as a data source for Azure Analysis Services (e.g. you have fact and dimension files in the Data Lake).
There is no connector in Azure Analysis Services to pull data directly from Azure Data Lake at present, although hopefully this is something Microsoft will address soon.
As a workaround you could try the following:
Azure Analysis Services will allow you to use Azure Blob Storage as a data source. So once you have transformed your data in Azure Data Lake you then need to copy the fact and dimension files into Azure Blob Storage (e.g. using Azure Data Factory) and then you should be able to use Azure Analysis Services to build your model.
Note that the Blob Storage data source option is only available if you build a 1400 compatibility model in Azure Analysis Services. This option is only available if you have the latest version of SQL Server Data Tools for Visual Studio (you may need to upgrade to version 17.1 of SSDT).
I hope this helps

Resources