Error trying to copy data from Azure SQL database to Azure Blob Storage - azure

I have created a pipeline in Azure data factory (V1). I have a copy pipeline, that has an AzureSqlTable data set on input and AzureBlob data set as output. The AzureSqlTable data set that I use as input, is created as output of another pipeline. In this pipeline I launch a procedure that copies one table entry to blob csv file.
I get the following error when launching pipeline:
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
How can I solve this?

According to the error information, it indicateds that it is not supported action for Azure data factory, but if use Azure sql table as input and Azure blob data as output it should be supported by Azure data factory.
I also do a demo test it with Azure portal. You also could follow the detail steps to do that.
1.Click the copy data from Azure portal.
2.Set copy properties.
3.Select the source
4.Select the destination data store
5.Complete the deployment
6.Check the result from azure and storage.
Update:
If we want to use the existing dataset we could choose [From Existing Conections], for more information please refer to the screenshot.
Update2:
For Data Factory(v1) copy activity settings it just supports to use existing Azure blob storage/Azure Data Lake Store Dataset. More detail information please refer to this link.
If using Data Factory(V2) is acceptable, we could using existing azure sql dataset.

So, actually, if we don't use this awful "Copy data (PREVIEW)" action and we actually add an activity to existing pipeline and not a new pipeline - everything works. So the solution is to add a copy activity manually into an existing pipeline.

Related

Azure Logs Kusto Query Output to database table

I need to save a output from a Kusto query on monitoring logs into a database table but I am unable to find a way to do it. I am presuming there will be a way to get the output from a Kusto query and save it to storage then pull that data into a table using a pipeline.
Any suggestions welcome
I have reproduced in my environment and got expected results as below:
Firstly, I have executed below Kusto query and exported it into csv file into local machine:
AzureActivity
| project OperationName,Level,ActivityStatus
And then uploaded the csv file from local machine into my Blob storage account as below:
Now I created an ADF service
now create new pipeline in it and take Copy activity in that pipeline.
Then I created linked service for blob storage as source and linked service for SQL database as sink.
In source dataset I gave the Blob file
In source dataset I gave SQL server table
Copy activity sink settings set table options as Auto create table
Output In SQL Query editor:
So what we do not is we have created a logic application that runs the query real-time and returns the data via http then we save that to the table. No manual intervention

What to do when my Data source is not supported by Azure Synapse's Data Flow?

I am trying to transform data from Salesforce before loading it to dedicated SQL pool.
When I try to create a dataset from Synapse's Dataflow, I am not able to choose Salesforce a Data store:
Can anyone suggest how to transform data from Salesforce or any other Datasource that is not supported by Dataflow?
As per the Official Documentation, Currently Dataflows does not support Salesforce data as source or sink.
If you want, you can raise the feature request in the Synapse portal.
As an alternate, you can use Copy activity in the Azure Data factory to copy data from Salesforce to Dedicated SQL pool and then you can transform it using Dataflows in synapse from Dedicated SQL DB to Dedicated SQL DB.
Follow the below steps to achieve your requirement:
First create a Data Factory Workspace.
Select the Author hub and a create a pipeline. Now, drag the copy activity from the workspace and select the source. You can see that Salesforce is supported when you select new source dataset. Select it and create a linked service for that.
Now, select the sink dataset and click on Azure Synapse analytics.
Create a linked service for the Dedicated SQL database and select it.
Then, you can select the table in the Dedicated SQL and copy your data by running this.
After this copy, go to Synapse workspace and click on the Source of the Dataflow.
Select the Azure Synapse Analytics in source and click on continue.
Now, click on New to create linked service for the SQL DB. Give the subscription and server name and authenticate with your database.
After the creation of linked service, select it and give your table which is result of the copy in the DB.
Now, go to sink and select Azure Synapse Analytics and create another linked service for it as same above and select the resultant table in DB which you want after transform.
By following the above process, we can achieve the transformation from Salesforce data to Dedicated SQL DB.
Can anyone suggest how to transform data from Salesforce or any other Datasource that is not supported by Dataflow?
You can try this approach for the data stores which are not supported by the Data flows and please refer this to check various data stores supported by Copy activity before doing this process for the other data stores.

Azure Datafactory Copy activity from SAP system on Unpartitioned data

How to read TetaByte of data from On-prem SAP system into Azure blob storage very fastly using Azure datafactory?
Refer to this LINK
MSFT has provided detailed documentation on ADF connectivity with SAP. You can first create a linked service to SAP, create a Dataset and use that dataset as Source in a Copy Activity

Azure Linked Service for a Synapse workspace

Whenever I create a Source in an activity in a Synapse Pipeline, in the Linked Service tab, I get an option to either create a new Linked Service or to select from the dropdown (as shown below). One selection of that dropdown includes a default Linked Service (shown below) that shows as MySynapseWorkspaceName-WorkspaceDefaultStorage (where MySynapseWorkspaceName is name of a Synapse workspace that you create).
It seems that MySynapseWorkspaceName-WorkspaceDefaultStorage is the linked service that gets created when you specify an Azure Data Lake Storage Gen2 (ADLSGen2) account for your Synapse workspace.
Question: If a Dataset for the source or destination (Sink) of an activity in Synapse Pipeline is a ADLSGen2 storage, can we just select the above default linked service MySynapseWorkspaceName-WorkspaceDefaultStorage for that dataset; or choosing this linked service (created for Synapse workspace) for other datasets may cause an issue - and hence we should avoid using this linked service for other datasets inside our Synapse workspace?
From your comment, I understood that You want to know Whether same Linked Service can be used in both Source and Sink datasets ?
Unfortunately, you can not use same Linked Service in both Source and Sink. It may cause an issue and hence you should avoid using same linked service.

ETL using azure table storage

Is there a way i can transform per minute data logged in azure table storage to hourly ,daily and monthly tables?
I have heard of stream analytics and data lake but don't get how this can be done through above two technologies.
As I know we could do that with Azure Data Factory easily on the azure portal . Please have a try to follow my detail steps.
1.Login the Azure new Portal
2.Add a Data Factory
3.Click [Copy data (preview)] to set properties, we can set Recurring pattern as minute , hourly, daily …as we like
4.Choose the source data store as we like, in the demo I choose azure blob table.
5.Specify new Azure storage connection
6.Select tables from the azure storage which to copy data
7.Apply filter if we want to
8.Select destination data store
9.Table mapping
10.Select Parallel copy settings
11.Get the setting summary
12.We can check that copy action has been done from Data Factory
13.Check from the Azure storage table

Resources