Call stored procedure from ADF and sink to Azure SQL database - azure

I'm struggling with calling the stored procedure (create in SSMS, Azure serverless pools) in ADF and sink to AZURE database.
I have a copy data activity, my Source dataset is linked to my synapse analytics serverless pools:
My Sink is connected to an Azure SQL database parameter in the Sink coming from the Azure SQL database Dataset.
This is where I want to write output from the stored procedure. Problem is that I could not figure out how I could TRUNCATE TABLE in Pre-copy-Script.

There is a text box for pre-copy script on your 2nd screen dump.
Just add TRUNCATE TABLE YOURTABLENAME in the box. Replacing YOURTABLENAME with the actual name of your table.

Related

Azure Logs Kusto Query Output to database table

I need to save a output from a Kusto query on monitoring logs into a database table but I am unable to find a way to do it. I am presuming there will be a way to get the output from a Kusto query and save it to storage then pull that data into a table using a pipeline.
Any suggestions welcome
I have reproduced in my environment and got expected results as below:
Firstly, I have executed below Kusto query and exported it into csv file into local machine:
AzureActivity
| project OperationName,Level,ActivityStatus
And then uploaded the csv file from local machine into my Blob storage account as below:
Now I created an ADF service
now create new pipeline in it and take Copy activity in that pipeline.
Then I created linked service for blob storage as source and linked service for SQL database as sink.
In source dataset I gave the Blob file
In source dataset I gave SQL server table
Copy activity sink settings set table options as Auto create table
Output In SQL Query editor:
So what we do not is we have created a logic application that runs the query real-time and returns the data via http then we save that to the table. No manual intervention

What to do when my Data source is not supported by Azure Synapse's Data Flow?

I am trying to transform data from Salesforce before loading it to dedicated SQL pool.
When I try to create a dataset from Synapse's Dataflow, I am not able to choose Salesforce a Data store:
Can anyone suggest how to transform data from Salesforce or any other Datasource that is not supported by Dataflow?
As per the Official Documentation, Currently Dataflows does not support Salesforce data as source or sink.
If you want, you can raise the feature request in the Synapse portal.
As an alternate, you can use Copy activity in the Azure Data factory to copy data from Salesforce to Dedicated SQL pool and then you can transform it using Dataflows in synapse from Dedicated SQL DB to Dedicated SQL DB.
Follow the below steps to achieve your requirement:
First create a Data Factory Workspace.
Select the Author hub and a create a pipeline. Now, drag the copy activity from the workspace and select the source. You can see that Salesforce is supported when you select new source dataset. Select it and create a linked service for that.
Now, select the sink dataset and click on Azure Synapse analytics.
Create a linked service for the Dedicated SQL database and select it.
Then, you can select the table in the Dedicated SQL and copy your data by running this.
After this copy, go to Synapse workspace and click on the Source of the Dataflow.
Select the Azure Synapse Analytics in source and click on continue.
Now, click on New to create linked service for the SQL DB. Give the subscription and server name and authenticate with your database.
After the creation of linked service, select it and give your table which is result of the copy in the DB.
Now, go to sink and select Azure Synapse Analytics and create another linked service for it as same above and select the resultant table in DB which you want after transform.
By following the above process, we can achieve the transformation from Salesforce data to Dedicated SQL DB.
Can anyone suggest how to transform data from Salesforce or any other Datasource that is not supported by Dataflow?
You can try this approach for the data stores which are not supported by the Data flows and please refer this to check various data stores supported by Copy activity before doing this process for the other data stores.

Loading data into Azure Synapse Analytics from Azure SQL Database

I am followin this tutorial to move data from SQL to Azure Synapse https://learn.microsoft.com/en-us/azure/data-factory/load-azure-sql-data-warehouse?tabs=data-factory
However, once I get to step 5c I cannot select a a Database name, Do I have to create an Azure Synapse Database first to copy data over there? I though that is what this tutorial will do?
I have a SQL database and I want to move the data into Azure Synapse.
Thanks
Yes, in Azure data factory your source and sink needs to be already present w.r.t database scenarios.
So it is expected that you already have an Azure SQL database and Azure SQL datawarehouse in place before proceeding with copy activity

How do I bulk insert into an Azure SQLServer Database?

I would like to do bulk inserts to my Azure database from Python, but I can't find the documentation for how it's done.
This page says:
The following table summarizes the options for moving data to an Azure SQL Database.
The section linked from that table says:
The steps for the procedure using the Bulk Insert SQL Query are similar to those covered in the sections for moving data from a flat file source to SQL Server on an Azure VM.
And that provides the following query:
BULK INSERT <tablename>
FROM
'<datafilename>'
WITH
(
FirstRow=2,
FIELDTERMINATOR =',', --this should be column separator in your data
ROWTERMINATOR ='\n' --this should be the row separator in your data
)
But presumably that datafile has to live somewhere, but I can't find where in the documentation that it confirms where this data file should live. I can create a csv file and upload it as a blob to Azure storage, but nobody in the last year had an answer for how get it from there to SQL Azure.
How can I bulk insert into SQL Azure?
SQL Server vNext CTP supports T-SQL commands that load from Azure Blob Storage. This will be available in Azure SQL Database soon so you can use BULK INSERT command from your example to load data from Azure Blob Storage.

Azure Data Factory: Moving data from Table Storage to SQL Azure

While moving data from Table Storage to SQL Azure, is it possible to obtain only the Delta (The data that hasn't been already moved) using Azure Data Factory?
A more detailed explanation:
There is an Azure Storage Table, which contains some data, which will be updated periodically. And I want to create a Data Factory pipeline which moves this data to an SQL Azure Database. But during each move I only want the newly added data to be written to SQL DB. Is it possible with Azure Data Factory?
See more information on azureTableSourceQuery and copy activity at this link : https://azure.microsoft.com/en-us/documentation/articles/data-factory-azure-table-connector/#azure-table-copy-activity-type-properties.
Also see this link for invoking stored procedure for sql: https://azure.microsoft.com/en-us/documentation/articles/data-factory-azure-sql-connector/#invoking-stored-procedure-for-sql-sink
You can query each time on timestamp to achieve something similar to delta copy, but this is not true delta copy.

Resources