Ezport data from azure synapse sql - azure

We have azure synapse with external data source as azure dala lake gen2.
We need to export T-SQL query results as csv file on a weekly schedule from Azure synapse to any blob storage or FTP. I could not find documents related to export from synapse. Please guide me through this - I've been stuck here for a long time.

Per this answer, I think the answer is:
Make a Dataflow where
the source is the Synapse db and you pass the query you want
the sink is a csv in ADLS gen2
Make an ADF pipeline with a weekly schedule trigger that calls your Dataflow

Related

Synapse Analytics save a query result in a blob storage

So, I have some parquet files stored in azure devops containers inside a blob storaged. I used data factory pipeline with the "copy data" connector to extract the data from a on premise Oracle database.
I'm using Synapse Analytics to do a sql query that uses some of the parquet files store in the blob container and I want to save the results of the query in another blob. Which Synapse connector can I use the make this happen? To do the query I'm using the "develop" menu inside Synapse Analytics.
To persist the results of a serverless SQL query, use CREATE EXTERNAL TABLE AS SELECT (CETAS). This will create a physical copy of the result data in your storage account as a collection of files in a folder, so you cannot specify how many files nor the naming scheme of the files.

Is there a way to setup an export of Log Analytics tables to Azure SQL database?

I am looking for way to automatically export data from a Log Analytics table into an Azure SQL database table? Does anyone know how to do this on an automated schedule?
To export log analytics data to Blob storage or ADLS, you can try one of below.
Log Analytics – data export (preview) and example
Archive data from Log Analytics workspace to Azure storage using Logic App
Next ....You can go for Ingest option in ADF.
Next, configure source linked service i.e. Blob Storage or Azure Table storage linked and Sink i.e. SQL DB
Or any relevant source or sink option you want as per the logs stored.

How to execute SQL scripts (SQL Dacpac file) in Azure Data Factory v2?

I have a requirement to create SQL tables in the Azure SQL Database and then load the data to those tables from CSV files in the Blob storage. The tables needs to created and then dropped after the completion of process.
We usually use the dacpac to deploy the database objects from Azure Devops pipeline.
Is there a way to execute these dacpac from ADF?
or Is there any activity in the Azure Data Factory v2 to execute the scripts, if I maintain these scripts in the storage?
Any suggestions are much appreciated!
As I know about Data Factory, we can not import the DACPAC file to SQL database directly.
Data Factory will consider the dacpac as a file, not a SQL script.
And Azure Data Factory only supports the following file formats. Refer to each article for format-based settings.:
Hope this helps.

Is there a way to load data to Azure data lake storage gen 2 using logic app?

I have load data to azure datalake storage gen2 using logic app.I tried using the connector azure file storage but i couldn't get any filesytem folder in that.Can some one help me on this issue?
Note: without using copy activity.
Currently, there has no connector for data lake gen2 in logic app. https://feedback.azure.com/forums/287593-logic-apps/suggestions/37118125-connector-for-azure-data-lake-gen-2.
Here is a workaround which I have tested to work.
1. create a azure data factory service.
2. create a pipeline to copy files from data lake gen1 to data lake gen2.
https://learn.microsoft.com/en-us/azure/data-factory/load-azure-data-lake-storage-gen2#load-data-into-azure-data-lake-storage-gen2.
use data factory connector in logic app to create a pipeline run.
Once run successfully, the related files will be copied to the target folder under data lake gen2.
Isn't ADLS Gen2 just a blob container? Select the Azure Blob Storage connector, then Create Blob task.
I selected "Azure Blob Storage" as action in logic app and then selected my ADLSGen2 storage account name. it is working fine. Do you guys see any issue ??

Is possible to read an Azure Databricks table from Azure Data Factory?

I have a table into an Azure Databricks Cluster, i would like to replicate this data into an Azure SQL Database, to let another users analyze this data from Metabase.
Is it possible to acess databricks tables through Azure Data factory?
No, unfortunately not. Databricks tables are typically temporary and last as long as your job/session is running. See here.
You would need to persist your databricks table to some storage in order to access it. Change your databricks job to dump the table to Blob storage as it's final action. In the next step of your data factory job, you can then read the dumped data from the storage account and process further.
Another option may be databricks delta although I have not tried this yet...
If you register the table in the Databricks hive metastore then ADF could read from it using the ODBC source in ADF. Though this would require an IR.
Alternatively you could write the table to external storage such as blob or lake. ADF can then read that file and push it to your sql database.

Resources