Can we copy data from Azure Analysis Services using Azure Data Factory? - azure

I cannot find any connector related to this and wanted to know if this is possible.

By copying data from Azure analysis service , do you mean to pull the aggregated data from the OLAP system into a target store?
If yes, then there is no direct connector but you can pull the data based on a linked server concept via ADF v2.
We need to create a linked server between an IaaS/On Prem/SQL MI data instance and AAS Server :
In ADF, create a linked service connecting to the database instance and query the Azure analysis service through database instance via OpenQuery :
select * from openquery(AZUREAS,
'evaluate SUMMARIZECOLUMNS(
''Geography''[City],
"My Measure", [My Measure]
)')
This is how you would be able to copy the AAS data from ADF into a destination :)
Blog post for the same:
https://datasharkx.wordpress.com/2021/03/16/copy-data-from-ssas-aas-through-azure-data-factory/

You also can use Azure Analysis Services REST API.

Related

Unable to create a C# object in Azure Data Factory

I m new to Azure Data Factory. How Can I create a C# object in Azure data factory and I m not sure how we can create an SQL connection in ADF? Somebody please guide me on this?
Thanks #Peter Bons for the valuable suggestions on this.
Follow the below detailed process to achieve your requirement.
Create an Azure data factory from the Portal.
Create SQL Server & Azure Storage linked services with a self-hosted integration runtime.
From the Azure Data Factory Studio launch the Data Factory UI in new tab.
Select the SQL Server from the new dataset in pipeline.
Create New linked service (SQL Server) and test the connection, it will show you Connection Successful once we given all the required values.
Linked Service Screenshot:
References:
Copy data from a SQL Server database to Azure Blob storage by using Azure Data Factory.
Create an Azure SQL Database linked service using UI.

What to do when my Data source is not supported by Azure Synapse's Data Flow?

I am trying to transform data from Salesforce before loading it to dedicated SQL pool.
When I try to create a dataset from Synapse's Dataflow, I am not able to choose Salesforce a Data store:
Can anyone suggest how to transform data from Salesforce or any other Datasource that is not supported by Dataflow?
As per the Official Documentation, Currently Dataflows does not support Salesforce data as source or sink.
If you want, you can raise the feature request in the Synapse portal.
As an alternate, you can use Copy activity in the Azure Data factory to copy data from Salesforce to Dedicated SQL pool and then you can transform it using Dataflows in synapse from Dedicated SQL DB to Dedicated SQL DB.
Follow the below steps to achieve your requirement:
First create a Data Factory Workspace.
Select the Author hub and a create a pipeline. Now, drag the copy activity from the workspace and select the source. You can see that Salesforce is supported when you select new source dataset. Select it and create a linked service for that.
Now, select the sink dataset and click on Azure Synapse analytics.
Create a linked service for the Dedicated SQL database and select it.
Then, you can select the table in the Dedicated SQL and copy your data by running this.
After this copy, go to Synapse workspace and click on the Source of the Dataflow.
Select the Azure Synapse Analytics in source and click on continue.
Now, click on New to create linked service for the SQL DB. Give the subscription and server name and authenticate with your database.
After the creation of linked service, select it and give your table which is result of the copy in the DB.
Now, go to sink and select Azure Synapse Analytics and create another linked service for it as same above and select the resultant table in DB which you want after transform.
By following the above process, we can achieve the transformation from Salesforce data to Dedicated SQL DB.
Can anyone suggest how to transform data from Salesforce or any other Datasource that is not supported by Dataflow?
You can try this approach for the data stores which are not supported by the Data flows and please refer this to check various data stores supported by Copy activity before doing this process for the other data stores.

How to access a Redshift DB through VPN to extract data and load into own Azure environment?

pretty new to the Azure environment and so far my search for information wasnt very successful.
Problem is as follows:
we wanna access a redshift DB which you can only connect to if you are conntected to a specific VPN beforehand - this is the main problem
we then wanna build an automated data pipeline which extracts daily updated data from the redshift db and create our own analytics solution from it
how can that be set up in a fully automated workflow and also in the simplest, most efficient way with the tools available on the azure platform?
thanks for the help.
If VPN is not the challenge and you just need to extract the data from Redshift DB and store it in any Azure Service like Blob Storage or Azure Synapse Analytics, then best possible way is to use Azure Data Factory. Azure Data Factory is a fully managed, serverless data integration service.
You can copy data using Copy activity from Amazon Redshift to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the Supported data stores table.
Specifically, this Amazon Redshift connector supports retrieving data from Redshift using query or built-in Redshift UNLOAD support.
Note: When copying data to an Azure data store, see Azure Data Center IP Ranges for the Compute IP address and SQL ranges used by the Azure data centers.
In case you need to import data into Azure SQL database from AWS Redshift, follow the link.

How do I setup an AutoResolve Integrated Runtime in Azure Purview

I am trying to test out Azure Purview and connect it to an Azure SQL Server. Since the SQL server is hosted in the cloud I want to use the default AutoResolve Integrated Runtime to get connected but there is not one setup or an option to setup a new one. Has anyone else using Purview been able to setup (or needed to setup) an AutoResolve IR?
To connect to Azure SQL DB/MI you can directly go to the Azure Purview portal and register new data sources and select Azure SQL DB/MI.
In this article - Manage data sources in Azure Purview (Preview), you learn how to register new data sources, manage collections of data sources, and view sources in Azure Purview (Preview).
Only to connect on-premise SQL server you need to Set up a
self-hosted integration runtime to scan the data source.
If the data source is located on Azure, you don't need any integration runtime to scan the data source.
Reference: Register and scan an Azure SQL Database.
CHEEKATLAPRADEEP-MSFT is absolutely correct, to go a step further, since you know what an auto resolve integration runtime is, you probably are utilizing Azure Data Factory so in addition to registering your SQL Server, you can also link your Azure Data Factory for data lineage purposes. Based on the pipelines that are executed, it will autonomously create the data lineage.
Navigation to Link Data Factory
Data lineage created by linking Data Factory
Keep in mind, you will have to execute pipelines after linkage for it to pick up the data lineage. Also, for sources or destinations not supported yet, it will not get the data lineage.

Azure Data Lake Store exposed via OData

Is it possible to expose an Azure Data Lake Store via OData? The main goal is consume this service in Salesforce (via Salesforce Connect).
If so, should it take place through Azure Data Factory?
Update
A little bit more of context:
We have historical data stored in Azure Data Lake Storage (ADLS) that we want to expose via OData (to be visualised in Salesforce via Salesforce Connect / External objects). After digging into the issue and potential solutions, we don't think ADLS is the right service to be used in this particular case. Instead, we'll might need to configure a Data Factory pipeline to copy the data we are interested in to a SQL Database and read the data from there via a simple ASP.Net application using Entity Data Model and WCF Data Services Entity Framework Provider (got some insights from this website).
I don't think OData has a connector for ADLS. However, given OData is basically a REST API, you could probably build an OData API over the existing ADLS REST APIs if they are not providing what you need. I am not sure how ADF would come into this picture?
Maybe it would be useful if you tell us what you want to achieve?

Resources