I need to build cube on Azure analysis services by connecting to Snowflake DB.
Seems Azure analysis services does not provide a connector to snowflake. Can anyone suggest how to overcome this.
First, on your laptop install both the 32-bit and 64-bit ODBC driver for Snowflake. Then open the "ODBC Data Sources (32-bit)" and create a new system DSN called "Snowflake" using the Snowflake ODBC driver. Repeat in the "ODBC Data Sources (64-bit)" app creating another system DSN named identically as the 32-bit one. Make sure you set tracing=0 in both 32-bit and 64-bit ODBC connection dialog properties as it kills cube processing performance to set tracing=6.
Next, on an appropriate VM (preferably an Azure VM in the same Azure region as Snowflake) ensure the On-premises Data Gateway is setup for Azure Analysis Services. (Though Snowflake is not on-premises, it's not a supported cloud data source, so it must use the gateway.) On that VM, repeat the above ODBC steps.
In Visual Studio, choose File... New... Project... Analysis Services... Tabular... Analysis Services Tabular Project. Choose compatibility mode "SQL Server 2017/Azure Analysis Services (1400)" and choose "Integrated workspace".
Then in Tabular Model Explorer right click the Data Sources folder and choose "Add Data Source". Choose ODBC as the data source and then choose your DSN name from the dropdown.
Choose which tables you wish to import. Once the model is ready to deploy, deploy to Azure Analysis Services and it should use the ODBC driver on the gateway VM to connect to Snowflake.
Related
I have a ODBC datasource and I'd need to use it with PowerBI. It works fine locally.
In order to refresh data and keep user up to date, I need to send this data do cloud (Azure/BlobStorage) then PowerBI can connect directly into Blob Storage and consume the data.
As discussed here stackoverflow-topic , we need an on-premises server to have this ODBC running.
It sounds a quite confusing to me as we need the data on cloud, using ODBC as datasource and "publishing" the data with Blob Storage, and for that, we do need a local server.
I'm very begginer with Azure Cloud but... don't we have any other way to have this process configured without local server or a expensive Virtual Machine on Azure? Looks like we are running in circles here.
Thank you!
Your options:
Install Power BI Gateway (standard mode) on a server to host the ODBC driver
Install Power BI Gateway (personal mode) on your PC to host the ODBC driver
Copy the data to Azure (Blob Storage or Azure SQL Database, etc), and configure the Power BI Model to load the data from there instead of ODBC
Manually refresh your data in Power BI Desktop and re-publish it to Power BI whenever the data needs to be updated.
For Option 3 you can use any tool you want to copy the data to Azure. Azure Data Factory is one option, and for that you would need to install the Self-Hosted Integration Runtime to host the ODBC driver. But you can also extract the data to files and copy them to Auzre Storage with AzCopy.
I am trying to test out Azure Purview and connect it to an Azure SQL Server. Since the SQL server is hosted in the cloud I want to use the default AutoResolve Integrated Runtime to get connected but there is not one setup or an option to setup a new one. Has anyone else using Purview been able to setup (or needed to setup) an AutoResolve IR?
To connect to Azure SQL DB/MI you can directly go to the Azure Purview portal and register new data sources and select Azure SQL DB/MI.
In this article - Manage data sources in Azure Purview (Preview), you learn how to register new data sources, manage collections of data sources, and view sources in Azure Purview (Preview).
Only to connect on-premise SQL server you need to Set up a
self-hosted integration runtime to scan the data source.
If the data source is located on Azure, you don't need any integration runtime to scan the data source.
Reference: Register and scan an Azure SQL Database.
CHEEKATLAPRADEEP-MSFT is absolutely correct, to go a step further, since you know what an auto resolve integration runtime is, you probably are utilizing Azure Data Factory so in addition to registering your SQL Server, you can also link your Azure Data Factory for data lineage purposes. Based on the pipelines that are executed, it will autonomously create the data lineage.
Navigation to Link Data Factory
Data lineage created by linking Data Factory
Keep in mind, you will have to execute pipelines after linkage for it to pick up the data lineage. Also, for sources or destinations not supported yet, it will not get the data lineage.
I have recently started using MS Power BI, and have come across a problem which seems inconsistent, and the answer likewise.
I am connecting to an Azure SQL database, and therefore have chosen this as the data source in the desktop app. Everything seems to be working just fine, and I can create tables, graphs and whatnot. One thing is off, though: When I choose Azure SQL DB as the source, the connection dialog box does not appear to be any different than if I just choose (non-Azure) SQL DB. Puzzling.
The other thing, which is actually the main issue: In Power BI (the website), I can open my published reports, but some of them don't show up, and I get an error message in a pink bar at the top, saying the data source is not available because the gateway can't be reached. I am well aware of this, because I have deliberately stopped the gateway service (PBIEgwService) running locally, because I have read several places that if the data source is Azure, an on-premises gateway is not needed. (E.g.: "Question: Do I need a gateway for cloud data sources like Azure SQL Database?
Answer: No! The service will be able to connect to that data source without a gateway." here: https://learn.microsoft.com/en-us/power-bi/service-gateway-onprem-faq)
So in short: Why does PBI not (always) connect directly to Azure?
And yes, I have checked the credentials. I can connect just fine in PBI desktop.
Are you allowing Azure Services to connect in your Azure SQL firewall?
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-firewall-configure#manage-server-level-ip-firewall-rules-using-the-azure-portal
I'm having windows azure. I've two virtual machine setup in azure and both virtual machine has sites. I've database on azure sql server . I just want to move azure sql server database to virtual machine's sql server.
How can I move database from azure sql server to virtual machine?
Follow this link, this will describe step by step process
http://blogs.msdn.com/b/sqlazure/archive/2010/05/17/10014014.aspx
You can also export you Azure SQL Database as BACPACK file, move it to your VM and import BACKPACK into on-prem database. This process is easier and has less than exporting via Data Export.
Here is an article how to export/import BACKPACK files: http://fabriccontroller.net/blog/posts/backup-and-restore-your-sql-azure-database-using-powershell/
Not much of GUI explained there, but the first 2 scereenshots show you how to get to exporting and importing menus.
I recommend to practice on non-production DBs first, before you go ahead with your critical data.
Is there a way to connect SQL Server 2008 R2 to the Azure Data Marketplace to enable data import?
Are there any ODBC or JDBC drivers for the Azure Data Marketplace?
I'm a bit confused by the question. Is this about publishing data through the Windows Azure Marketplace and sourcing it from SQL Server? Or is it about accessing published data from an application and bringing that data into your own app?
If the former:
You may choose to host your data in SQL Server. When you sign up for data hosting in the Windows Azure Marketplace, you'll provide the requisite connection strings for your servers. You don't have to worry about ODBC/JDBC drivers. See the data publishing documentation for more details.
If the latter: Data may be accessed via HTTP/OData, not ODBC/JDBC. It's a metered consumption model, so you need to subscribe to a particular data feed, which then gives you an access token. Check out this video from TechEd last year to see more about this, along with a .NET code sample. You can easily access data from any other language as well.
If your goal is to access the data feed directly from SQL Server: I'm no expert in CLR Stored Procedures, but if CLR SP's supported code that can access a web service endpoint, I guess you could write a CLR SP to access a data feed, pull data down, and populate local tables. I have no idea if this is supported or advisable...