ODBC DataSource - How to keep data on Azure cloud to source PowerBI? - azure

I have a ODBC datasource and I'd need to use it with PowerBI. It works fine locally.
In order to refresh data and keep user up to date, I need to send this data do cloud (Azure/BlobStorage) then PowerBI can connect directly into Blob Storage and consume the data.
As discussed here stackoverflow-topic , we need an on-premises server to have this ODBC running.
It sounds a quite confusing to me as we need the data on cloud, using ODBC as datasource and "publishing" the data with Blob Storage, and for that, we do need a local server.
I'm very begginer with Azure Cloud but... don't we have any other way to have this process configured without local server or a expensive Virtual Machine on Azure? Looks like we are running in circles here.
Thank you!

Your options:
Install Power BI Gateway (standard mode) on a server to host the ODBC driver
Install Power BI Gateway (personal mode) on your PC to host the ODBC driver
Copy the data to Azure (Blob Storage or Azure SQL Database, etc), and configure the Power BI Model to load the data from there instead of ODBC
Manually refresh your data in Power BI Desktop and re-publish it to Power BI whenever the data needs to be updated.
For Option 3 you can use any tool you want to copy the data to Azure. Azure Data Factory is one option, and for that you would need to install the Self-Hosted Integration Runtime to host the ODBC driver. But you can also extract the data to files and copy them to Auzre Storage with AzCopy.

Related

From azure sql Database to snowflake

I am thinking about using Snowflake as data warehouse. My databases are in Azure SQl Database and I would like to know what tools I need for etl my data from Azure SQL Database to Snowflake.
I think Snowpark could work for data transformations, but I wonder what other code tool could I use.
Also, I wonder if I use azure blob storage as staging area or snowflake has its own.
Thanks
You can use HEVO data a third-party tool where you can directly migrate data from Microsoft SQL Server to Snowflake.
STEPS TO BE FOLLOWED
Make a connection to your Microsoft SQL Server database.
Choose a replication mode.
Create a Snowflake Data Warehouse configuration.
Alternatively, You can use SnowSQL to Connect Microsoft SQL Server to Snowflake where you export data from SQL Server to SSMS, upload the same to either Azure storage or S3, and move the data from Storage to Snowflake.
REFERENCES:
Microsoft SQL Server to Snowflake
How to move the data from Azure Blob Storage to Snowflake

How to access a Redshift DB through VPN to extract data and load into own Azure environment?

pretty new to the Azure environment and so far my search for information wasnt very successful.
Problem is as follows:
we wanna access a redshift DB which you can only connect to if you are conntected to a specific VPN beforehand - this is the main problem
we then wanna build an automated data pipeline which extracts daily updated data from the redshift db and create our own analytics solution from it
how can that be set up in a fully automated workflow and also in the simplest, most efficient way with the tools available on the azure platform?
thanks for the help.
If VPN is not the challenge and you just need to extract the data from Redshift DB and store it in any Azure Service like Blob Storage or Azure Synapse Analytics, then best possible way is to use Azure Data Factory. Azure Data Factory is a fully managed, serverless data integration service.
You can copy data using Copy activity from Amazon Redshift to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the Supported data stores table.
Specifically, this Amazon Redshift connector supports retrieving data from Redshift using query or built-in Redshift UNLOAD support.
Note: When copying data to an Azure data store, see Azure Data Center IP Ranges for the Compute IP address and SQL ranges used by the Azure data centers.
In case you need to import data into Azure SQL database from AWS Redshift, follow the link.

Connect Snowflake to Azure analysis services to build cube

I need to build cube on Azure analysis services by connecting to Snowflake DB.
Seems Azure analysis services does not provide a connector to snowflake. Can anyone suggest how to overcome this.
First, on your laptop install both the 32-bit and 64-bit ODBC driver for Snowflake. Then open the "ODBC Data Sources (32-bit)" and create a new system DSN called "Snowflake" using the Snowflake ODBC driver. Repeat in the "ODBC Data Sources (64-bit)" app creating another system DSN named identically as the 32-bit one. Make sure you set tracing=0 in both 32-bit and 64-bit ODBC connection dialog properties as it kills cube processing performance to set tracing=6.
Next, on an appropriate VM (preferably an Azure VM in the same Azure region as Snowflake) ensure the On-premises Data Gateway is setup for Azure Analysis Services. (Though Snowflake is not on-premises, it's not a supported cloud data source, so it must use the gateway.) On that VM, repeat the above ODBC steps.
In Visual Studio, choose File... New... Project... Analysis Services... Tabular... Analysis Services Tabular Project. Choose compatibility mode "SQL Server 2017/Azure Analysis Services (1400)" and choose "Integrated workspace".
Then in Tabular Model Explorer right click the Data Sources folder and choose "Add Data Source". Choose ODBC as the data source and then choose your DSN name from the dropdown.
Choose which tables you wish to import. Once the model is ready to deploy, deploy to Azure Analysis Services and it should use the ODBC driver on the gateway VM to connect to Snowflake.

Connecting Power BI to Azure SQL database without on-premises gateway running

I have recently started using MS Power BI, and have come across a problem which seems inconsistent, and the answer likewise.
I am connecting to an Azure SQL database, and therefore have chosen this as the data source in the desktop app. Everything seems to be working just fine, and I can create tables, graphs and whatnot. One thing is off, though: When I choose Azure SQL DB as the source, the connection dialog box does not appear to be any different than if I just choose (non-Azure) SQL DB. Puzzling.
The other thing, which is actually the main issue: In Power BI (the website), I can open my published reports, but some of them don't show up, and I get an error message in a pink bar at the top, saying the data source is not available because the gateway can't be reached. I am well aware of this, because I have deliberately stopped the gateway service (PBIEgwService) running locally, because I have read several places that if the data source is Azure, an on-premises gateway is not needed. (E.g.: "Question: Do I need a gateway for cloud data sources like Azure SQL Database?
Answer: No! The service will be able to connect to that data source without a gateway." here: https://learn.microsoft.com/en-us/power-bi/service-gateway-onprem-faq)
So in short: Why does PBI not (always) connect directly to Azure?
And yes, I have checked the credentials. I can connect just fine in PBI desktop.
Are you allowing Azure Services to connect in your Azure SQL firewall?
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-firewall-configure#manage-server-level-ip-firewall-rules-using-the-azure-portal

Connect Azure Data Marketplace to SQL Server 2008 R2

Is there a way to connect SQL Server 2008 R2 to the Azure Data Marketplace to enable data import?
Are there any ODBC or JDBC drivers for the Azure Data Marketplace?
I'm a bit confused by the question. Is this about publishing data through the Windows Azure Marketplace and sourcing it from SQL Server? Or is it about accessing published data from an application and bringing that data into your own app?
If the former:
You may choose to host your data in SQL Server. When you sign up for data hosting in the Windows Azure Marketplace, you'll provide the requisite connection strings for your servers. You don't have to worry about ODBC/JDBC drivers. See the data publishing documentation for more details.
If the latter: Data may be accessed via HTTP/OData, not ODBC/JDBC. It's a metered consumption model, so you need to subscribe to a particular data feed, which then gives you an access token. Check out this video from TechEd last year to see more about this, along with a .NET code sample. You can easily access data from any other language as well.
If your goal is to access the data feed directly from SQL Server: I'm no expert in CLR Stored Procedures, but if CLR SP's supported code that can access a web service endpoint, I guess you could write a CLR SP to access a data feed, pull data down, and populate local tables. I have no idea if this is supported or advisable...

Resources