Why does LibreOffice base, Microsoft Excel and Tableau require unnecessary permission to query table data from bigquery? - excel

I have a BigQuery instance and I have shared a view with a service account. This service account has the "bigQuery.User" role. I setup Simba ODBC drivers on my Ubuntu machine, installed Libreoffice base on it and also modified the odbc.ini file to use the above service account. I'm able to connect to Bigquery but when I try to query the shared view, it throws an error saying that "user does not have BigQuery.tables.create permission for table ...". Looks like LibreOfficeBase Base is trying to create some temp tables. Tried with MS Excel and same error is thrown
My questions:
Isn't the "bigQuery.User" role enough to query data from shared datasets/tables?
Why does Libre Office Base require such extra permissions?
What I tried:
I shared the data with a user account(someuser#gmail.com). I gave the same role i.e. bigQuery.user to this account. I was able to query data successfully from this account.
I also tried on Tableau. Tableau has native support for Bigqyuery and also supports ODBC connections(to connect BigQuery, MySql, etc). I tried with both i.e. connecting to Bigquery using Tableau native BigQuery support and using ODBC connection. It worked with native BigQuery but not with ODBC connection(maybe it has the same issue as LibreOffice base and MS Excel)

Related

ODBC DataSource - How to keep data on Azure cloud to source PowerBI?

I have a ODBC datasource and I'd need to use it with PowerBI. It works fine locally.
In order to refresh data and keep user up to date, I need to send this data do cloud (Azure/BlobStorage) then PowerBI can connect directly into Blob Storage and consume the data.
As discussed here stackoverflow-topic , we need an on-premises server to have this ODBC running.
It sounds a quite confusing to me as we need the data on cloud, using ODBC as datasource and "publishing" the data with Blob Storage, and for that, we do need a local server.
I'm very begginer with Azure Cloud but... don't we have any other way to have this process configured without local server or a expensive Virtual Machine on Azure? Looks like we are running in circles here.
Thank you!
Your options:
Install Power BI Gateway (standard mode) on a server to host the ODBC driver
Install Power BI Gateway (personal mode) on your PC to host the ODBC driver
Copy the data to Azure (Blob Storage or Azure SQL Database, etc), and configure the Power BI Model to load the data from there instead of ODBC
Manually refresh your data in Power BI Desktop and re-publish it to Power BI whenever the data needs to be updated.
For Option 3 you can use any tool you want to copy the data to Azure. Azure Data Factory is one option, and for that you would need to install the Self-Hosted Integration Runtime to host the ODBC driver. But you can also extract the data to files and copy them to Auzre Storage with AzCopy.

Connect Snowflake to Azure analysis services to build cube

I need to build cube on Azure analysis services by connecting to Snowflake DB.
Seems Azure analysis services does not provide a connector to snowflake. Can anyone suggest how to overcome this.
First, on your laptop install both the 32-bit and 64-bit ODBC driver for Snowflake. Then open the "ODBC Data Sources (32-bit)" and create a new system DSN called "Snowflake" using the Snowflake ODBC driver. Repeat in the "ODBC Data Sources (64-bit)" app creating another system DSN named identically as the 32-bit one. Make sure you set tracing=0 in both 32-bit and 64-bit ODBC connection dialog properties as it kills cube processing performance to set tracing=6.
Next, on an appropriate VM (preferably an Azure VM in the same Azure region as Snowflake) ensure the On-premises Data Gateway is setup for Azure Analysis Services. (Though Snowflake is not on-premises, it's not a supported cloud data source, so it must use the gateway.) On that VM, repeat the above ODBC steps.
In Visual Studio, choose File... New... Project... Analysis Services... Tabular... Analysis Services Tabular Project. Choose compatibility mode "SQL Server 2017/Azure Analysis Services (1400)" and choose "Integrated workspace".
Then in Tabular Model Explorer right click the Data Sources folder and choose "Add Data Source". Choose ODBC as the data source and then choose your DSN name from the dropdown.
Choose which tables you wish to import. Once the model is ready to deploy, deploy to Azure Analysis Services and it should use the ODBC driver on the gateway VM to connect to Snowflake.

Getting the error when I am trying to connect the on prem IBM DB2 from Azure using Microsoft integration runtime

We are trying to get the data from On Premise IBM DB2 from Azure Data Factory using Microsoft Integration Runtime. We are able to connect the database also we are able to get the list of tables in the ADF Dataset but when we are trying to execute the query we are getting the below error. Not able to identify the issue. Help me on this.
ROUTINE SQLSTATISTICS (SPECIFIC NAME SQLSTATISTICS) HAS RETURNED AN
ERROR SQLSTATE WITH DIAGNOSTIC TEXT -805
DSN.DSNASPCC.DSNASTAU.0E5F1F1D09F1404 SQLSTATE=38112 SQLCODE=-443
Error Screen Shot
Suggest you follow this technote to bind db2schema.bnd to the target Db2 database, after ensuring your Db2 client has the same version/fixpack as the Db2-server.

Connecting to Azure Sql Server from Access linked tables using AAD

Is it possible to connect to Azure Sql Server database from Access using 'Sql Server Native Agent' driver and Azure Active Directory authentication?
I can connect using Odbc driver for Sql Server but if in Access I have problem with date and datetime2 columns (they are seen by Access as text fields instead datetime)
This is a known issue of older SQL server ODBC drivers.
Use a more recent ODBC driver. An overview of versions can be found on This MS Docs page

Connecting from Tibco SOTFIRE to SAS datasets on windows

How to create connection from spotfire to SAS datasets?
Good Morning su919. Your question doesn't provide enough details but the following link will help you out:
http://support.spotfire.com/sr_spotfire65.asp
Select your version at the top to read requirements related to Spotfire. In particular to SAS...
SAS Providers for OLE DB 9.22 or higher
It is possible to import SAS data files (*.sas7bdat, *.sd2, *.sd7) into TIBCO Spotfire directly. The requirement for this functionality is that the SAS Providers for OLE DB 9.22 or higher must first be installed on the client machine.
Click here to download SAS Providers for OLE DB (free registration on SAS website required).
NOTE: SAS Providers for OLE DB are not supported for use in Spotfire Web Player (the SAS driver is not thread-safe, which can cause general platform instabilities).

Resources