Unable to pull non-US BigQuery datasets via Simba ODBC Excel driver for Mac - excel

I have installed the ODBC driver manager and the Simba ODBC driver for BigQuery (v2.4.1.1009 from the Google Cloud docs) onto my Mac, and successfully set up authorisation and the .ini configuration file to the point that I am able to see all my tables in the Get Data from Database view in Excel. However, when I try to query any dataset that isn't in the US, I get the following error:
"[Simba][BigQuery] (31750) Dataset is not found. Not found: Dataset
project-name:dataset-name was not found in location US.
All my datasets are in the EU, but I created a test dataset in the US in the same project and was able to successfully query a test table there using the connection, so I'm confident that it's not an authorisation issue.
Having looked thoroughly in the documentation for the driver, I couldn't find any settings to add to the .ini file to specify a processing location or region, nor did I find a way to explicitly specify a dataset location in BigQuery SQL. The documentation does make mention of a "QueryProperties" setting, which apparently supports the same properties of the connectionProperties of the JobConfigurationQuery of the "Job" type in the BigQuery API, however this is listed as only supporting the time_zone property.
I've been able to access these datasets regularly before via the equivalent ODBC driver for Windows, also from Simba, but I'm not sure if this a limitation specific to the Mac version, or if I'm missing a setting somewhere? It was a while ago, but I'm pretty sure I didn't have to specify a region at any point when using the Windows version.
Any help appreciated!

Related

tCosmosDBConnection component is not showing in Talend Open Studio for Data Integration

Im using Talend Open Studio fro Data Integration Version 7.1.1. I need to Connect Azure Database to data extract and data upload from Talend to our SQL databases. I have gone through below link to connect to Azure cosmos dbs.
https://help.talend.com/r/OgamG5JTIU2aMhx2HjGp8g/L1KG9WTDgOCS8RO9RzqIQw
But in my Talend DB Connection I can find the any CosmosDBconnection type or component. Even I tried to search in packages but didnt find. Could someone tell me how install or enable CosmosDBConnection or how to connect to Cosmos database using Talend.
As per Talend documentation -
tCosmosDBConnection
Creates a connection to a CosmosDB database and reuse that connection in other components.
tCosmosDBConnection Standard properties
These properties are used to configure tCosmosDBConnection running in the Standard Job framework.
The Standard tCosmosDBConnection component belongs to the Cloud and
the Databases families.
The component in this framework is available in all Talend products
with Big Data.
As you are using Talend Open Studio for Data Integration Version 7.1.1, you are not able to see these specific components in your palette.
Also, you might want to check out Project Settings -> Designer -> Palette settings and check out the components that are available and selected for your studio.

Manual Azure Backup Cosmos DB

Tried to export data in CosmosDB but it was not successful. According to https://learn.microsoft.com/en-us/azure/cosmos-db/storage-explorer, by using this tool I can export the data inside the cosmosdb, but no option to export. Tried to do the instructions here https://azure.microsoft.com/en-us/updates/documentdb-data-migration-tool/ and https://learn.microsoft.com/en-us/azure/cosmos-db/import-data#JSON, but error is being encountered.
Can you help me how to do this in Data Factory or any steps just to manual backup cosmos DB?
i tried doing the backup through azure data factory but data factory can't seem to connect to cosmos db, it's so weird 'cause the primary string/secondary string that I used is in the details of the cosmos db
Thank you.
Can you help me how to do this in Data Factory
According to your description,it seems you have trouble with export data,not import data. You could use Copy activity in ADF which supports cosmos db connector.For you needs,cosmos db is source dataset and please add one more sink dataset(destination). Such as some json files in the blob storage.Just make sure you configure right authentication information with your cosmos db account.
ADF is more suitable for the batch back up or daily back up.
or any steps just to manual backup cosmos DB
Yes,Storage Explorer is not for exporting data from cosmos db,Data migration tool is the suitable option.Please install the tool and refer to some details from this link:https://learn.microsoft.com/en-us/azure/cosmos-db/import-data#export-to-json-file
DMT is more suitable for single back up.Surely,it also supports execution in the batch if you use command line to execute it.
Cosmos DB Data Migration tool can be used to export data from Cosmos DB.
Refer https://learn.microsoft.com/en-us/azure/cosmos-db/import-data#export-to-json-file
this one worked for me... since my SSL in my Macbook did not work, I did these steps from the Azure VM that I created.
Steps:
Download MongoDB Community Server Client tool as per your OS version and MongoDB compatible version.
(Or you can download [v3.2.22 for Windows X64] directly at here, please don’t download the version beyond 4.2 as it’s incompatible)
After installing the MongoDB client tools, go to the installation directory -> go to the subfolder “bin” containing the mongoexport.exe, then issue below command to export your data:
mongoexport --host=: -u= -p= --db= --collection= --ssl --sslAllowInvalidCertificates --out=
Note 1: You can find the , , and in Cosmos DB Portal – “Connection String”

cognos analytics and cognos insight

all,
I just try the latest cognos analytics 11 Trial version. It seems to connect to cloud directly, but when I try to connect to MySql database in remote linux (I go to Manage->Data servers->New, then fill in the server, port...) from windows 8.1. It always raise the followed errors:
XQE-JDB-0004 A problem occurred finding the driver class "com.mysql.jdbc.Driver".
It seems JDBC driver has not been installed or configured in the server
My Questions are:
For latest cognos analytics 11 trial version in cloud, where to configure the server or install JDBC driver? Or we need to install cognos express sever firstly?
For cognos analytics 11, besides cloud version, could we download the usual desktop version? when I click the access trial, it seems to directly connect to the cloud version. I could not find where to download the desktop version of cognos analytics
For another cognos software--cognos insight, trial version could only import CSV file, and does NOT support MySql database. Is it right?
Thanks in advance
JDBC drivers need to be added to the <Cognos root>/drivers folder (as of Cognos 11), so there would be no way for you to add the necessary driver to their cloud installation. I am assuming they have only chosen to support a subset of data sources for the cloud trial, but I am not aware of a list of which ones they are allowing/supporting.
I have not heard of an on-premises Cognos Analytics trial, at least one that you are able to get publicly. It is certainly possible that IBM's sales folks would make that happen if it was a potential sales driver for them, but that is conjecture only.
Cognos Insight is capable of several things, one of which is being able to analyze CSV data brought in locally. More specifically, Cognos Insight supports getting data from the following:
CSV files
Microsoft Excel spreadsheets
ODBC data sources
IBM Cognos BI Reports
IBM Cognos TM1 Cube Views
IBM Cognos TM1 Dimension Subsets
Reference: http://www.ibm.com/developerworks/data/library/cognos/infrastructure/cognos_specific/page627.html
I had the same problem but with an Oracle database hence I'm not sure if it helps for MySql, but you could try to perform the following steps:
Install the database Drivers (32 & 64 Bit) on the Cognos Server.
Open the folder cognos_install/v5dataserver/ and rename the file databaseDriverLocations.properties.sample to databaseDriverLocations.properties.
Open this file using a text editor and update the databaseJNIPATH to point to your database drivers.
In my case I configured the following value:
databaseJNIPath=C:\Oracle\product\12.1.0\client_64\bin;C:\Oracle\product\12.1.0\client_32\bin;
See here as well: http://www-01.ibm.com/support/docview.wss?uid=swg21574953

Query excel based database from SQLDeveloper

I am on a Windows 7 machine and have configured a ODBC connection by name of 'MyExcelDb' to an excel file. I am able to programatically connect to 'MyExcelDb' using type-1 jdbc driver and everything is working fine.
I now want to use Oracle SQLDveloper to query the DSN 'MyExcelDb' but I am not able to make this connection as I do not find any option for ODBC Connection. I have tried using the 'advanced' option in Oracle section and tried putting custom jdbc URL as 'jdbc:odbc:excelDB' bugt to no avail.
I have tried adding entries in SQLDeveloper for third party drivers like:
- sourceforge.net/projects/xlsql/
- code.google.com/p/sqlsheet/
- hxtt.com/excel.html
Despite this but no new connection option would appear and I am still struggling to make connection between the two.
I have searched around and found that I could install Oracle and add a TNS for the excel and that way I may be able to connect using the Oracle TNS in SQLDeveloper. Installing Oracle database but would be like killing a mosquito using a canon.
I wonder if there is a simple solution to my problem.
Have you tried following Oracle's guidelines for connecting to Excel files? Did you set up a system DSN as described in step three of the Oracle guide. Importing files into the database is routine and simple but using SQL Developer in the manner you describe, without the client installed. Here is an example of reading an Excel file via PL/SQL.
My question is why use SQL Developer to manipulate a datastore in Excel when Excel is designed to manipulate the data?

LocalDB Export to Excel

I have collected a bunch of data using my locally developed website. Now I need to analyze the data, but it seems like I cannot locate the .mdf file for the LocalDB database my website uses.
Looking at the data connection, it says myusername\localdb#abunchofnonsense.mydomainname.Models.UserDBContext.dbo. And the the connection string is data source=(localdb)\v11.0; initial catalog=mydomain.models.userdbcontext; integrated security=true.
Also, I'm using ASP.NET MVC, Visual Studio 2013, and Entity Framework if it helps.
It's probably quite confusing what I'm trying to do here. I collected some data and need to run logistic regression with it. Now the question is how can I connect Excel to this LocalDB so I can export the data I have collected?
You can import your LocalDB tables and data directly into Excel by the option Get External Data > From Other Sources under the Data tab in Microsoft Excel (2013 to be precise) - as the following screenshot shows:
A bit late, but perhaps someone can use the answer as I found this thread when running into a similar problem:
The problem is, that the localdb uses a different provider than the "normal" SQL-Connection. You need to use the "SQL Server Native Client" (in your case version 11) to connect. The provider should be installed with the localdb - if not you can find it here
To use in Excel, just choose
"From Other Sources"
- "From Data Connection Wizard"
- "Other Advanced"
- Choose your provider
- Enter the rest of your connection details
(not tested, but I see no reason why it shouldn't work): For existing connections you can edit the connection string and add/change "Provider=SQLNCLI11.1" (of course considering your installed version)
Source: I had the same problem in Installshield where you have to change the provider manually, so I just tried in Excel.

Resources