how to configure the DBeaver and Cassandra - cassandra

I am very new to DBeaver. From forum i got to know that driver need to update for cassandra 3+ to work with DBeaver.
https://github.com/serge-rider/dbeaver/issues/167
I downloaded this new driver but unable to set the class path. Can anyone help me on this.Below is the current config i did for driver and cassandra.

As I had some problems finding Cassandra driver after DBeaver install, I went through another way:
Cloned this GitHub project
Built the jar file (mvn package)
Added this jar file to the Driver Manager configuration, using the JDBC Class Name and JDBC URL values as described at project README:

Download and use dbeaver enterprise edition (for free as well, not open source though) which includes cassandra driver by default.

Tested on DBeaver Community Edition 7.3.0.202011292131
Go to https://downloads.datastax.com/#odbc-jdbc-drivers
Select Simba JDBC Driver for Apache Cassandra
Open DBeaver
Database > Driver Manager
New
6. Do not fill Class Name manually, after Add file click Find Class and select.

Here is another solution you can use trial drivers and way to configure:
https://www.cdata.com/kb/tech/cassandra-jdbc-dbvr.rst

Related

gp_dump utility of greenplum not available

Can anyone tell me why is the gp_dump utility is not available with greenplum database by default? If I have to use it then what is the source to download and way to enable it? I have gone through a lot of online resources but nothing relevant could be found
Are you using Greenplum v6.x?
The latest version of gpbackup is located here:
https://github.com/greenplum-db/gpbackup/releases
and if interested, the corresponding s3 plugin here:
https://github.com/greenplum-db/gpbackup-s3-plugin
gp_dump is a very old, deprecated backup utility for Greenplum.
The older python based gpcrondump/gpdbrestore utilities are still bundled in Greenplum 4.3.x and 5.x versions, but do not support Greenplum 6.x and thus removed.
The newer Golang based gpbackup/gprestore utilities support Greenplum 4.3.22 and later, Greenplum 5.5 and later and Greenplum 6.0 and later all in the same binary.
Let me know if you have additional questions.
oak
Try using pg_dump since Greenplum is a fork of postgres.

SonarQube SQL Azure Database support broken with 5.5?

I had an installation of SonarQube 5.4 running on Ubuntu (latest) against a SQL Azure database.
Since the upgrade to 5.5 it's not working anymore.
For the upgrade I deleted the data/es directory and updated the new conf file with my settings as advised.
The collation was wrong at first so I decided to try with a new db of collation SQL_Latin1_General_CP1_CS_AS.
Now it's looking for database_firewall_rules (table?) but cannot find it.
Any idea how I can solve that? Any table I could create?
Thanks!
You're hitting a known bug in SonarQube 5.5: SONAR-7589 . Until 5.5.1 or 5.6 gets released, you can temporarily use SonarQube 5.6-RC1 (available here).

Unable to build Spark+cassandra using sbt-assembly

I am trying to build a simple project with Spark+Cassandra for a SQL-analytics demo.
I need to use Cassandra v2.0.14 (can't upgrade it for now). I am unable to find the correct version of Spark and Spark-cassandra-connector. I referred to Datastax's git project at - https://github.com/datastax/spark-cassandra-connector, and I know that the Spark and Spark-cassandra-connector versions need to match and be compatible with Cassandra. Hence, would like anyone to help pointing out the exact versions for Spark, Spark-Cassandra-connector. I tried using v1.1.0 and v1.2.1 for both Spark and Spark-Cassandra-connector - but unable to build the spark-cassandra-connector jat jar with neither the supplied sbt (fails because the downloaded sbt-launch jar just contains a 404 not found html), nor my local sbt v0.13.8 (fails for compilation error for "import sbtassembly.Plugin.", "import AssemblyKeys.")
The connector works with Cassandra 2.0 and 2.1 but some features may also work fine with 2.2 and 3.0 (not officially supported yet) using the older Java driver 2.1. This is because C* Java driver supports a wide range of Cassandra versions. The newer driver works with older C* versions, but also the older driver versions work with newer C* versions, excluding new C* features.
However, there is a one minor caveat with using C* 2.0:
Since version 1.3.0, we dropped the thrift client from the connector. This move was to simplify connectivity code and make it easier to debug - debugging one type of connection should be easier than two. It either connects or not, no more surprises of a kind "it writes fine, but can't connect for reading". Unfortunately, not all of the thrift functionality was exposed by the native protocol in C* 2.0 nor in the system tables. Therefore, if you use C* prior to version 2.1.5, automatic split sizing won't work properly and you have to tell the connector the preferred number of splits. This is to be set in ReadConf object passed at the creation of the RDD.
As for the interface between the Connector and Spark, there is much less freedom. Spark APIs change quite often and you typically need a connector dedicated to the Spark version you use. See the version table in the README.
(fails because the downloaded sbt-launch jar just contains a 404 not found html)
This looks like an SBT problem, not a connector problem.
I just tried to do sbt clean assembly on all v1.2.5, v1.3.0, b1.4 and it worked fine.
if you can upgrade version of spark then you can connect with spark with cassandra .
put following maven dependency in pom file :-
cassandra-all
cassandra-core
cassandra-mapping
cassandra-thrift
cassandra-client
spark-cassandra-connector
spark-cassandra-connector-java
this will be work.

Extension library and JDBC driver

For xPages extension library ver. 853 I'm using "Extlib_8.5.3_DB2_Driver_updateSite201110201044.zip" JDBC driver..
But now I'm gonna switch to Domino 9. So I'm gonna use Extension Library ver 901. Where can I download appropriate DB2 JDBC driver from for that version? Or should I still use the one that is for 853?
Thanks
The 9.0.1 Extension Library has a tool that gets installed that lets you build a plugin from any JDBC driver. You can download the desired DB2 JDBC driver from IBM and then build the plugin. This tool is accessed from Designer using the Tools menu, JDBC Driver Plug-in Wizard.

Source code for Datastax Community Edition

What is the URL to download the source code for DataStax Community Edition??
I've been plowing through planetcassandra.org, DataStax website, and google without success. Have the Cassandra source off the Apache site...looking for the corresponding source for the DataStax community release.
Thanks!
The community distribution contains the corresponding Apache Cassandra release, and a free-but-not-open-source version of OpsCenter.

Resources