populate_io_cache_on_flush is not a column defined in this metadata - cassandra

While connecting to Cassandra 1.2.1 using Data-stax Java driver version 1.0.2, I am getting the error:
Exception in thread "main" java.lang.IllegalArgumentException: populate_io_cache_on_flush is not a column defined in this metadata
at com.datastax.driver.core.ColumnDefinitions.getIdx(ColumnDefinitions.java:268)
at com.datastax.driver.core.Row.isNull(Row.java:84)
at com.datastax.driver.core.TableMetadata$Options.<init>(TableMetadata.java:440)
at com.datastax.driver.core.TableMetadata.build(TableMetadata.java:107)
at com.datastax.driver.core.Metadata.buildTableMetadata(Metadata.java:124)
at com.datastax.driver.core.Metadata.rebuildSchema(Metadata.java:88)
at com.datastax.driver.core.ControlConnection.refreshSchema(ControlConnection.java:265)
at com.datastax.driver.core.ControlConnection.tryConnect(ControlConnection.java:220)
at below line:
cluster = Cluster.builder().addContactPoint("localhost").build();
I tried deleted folder \var\lib\cassandra and then restart the cassandra server too which means there is no previous data. The server starts without any error but I am still getting the above error when I am trying to connect to it.

Ohk. Just discovered that it went away when I use latest version of Cassandra(1.2.8). So it might be because of version incompatibility.

Related

Caused by:Ambari Upgradation java.sql.SQLSyntaxErrorException: Unknown table 'hostcomponentstate' in information_schema

I'm trying to upgrade Ambari 2.1 to 2.5.
Steps followed:
Stopped ambari-server
Took backup of ambari database from MySql database.
Ran ambari-server upgrade.
I'm getting this SQL Error,
Exception in thread "main" org.apache.ambari.server.AmbariException: Unknown table 'hostcomponentstate' in information_schema
at org.apache.ambari.server.upgrade.SchemaUpgradeHelper.executeUpgrade(SchemaUpgradeHelper.java:212)
at org.apache.ambari.server.upgrade.SchemaUpgradeHelper.main(SchemaUpgradeHelper.java:427)
Caused by: java.sql.SQLSyntaxErrorException: Unknown table 'hostcomponentstate' in information_schema
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:536)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:513)
at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:115)
at com.mysql.cj.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:1983)
at com.mysql.cj.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:1936)
at com.mysql.cj.jdbc.StatementImpl.executeQuery(StatementImpl.java:1422)
at com.mysql.cj.jdbc.DatabaseMetaData$7.forEach(DatabaseMetaData.java:3182)
at com.mysql.cj.jdbc.DatabaseMetaData$7.forEach(DatabaseMetaData.java:3170)
at com.mysql.cj.jdbc.IterateBlock.doForAll(IterateBlock.java:50)
at com.mysql.cj.jdbc.DatabaseMetaData.getPrimaryKeys(DatabaseMetaData.java:3223)
at org.apache.ambari.server.orm.DBAccessorImpl.tableHasPrimaryKey(DBAccessorImpl.java:1082)
at org.apache.ambari.server.upgrade.UpgradeCatalog211.executeHostComponentStateDDLUpdates(UpgradeCatalog211.java:204)
at org.apache.ambari.server.upgrade.UpgradeCatalog211.executeDDLUpdates(UpgradeCatalog211.java:108)
at org.apache.ambari.server.upgrade.AbstractUpgradeCatalog.upgradeSchema(AbstractUpgradeCatalog.java:925)
at org.apache.ambari.server.upgrade.SchemaUpgradeHelper.executeUpgrade(SchemaUpgradeHelper.java:209)
... 1 more
Am I missing any step or any configuration change is required?
You may try upgrading first to Ambari 2.2.x
Not sure that such a long upgrade path works flawlessly

Liferay: GA4 on an empty MySQL 5.7 results in a fatal exception

I am trying to install Liferay GA4 and a Master build for development purposes. However I keep falling into a fatal exception with MySQL 5.7.
As described at: https://issues.liferay.com/browse/LPS-73410
In an empty database, MySQL 5.7, when the servers is brought up the follow exception is raised. (seem on both drivers com.mysql.jdbc.Driver and com.mysql.cj.jdbc.Driver)
liferay | 21:45:35,927 ERROR [localhost-startStop-1][MainServlet:275] com.liferay.portal.kernel.events.ActionException: com.liferay.portal.verify.VerifyException: com.liferay.portal.verify.VerifyException: java.sql.SQLSyntaxErrorException: Table 'XXXXX.EVENTS' doesn't exist
liferay | com.liferay.portal.kernel.events.ActionException: com.liferay.portal.verify.VerifyException: com.liferay.portal.verify.VerifyException: java.sql.SQLSyntaxErrorException: Table 'XXXXX.EVENTS' doesn't exist
I was wondering if this is something I can get around by some procedure done directly into the database...Any thoughts?
I have found my way out this issue with new JDBC defaults.
jdbc.default.driverClassName=com.mysql.cj.jdbc.Driver
jdbc.default.url=jdbc:mysql://${database.host}/${database.schema}?useUnicode=true&characterEncoding=UTF-8&useFastDateParsing=false&useSSL=false&nullNamePatternMatchesAll=true&&nullCatalogMeansCurrent=true
From: https://www.e-systems.tech/web/guest/blog/-/blogs/liferay-with-mysql-5-7-driver-changes

insert data into Microsoft SQL server using Spark

I am trying to insert data into sql server using spark using the below Jdbc methods.
Option 1:
prop.put("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")
dataf.write.mode(org.apache.spark.sql.SaveMode.Append).jdbc(url,table_name, prop)
Table is already created. Appending new data.Job Error-ed with the below exception
Exception in thread "main"
com.microsoft.sqlserver.jdbc.SQLServerException: CREATE TABLE
permission denied in database
Question is : Why create table permission is required for appending the data?
Option2:
prop.put("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")
org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils.saveTable(dataf, url, table_name, prop)
Above command working from spark-shell. when the same is used in scala code and packaged with dependencies giving below exception
Exception in thread "main" java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(DriverManager.java:315)
I tried setting driver class-path and executor class-path and also --jars still no luck. Included sqljdbc4.jar in driver-classpath and --jars.
Copied sqljdbc4.jar to all worker nodes as well still no luck.
Any Ideas on this?
After Lot of searching and Testing, I found the answer. It might be useful for someone.
Option 1: This is because of bug in spark 1.5.X. the same was resolved
in 1.6.x and later. Because of the bug, It always try to create a new
table.
Option2: This causes because , driver name on classpath given
priority than properties we are passing as argument. Workaround for
this is to create connection and then invoke savetable.
workaround if you are using spark 1.5.x or lower.
JdbcUtils.createConnection(url, prop)
JdbcUtils.saveTable()

DataStax Devcenter fails to connect to the remote cassandra db

I've installed DataStax cassandra and it is up and running on my remote machine. Now I am trying to connecto via DataStax Devcenter but it fails.
Before posting this question I've read identical here: DataStax Devcenter fails to connect to the remote cassandra database
I went to cassandra.yaml conf file but start_native_transport: true option is not in my file. Where should I look for it?
Also I've changed rpc_address to: 0.0.0.0.
UPDATE:
If I add start_native_transport: true into my cassandra.yaml it just crashes on Cassandra restart. Please refer a log below:
ERROR 17:48:32,626 Fatal configuration error error
Can't construct a java object for tag:yaml.org,2002:org.apache.cassandra.config.Config; exception=Cannot create property=start_native_transport for JavaBean=org.apache.cassandra.config.Config#ef28a30; Unable to find property 'start_native_transport' on class: org.apache.cassandra.config.Config
in "<reader>", line 10, column 1:
cluster_name: 'Test Cluster'
^
at org.yaml.snakeyaml.constructor.Constructor$ConstructYamlObject.construct(Constructor.java:372)
at org.yaml.snakeyaml.constructor.BaseConstructor.constructObject(BaseConstructor.java:177)
at org.yaml.snakeyaml.constructor.BaseConstructor.constructDocument(BaseConstructor.java:136)
at org.yaml.snakeyaml.constructor.BaseConstructor.getSingleData(BaseConstructor.java:122)
at org.yaml.snakeyaml.Loader.load(Loader.java:52)
at org.yaml.snakeyaml.Yaml.load(Yaml.java:166)
at org.apache.cassandra.config.DatabaseDescriptor.loadYaml(DatabaseDescriptor.java:141)
at org.apache.cassandra.config.DatabaseDescriptor.<clinit>(DatabaseDescriptor.java:116)
at org.apache.cassandra.service.AbstractCassandraDaemon.setup(AbstractCassandraDaemon.java:124)
at org.apache.cassandra.service.AbstractCassandraDaemon.activate(AbstractCassandraDaemon.java:389)
at org.apache.cassandra.thrift.CassandraDaemon.main(CassandraDaemon.java:107)
Caused by: org.yaml.snakeyaml.error.YAMLException: Cannot create property=start_native_transport for JavaBean=org.apache.cassandra.config.Config#ef28a30; Unable to find property 'start_native_transport' on class: org.apache.cassandra.config.Config
at org.yaml.snakeyaml.constructor.Constructor$ConstructMapping.constructJavaBean2ndStep(Constructor.java:305)
at org.yaml.snakeyaml.constructor.Constructor$ConstructMapping.construct(Constructor.java:184)
at org.yaml.snakeyaml.constructor.Constructor$ConstructYamlObject.construct(Constructor.java:370)
... 10 more
Caused by: org.yaml.snakeyaml.error.YAMLException: Unable to find property 'start_native_transport' on class: org.apache.cassandra.config.Config
at org.yaml.snakeyaml.constructor.Constructor$ConstructMapping.getProperty(Constructor.java:342)
at org.yaml.snakeyaml.constructor.Constructor$ConstructMapping.constructJavaBean2ndStep(Constructor.java:240)
... 12 more
null; Can't construct a java object for tag:yaml.org,2002:org.apache.cassandra.config.Config; exception=Cannot create property=start_native_transport for JavaBean=org.apache.cassandra.config.Config#ef28a30; Unable to find property 'start_native_transport' on class: org.apache.cassandra.config.Config
Invalid yaml; unable to start server. See log for stacktrace.
Thanks for any Help!
start_native_transport: true
should be there in cassandra.yaml if its not there then you should add it into cassandra.yaml and try after restarting the Cassandra server
What version of Cassandra are you using? DevCenter supports Cassandra versions >= 1.2
If you still see errors with the change in cassandra.yaml you can post a link to a Gist. But the YAML format is pretty simple so I think you'll figure it out.
If you read my previous answer you'll notice that it required the rpc_address to be set to a different value than 0.0.0.0. Anyways the latest version of DevCenter (1.1.1) will work even all the nodes in your cluster have the rpc_address set to 0.0.0.0 (as a side note I don't think that's generally a good setting).
DevCenter.ini does not have java VM information.
Adding below line of VM info helped resolve connection issue.
-vm
C:\Program Files (x86)\JDK64\1.8.0.74\jre\bin\java.exe
NOTE: above line represents appropriate java.exe from JRE version

cassandra sstable-loader error: "Got an unknow host from describe_ring()"

I am trying to load sstables to cassandra cluster of two nodes with sstable-loader utility provided in cassandra 0.8.4
1) I have loaded the data successfully on single node environment .
2) As i have created the cluster of two nodes ,while loading ,after gossip it throws exception
java.lang.RuntimeException: Got an unknow host from describe_ring()
This is a bug in 0.8.4 (https://issues.apache.org/jira/browse/CASSANDRA-3044). It's fixed in 0.8.5; you can test that by following the link on the release thread here.

Resources