I am am trying to run Cassandra instance through eclipse on Windows.
In Eclipse run configuration I have specified org.apache.cassandra.service.CassandraDaemon class and in VM arguments I have specified below mentioned VM arguments :
-Dcassandra.config=file:C:\nw_servers\apache-cassandra-2.1.3\conf\cassandra.yaml
-Dcassandra-foreground
-ea -Xmx1G
-Dlog4j.configuration=file:C:\nw_servers\apache-cassandra-2.1.3\conf\log4j-server.properties
When I run the CassandraDaemon class in eclipse i get below mentioned error.
WARN 02:23:07 JNA link failure, one or more native method will be unavailable.
WARN 02:23:07 JNA link failure, one or more native method will be unavailable.
Related
I am learning spark. And trying to run a simple spark app that output 5 dataframe rows without installing spark. I know that one can run spark app without installing spark. The app is throwing below error and I could not resolve the issue. I have tried solution posted on snowflake to similar problem but in vain. Below is the error log for your reference.
Using below softwares:
Eclipse IDE for Enterprise Java and Web Developers (includes Incubating components)
Version: 2022-09 (4.25.0)
Build id: 20220908-1902
java -version
openjdk version "11.0.12" 2021-07-20
OpenJDK Runtime Environment Microsoft-25199 (build 11.0.12+7)
OpenJDK 64-Bit Server VM Microsoft-25199 (build 11.0.12+7, mixed mode)
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
at org.apache.spark.internal.config.package$.<init>(package.scala:1095)
at org.apache.spark.internal.config.package$.<clinit>(package.scala)
at org.apache.spark.SparkConf$.<init>(SparkConf.scala:654)
at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
at org.apache.spark.SparkConf.set(SparkConf.scala:94)
at org.apache.spark.SparkConf.set(SparkConf.scala:83)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:916)
at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149)
at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237)
at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:149)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:916)
at net.jgp.books.spark.ch01.lab100_csv_to_dataframe.CsvToDataframeApp.start(CsvToDataframeApp.java:32)
at net.jgp.books.spark.ch01.lab100_csv_to_dataframe.CsvToDataframeApp.main(CsvToDataframeApp.java:21)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does not "opens java.nio" to unnamed module #35dab4eb
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
at java.base/java.lang.reflect.Constructor.checkCanSetAccessible(Constructor.java:188)
at java.base/java.lang.reflect.Constructor.setAccessible(Constructor.java:181)
at org.apache.spark.unsafe.Platform.<clinit>(Platform.java:56)
... 16 more
I tried the solutions here Why am I seeing `java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible` on a mac
I tried uninstalling and reinstalling java again. I am a newbie.
Try adding the below to your Spark session builder
SparkSession.builder()
// your setting
.config(
"spark.sql.extensions",
"org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"
)
I am trying to install Cassandra 3.11.6 in Windows 10. I have set the environment variables and my current java version is 1.8.0_241. When I run the command "cassandra -f" in command prompt I'm getting the below message.
WARNING! Powershell script execution unavailable.
Please use 'powershell Set-ExecutionPolicy Unrestricted'
on this user-account to run cassandra with fully featured
functionality on this platform.
Starting with legacy startup options
Starting Cassandra Server
Unrecognized VM option 'UseParNewGC'
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
How can I get Cassandra to work ? Am I missing something ?
Note: I know that here is same question, but it's enviroment is window so I created it. JNA link failure Error on Cassandra Startup
I try to start cassandra but I get a warning below:
$ cassandra
...
WARN 09:13:42 JNA link failure, one or more native method will be unavailable.
WARN 09:13:42 JMX is not enabled to receive remote connections. Please see cassandra-env.sh for more info.
Please tell me how to solve this problem.
My enviroment:
Cassandra v2.2.0 with Homebrew
OS X 10.10
JNA is used for optimizations such as disabling swapping and creating hardlinks during snapshots. It is recommended for production systems. Dev systems should also be fine without JNA support, so you can just ignore the warning.
I can't run cassandra as daemon. I set variable JAVA_HOME, CASSANDRA_HOME,PATH for cassandra. To running I use Apache Commons and tutorial link
but when i try started I see in console Error: Could not find or load main class org.apache.cassandra.service.CassandraDaemon
Tested on JDK 8 and 7
I do not know what's going on
I found error bug on CASSANDRA-7477
I'm using Cassandra 1.2.3 on Windows, I have downloaded and copied Jna.jar and Platform.jar to C:\Program Files (x86)\apache-cassandra-1.2.3\lib , but when I run Cassandra I get this message
INFO 16:20:42,839 JNA link failure, one or more native method will be unavailable.
I didn't found any solution to fix it in Windows
Cassandra does not support JNA on windows. It only knows how to link libc, which does not exist there.