I am setting Jmeter with Groovy of Cassandra DB.
However, I cannot fix these errors.
could you help me?
Response message: javax.script.ScriptException:
org.codehaus.groovy.control.MultipleCompilatiON-ERRORsException:
startup failed: General error during class generation:
java.lang.NoClassDefFoundError: Unable to load class
com.datastax.driver.core.Session due to missing dependency
org/apache/cassandra/transport/Message$Request
You need to add:
Cassandra JDBC Driver itself
All its dependencies, to wit:
asm-5.0.3.jar
asm-analysis-5.0.3.jar
asm-commons-5.0.3.jar
asm-tree-5.0.3.jar
asm-util-5.0.3.jar
guava-19.0.jar
jffi-1.2.16.jar
jffi-1.2.16-native.jar
jnr-constants-0.9.9.jar
jnr-ffi-2.1.7.jar
jnr-posix-3.0.44.jar
jnr-x86asm-1.0.2.jar
metrics-core-3.2.2.jar
netty-buffer-4.0.56.Final.jar
netty-codec-4.0.56.Final.jar
netty-common-4.0.56.Final.jar
netty-handler-4.0.56.Final.jar
netty-transport-4.0.56.Final.jar
slf4j-api-1.7.25.jar
to JMeter Classpath
So you will need to:
Download cassandra-driver-core-3.6.0.jar
Download all the aforementioned dependencies
Copy the driver and the dependencies to "lib" folder of your JMeter installation
Restart JMeter to pick the libraries up
More information: Cassandra Load Testing with Groovy
Related
I am running a job from a jar file in Azure Databricks. This jar has a dependency on azure-storage-file-share. Previously, there wasn't an issue installing this dependency using Maven within the Databricks UI. Now, I get failure with this error message:
Run result unavailable: job failed with error message Library installation failed for library due to user error for maven { coordinates: "com.azure:azure-storage-file-share:12.16.2" } Error messages: Library installation attempted on the driver node of cluster 0131-144423-r1g81134 and failed. Please refer to the following error message to fix the library or contact Databricks support. Error Code: DRIVER_LIBRARY_INSTALLATION_FAILURE. Error Message: java.util.concurrent.ExecutionException: java.io.FileNotFoundException: File file:/local_disk0/tmp/clusterWideResolutionDir/maven/ivy/jars/io.netty_netty-transport-native-kqueue-4.1.86.Final.jar does not exist
To try to work around this, I manually installed this netty library (and several others) as well. I can see in the logs that it was able to download the jar successfully:
downloading https://maven-central.storage-download.googleapis.com/maven2/io/netty/netty-transport-native-kqueue/4.1.86.Final/netty-transport-native-kqueue-4.1.86.Final-osx-x86_64.jar ... [SUCCESSFUL ] io.netty#netty-transport-native-kqueue;4.1.86.Final!netty-transport-native-kqueue.jar (202ms)
However, it still fails. This is error message is in the same log after the one above:
23/01/31 14:50:09 WARN LibraryState: [Thread 137] Failed to install library maven;azure-storage-file-share;com.azure;12.16.2;; java.util.concurrent.ExecutionException: java.io.FileNotFoundException: File file:/local_disk0/tmp/clusterWideResolutionDir/maven/ivy/jars/io.netty_netty-transport-native-kqueue-4.1.86.Final.jar does not exist
I am experimenting with spark reading from a kafka topic through "Structured Streaming + Kafka Integration Guide".
Spark version: 3.2.1
Scala version: 2.12.15
Following their guide on the spark-shell including the dependencies, I start my shell:
spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.2.1
However, once I run something like the following in my shell:
val df = spark.readStream.format("kafka").option("kafka.bootstrap.servers","http://HOST:PORT").option("subscribe", "my-topic").load()
I get the following exception:
java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/ByteArraySerializer`
Any ideas how to overcome this issue?
My assumption was with using --packages, all dependencies should be loaded as well. But this does not seem to be the case. From the logs I assume that the package gets loaded successfully, including the kafka-clients dependency:
org.apache.spark#spark-sql-kafka-0-10_2.12 added as a dependency
resolving dependencies :: org.apache.spark#spark-submit-parent-3b04f646-471c-4cc8-88fb-7e32bc3226ed;1.0
confs: \[default\]
found org.apache.spark#spark-sql-kafka-0-10_2.12;3.2.1 in central
found org.apache.spark#spark-token-provider-kafka-0-10_2.12;3.2.1 in central
found org.apache.kafka#kafka-clients;2.8.0 in central
found org.lz4#lz4-java;1.7.1 in central
found org.xerial.snappy#snappy-java;1.1.8.4 in central
found org.slf4j#slf4j-api;1.7.30 in central
found org.apache.hadoop#hadoop-client-runtime;3.3.1 in central
found org.spark-project.spark#unused;1.0.0 in central
found org.apache.hadoop#hadoop-client-api;3.3.1 in central
found org.apache.htrace#htrace-core4;4.1.0-incubating in central
found commons-logging#commons-logging;1.1.3 in central
found com.google.code.findbugs#jsr305;3.0.0 in central
found org.apache.commons#commons-pool2;2.6.2 in central
The logs seem fine, but you can try to include kafka-clients dependency in --packages argument as well
Otherwise, I'd suggest creating an uber jar instead of downloading libraries every time you submit the app
I have upgraded Weblogic version in linux server by changing the wl_home path in the setDomainEnv.sh for 12.1.2 to 12.1.3 and restart. when restarting it gives below errors.
Appreciate if anyone can give idea about this.
java.lang.IllegalAccessError: tried to access method com.bea.logging.LogBufferHandler.bufferLogObject(Ljava/lang/Object;)V from class weblogic.logging.log4j.WLLog4jMemoryBufferAppender
java.lang.IllegalStateException: Unable to perform operation: post construct on weblogic.diagnostics.lifecycle.LoggingServerService
java.lang.IllegalArgumentException: While attempting to resolve the dependencies of weblogic.diagnostics.lifecycle.DiagnosticFoundationService errors were found
java.lang.IllegalStateException: Unable to perform operation: resolve on weblogic.diagnostics.lifecycle.DiagnosticFoundationService
java.lang.IllegalArgumentException: While attempting to resolve the dependencies of com.oracle.injection.integration.CDIIntegrationService errors were found
java.lang.IllegalStateException: Unable to perform operation: resolve on com.oracle.injection.integration.CDIIntegrationService
Use the wllog4j.jar which is part of WLS 12.1.3 build. The one part of WLS_HOME/server/lib diretory.
If your domains-home/lib folder holds older wllog4.jar which was part of 12.1.2 then you will face this issue
I'm trying to start a spark jobserver, here are the steps I'm following:
I configure the local.sh based on the template.
Then I run ./bin/server_deploy.sh and it finishes without any error.
Configure local.conf.
Run ./bin/server_start.sh in the deploy server.
But when I do the last step I get the following error:
Error: Exception thrown by the agent : java.lang.NullPointerException
Note: I'm using spark 1.4.1. I'm using version 0.5.2 from jobserver (https://github.com/spark-jobserver/spark-jobserver/tree/v0.5.2)
Any idea in how I can fix this (or at least debug it).
Thanks
The error log does not provide much information.
I encountered the same error. For my case, I had another instance of the JobServer running (and somehow ./bin/server_stop.sh did not catch it). It works after I manually killed the other process.
Hint : Error: Exception thrown by the agent : java.lang.NullPointerException when starting Java application
I am working on a J2ME project in Netbeans. I am having a problem in buiding the project. Whenever I ties to build it, it's giving me preverification error. below are the logs.
Executable: C:\WTK2.5.2\bin\preverify
Arguments : -classpath "C:\J2ME Client USE\S60CnB\lib\jsr257.jar;C:\J2ME Client USE\S60CnB\lib\JSR257Ext.jar;C:\WTK2.5.2\lib\mmapi.jar;C:\WTK2.5.2\lib\jsr75.jar;C:\WTK2.5.2\lib\cldcapi11.jar;C:\WTK2.5.2\lib\jsr239.jar;C:\WTK2.5.2\lib\midpapi20.jar;C:\WTK2.5.2\lib\jsr179.jar" -d "C:\J2ME Client USE\S60CnB\ec09b2f6.tmp" -target CLDC1.1 "C:\J2ME Client USE\S60CnB\build\NKTej2.jar"
JAR file creation failed with error 1
The preverified classes if any are in tmp26379. See jar log of errors in C:\J2ME Client USE\S60CnB\ec09b2f6.tmp\jarlog.txt
Error preferifying, attempting to print C:\J2ME Client USE\S60CnB\ec09b2f6.tmp\jarlog.txt
====C:\J2ME Client USE\S60CnB\ec09b2f6.tmp\jarlog.txt====
java.io.FileNotFoundException: tmp26379\META-INF\MANIFEST.MF (The system cannot find the path specified)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:120)
at java.io.FileInputStream.<init>(FileInputStream.java:79)
at sun.tools.jar.Main.run(Main.java:148)
at sun.tools.jar.Main.main(Main.java:1147)
Error: No error
jar -cfm "C:\J2ME Client USE\S60CnB\ec09b2f6.tmp\NKTej2.jar" tmp26379\\META-INF\MANIFEST.MF -C tmp26379 .
C:\J2ME Client USE\S60CnB\build.xml:146: Preverification failed (result=1)
Please help me out.
please details of your program. Which UI are you using? Have you gone through your program to see their is no bug? Have you successfully embedded the oracle sdk plugins for netbeans?
These are the possible problem. If it persists, reply...
J2me tutorial at http://www.tutorialmasterng.blogspot.com