so i Installed the local version of dynamoDB
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/DynamoDBLocal.html .
And after that, executed the command is mentioned in the same link
java -Djava.library.path=./DynamoDBLocal_lib -jar DynamoDBLocal.jar -sharedDb
the thing is i get this error every time
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/eclipse/jetty/server/handler/AbstractHandler
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.eclipse.jetty.server.handler.AbstractHandler
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
how do i get rid of this problem ?
Using Ubuntu 16.04, trying to make dynamoDB work for nodeJS
If you are using this for local development, I'd highly suggest using a docker image for dynamoDB - this will ensure that you don't need specific configuration on your linux flavor and it should work out of the box.
For me, https://hub.docker.com/r/dwmkerr/dynamodb/ does work quite well for my use case. If you don't have docker yet, here's a handy guide on how to install it for Ubuntu 16.04 - https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-ubuntu-16-04
Related
I am following this tutorial to run Spark-Pi Application using kubectl command from here. https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/docs/quick-start-guide.md#running-the-examples.
When I use spark-on-operator to submit a spark task, an error occurs. I use kubectl apply -f examples/spark-pi.yaml to submit the job,but it failed.
And my spark version is 2.4.4 and the spark-on-operator version is v1beta2-1.0.1-2.4.4.
Here are the messages:
Warning SparkApplicationFailed 3m spark-operator SparkApplication spark-pi failed: failed to run spark-submit for SparkApplication default/spark-pi: Exception in thread "main" java.lang.NoClassDefFoundError: io/fabric8/kubernetes/client/Watcher
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:233)
at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:204)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException:
io.fabric8.kubernetes.client.Watcher
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 33 more
Afterwards, I tried to modify the Dockerfile, add kubernetes-client-4.1.2.jar to the yaml file, and re-create the docker image. But the error message is the same.My Dockerfile is :
ARG spark_image=gcr.io/spark-operator/spark:v2.4.4
FROM $spark_image
RUN mkdir -p /opt/spark/jars
ADD https://repo1.maven.org/maven2/io/fabric8/kubernetes-
client/4.1.2/kubernetes-client-4.1.2.jar /opt/spark/jars
ENV SPARK_HOME /opt/spark
WORKDIR /opt/spark/work-dir
ENTRYPOINT [ "/opt/entrypoint.sh" ]
Anybody help woud be appreciate
Spark-on-operator requires that the version of k8s is 1.13+, but in the openshift I use, the version of k8s is 1.11+, so replacing it with a higher version of k8s can solve the problem.
I want to deploy my javafx application on a raspberry pi from within IntelliJ (making debugging easier as well). For this purpose I installed the following plugin for embedded linux development: https://plugins.jetbrains.com/plugin/7738-embedded-linux-jvm-debugger-raspberry-pi-beaglebone-black-intel-galileo-ii-and-several-other-iot-devices-
I can connect to my Pi just fine. Deployment also works fine. However, once the following command is executed, something goes wrong:
Executing Command :sudo java -Dprism.verbose=true -cp classes:lib/'*' runnable.MainApp
A load of class files show up (such as classes/mypackage/MyClass.class), and eventually, I get the following stack-trace, and I'm not sure what to make of it.
The kernel version of the Pi is: Linux raspberrypi 4.4.50-v7+ #970 SMP Mon Feb 20 19:18:29 GMT 2017 armv7l GNU/Linux
GraphicsPipeline.createPipeline failed for com.sun.prism.es2.ES2Pipeline
java.lang.ClassNotFoundException: com.sun.prism.es2.ES2Pipeline
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.sun.prism.GraphicsPipeline.createPipeline(GraphicsPipeline.java:187)
at com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.init(QuantumRenderer.java:91)
at com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.run(QuantumRenderer.java:124)
at java.lang.Thread.run(Thread.java:745)
*** Fallback to Prism SW pipeline
And right after I get the following:
Prism pipeline name = com.sun.prism.sw.SWPipeline
GraphicsPipeline.createPipeline failed for com.sun.prism.sw.SWPipeline
java.lang.UnsatisfiedLinkError: Can't load library: /home/pi/IdeaProjects/user_interface/arm/libprism_sw.so
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at com.sun.glass.utils.NativeLibLoader.loadLibraryFullPath(NativeLibLoader.java:201)
at com.sun.glass.utils.NativeLibLoader.loadLibraryInternal(NativeLibLoader.java:94)
at com.sun.glass.utils.NativeLibLoader.loadLibrary(NativeLibLoader.java:39)
at com.sun.prism.sw.SWPipeline.lambda$static$472(SWPipeline.java:42)
at java.security.AccessController.doPrivileged(Native Method)
at com.sun.prism.sw.SWPipeline.<clinit>(SWPipeline.java:41)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.sun.prism.GraphicsPipeline.createPipeline(GraphicsPipeline.java:187)
at com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.init(QuantumRenderer.java:91)
at com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.run(QuantumRenderer.java:124)
at java.lang.Thread.run(Thread.java:745)
Graphics Device initialization failed for : es2, sw
Error initializing QuantumRenderer: no suitable pipeline found
java.lang.RuntimeException: java.lang.RuntimeException: Error initializing QuantumRenderer: no suitable pipeline found
at com.sun.javafx.tk.quantum.QuantumRenderer.getInstance(QuantumRenderer.java:280)
at com.sun.javafx.tk.quantum.QuantumToolkit.init(QuantumToolkit.java:221)
at com.sun.javafx.tk.Toolkit.getToolkit(Toolkit.java:205)
at com.sun.javafx.application.PlatformImpl.startup(PlatformImpl.java:209)
at com.sun.javafx.application.LauncherImpl.startToolkit(LauncherImpl.java:675)
at com.sun.javafx.application.LauncherImpl.launchApplicationWithArgs(LauncherImpl.java:337)
at com.sun.javafx.application.LauncherImpl.launchApplication(LauncherImpl.java:328)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at sun.launcher.LauncherHelper$FXHelper.main(LauncherHelper.java:767)
Caused by: java.lang.RuntimeException: Error initializing QuantumRenderer: no suitable pipeline found
at com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.init(QuantumRenderer.java:94)
at com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.run(QuantumRenderer.java:124)
at java.lang.Thread.run(Thread.java:745)
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at sun.launcher.LauncherHelper$FXHelper.main(LauncherHelper.java:767)
Caused by: java.lang.RuntimeException: No toolkit found
at com.sun.javafx.tk.Toolkit.getToolkit(Toolkit.java:217)
at com.sun.javafx.application.PlatformImpl.startup(PlatformImpl.java:209)
at com.sun.javafx.application.LauncherImpl.startToolkit(LauncherImpl.java:675)
at com.sun.javafx.application.LauncherImpl.launchApplicationWithArgs(LauncherImpl.java:337)
at com.sun.javafx.application.LauncherImpl.launchApplication(LauncherImpl.java:328)
... 5 more
while I'm trying to submit a spark job from eclipse onto a remote hadoop cluster(yarn) getting below error:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/Logging
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.deploy.yarn.ExecutorLauncher$.main(ApplicationMaster.scala:674)
at org.apache.spark.deploy.yarn.ExecutorLauncher.main(ApplicationMaster.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.Logging
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 14 more
Please help me with this.
The problem is solved by downloading and saving the spark-assembly jar onto the HDFS cluster and adding this property to the eclipse code set("spark.yarn.jar" "hdfs//namenode:8020/
I have a Hortonwork Hadoop cluster installed on my Windows machine (version: 2.6.0.2.2.4.2-0002) and I downloaded the Spark 1.4.0 for hadoop 2.6 and later. I haven't changed any settings in my windows environment. When I run the spark-shell command I get the following error message:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/internal/util/AbstractFileClassLoader
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.spark.repl.Main$.main(Main.scala:30)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
I have the scala installed and I have the SCALA_HOME=c:\scala\bin.
I've seen a set of changes that I need to make (e.g. adding some enviromental variable. etc) here and there but neither looks to be complete and didn't make my spark cluster working. I wonder to now the set of required changes I need to make in order to make my spark cluster up and running.
I have a java web app that uses Jetty embedded and when I try to run in a linux environment I have the following exception:
Exception in thread "main" java.lang.NoClassDefFoundError: org/eclipse/jetty/server/Connector
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
at br.com.teste.install.Launcher.work(Launcher.java:23)
at br.com.teste.install.Install.main(Install.java:14)
Caused by: java.lang.ClassNotFoundException: org.eclipse.jetty.server.Connector
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 4 more
When I run exactly the same app in a windows environment this works without any error.
I have no ideia for what happen this.
Thanks for any help.