I am running maven 3.3.9 in virtual box with JDK8
and when I run the 'mvn' command from the terminal I get the following error:
constituent[25]: file:/usr/share/maven/lib/guice.jar
constituent[26]: file:/usr/share/maven/lib/eclipse-aether-impl.jar
constituent[27]: file:/usr/share/maven/lib/guava.jar
constituent[28]: file:/usr/share/maven/lib/commons-lang3.jar
constituent[29]: file:/usr/share/maven/lib/wagon-http-shaded.jar
constituent[30]: file:/usr/share/maven/lib/javax.inject.jar
constituent[31]: file:/usr/share/maven/lib/maven-compat-3.x.jar
constituent[32]: file:/usr/share/maven/lib/commons-io.jar
constituent[33]: file:/usr/share/maven/lib/plexus-interpolation.jar
constituent[34]: file:/usr/share/maven/lib/commons-lang.jar
constituent[35]: file:/usr/share/maven/lib/plexus-utils.jar
constituent[36]: file:/usr/share/maven/lib/plexus-cipher.jar
constituent[37]: file:/usr/share/maven/lib/wagon-provider-api.jar
constituent[38]: file:/usr/share/maven/lib/wagon-http-shared.jar
constituent[39]: file:/usr/share/maven/lib/eclipse-aether-transport-wagon.jar
constituent[40]: file:/usr/share/maven/lib/sisu-plexus.jar
constituent[41]: file:/usr/share/maven/conf/logging/
---------------------------------------------------
Exception in thread "main" java.lang.NoClassDefFoundError: javax/inject/Provider
at org.apache.maven.cli.MavenCli.container(MavenCli.java:545)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:281)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: java.lang.ClassNotFoundException: javax.inject.Provider
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50)
at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:271)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:247)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:239)
... 11 more
I read a few forums and they all suggest upgrading the JDK but I am already on the latest version. :-(
can someone please guide me with this issue.
The same version on windows machine works fine with same JDK
Related
Groovy version:
Groovy Version: 3.0.0-rc-3 JVM: 1.8.0_221 Vendor: Oracle Corporation OS: Linux
on running command : groovysh
Error output:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.groovy.tools.GroovyStarter.rootLoader(GroovyStarter.java:106)
at org.codehaus.groovy.tools.GroovyStarter.main(GroovyStarter.java:124)
Caused by: java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no jansi in java.library.path, /tmp/libjansi-64-3695606470401720252.so: /tmp/libjansi-64-3695606470401720252.so: failed to map segment from shared object: Operation not permitted]
at org.fusesource.hawtjni.runtime.Library.doLoad(Library.java:182)
at org.fusesource.hawtjni.runtime.Library.load(Library.java:140)
at org.fusesource.jansi.internal.CLibrary.<clinit>(CLibrary.java:42)
at org.fusesource.jansi.AnsiConsole.wrapOutputStream(AnsiConsole.java:48)
at org.fusesource.jansi.AnsiConsole.<clinit>(AnsiConsole.java:38)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.lambda$createCallStaticSite$2(CallSiteArray.java:65)
at java.security.AccessController.doPrivileged(Native Method)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.createCallStaticSite(CallSiteArray.java:63)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.createCallSite(CallSiteArray.java:156)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:115)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:119)
at org.apache.groovy.groovysh.Main.installAnsi(Main.groovy:249)
at org.apache.groovy.groovysh.Main.setTerminalType(Main.groovy:235)
at org.apache.groovy.groovysh.Main.main(Main.groovy:120)
... 6 more
As far as I know this kind of issue comes not from Groovy itself, but from your user not having write rights to /tmp. For example if your tmp is mounted readonly.
Background: Jansi requires native libraries, that are part of the jar. Java cannot load them form the jar and requires them to be put "somewhere", which is usually /tmp. Since it seems your tmp cannot be written you get the "Operation not permitted".
I tried all the listed solutions available in stackoverflow.com and askubuntu.com .I am using java-8 not the defalult jdk available.I really hope someone could provide me a solution and let me know if the question is not much relevant.
Java Env:
java version "1.8.0_144"
Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
Java HotSpot(TM) Server VM (build 25.144-b01, mixed mode)
Error:
Java HotSpot(TM) Server VM warning: ignoring option MaxPermSize=256M; support was removed in 8.0
Graphics Device initialization failed for : es2, sw
Error initializing QuantumRenderer: no suitable pipeline found
java.lang.RuntimeException: java.lang.RuntimeException: Error initializing QuantumRenderer: no suitable pipeline found
at com.sun.javafx.tk.quantum.QuantumRenderer.getInstance(QuantumRenderer.java:280)
at com.sun.javafx.tk.quantum.QuantumToolkit.init(QuantumToolkit.java:221)
at com.sun.javafx.tk.Toolkit.getToolkit(Toolkit.java:205)
at com.sun.javafx.application.PlatformImpl.startup(PlatformImpl.java:209)
at com.sun.javafx.application.LauncherImpl.startToolkit(LauncherImpl.java:675)
at com.sun.javafx.application.LauncherImpl.launchApplicationWithArgs(LauncherImpl.java:337)
at com.sun.javafx.application.LauncherImpl.launchApplication(LauncherImpl.java:328)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.launcher.LauncherHelper$FXHelper.main(LauncherHelper.java:767)
Caused by: java.lang.RuntimeException: Error initializing QuantumRenderer: no suitable pipeline found
at com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.init(QuantumRenderer.java:94)
at com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.run(QuantumRenderer.java:124)
at java.lang.Thread.run(Thread.java:748)
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.launcher.LauncherHelper$FXHelper.main(LauncherHelper.java:767)
Caused by: java.lang.RuntimeException: No toolkit found
at com.sun.javafx.tk.Toolkit.getToolkit(Toolkit.java:217)
at com.sun.javafx.application.PlatformImpl.startup(PlatformImpl.java:209)
at com.sun.javafx.application.LauncherImpl.startToolkit(LauncherImpl.java:675)
at com.sun.javafx.application.LauncherImpl.launchApplicationWithArgs(LauncherImpl.java:337)
at com.sun.javafx.application.LauncherImpl.launchApplication(LauncherImpl.java:328)
I'm trying to run Spark on Windows 10 with hadoop on the official Microsoft terminal cmd.exe.
I don't have problem with Hadoop. The installation and stating is OK.
I'm using Java 8 x64 (jdk1.8.0_92)
When I start Spark with the command spark-shell, I got the Java error bellow :
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.NoClassDefFoundError: Could not initialize class scala.tools.fusesource_embedded.jansi.internal.Kernel32
at scala.tools.fusesource_embedded.jansi.internal.WindowsSupport.getConsoleMode(WindowsSupport.java:50)
at scala.tools.jline_embedded.WindowsTerminal.getConsoleMode(WindowsTerminal.java:204)
at scala.tools.jline_embedded.WindowsTerminal.init(WindowsTerminal.java:82)
at scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:101)
at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:158)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:229)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:221)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:209)
at scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.(JLineReader.scala:61)
at scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.(JLineReader.scala:33)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:865)
at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:862)
at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:871)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
at scala.util.Try$.apply(Try.scala:192)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223)
at scala.collection.immutable.Stream.collect(Stream.scala:435)
at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:877)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$2.apply(ILoop.scala:916)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:916)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911)
at org.apache.spark.repl.Main$.main(Main.scala:49)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I get exactly the same stack trace when opening the scala console in Netbeans. If I type any scala expression below the stack trace, it works fine though.
I have a Hortonwork Hadoop cluster installed on my Windows machine (version: 2.6.0.2.2.4.2-0002) and I downloaded the Spark 1.4.0 for hadoop 2.6 and later. I haven't changed any settings in my windows environment. When I run the spark-shell command I get the following error message:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/internal/util/AbstractFileClassLoader
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.spark.repl.Main$.main(Main.scala:30)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
I have the scala installed and I have the SCALA_HOME=c:\scala\bin.
I've seen a set of changes that I need to make (e.g. adding some enviromental variable. etc) here and there but neither looks to be complete and didn't make my spark cluster working. I wonder to now the set of required changes I need to make in order to make my spark cluster up and running.
Complete error while starting cassandra.
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:322)
at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:229)
at org.xerial.snappy.Snappy.<clinit>(Snappy.java:48)
at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:79)
at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:66)
at org.apache.cassandra.net.OutboundTcpConnection.connect(OutboundTcpConnection.java:359)
at org.apache.cassandra.net.OutboundTcpConnection.run(OutboundTcpConnection.java:150)
Caused by: java.lang.UnsatisfiedLinkError: /tmp/snappy-1.0.5-libsnappyjava.so: /usr/lib64/libstdc++.so.6: version `GLIBCXX_3.4.9' not found (required by /tmp/snappy-1.0.5-libsnappyjava.so)
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1939)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1864)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1825)
at java.lang.Runtime.load0(Runtime.java:792)
at java.lang.System.load(System.java:1059)
at org.xerial.snappy.SnappyNativeLoader.load(SnappyNativeLoader.java:39)
... 11 more
ERROR 17:23:34,725 Exception in thread Thread[WRITE-/10.141.0.21,5,main]
org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] null
at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:239)
at org.xerial.snappy.Snappy.<clinit>(Snappy.java:48)
at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:79)
at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:66)
at org.apache.cassandra.net.OutboundTcpConnection.connect(OutboundTcpConnection.java:359)
at org.apache.cassandra.net.OutboundTcpConnection.run(OutboundTcpConnection.java:150)
What have done so far:
Followed advice from the link and compiled snappy from source for version 1.0.5
Replaced snappy-java-1.0.5.jar from $DSE/resource/cassandra/lib/.
Still I face the same issue. In fact, I suspect those two .so are being picked up from somewhere else. I am on cluster grid so I can't ask them upgrade libstdc++ just for me.
What do you think I could be doing wrong.
Use the latest version of snappy
http://code.google.com/p/snappy-java/downloads/list