I have a Hortonwork Hadoop cluster installed on my Windows machine (version: 2.6.0.2.2.4.2-0002) and I downloaded the Spark 1.4.0 for hadoop 2.6 and later. I haven't changed any settings in my windows environment. When I run the spark-shell command I get the following error message:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/internal/util/AbstractFileClassLoader
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.spark.repl.Main$.main(Main.scala:30)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
I have the scala installed and I have the SCALA_HOME=c:\scala\bin.
I've seen a set of changes that I need to make (e.g. adding some enviromental variable. etc) here and there but neither looks to be complete and didn't make my spark cluster working. I wonder to now the set of required changes I need to make in order to make my spark cluster up and running.
Related
I'm trying to start the spark-jobserver. I can't find any reference to this akka library on the installation steps provided on the GitHub spark_jobserver page.
I'm running spark in standalone mode on a single server which act as a master/node.
But when I execute
./job-server/server_start.sh
it shows the following error:
error while starting up loggers
akka.ConfigurationException: Logger specified in config can't be loaded [akka.event.slf4j.Slf4jLogger] due to [java.lang.ClassNotFoundException: akka.event.slf4j.Slf4jLogger]
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$1.applyOrElse(Logging.scala:116)
at akka.event.LoggingBus$$anonfun$4$$anonfun$apply$1.applyOrElse(Logging.scala:115)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:216)
at scala.util.Try$.apply(Try.scala:192)
at scala.util.Failure.recover(Try.scala:216)
at akka.event.LoggingBus$$anonfun$4.apply(Logging.scala:115)
at akka.event.LoggingBus$$anonfun$4.apply(Logging.scala:110)
at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:683)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:682)
at akka.event.LoggingBus$class.startDefaultLoggers(Logging.scala:110)
at akka.event.EventStream.startDefaultLoggers(EventStream.scala:26)
at akka.actor.LocalActorRefProvider.init(ActorRefProvider.scala:623)
at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:157)
at akka.cluster.ClusterActorRefProvider.init(ClusterActorRefProvider.scala:58)
at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
at spark.jobserver.JobServer$.spark$jobserver$JobServer$$makeSupervisorSystem$1(JobServer.scala:154)
at spark.jobserver.JobServer$$anonfun$main$1.apply(JobServer.scala:156)
at spark.jobserver.JobServer$$anonfun$main$1.apply(JobServer.scala:156)
at spark.jobserver.JobServer$.start(JobServer.scala:54)
at spark.jobserver.JobServer$.main(JobServer.scala:156)
at spark.jobserver.JobServer.main(JobServer.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: akka.event.slf4j.Slf4jLogger
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(DynamicAccess.scala:67)
at akka.actor.ReflectiveDynamicAccess$$anonfun$getClassFor$1.apply(DynamicAccess.scala:66)
at scala.util.Try$.apply(Try.scala:192)
at akka.actor.ReflectiveDynamicAccess.getClassFor(DynamicAccess.scala:66)
at akka.event.LoggingBus$$anonfun$4.apply(Logging.scala:113)
... 33 more
Exception in thread "main" akka.ConfigurationException: Could not start logger due to [akka.ConfigurationException: Logger specified in config can't be loaded [akka.event.slf4j.Slf4jLogger] due to [java.lang.ClassNotFoundException: akka.event.slf4j.Slf4jLogger]]
at akka.event.LoggingBus$class.startDefaultLoggers(Logging.scala:144)
at akka.event.EventStream.startDefaultLoggers(EventStream.scala:26)
at akka.actor.LocalActorRefProvider.init(ActorRefProvider.scala:623)
at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:157)
at akka.cluster.ClusterActorRefProvider.init(ClusterActorRefProvider.scala:58)
at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:620)
at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:617)
at akka.actor.ActorSystemImpl._start(ActorSystem.scala:617)
at akka.actor.ActorSystemImpl.start(ActorSystem.scala:634)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
at spark.jobserver.JobServer$.spark$jobserver$JobServer$$makeSupervisorSystem$1(JobServer.scala:154)
at spark.jobserver.JobServer$$anonfun$main$1.apply(JobServer.scala:156)
at spark.jobserver.JobServer$$anonfun$main$1.apply(JobServer.scala:156)
at spark.jobserver.JobServer$.start(JobServer.scala:54)
at spark.jobserver.JobServer$.main(JobServer.scala:156)
at spark.jobserver.JobServer.main(JobServer.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Two steps to solve this issue:
The spark-job-server.jar created after execute ./bin/server_package.sh < enviroment> it have to be moved to the jars folder located on $SPARK_HOME
Nevertheless I had to download akka-slf4j_2.11.0-RC3-2.3.0.jar and put it on the $SPARK_HOME/jars folder too.
]1
Hue 3.10
Spark 1.6.0
CDH 5.8.0
When i run jar using spark-submit command it works fine but using hue workflow it gives me an error.
`java.lang.ClassNotFoundException: RowCountFilter
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.spark.util.Utils$.classForName(Utils.scala:175)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
at org.apache.oozie.action.hadoop.SparkMain.runSpark(SparkMain.java:256)
at org.apache.oozie.action.hadoop.SparkMain.run(SparkMain.java:207)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:49)
at org.apache.oozie.action.hadoop.SparkMain.main(SparkMain.java:52)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Intercepting System.exit(101)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [101]
`
>
Can anyone help what is missing ?
Please share your job.properties & coordinator.properties file. check the lib path oozie.libpath in these files and see if the required jar is present.
When oozie triggers a job , it will check the jars in the lib path distribute the to all the nodes in the cluster for execution.
You may also want to verify the configs in oozie-site.xml
I am running maven 3.3.9 in virtual box with JDK8
and when I run the 'mvn' command from the terminal I get the following error:
constituent[25]: file:/usr/share/maven/lib/guice.jar
constituent[26]: file:/usr/share/maven/lib/eclipse-aether-impl.jar
constituent[27]: file:/usr/share/maven/lib/guava.jar
constituent[28]: file:/usr/share/maven/lib/commons-lang3.jar
constituent[29]: file:/usr/share/maven/lib/wagon-http-shaded.jar
constituent[30]: file:/usr/share/maven/lib/javax.inject.jar
constituent[31]: file:/usr/share/maven/lib/maven-compat-3.x.jar
constituent[32]: file:/usr/share/maven/lib/commons-io.jar
constituent[33]: file:/usr/share/maven/lib/plexus-interpolation.jar
constituent[34]: file:/usr/share/maven/lib/commons-lang.jar
constituent[35]: file:/usr/share/maven/lib/plexus-utils.jar
constituent[36]: file:/usr/share/maven/lib/plexus-cipher.jar
constituent[37]: file:/usr/share/maven/lib/wagon-provider-api.jar
constituent[38]: file:/usr/share/maven/lib/wagon-http-shared.jar
constituent[39]: file:/usr/share/maven/lib/eclipse-aether-transport-wagon.jar
constituent[40]: file:/usr/share/maven/lib/sisu-plexus.jar
constituent[41]: file:/usr/share/maven/conf/logging/
---------------------------------------------------
Exception in thread "main" java.lang.NoClassDefFoundError: javax/inject/Provider
at org.apache.maven.cli.MavenCli.container(MavenCli.java:545)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:281)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: java.lang.ClassNotFoundException: javax.inject.Provider
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50)
at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:271)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:247)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:239)
... 11 more
I read a few forums and they all suggest upgrading the JDK but I am already on the latest version. :-(
can someone please guide me with this issue.
The same version on windows machine works fine with same JDK
I'm trying to run Spark on Windows 10 with hadoop on the official Microsoft terminal cmd.exe.
I don't have problem with Hadoop. The installation and stating is OK.
I'm using Java 8 x64 (jdk1.8.0_92)
When I start Spark with the command spark-shell, I got the Java error bellow :
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.NoClassDefFoundError: Could not initialize class scala.tools.fusesource_embedded.jansi.internal.Kernel32
at scala.tools.fusesource_embedded.jansi.internal.WindowsSupport.getConsoleMode(WindowsSupport.java:50)
at scala.tools.jline_embedded.WindowsTerminal.getConsoleMode(WindowsTerminal.java:204)
at scala.tools.jline_embedded.WindowsTerminal.init(WindowsTerminal.java:82)
at scala.tools.jline_embedded.TerminalFactory.create(TerminalFactory.java:101)
at scala.tools.jline_embedded.TerminalFactory.get(TerminalFactory.java:158)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:229)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:221)
at scala.tools.jline_embedded.console.ConsoleReader.(ConsoleReader.java:209)
at scala.tools.nsc.interpreter.jline_embedded.JLineConsoleReader.(JLineReader.scala:61)
at scala.tools.nsc.interpreter.jline_embedded.InteractiveReader.(JLineReader.scala:33)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:865)
at scala.tools.nsc.interpreter.ILoop$$anonfun$scala$tools$nsc$interpreter$ILoop$$instantiate$1$1.apply(ILoop.scala:862)
at scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$mkReader$1(ILoop.scala:871)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15$$anonfun$apply$8.apply(ILoop.scala:875)
at scala.util.Try$.apply(Try.scala:192)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
at scala.tools.nsc.interpreter.ILoop$$anonfun$15.apply(ILoop.scala:875)
at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
at scala.collection.immutable.Stream$$anonfun$map$1.apply(Stream.scala:418)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1233)
at scala.collection.immutable.Stream$Cons.tail(Stream.scala:1223)
at scala.collection.immutable.Stream.collect(Stream.scala:435)
at scala.tools.nsc.interpreter.ILoop.chooseReader(ILoop.scala:877)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$2.apply(ILoop.scala:916)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply$mcZ$sp(ILoop.scala:916)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
at scala.tools.nsc.interpreter.ILoop$$anonfun$process$1.apply(ILoop.scala:911)
at scala.reflect.internal.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:97)
at scala.tools.nsc.interpreter.ILoop.process(ILoop.scala:911)
at org.apache.spark.repl.Main$.main(Main.scala:49)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I get exactly the same stack trace when opening the scala console in Netbeans. If I type any scala expression below the stack trace, it works fine though.
When I ran a query on hive console in debug mode, I got an error as listed below. I'm using hive-1.2.1 and spark 1.5.1; I checked the hive-exec jar, which has the class definition org/apache/hive/spark/client/Job .
Caused by: java.lang.NoClassDefFoundError: org/apache/hive/spark/client/Job
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:792)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136)
at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:99)
at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
at org.apache.hive.spark.client.rpc.KryoMessageCodec.decode(KryoMessageCodec.java:96)
at io.netty.handler.codec.ByteToMessageCodec$1.decode(ByteToMessageCodec.java:42)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:327)
... 15 more*
And finally the query fails with:
"ERROR spark.SparkTask: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'"*
How can I resolve this issue?
In hive-1.2.1 pom.xml, the spark.version is 1.3.1
So, The easy way is dowload a spark-1.3.1-bin-hadoop from spark.apache.org.
then, add it's path to hive-site.xml like:
<property>
<name>spark.home</name>
<value>/path/spark-1.3.1-bin-hadoop2.4</value>
</property>