I am currently having an issue when running my Spark job remotely in a HDInsight Cluster:
My project has a dependency on netty-all and here is what I explicitly specify for it in the pom file:
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.51.Final</version>
</dependency>
The final built jar includes this package with the specified version and running the Spark job on my local machine works fine. However, when I try to run it in the remote HDInsight cluster, the job throws the following exception:
java.lang.NoSuchMethodError: io.netty.handler.ssl.SslProvider.isAlpnSupported(Lio/netty/handler/ssl/SslProvider;)Z
I believe this is due to the netty version mismatch as Spark was picking up the old netty version (netty-all-4.1.17) from its default system classpath in the remote cluster rather than the newer netty package defined in the uber jar.
I have tried different ways to resolve this issue but they don't seem to work well:
Relocating classes using Maven Shade plugin:
More details and its issues are here - Missing Abstract Class using Maven Shade Plugin for Relocating Classes
Spark configurations
spark.driver.extraClassPath=<path to netty-all-4.1.50.Final.jar>
spark.executor.extraClassPath=<path to netty-all-4.1.50.Final.jar>
Would like to know if there is any other solutions to solve this issue or any steps missing here?
You will need to ensure you only have Netty 4.1.50.Final or higher on the classpath
Related
I begin to test spark.
I installed spark on my local machine and run a local cluster with a single worker. when I tried to execute my job from my IDE by setting the sparconf as follows:
final SparkConf conf = new SparkConf().setAppName("testSparkfromJava").setMaster("spark://XXXXXXXXXX:7077");
final JavaSparkContext sc = new JavaSparkContext(conf);
final JavaRDD<String> distFile = sc.textFile(Paths.get("").toAbsolutePath().toString() + "dataSpark/datastores.json");*
I got this exception:
java.lang.RuntimeException: java.io.InvalidClassException: org.apache.spark.rpc.netty.RequestMessage; local class incompatible: stream classdesc serialVersionUID = -5447855329526097695, local class serialVersionUID = -2221986757032131007
It can be multiple incompatible reasons below:
Hadoop version;
Spark version;
Scala version;
...
For me, its Scala version , I using 2.11.X in my IDE but official doc says:
Spark runs on Java 7+, Python 2.6+ and R 3.1+. For the Scala API, Spark 1.6.1 uses Scala 2.10. You will need to use a compatible Scala version (2.10.x).
and the x in the doc told cannot be smaller than 3 if you using latest Java(1.8), cause this.
Hope it will help you!
Got it all working with below combination of versions
Installed spark 1.6.2
verify with bin/spark-submit --version
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
</dependency>
and
Scala 2.10.6 and Java 8.
Note it did NOT work and have similar class incompatible issue with below versions
Scala 2.11.8 and Java 8
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.6.2</version>
</dependency>
Looks your installed Spark version is not same as the Spark version used in your IDE.
If you are using maven, just compare the version of the dependency declared in pom.xml and the output of bin/spark-submit --version and make sure they are same.
I faced this issue because Spark jar dependency was 2.1.0 but installed Spark Engine version is 2.0.0 Hence version mismatch, So it throws this exception.
The root cause of this problem is version mismatch of Spark jar dependency in project and installed Spark Engine where execute spark job is running.
Hence verify both versions and make them identical.
Example Spark-core Jar version 2.1.0 and Spark Computation Engine version must be: 2.1.0
Spark-core Jar version 2.0.0 and Spark Computation Engine version must be: 2.0.0
It's working for me perfectly.
I had this problem.
when I run the code with spark-submit it works (instead of running with IDE).
./bin/spark-submit --master spark://HOST:PORT target/APP-NAME.jar
While trying to run a piece of code that used some FAT JARs (that share some common submodules) built using sbt assembly, I'm running into this nasty java.lang.NoSuchMethodError
The JAR is built on EMR itself (and not uploaded from some other environment), so version conflict in libraries / Spark / Scala etc is unlikely
My EMR environment:
Release label: emr-5.11.0
Hadoop distribution: Amazon 2.7.3
Applications: Spark 2.2.1, Zeppelin 0.7.3, Ganglia 3.7.2, Hive 2.3.2, Livy 0.4.0, Sqoop 1.4.6, Presto 0.187
Project configurations:
Scala 2.11.11
Spark 2.2.1
SBT 1.0.3
It turned out that the real culprit were the shared submodules in those jars.
Two fat jars built out of projects containing common submodules were leading to this conflict. Removing one of those jars resolved the issue.
I'm not sure if this conflict happened only under some particular circumstances or would always occur upon uploading such jars (that have same submodules) in Zeppelin interpreter, so still waiting for proper explanation.
I am trying to run an oozie spark action that runs spark 2.x code. Followed the steps mentioned in Hortonwork's HDP documentation, however, the spark action fails with error "jackson version too old 2.4.4".
Spark2 oozie sharelib jars have 2.6.5 version of jackson jars, but oozie's oozie-sharelib jars have 2.4.4 version of jackson jars.
Hence, sometimes the job runs fine but sometimes it fails citing the version mismatch or NoSuchMethodExists exception (again due to mismatched jars ).
I dont want to delete the 2.4.4 version jars from oozie' oozie sharelib, but wondering why these jars are added to the classpath when spark action is running. Is there a way to only add jars from /user/oozie/share/lib//spark2 and restrict any other jars from getting added to classpath ?
I pulled the latest source from the Spark repository and built locally. It works great from an interactive shell like spark-shell or spark-sql.
Now I want to connect Zeppelin to my Spark 1.5, according to this install manual. I published the custom Spark build to the local maven repository and set the custom Spark version in the Zeppelin build command. The build process finished successfully but when I try to run basic things like sc inside notebook, it throws:
akka.ConfigurationException: Akka JAR version [2.3.11] does not match the provided config version [2.3.4]
Version 2.3.4 is set in pom.xml and spark/pom.xml, but simply changing them won’t even let me get a build.
If I rebuild Zeppelin with the standard -Dspark.vesion=1.4.1, everything works.
Update 2016-01
Spark 1.6 support has landed to master and is available under -Pspark-1.6 profile.
Update 2015-09
Spark 1.5 support has landed to master and is available under -Pspark-1.5 profile.
Work on supporting Spark 1.5 in Apache Zeppelin (incubating) was done under this PR apache/incubator-zeppelin#269 which will lend to master soon.
For now, building from Spark_1.5 branch with -Pspark-1.5 should do the trick.
I am a new user to Maven, as I am trying to use it to build apache spark on amazon EC2 VMs. I have mannually installed java version 1.7.0 on the VMs. However as I was running the Maven, the following error occurs:
Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first) on project spark-core_2.10: Execution scala-test-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.0:testCompile failed. CompileFailed
As I think the java version mismatch is the potential reason, causing the compiling problem. I opened up the pom file of the spark for maven tool, it has declared java related version in two seperate places:
<java.version>1.6</java.version>
and
<aws.java.sdk.version>1.8.3</aws.java.sdk.version>
What are the differences between these two versions?
Which one should be edited to solve the jave version mismatch?
It's two different things
<java.version>1.6</java.version>
is the java version used and
<aws.java.sdk.version>1.8.3</aws.java.sdk.version>
is the AWS SDK for Java version used.
The minumum requirement of AWS SDK 1.9 is Java 1.6+ so there is no compatibility issues.