Failed to build spark2.4.3 against hadoop 3.2.0 - apache-spark

I'm building spark 2.4.3 to make it compatible to latest hadoop 3.2.0.
The source code is downloaded from https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3.tgz
Build command is ./build/mvn -Pyarn -Phadoop-3.2 -Dhadoop.version=3.2.0 -DskipTests clean package
The build result is:
[INFO] Spark Project Parent POM ........................... SUCCESS [ 1.761 s]
[INFO] Spark Project Tags ................................. SUCCESS [ 1.221 s]
[INFO] Spark Project Sketch ............................... SUCCESS [ 0.551 s]
[INFO] Spark Project Local DB ............................. SUCCESS [ 0.608 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 1.558 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 0.631 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 0.444 s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 2.501 s]
[INFO] Spark Project Core ................................. SUCCESS [ 13.536 s]
[INFO] Spark Project ML Local Library ..................... SUCCESS [ 0.549 s]
[INFO] Spark Project GraphX ............................... SUCCESS [ 1.614 s]
[INFO] Spark Project Streaming ............................ SUCCESS [ 3.332 s]
[INFO] Spark Project Catalyst ............................. SUCCESS [ 14.271 s]
[INFO] Spark Project SQL .................................. SUCCESS [ 13.008 s]
[INFO] Spark Project ML Library ........................... SUCCESS [ 7.923 s]
[INFO] Spark Project Tools ................................ SUCCESS [ 0.187 s]
[INFO] Spark Project Hive ................................. SUCCESS [ 6.664 s]
[INFO] Spark Project REPL ................................. SUCCESS [ 1.285 s]
[INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 4.824 s]
[INFO] Spark Project YARN ................................. SUCCESS [ 3.020 s]
[INFO] Spark Project Assembly ............................. SUCCESS [ 1.558 s]
[INFO] Spark Integration for Kafka 0.10 ................... SUCCESS [ 1.411 s]
[INFO] Kafka 0.10+ Source for Structured Streaming ........ SUCCESS [ 1.573 s]
[INFO] Spark Project Examples ............................. SUCCESS [ 1.702 s]
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SUCCESS [ 5.969 s]
[INFO] Spark Avro ......................................... SUCCESS [ 0.702 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:32 min
[INFO] Finished at: 2019-07-31T18:56:24+08:00
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "hadoop-3.2" could not be activated because it does not exist.
According to my expectation, an all-in-one compress file like spark-2.4.3-bin-hadoop3.2.tgz would be generated under build directory, just like the binary file that can be downloaded from official site, https://www.apache.org/dyn/closer.lua/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz.
How can I remove the warning The requested profile "hadoop-3.2" could not be activated because it does not exist, what does it mean?

Caution: What you are trying to do could result in very unstable environment if you don't know what you are doing.
That being said, spark 2.4.x stable release does not have profile hadoop-3.2, it has hadoop-3.1.
You will need to pull code from master to achieve what you want to achieve.
If your sole intention is to make spark 2.4.3 compatible with hadoop 3.2, you could look at profile in master along with relevant changes and cherrypick those into your own workspace.

Related

How to put data in Ignite Cache from Spark

I managed several spark jobs to compute RDD and in the end, I would like to put some of these data in an Ignite Cache. Unfortunately, I have got an error :
Java.lang.ClassCastException: org.apache.ignite.internal.processors.cache.IgniteCacheProxyImpl cannot be cast to org.apache.ignite.internal.processors.cache.GatewayProtectedCacheProxy
[info] at org.apache.ignite.internal.processors.cache.GatewayProtectedCacheProxy.equals(GatewayProtectedCacheProxy.java:1715)
[info] at scala.collection.mutable.FlatHashTable$class.findElemImpl(FlatHashTable.scala:131)
[info] at scala.collection.mutable.FlatHashTable$class.containsElem(FlatHashTable.scala:124)
[info] at scala.collection.mutable.HashSet.containsElem(HashSet.scala:40)
[info] at scala.collection.mutable.HashSet.contains(HashSet.scala:57)
[info] at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:87)
[info] at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitExternalizable(SerializationDebugger.scala:142)
[info] at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:104)
[info] at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visitSerializable(SerializationDebugger.scala:206)
[info] at org.apache.spark.serializer.SerializationDebugger$SerializationDebugger.visit(SerializationDebugger.scala:108)
[info] at org.apache.spark.serializer.SerializationDebugger$.find(SerializationDebugger.scala:67)
[info] at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:41)
[info] at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
[info] at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
[info] at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:400)
[info] at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:393)
[info] at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:162)
[info] at org.apache.spark.SparkContext.clean(SparkContext.scala:2326)
[info] at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:371)
[info] at org.apache.spark.rdd.RDD$$anonfun$map$1.apply(RDD.scala:370)
[info] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
[info] at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
[info] at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
[info] at org.apache.spark.rdd.RDD.map(RDD.scala:370).........
So my question is how to put data coming from spark rdd in specific Ignite Cache, in our case, an Ignite Cache with 3rd persistence cache store implemented on Postgres?

livy-server build failing on Windows while running mvn -e clean install -DskipTests command in cmd prompt?

I'm executing cmd: mvn -e clean install -DskipTests on windows to set up apache livy which is giving livy-server build failure. The following are the error logs:
[INFO] Livy Project Parent POM ............................ SUCCESS [ 3.803 s]
[INFO] livy-api ........................................... SUCCESS [ 9.125 s]
[INFO] livy-client-common ................................. SUCCESS [ 5.708 s]
[INFO] livy-test-lib ...................................... SUCCESS [ 7.374 s]
[INFO] livy-rsc ........................................... SUCCESS [ 9.376 s]
[INFO] multi-scala-project-root ........................... SUCCESS [ 0.406 s]
[INFO] livy-core-parent ................................... SUCCESS [ 0.640 s]
[INFO] livy-core_2.11 ..................................... SUCCESS [22:01 min]
[INFO] livy-repl-parent ................................... SUCCESS [ 3.447 s]
[INFO] livy-repl_2.11 ..................................... SUCCESS [ 22.895 s]
[INFO] livy-server ........................................ FAILURE [ 0.203 s]
[INFO] livy-assembly ...................................... SKIPPED
[INFO] livy-client-http ................................... SKIPPED
[INFO] livy-scala-api-parent .............................. SKIPPED
[INFO] livy-scala-api_2.11 ................................ SKIPPED
[INFO] livy-integration-test .............................. SKIPPED
[INFO] livy-coverage-report ............................... SKIPPED
[INFO] livy-examples ...................................... SKIPPED
[INFO] livy-python-api .................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23:04 min
[INFO] Finished at: 2019-05-03T18:12:32+05:30
[INFO] ------------------------------------------------------------------------
The logs show that an ant build exception has occurred:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.8:run (default) on project livy-server: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "bash" (in directory "C:\Windows\System32\incubator-livy\server"): CreateProcess error=2, The system cannot find the file specified
[ERROR] around Ant part ...<exec executable="bash">... # 4:27 in C:\Windows\System32\incubator-livy\server\target\antrun\build-main.xml
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.8:run (default) on project livy-server: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "bash" (in directory "C:\Windows\System32\incubator-livy\server"): CreateProcess error=2, The system cannot find the file specified
around Ant part ...<exec executable="bash">... # 4:27 in C:\Windows\System32\incubator-livy\server\target\antrun\build-main.xml
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:215)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "bash" (in directory "C:\Windows\System32\incubator-livy\server"): CreateProcess error=2, The system cannot find the file specified
around Ant part ...<exec executable="bash">... # 4:27 in C:\Windows\System32\incubator-livy\server\target\antrun\build-main.xml
Can anyone help resolve this?
Livy doesn't support windows yet. If you read your log message carefully it mentions bash.
When I did run the command 'mvn clean package -DskipTsets', I got the below error:
[WARNING] Rule 2: org.apache.maven.plugins.enforcer.RequireOS failed with message:
OS Arch: amd64 Family: dos Name: windows 10 Version: 10.0 is not allowed by Family=unix
Also, if you download the bin from apache livy website, it only contains the shell script so you need to try alternatives.

Build Failure while while building the Apache Zeppelin

I was installing Apache Zeppelin with Spark and while running the maven install command I get following error for Zeppelin: web Application
[ERROR] error Command failed with exit code 1.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Zeppelin ........................................... SUCCESS [ 50.394 s]
[INFO] Zeppelin: Interpreter .............................. SUCCESS [ 31.632 s]
[INFO] Zeppelin: Zengine .................................. SUCCESS [ 24.134 s]
[INFO] Zeppelin: Display system apis ...................... SUCCESS [ 19.607 s]
[INFO] Zeppelin: Spark dependencies ....................... SUCCESS [01:33 min]
[INFO] Zeppelin: Spark .................................... SUCCESS [ 29.058 s]
[INFO] Zeppelin: Markdown interpreter ..................... SUCCESS [ 5.796 s]
[INFO] Zeppelin: Angular interpreter ...................... SUCCESS [ 4.361 s]
[INFO] Zeppelin: Shell interpreter ........................ SUCCESS [ 4.827 s]
[INFO] Zeppelin: Livy interpreter ......................... SUCCESS [ 50.227 s]
[INFO] Zeppelin: HBase interpreter ........................ SUCCESS [ 11.682 s]
[INFO] Zeppelin: Apache Pig Interpreter ................... SUCCESS [ 10.991 s]
[INFO] Zeppelin: PostgreSQL interpreter ................... SUCCESS [ 5.541 s]
[INFO] Zeppelin: JDBC interpreter ......................... SUCCESS [ 6.663 s]
[INFO] Zeppelin: File System Interpreters ................. SUCCESS [ 6.304 s]
[INFO] Zeppelin: Flink .................................... SUCCESS [ 13.449 s]
[INFO] Zeppelin: Apache Ignite interpreter ................ SUCCESS [ 5.955 s]
[INFO] Zeppelin: Kylin interpreter ........................ SUCCESS [ 4.915 s]
[INFO] Zeppelin: Python interpreter ....................... SUCCESS [ 6.109 s]
[INFO] Zeppelin: Lens interpreter ......................... SUCCESS [ 11.360 s]
[INFO] Zeppelin: Apache Cassandra interpreter ............. SUCCESS [ 58.287 s]
[INFO] Zeppelin: Elasticsearch interpreter ................ SUCCESS [ 9.617 s]
[INFO] Zeppelin: BigQuery interpreter ..................... SUCCESS [ 5.584 s]
[INFO] Zeppelin: Alluxio interpreter ...................... SUCCESS [ 9.001 s]
[INFO] Zeppelin: Scio ..................................... SUCCESS [ 48.425 s]
[INFO] Zeppelin: web Application .......................... FAILURE [28:26 min]
[INFO] Zeppelin: Server ................................... SKIPPED
[INFO] Zeppelin: Packaging distribution ................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 37:14 min
[INFO] Finished at: 2017-02-01T16:21:39+05:30
[INFO] Final Memory: 224M/1792M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal com.github.eirslett:frontend-maven-plugin:1.3:yar
n (yarn install) on project zeppelin-web: Failed to run task: 'yarn install --no
-lockfile --https-proxy=http://sg0227823:***#tulsa-proxy.sabre.com:80 --proxy=ht
tp://sg0227823:***#tulsa-proxy.sabre.com:80' failed. (error code 1) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e swit
ch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please rea
d the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureExc
eption
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :zeppelin-web
As #lambda has mentioned in comments, it was fixed by his PR-2016 github.com/apache/zeppelin/pull/2016 but it seems the issue is back in the latest version of Zeppelin.
I resolved this by installing system-wide npm and nodejs:
$ sudo yum install nodejs npm
(notice if you're using RHEL, both packages are available only through non-default EPEL yum repository https://fedoraproject.org/wiki/EPEL).
but then I was getting
"Node version 0.10.48 is not supported, please use Node.js 4.0 or higher." when manually ran 'yarn install --no-lockfile' (don't confuse Hadoop's yarn command with this yarn command - https://yarnpkg.com/en/ which is used by nodejs). This in its turn got resolved by installing latest stable node.js
$ sudo npm install n -g
$ sudo n stable
ps. It was hard to diagnose what exactly was wrong with yarn command at first as maven just spits 'error code 1', but you can debug by running that same command manually, for example, ./zeppelin-web/node/yarn/dist/bin/yarn install --no-lock (assuming you're in the root of zeppelin codebase). So you get output of exact problem.

I am trying to build the spark to be able to run the programs, however it does not seem to work

I am trying to build the spark to be able to run the programs, however it does not seem to work.
This is what happens when I try to run sample program in spark:
hduser_#ankit-sve14137cnb:/usr/local/spark$ ./bin/run-example SparkPi 10
Failed to find Spark examples assembly in /usr/local/spark/lib or /usr/local/spark/examples/target
You need to build Spark before running this program
hduser_#ankit-sve14137cnb:/usr/local/spark$ sudo build/mvn -e -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package
Using `mvn` from path: /usr/bin/mvn
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512M; support was removed in 8.0
[INFO] Error stacktraces are turned on.
[INFO] Scanning for projects...
[INFO]
[INFO] Reactor Build Order:
[INFO]
[INFO] Spark Project Parent POM
[INFO] Spark Project Test Tags
[INFO] Spark Project Launcher
[INFO] Spark Project Networking
[INFO] Spark Project Shuffle Streaming Service
[INFO] Spark Project Unsafe
[INFO] Spark Project Core
[INFO] Spark Project Bagel
[INFO] Spark Project GraphX
[INFO] Spark Project Streaming
[INFO] Spark Project Catalyst
[INFO] Spark Project SQL
[INFO] Spark Project ML Library
[INFO] Spark Project Tools
[INFO] Spark Project Hive
[INFO] Spark Project Docker Integration Tests
[INFO] Spark Project REPL
[INFO] Spark Project YARN Shuffle Service
[INFO] Spark Project YARN
[INFO] Spark Project Assembly
[INFO] Spark Project External Twitter
[INFO] Spark Project External Flume Sink
[INFO] Spark Project External Flume
[INFO] Spark Project External Flume Assembly
[INFO] Spark Project External MQTT
[INFO] Spark Project External MQTT Assembly
[INFO] Spark Project External ZeroMQ
[INFO] Spark Project External Kafka
[INFO] Spark Project Examples
[INFO] Spark Project External Kafka Assembly
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project Parent POM 1.6.1
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) # spark-parent_2.10 ---
[INFO] Deleting /usr/local/spark/target
[INFO]
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) # spark-parent_2.10 ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) # spark-parent_2.10 ---
[INFO] Add Source directory: /usr/local/spark/src/main/scala
[INFO] Add Test Source directory: /usr/local/spark/src/test/scala
[INFO]
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default-cli) # spark-parent_2.10 ---
[INFO] Dependencies classpath:
/root/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) # spark-parent_2.10 ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-parent_2.10 ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-antrun-plugin:1.8:run (create-tmp-dir) # spark-parent_2.10 ---
[INFO] Executing tasks
main:
[mkdir] Created dir: /usr/local/spark/target/tmp
[INFO] Executed tasks
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile-first) # spark-parent_2.10 ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default) # spark-parent_2.10 ---
[INFO]
[INFO] --- scalatest-maven-plugin:1.0:test (test) # spark-parent_2.10 ---
[INFO] Tests are skipped.
[INFO]
[INFO] --- maven-jar-plugin:2.6:test-jar (prepare-test-jar) # spark-parent_2.10 ---
[INFO] Building jar: /usr/local/spark/target/spark-parent_2.10-1.6.1-tests.jar
[INFO]
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) # spark-parent_2.10 ---
[INFO]
[INFO] --- maven-shade-plugin:2.4.1:shade (default) # spark-parent_2.10 ---
[INFO] Including org.spark-project.spark:unused:jar:1.0.0 in the shaded jar.
[INFO] Replacing original artifact with shaded artifact.
[INFO]
[INFO] --- maven-source-plugin:2.4:jar-no-fork (create-source-jar) # spark-parent_2.10 ---
[INFO]
[INFO] --- maven-source-plugin:2.4:test-jar-no-fork (create-source-jar) # spark-parent_2.10 ---
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project Test Tags 1.6.1
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) # spark-test-tags_2.10 ---
[INFO] Deleting /usr/local/spark/tags/target
[INFO]
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) # spark-test-tags_2.10 ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) # spark-test-tags_2.10 ---
[INFO] Add Source directory: /usr/local/spark/tags/src/main/scala
[INFO] Add Test Source directory: /usr/local/spark/tags/src/test/scala
[INFO]
[INFO] --- maven-dependency-plugin:2.10:build-classpath (default-cli) # spark-test-tags_2.10 ---
[INFO] Dependencies classpath:
/root/.m2/repository/org/scala-lang/scala-reflect/2.10.5/scala-reflect-2.10.5.jar:/root/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/root/.m2/repository/org/scala-lang/scala-library/2.10.5/scala-library-2.10.5.jar:/root/.m2/repository/org/scalatest/scalatest_2.10/2.2.1/scalatest_2.10-2.2.1.jar
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) # spark-test-tags_2.10 ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) # spark-test-tags_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /usr/local/spark/tags/src/main/resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # spark-test-tags_2.10 ---
[INFO] Using zinc server for incremental compilation
[error] Required file not found: scala-compiler-2.10.5.jar
[error] See zinc -help for information about locating necessary files
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 3.566 s]
[INFO] Spark Project Test Tags ............................ FAILURE [ 0.466 s]
[INFO] Spark Project Launcher ............................. SKIPPED
[INFO] Spark Project Networking ........................... SKIPPED
[INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
[INFO] Spark Project Unsafe ............................... SKIPPED
[INFO] Spark Project Core ................................. SKIPPED
[INFO] Spark Project Bagel ................................ SKIPPED
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project Docker Integration Tests ............. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SKIPPED
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Twitter ..................... SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Project External MQTT ........................ SKIPPED
[INFO] Spark Project External MQTT Assembly ............... SKIPPED
[INFO] Spark Project External ZeroMQ ...................... SKIPPED
[INFO] Spark Project External Kafka ....................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5.128 s
[INFO] Finished at: 2016-04-02T22:46:45+05:30
[INFO] Final Memory: 38M/223M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-test-tags_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-test-tags_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:224)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:862)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:286)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:197)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:145)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 20 more
Caused by: Compile failed via zinc server
at sbt_inc.SbtIncrementalCompiler.zincCompile(SbtIncrementalCompiler.java:136)
at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:86)
at scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303)
at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:119)
at scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:99)
at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
... 21 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :spark-test-tags_2.10
I do have the latest maven:
hduser_#ankit-sve14137cnb:/usr/local$ mvn -version
Apache Maven 3.3.3
Maven home: /usr/share/maven
Java version: 1.8.0_77, vendor: Oracle Corporation
Java home: /home/ankit/Downloads/jdk1.8.0_77/jre
Default locale: en_IN, platform encoding: UTF-8
Below are the paths mentioned in my bashrc file:
export JAVA_HOME=/home/ankit/Downloads/jdk1.8.0_77
export HADOOP_HOME=/home/ankit/Downloads/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
#export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"
export CASSANDRA_HOME =$CASSANDRA_HOME:/home/hduser_/cassandra
#export PATH = $PATH:$CASSANDRA_HOME/bin
export SCALA_HOME = $SCALA_HOME:/usr/local/scala
export PATH = $SCALA_HOME/bin:$PATH
I am new to SOF,could someone please advise?
In Maven 3 if you just had a failed download and have fixed it (e.g. by uploading the jar toa repository) it will cache the failure. To force a refresh add -U to the command line. Try refreshing and let me know how it'll go.
If you have already failed build once you need to force refresh with maven 3 : The command should be (note the -U option):
mvn -U -DskipTests clean package

Attempting to build RQB from source getting assert error from testJsList

LOVE RQB!!, I want to get the source building. I have cloned the repo with git and I believe I have downloaded and have all of the bits installedbut when I run "mvn clean install" it goes through some machinations, then finally fails with the following
Failed tests:
[INFO] testJsList(com.redspr.redquerybuilder.js.client.GwtTestBasics)
Here is the output from the stack:
[INFO] Tests run: 10, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 8.325 sec <<< FAILURE!
[INFO] testJsList(com.redspr.redquerybuilder.js.client.GwtTestBasics) Time elapsed: 0.047 sec <<< FAILURE!
[INFO] junit.framework.AssertionFailedError: Remote test failed at 127.0.0.1 / Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19
[INFO] expected: <22>, actual: <14>
[INFO] at junit.framework.Assert.fail(Assert.java:193)
[INFO] at junit.framework.Assert.failNotEquals(Assert.java:198)
[INFO] at junit.framework.Assert.assertEquals(Assert.java:94)
[INFO] at junit.framework.Assert.assertEquals(Assert.java:43)
[INFO] at com.redspr.redquerybuilder.js.client.GwtTestBasics.testJsList(GwtTestBasics.java:192)
[INFO] at com.redspr.redquerybuilder.js.client.__GwtTestBasics_unitTestImpl.doRunTest(__GwtTestBasics_unitTestImpl.java:7)
[INFO] at junit.framework.TestCase.runTest(TestCase.java:62)
[INFO] at com.google.gwt.junit.client.GWTTestCase.runBare(GWTTestCase.java:188)
[INFO] at com.google.gwt.junit.client.GWTTestCase.__doRunTest(GWTTestCase.java:129)
[INFO] at com.google.gwt.junit.client.impl.GWTRunner.runTest(GWTRunner.java:390)
[INFO] at com.google.gwt.junit.client.impl.GWTRunner.doRunTest(GWTRunner.java:318)
[INFO] at com.google.gwt.junit.client.impl.GWTRunner.access$9(GWTRunner.java:312)
[INFO] at com.google.gwt.junit.client.impl.GWTRunner$TestBlockListener.onSuccess(GWTRunner.java:107)
[INFO] at com.google.gwt.junit.client.impl.GWTRunner$TestBlockListener.onSuccess(GWTRunner.java:1)
[INFO] at com.google.gwt.user.client.rpc.impl.RequestCallbackAdapter.onResponseReceived(RequestCallbackAdapter.java:232)
[INFO] at com.google.gwt.http.client.Request.fireOnResponseReceived(Request.java:258)
[INFO] at com.google.gwt.http.client.RequestBuilder$1.onReadyStateChange(RequestBuilder.java:412)
[INFO] at sun.reflect.GeneratedMethodAccessor62.invoke(Unknown Source)
[INFO] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[INFO] at java.lang.reflect.Method.invoke(Method.java:606)
[INFO] at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103)
[INFO] at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71)
[INFO] at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:172)
[INFO] at com.google.gwt.dev.shell.BrowserChannelServer.reactToMessagesWhileWaitingForReturn(BrowserChannelServer.java:338)
[INFO] at com.google.gwt.dev.shell.BrowserChannelServer.invokeJavascript(BrowserChannelServer.java:219)
[INFO] at com.google.gwt.dev.shell.ModuleSpaceOOPHM.doInvoke(ModuleSpaceOOPHM.java:136)
[INFO] at com.google.gwt.dev.shell.ModuleSpace.invokeNative(ModuleSpace.java:571)
[INFO] at com.google.gwt.dev.shell.ModuleSpace.invokeNativeObject(ModuleSpace.java:279)
[INFO] at com.google.gwt.dev.shell.JavaScriptHost.invokeNativeObject(JavaScriptHost.java:91)
[INFO] at com.google.gwt.core.client.impl.Impl.apply(Impl.java)
[INFO] at com.google.gwt.core.client.impl.Impl.entry0(Impl.java:249)
[INFO] at sun.reflect.GeneratedMethodAccessor61.invoke(Unknown Source)
[INFO] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[INFO] at java.lang.reflect.Method.invoke(Method.java:606)
[INFO] at com.google.gwt.dev.shell.MethodAdaptor.invoke(MethodAdaptor.java:103)
[INFO] at com.google.gwt.dev.shell.MethodDispatch.invoke(MethodDispatch.java:71)
[INFO] at com.google.gwt.dev.shell.OophmSessionHandler.invoke(OophmSessionHandler.java:172)
[INFO] at com.google.gwt.dev.shell.BrowserChannelServer.reactToMessages(BrowserChannelServer.java:293)
[INFO] at com.google.gwt.dev.shell.BrowserChannelServer.processConnection(BrowserChannelServer.java:547)
[INFO] at com.google.gwt.dev.shell.BrowserChannelServer.run(BrowserChannelServer.java:364)
[INFO] at java.lang.Thread.run(Thread.java:744)
This is a bug in the unit test of RedQueryBuilder 0.6.0 due to the unit test setting the time in UTC then testing the days, hours, minutes are the same in local time... Only true in the UK and Lisbon :( Fix should be in 0.7.0. Thanks for finding and raising the bug.

Resources