spark jobserver failing to build with Spark 2.0 - apache-spark

I am trying to run spark-jobserver with spark-2.0
I cloned branch spark-2.0-preview from github repository. I follow the deployment guide but when I try to deploy server using bin/server_deploy.sh. I got compilation error:
Error:
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:4: cannot find symbol
[error] symbol: class DataFrame
[error] location: package org.apache.spark.sql
[error] import org.apache.spark.sql.DataFrame;
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java:13: java.lang.Object cannot be converted to org.apache.spark.sql.Row[]
[error] return sc.sql(data.getString("sql")).collect();
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestLoaderJob.java:25: cannot find symbol
[error] symbol: class DataFrame
[error] location: class spark.jobserver.JHiveTestLoaderJob
[error] final DataFrame addrRdd = sc.sql("SELECT * FROM default.test_addresses");
[error] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JSqlTestJob.java:13: array required, but java.lang.Object found
[error] Row row = sc.sql("select 1+1").take(1)[0];
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Some input files use or override a deprecated API.
[info] /spark-jobserver/job-server-extras/src/main/java/spark/jobserver/JHiveTestJob.java: Recompile with -Xlint:deprecation for details.
[error] (job-server-extras/compile:compileIncremental) javac returned nonzero exit code
Did I forget to add some dependencies?

I had a similar issue. I found that it is bug because of changes in Spark API from 1.x to 2.x. you can found open issue on github https://github.com/spark-jobserver/spark-jobserver/issues/760
I introduced some quick fix which for me solved the issue and I can deploy jobserver. I submitted pull request for that. https://github.com/spark-jobserver/spark-jobserver/pull/762

Related

Error while compiling opentsDB for Cassandra using AsyncCassandra

I am following below link to build AsyncCassandra -
https://github.com/OpenTSDB/asynccassandra
But I am getting following error :
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /root/asynccassandra/src/org/hbase/async/HBaseClient.java:[304,34] error: cannot find symbol
[ERROR] symbol: variable ConsistencyLevel
location: class HBaseClient
/root/asynccassandra/src/org/hbase/async/HBaseClient.java:[303,25] [deprecation] BoundedExponentialBackoff(long,int,int) in BoundedExponentialBackoff has been deprecated
[ERROR] /root/asynccassandra/src/org/hbase/async/HBaseClient.java:[311,35] error: cannot find symbol
It looks like source code is missing
import com.netflix.astyanax.model.ConsistencyLevel;
But in reality, the astynax project that is used for it is already retired, and shouldn't be used.

Scala syntax errors when building Spark

I cloned a fresh copy of the branch-2.0 branch of Spark from Github onto a Centos 7 system. When executing the suggested command to build from source,
./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn
I get the following errors:
[INFO] Total time: 5.368 s (Wall Clock)
[INFO] Finished at: 2016-08-18T11:56:49-05:00
[error[INFO] Final Memory: 71M/1963M
] [INFO] ------------------------------------------------------------------------
/home/rprechelt/ada/spark/common/tags/src/main/scala/org/apache/spark/annotation/Since.scala:30: error writing class Since: /home/rprechelt/ada/spark/common/tags/target/scala-2.11/classes/org/apache/spark/annotation/Since.class: /home/rprechelt/ada/spark/common/tags/target/scala-2.11/classes/org is not a directory
[error] private[spark] class Since(version: String) extends StaticAnnotation
[error] ^
[error] /home/rprechelt/ada/spark/common/tags/src/main/scala/org/apache/spark/annotation/package.scala:25: error writing package object annotation: /home/rprechelt/ada/spark/common/tags/target/scala-2.11/classes/org/apache/spark/annotation/package.class: /home/rprechelt/ada/spark/common/tags/target/scala-2.11/classes/org is not a directory
[error] package object annotation
[error] ^
[error] two errors found
[error] Compile failed at Aug 18, 2016 11:56:49 AM [0.358s]
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (scala-compile-first) on project spark-sketch_2.10: Execution scala-compile-first of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed. CompileFailed -> [Help 1]
I'm at a loss for what could be occurring here - I've tried building it with different versions of Scala, but they all report the same error.
Does anyone have any suggestions about how I might go about fixing this?
I got the same problem, this is because zinc server is started by some other user. Trying killing zinc and then starting the build again.
Its basically an access error where zinc server started by another user is trying to write to a different directory.
sudo ps -ef | grep zinc

build failed for datastax spark-cassandra connector

I was trying to build the spark-cassandra connector and followed this link :
http://www.planetcassandra.org/blog/kindling-an-introduction-to-spark-with-cassandra/
Which further in the link asks to download the connector from git and build using sbt. But, when i try to run the command ./sbt/sbt assembly. It throws the following exception :
Launching sbt from sbt/sbt-launch-0.13.8.jar
[info] Loading project definition from /home/naresh/Desktop/spark-cassandra-connector/project
Using releases: https://oss.sonatype.org/service/local/staging/deploy/maven2 for releases
Using snapshots: https://oss.sonatype.org/content/repositories/snapshots for snapshots
Scala: 2.10.5 [To build against Scala 2.11 use '-Dscala-2.11=true']
Scala Binary: 2.10
Java: target=1.7 user=1.7.0_79
[info] Set current project to root (in build file:/home/naresh/Desktop/spark-cassandra-connector/)
[warn] Credentials file /home/hduser/.ivy2/.credentials does not exist
[warn] Credentials file /home/hduser/.ivy2/.credentials does not exist
[warn] Credentials file /home/hduser/.ivy2/.credentials does not exist
[warn] Credentials file /home/hduser/.ivy2/.credentials does not exist
[warn] Credentials file /home/hduser/.ivy2/.credentials does not exist
[info] Compiling 140 Scala sources and 1 Java source to /home/naresh/Desktop/spark-cassandra-connector/spark-cassandra-connector/target/scala-2.10/classes...
[error] /home/naresh/Desktop/spark-cassandra-connector/spark-cassandra-connector/src/main/scala/org/apache/spark/sql/cassandra/CassandraCatalog.scala:48: not found: value processTableIdentifier
[error] val id = processTableIdentifier(tableIdentifier).reverse.lift
[error] ^
[error] /home/naresh/Desktop/spark-cassandra-connector/spark-cassandra-connector/src/main/scala/org/apache/spark/sql/cassandra/CassandraCatalog.scala:134: value toSeq is not a member of org.apache.spark.sql.catalyst.TableIdentifier
[error] cachedDataSourceTables.refresh(tableIdent.toSeq)
[error] ^
[error] /home/naresh/Desktop/spark-cassandra-connector/spark-cassandra-connector/src/main/scala/org/apache/spark/sql/cassandra/CassandraSQLContext.scala:94: not found: value BroadcastNestedLoopJoin
[error] BroadcastNestedLoopJoin
[error] ^
[error] three errors found
[info] Compiling 11 Scala sources to /home/naresh/Desktop/spark-cassandra-connector/spark-cassandra-connector-embedded/target/scala-2.10/classes...
[warn] /home/naresh/Desktop/spark-cassandra-connector/spark-cassandra-connector-embedded/src/main/scala/com/datastax/spark/connector/embedded/SparkTemplate.scala:69: value actorSystem in class SparkEnv is deprecated: Actor system is no longer supported as of 1.4.0
[warn] def actorSystem: ActorSystem = SparkEnv.get.actorSystem
[warn] ^
[warn] one warning found
[error] (spark-cassandra-connector/compile:compileIncremental) Compilation failed
[error] Total time: 27 s, completed 4 Nov, 2015 12:34:33 PM
This works for me,
run mvn -DskipTests clean package
you can find build spark command in README.md file from your spark Dir.
Before run that command You’ll need to configure Maven to use more
memory than usual by setting MAVEN_OPTS
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"

sbt on linux not accepting -jvm-debug 5005

I'm using sbt launcher version 0.13.7 on Arch Linux from the official arch repositories. I'm trying to debug a scala app using IntelliJ.
Everywhere else I looked, including other stack overflow questions, seemed to say that Linux versions of sbt can be debugged by simply invoking "sbt -jvm-debug 5005"
However, when I do this, I get a long string of errors:
[warn] The `-` command is deprecated in favor of `onFailure` and will be removed in 0.14.0
[error] Expected letter
[error] Expected symbol
[error] Expected '!'
[error] Expected '+'
[error] Expected '++'
[error] Expected 'debug'
[error] Expected 'info'
[error] Expected 'warn'
[error] Expected 'error'
[error] Expected ';'
[error] Expected end of input.
[error] Expected '--'
[error] Expected 'show'
[error] Expected 'all'
[error] Expected '*'
[error] Expected '{'
[error] Expected project ID
[error] Expected configuration
[error] Expected key
[error] Expected '-'
[error] 5005
[error] ^
[error] Not a valid command: jvm-debug
[error] Not a valid project ID: jvm-debug
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: jvm-debug
[error] jvm-debug
[error] ^
Why is this, and how can I set an sbt project to debug on a port?
It appears that this works:
export SBT_OPTS="-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005" && sbt
However, I would still like to know why "sbt -jvm-debug 5005" isn't working on Arch if anyone knows.
The question is about SBT on Linux, but I ran into the same issue in OS X.
SBT 0.13.9 in OS X neither accepts -jvm-debug nor follows SBT_OPTS environment as presented in the accepted answer.
However, this works:
export JAVA_OPTS="-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=n,address=5005"

Trouble generating CXF client code (wsdl2java) with Axis2 service

So I have a vendor app that created all of their services with Axis2. For whatever reason, if I generate the client code using Axis2's WSDL2Java, it works just dandy, but whenever I use CXF to generate it, I get errors all over the place, and I don't know how to make CXF work:
[ERROR] Failed to execute goal org.apache.cxf:cxf-codegen-plugin:2.6.0:wsdl2java (generate-sources) on project unitoffers: Thrown by JAXB:
[ERROR] Thrown by JAXB:
[ERROR] undefined simple or complex type 'xs:EJBException'
[ERROR] at line 1760 column 21 of schema file:/C:/Users/Me/myproject/src/main/resources/wsdl/ObjectHierarchy.wsdl
[ERROR]
[ERROR] undefined simple or complex type 'xs:EJBException'
[ERROR] at line 1831 column 21 of schema file:/C:/Users/Me/myproject/src/main/resources/wsdl/ObjectHierarchy.wsdl
[ERROR]
[ERROR] undefined simple or complex type 'xs:EJBException'
[ERROR] at line 1852 column 21 of schema file:/C:/Users/Me/myproject/src/main/resources/wsdl/ObjectHierarchy.wsdl
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
Is Axis2 doing something really funky that CXF can't interpret? As mentioned, I can generate the code with Axis2, but almost all of my other services are CXF, and I'd prefer not to mix-and-match services in my projects (not to mention that some projects would require having BOTH Axis2 and CXF in order to work, so that would make things messy).

Resources