SBT shell project in terminal gives error - sbt-native-packager

i am a beginner at scala programming language. i read sbt documentation and implement these in sbt shell but it gives error. how to resolve it?
....................
ThisBuild / scalaVersion := "2.13.0"
ThisBuild / organization := "com.example"
val scalaTest = "org.scalatest" %% "scalatest" % "3.2.7"
val gigahorse = "com.eed3si9n" %% "gigahorse-okhttp" % "0.5.0"
val playJson = "com.typesafe.play" %% "play-json" % "2.6.9"
lazy val hello = (project in file("."))
.aggregate(helloCore)
.dependsOn(helloCore)
.settings(
name := "Hello",
libraryDependencies += scalaTest % Test,
)
lazy val helloCore = (project in file("core"))
.settings(
name := "Hello Core",
libraryDependencies ++= Seq(gigahorse, playJson),
libraryDependencies += scalaTest % Test,
)
...............................the above is my build.sbt file...............
package example.core
import gigahorse._, support.okhttp.Gigahorse
import scala.concurrent._, duration._
import play.api.libs.json._
object Weather {
lazy val http = Gigahorse.http(Gigahorse.config)
def weather: Future[String] = {
val baseUrl = "https://www.metaweather.com/api/location"
val locUrl = baseUrl + "/search/"
val weatherUrl = baseUrl + "/%s/"
val rLoc = Gigahorse.url(locUrl).get.
addQueryString("query" -> "New York")
import ExecutionContext.Implicits.global
for {
loc <- http.run(rLoc, parse)
woeid = (loc \ 0 \ "woeid").get
rWeather = Gigahorse.url(weatherUrl format woeid).get
weather <- http.run(rWeather, parse)
} yield (weather \\ "weather_state_name")(0).as[String].toLowerCase
}
private def parse = Gigahorse.asString andThen Json.parse
}
...........................................hello.scala....................
package example
import scala.concurrent._, duration._
import core.Weather
object Hello extends App {
val w = Await.result(Weather.weather, 10.seconds)
println(s"Hello! The weather in New York is $w.")
Weather.http.close()
}
error:
mehveen#mehveen-Y11C:~$ cd foo-build
mehveen#mehveen-Y11C:~/foo-build$ touch build.sbt
mehveen#mehveen-Y11C:~/foo-build$ sbt
[info] Loading global plugins from /home/mehveen/.sbt/1.0/plugins
[info] Loading settings for project foo-build-build from plugins.sbt ...
[info] Loading project definition from /home/mehveen/foo-build/project
[info] Loading settings for project hello from build.sbt ...
[info] Set current project to Hello (in build file:/home/mehveen/foo-build/)
[info] sbt server started at local:///home/mehveen/.sbt/1.0/server/704a39d101f4d89588ee/sock
sbt:Hello> run
[info] Updating helloCore...
[warn] module not found: com.typesafe.play#play-json_2.13;2.6.9
[warn] ==== local: tried
[warn] /home/mehveen/.ivy2/local/com.typesafe.play/play-json_2.13/2.6.9/ivys/ivy.xml
[warn] ==== public: tried
[warn] https://repo1.maven.org/maven2/com/typesafe/play/play-json_2.13/2.6.9/play-json_2.13-2.6.9.pom
[warn] ==== local-preloaded-ivy: tried
[warn] /home/mehveen/.sbt/preloaded/com.typesafe.play/play-json_2.13/2.6.9/ivys/ivy.xml
[warn] ==== local-preloaded: tried
[warn] file:////home/mehveen/.sbt/preloaded/com/typesafe/play/play-json_2.13/2.6.9/play-json_2.13-2.6.9.pom
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: com.typesafe.play#play-json_2.13;2.6.9: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn] Note: Unresolved dependencies path:
[warn] com.typesafe.play:play-json_2.13:2.6.9 (/home/mehveen/foo-build/build.sbt#L19)
[warn] +- com.example:hello-core_2.13:0.1.0-SNAPSHOT
[error] sbt.librarymanagement.ResolveException: unresolved dependency: com.typesafe.play#play-json_2.13;2.6.9: not found
[error] at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:332)
[error] at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:208)
[error] at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:239)
[error] at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error] at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error] at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error] at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:95)
[error] at xsbt.boot.Locks$GlobalLock.xsbt$boot$Locks$GlobalLock$$withChannelRetries$1(Locks.scala:80)
[error] at xsbt.boot.Locks$GlobalLock$$anonfun$withFileLock$1.apply(Locks.scala:99)
[error] at xsbt.boot.Using$.withResource(Using.scala:10)
[error] at xsbt.boot.Using$.apply(Using.scala:9)
[error] at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:60)
[error] at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:50)
[error] at xsbt.boot.Locks$.apply0(Locks.scala:31)
[error] at xsbt.boot.Locks$.apply(Locks.scala:28)
[error] at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error] at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error] at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error] at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:238)
[error] at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error] at sbt.librarymanagement.ivy.IvyDependencyResolution.update(IvyDependencyResolution.scala:20)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:56)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:45)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:93)
[error] at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:68)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$19(LibraryManagement.scala:106)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:224)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:106)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:89)
[error] at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:149)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:120)
[error] at sbt.Classpaths$.$anonfun$updateTask$5(Defaults.scala:2561)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:40)
[error] at sbt.std.Transform$$anon$4.work(System.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:269)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:278)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:269)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
[error] at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
[error] at java.base/java.lang.Thread.run(Thread.java:834)
[error] (helloCore / update) sbt.librarymanagement.ResolveException: unresolved dependency: com.typesafe.play#play-json_2.13;2.6.9: not found
[error] Total time: 7 s, completed 16 Apr. 2021, 5:00:08 pm
error pic

Related

sbt-assembly - throwing java.lang.ArrayIndexOutOfBoundsException: Index 65536 out of bounds for length 132

Problem Details:
In my SBT project, I am using the dependency
"org.apache.cassandra" % "cassandra-all" % "3.0.27"
to gather the Cassandra table size using NodeTool/NodeProbe class. I run sbt-assembly to create a fat jar packaging the mentioned dependency. But once I upgrade the dependency to
"org.apache.cassandra" % "cassandra-all" % "4.0.0"
the "sbt assembly" command starts throwing the below error.
Raised the same with SBT as well. https://github.com/sbt/sbt-assembly/issues/475
[error] java.lang.ArrayIndexOutOfBoundsException: Index 65536 out of bounds for length 132
[error] at org.objectweb.asm.ClassReader.readLabel(ClassReader.java:2679)
[error] at org.objectweb.asm.ClassReader.createLabel(ClassReader.java:2695)
[error] at org.objectweb.asm.ClassReader.readTypeAnnotations(ClassReader.java:2761)
[error] at org.objectweb.asm.ClassReader.readCode(ClassReader.java:1937)
[error] at org.objectweb.asm.ClassReader.readMethod(ClassReader.java:1514)
[error] at org.objectweb.asm.ClassReader.accept(ClassReader.java:744)
[error] at org.objectweb.asm.ClassReader.accept(ClassReader.java:424)
[error] at com.eed3si9n.jarjar.ScalaSigProcessor.process(ScalaSigProcessor.scala:15)
[error] at com.eed3si9n.jarjar.util.JarProcessorChain.process(JarProcessorChain.java:38)
[error] at com.eed3si9n.jarjar.JJProcessor.process(JJProcessor.scala:108)
[error] at com.eed3si9n.jarjarabrams.Shader$.$anonfun$bytecodeShader$6(Shader.scala:75)
[error] at com.eed3si9n.jarjarabrams.Shader$.$anonfun$shadeDirectory$4(Shader.scala:21)
[error] at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
[error] at scala.collection.Iterator.foreach(Iterator.scala:943)
[error] at scala.collection.Iterator.foreach$(Iterator.scala:943)
[error] at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
[error] at scala.collection.IterableLike.foreach(IterableLike.scala:74)
[error] at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
[error] at scala.collection.AbstractIterable.foreach(Iterable.scala:56)
[error] at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
[error] at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
[error] at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
[error] at com.eed3si9n.jarjarabrams.Shader$.shadeDirectory(Shader.scala:17)
[error] at sbtassembly.Assembly$.$anonfun$assembleMappings$11(Assembly.scala:296)
[error] at scala.collection.parallel.AugmentedIterableIterator.map2combiner(RemainsIterator.scala:116)
[error] at scala.collection.parallel.AugmentedIterableIterator.map2combiner$(RemainsIterator.scala:113)
[error] at scala.collection.parallel.immutable.ParVector$ParVectorIterator.map2combiner(ParVector.scala:66)
[error] at scala.collection.parallel.ParIterableLike$Map.leaf(ParIterableLike.scala:1056)
[error] at scala.collection.parallel.Task.$anonfun$tryLeaf$1(Tasks.scala:53)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] at scala.util.control.Breaks$$anon$1.catchBreak(Breaks.scala:67)
[error] at scala.collection.parallel.Task.tryLeaf(Tasks.scala:56)
[error] at scala.collection.parallel.Task.tryLeaf$(Tasks.scala:50)
[error] at scala.collection.parallel.ParIterableLike$Map.tryLeaf(ParIterableLike.scala:1053)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.internal(Tasks.scala:160)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.internal$(Tasks.scala:157)
[error] at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.internal(Tasks.scala:440)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.compute(Tasks.scala:150)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.compute$(Tasks.scala:149)
[error] at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.compute(Tasks.scala:440)
[error] at java.base/java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:189)
[error] at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
[error] at java.base/java.util.concurrent.ForkJoinPool.awaitJoin(ForkJoinPool.java:1708)
[error] at java.base/java.util.concurrent.ForkJoinTask.doJoin(ForkJoinTask.java:397)
[error] at java.base/java.util.concurrent.ForkJoinTask.join(ForkJoinTask.java:721)
[error] at scala.collection.parallel.ForkJoinTasks$WrappedTask.sync(Tasks.scala:379)
[error] at scala.collection.parallel.ForkJoinTasks$WrappedTask.sync$(Tasks.scala:379)
[error] at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.sync(Tasks.scala:440)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.internal(Tasks.scala:174)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.internal$(Tasks.scala:157)
[error] at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.internal(Tasks.scala:440)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.compute(Tasks.scala:150)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.compute$(Tasks.scala:149)
[error] at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.compute(Tasks.scala:440)
[error] at java.base/java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:189)
[error] at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
[error] at java.base/java.util.concurrent.ForkJoinTask.doJoin(ForkJoinTask.java:396)
[error] at java.base/java.util.concurrent.ForkJoinTask.join(ForkJoinTask.java:721)
[error] at scala.collection.parallel.ForkJoinTasks$WrappedTask.sync(Tasks.scala:379)
[error] at scala.collection.parallel.ForkJoinTasks$WrappedTask.sync$(Tasks.scala:379)
[error] at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.sync(Tasks.scala:440)
[error] at scala.collection.parallel.ForkJoinTasks.executeAndWaitResult(Tasks.scala:423)
[error] at scala.collection.parallel.ForkJoinTasks.executeAndWaitResult$(Tasks.scala:416)
[error] at scala.collection.parallel.ForkJoinTaskSupport.executeAndWaitResult(TaskSupport.scala:60)
[error] at scala.collection.parallel.ExecutionContextTasks.executeAndWaitResult(Tasks.scala:555)
[error] at scala.collection.parallel.ExecutionContextTasks.executeAndWaitResult$(Tasks.scala:555)
[error] at scala.collection.parallel.ExecutionContextTaskSupport.executeAndWaitResult(TaskSupport.scala:84)
[error] at scala.collection.parallel.ParIterableLike$ResultMapping.leaf(ParIterableLike.scala:960)
[error] at scala.collection.parallel.Task.$anonfun$tryLeaf$1(Tasks.scala:53)
[error] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
[error] at scala.util.control.Breaks$$anon$1.catchBreak(Breaks.scala:67)
[error] at scala.collection.parallel.Task.tryLeaf(Tasks.scala:56)
[error] at scala.collection.parallel.Task.tryLeaf$(Tasks.scala:50)
[error] at scala.collection.parallel.ParIterableLike$ResultMapping.tryLeaf(ParIterableLike.scala:955)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.compute(Tasks.scala:153)
[error] at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask.compute$(Tasks.scala:149)
[error] at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.compute(Tasks.scala:440)
[error] at java.base/java.util.concurrent.RecursiveAction.exec(RecursiveAction.java:189)
[error] at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
[error] at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
[error] at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
[error] at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
[error] at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
[error] (datareader / assembly / assembledMappings) java.lang.ArrayIndexOutOfBoundsException: Index 65536 out of bounds for length 132
Any assistance on this is appreciated.
SBT assembly version used:
"com.eed3si9n" % "sbt-assembly" % "1.2.0"
Thanks in advance.

Error while connecting to JanusGraph

I have the following code:
trait InMemoryConnectScala {
def messageSerializer(): MessageSerializer = {
import java.util.Collections
import org.apache.tinkerpop.gremlin.driver.ser.GryoMessageSerializerV1d0
import org.janusgraph.graphdb.tinkerpop.JanusGraphIoRegistry
val config = new util.HashMap[String, Object]()
config.put("ioRegistries", Collections.singletonList(classOf[JanusGraphIoRegistry].getName))
val serializer = new GryoMessageSerializerV1d0()
serializer.configure(config, null)
serializer
}
def connect(): JanusGraph = {
import org.apache.commons.configuration.BaseConfiguration
val conf = new BaseConfiguration()
conf.setProperty("storage.backend", "inmemory")
conf.setProperty("type", "remote")
val jg = JanusGraphFactory.open(conf)
jg
}
}
val clusterBuilder = Cluster.build.port(8182).serializer(messageSerializer()).addContactPoint("localhost")
val cl = clusterBuilder.create()
val client: Client = cl.connect()
val jg = EmptyGraph.instance.traversal.withRemote(DriverRemoteConnection.using(cl))
val res = client.submit("g.V().count()")
}
I get the following error when it hits the submit method
12:25:34.979 [pool-1-thread-1] INFO o.a.t.gremlin.driver.ConnectionPool - Opening connection pool on Host{address=localhost/127.0.0.1:8182, hostUri=ws://localhost:8182/gremlin} with core size of 2
[info] AcmTestSpec *** ABORTED ***
[info] java.lang.RuntimeException: java.lang.RuntimeException: java.util.concurrent.TimeoutException: Timed out while waiting for an available host - check the client configuration and connectivity to the server if this message persists
[info] at org.apache.tinkerpop.gremlin.driver.Client.submit(Client.java:214)
[info] at org.apache.tinkerpop.gremlin.driver.Client.submit(Client.java:198)
[info] at AcmTestSpec.beforeAll(AcmTestSpec.scala:407)
[info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
[info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
[info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
[info] at AcmTestSpec.run(AcmTestSpec.scala:60)

Updating groovy to latest version 2.4.11 causes gmaven build failure

When I upgraded groovy to latest version 2.4.11 and tried to build maven project I get the below error.
I tried the following:
Updated pom.xml with dependency of groovy-all-2.4.11 but is seems like gmaven plugin is outdated
tried using gmaven-plus plugin
Cannot downgrade groovy below 2.0.x as my code requires it
[ERROR] Failed to execute goal org.codehaus.gmaven:gmaven-plugin:1.5:execute (default) on project xxxxxx: Execution default of goal org.codehaus.gmaven:gmaven-plugin:1.5:execute failed: An API incompatibility was encountered while executing org.codehaus.gmaven:gmaven-plugin:1.5:execute: java.lang.NoSuchMethodError: org.codehaus.groovy.ast.ModuleNode.getStarImports()Ljava/util/List;
[ERROR] -----------------------------------------------------
[ERROR] realm = plugin>org.codehaus.gmaven:gmaven-plugin:1.5
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/gmaven/gmaven-plugin/1.5/gmaven-plugin-1.5.jar
[ERROR] urls[1] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/gmaven/runtime/gmaven-runtime-api/1.5/gmaven-runtime-api-1.5.jar
[ERROR] urls[2] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/gmaven/feature/gmaven-feature-api/1.5/gmaven-feature-api-1.5.jar
[ERROR] urls[3] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/gmaven/runtime/gmaven-runtime-loader/1.5/gmaven-runtime-loader-1.5.jar
[ERROR] urls[4] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/gmaven/feature/gmaven-feature-support/1.5/gmaven-feature-support-1.5.jar
[ERROR] urls[5] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/gmaven/runtime/gmaven-runtime-support/1.5/gmaven-runtime-support-1.5.jar
[ERROR] urls[6] = file:/C:/Users/buildapp/.m2/repository/org/sonatype/gshell/gshell-io/2.4/gshell-io-2.4.jar
[ERROR] urls[7] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/plexus/plexus-utils/3.0/plexus-utils-3.0.jar
[ERROR] urls[8] = file:/C:/Users/buildapp/.m2/repository/com/thoughtworks/qdox/qdox/1.12/qdox-1.12.jar
[ERROR] urls[9] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/shared/file-management/1.2.1/file-management-1.2.1.jar
[ERROR] urls[10] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/shared/maven-shared-io/1.1/maven-shared-io-1.1.jar
[ERROR] urls[11] = file:/C:/Users/buildapp/.m2/repository/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar
[ERROR] urls[12] = file:/C:/Users/buildapp/.m2/repository/log4j/log4j/1.2.12/log4j-1.2.12.jar
[ERROR] urls[13] = file:/C:/Users/buildapp/.m2/repository/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar
[ERROR] urls[14] = file:/C:/Users/buildapp/.m2/repository/com/google/collections/google-collections/1.0/google-collections-1.0.jar
[ERROR] urls[15] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/reporting/maven-reporting-impl/2.0.4.1/maven-reporting-impl-2.0.4.1.jar
[ERROR] urls[16] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/plexus/plexus-interpolation/1.1/plexus-interpolation-1.1.jar
[ERROR] urls[17] = file:/C:/Users/buildapp/.m2/repository/commons-validator/commons-validator/1.2.0/commons-validator-1.2.0.jar
[ERROR] urls[18] = file:/C:/Users/buildapp/.m2/repository/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar
[ERROR] urls[19] = file:/C:/Users/buildapp/.m2/repository/commons-digester/commons-digester/1.6/commons-digester-1.6.jar
[ERROR] urls[20] = file:/C:/Users/buildapp/.m2/repository/commons-logging/commons-logging/1.0.4/commons-logging-1.0.4.jar
[ERROR] urls[21] = file:/C:/Users/buildapp/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar
[ERROR] urls[22] = file:/C:/Users/buildapp/.m2/repository/xml-apis/xml-apis/1.0.b2/xml-apis-1.0.b2.jar
[ERROR] urls[23] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/doxia/doxia-core/1.0-alpha-10/doxia-core-1.0-alpha-10.jar
[ERROR] urls[24] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/doxia/doxia-sink-api/1.0-alpha-10/doxia-sink-api-1.0-alpha-10.jar
[ERROR] urls[25] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/reporting/maven-reporting-api/2.0.4/maven-reporting-api-2.0.4.jar
[ERROR] urls[26] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/doxia/doxia-site-renderer/1.0-alpha-10/doxia-site-renderer-1.0-alpha-10.jar
[ERROR] urls[27] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/plexus/plexus-i18n/1.0-beta-7/plexus-i18n-1.0-beta-7.jar
[ERROR] urls[28] = file:/C:/Users/buildapp/.m2/repository/org/codehaus/plexus/plexus-velocity/1.1.7/plexus-velocity-1.1.7.jar
[ERROR] urls[29] = file:/C:/Users/buildapp/.m2/repository/org/apache/velocity/velocity/1.5/velocity-1.5.jar
[ERROR] urls[30] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/doxia/doxia-decoration-model/1.0-alpha-10/doxia-decoration-model-1.0-alpha-10.jar
[ERROR] urls[31] = file:/C:/Users/buildapp/.m2/repository/commons-collections/commons-collections/3.2/commons-collections-3.2.jar
[ERROR] urls[32] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/doxia/doxia-module-apt/1.0-alpha-10/doxia-module-apt-1.0-alpha-10.jar
[ERROR] urls[33] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/doxia/doxia-module-fml/1.0-alpha-10/doxia-module-fml-1.0-alpha-10.jar
[ERROR] urls[34] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/doxia/doxia-module-xdoc/1.0-alpha-10/doxia-module-xdoc-1.0-alpha-10.jar
[ERROR] urls[35] = file:/C:/Users/buildapp/.m2/repository/org/apache/maven/doxia/doxia-module-xhtml/1.0-alpha-10/doxia-module-xhtml-1.0-alpha-10.jar
[ERROR] urls[36] = file:/C:/Users/buildapp/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar
[ERROR] urls[37] = file:/C:/Users/buildapp/.m2/repository/org/slf4j/slf4j-api/1.5.10/slf4j-api-1.5.10.jar
[ERROR] urls[38] = file:/C:/Users/buildapp/.m2/repository/org/sonatype/gossip/gossip/1.2/gossip-1.2.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import from realm ClassRealm[project>com.palamida.appsec:palamida-appsec:6.11.2-10-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
[ERROR]
[ERROR] -----------------------------------------------------
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginContainerException

spark-jobserver cannot build on Spark 1.6.2

I'm trying to run the spark-jobserver 0.6.2 with Spark 1.6.2
Currently what I'm doing is this:
git clone https://github.com/spark-jobserver/spark-jobserver.git
git checkout tags/v0.6.2 -f
sbt job-server/package
At this point the system crashes with this error:
[info] Compiling 35 Scala sources to /test_jobserver/spark-jobserver/job-server/target/scala-2.10/classes...
[error]
[error] while compiling: /test_jobserver/spark-jobserver/job-server/src/spark.jobserver/util/SparkMasterProvider.scala
[error] during phase: jvm
[error] library version: version 2.10.6
[error] compiler version: version 2.10.6
[error] reconstructed args: -deprecation -classpath /test_jobserver/spark-jobserver/job-server/target/scala-2.10/classes:/test_jobserver/spark-jobserver/akka-app/target/scala-2.10/classes:/test_jobserver/spark-jobserver/job-server-api/target/scala-2.10/classes:/home/marco/.ivy2/cache/io.netty/netty-all/jars/netty-all-4.0.29.Final.jar:/home/marco/.ivy2/cache/com.typesafe/config/bundles/config-1.3.0.jar:/home/marco/.ivy2/cache/com.typesafe.akka/akka-cluster_2.10/jars/akka-cluster_2.10-2.3.15.jar:/home/marco/.ivy2/cache/com.typesafe.akka/akka-remote_2.10/jars/akka-remote_2.10-2.3.15.jar:/home/marco/.ivy2/cache/com.typesafe.akka/akka-actor_2.10/jars/akka-actor_2.10-2.3.15.jar:/home/marco/.ivy2/cache/io.netty/netty/bundles/netty-3.8.0.Final.jar:/home/marco/.ivy2/cache/com.google.protobuf/protobuf-java/bundles/protobuf-java-2.5.0.jar:/home/marco/.ivy2/cache/org.uncommons.maths/uncommons-maths/jars/uncommons-maths-1.2.2a.jar:/home/marco/.ivy2/cache/io.spray/spray-json_2.10/bundles/spray-json_2.10-1.3.2.jar:/home/marco/.ivy2/cache/io.spray/spray-can_2.10/bundles/spray-can_2.10-1.3.3.jar:/home/marco/.ivy2/cache/io.spray/spray-io_2.10/bundles/spray-io_2.10-1.3.3.jar:/home/marco/.ivy2/cache/io.spray/spray-util_2.10/bundles/spray-util_2.10-1.3.3.jar:/home/marco/.ivy2/cache/io.spray/spray-http_2.10/bundles/spray-http_2.10-1.3.3.jar:/home/marco/.ivy2/cache/org.parboiled/parboiled-scala_2.10/jars/parboiled-scala_2.10-1.1.7.jar:/home/marco/.ivy2/cache/org.parboiled/parboiled-core/jars/parboiled-core-1.1.7.jar:/home/marco/.ivy2/cache/io.spray/spray-caching_2.10/bundles/spray-caching_2.10-1.3.3.jar:/home/marco/.ivy2/cache/com.googlecode.concurrentlinkedhashmap/concurrentlinkedhashmap-lru/jars/concurrentlinkedhashmap-lru-1.4.2.jar:/home/marco/.ivy2/cache/io.spray/spray-routing_2.10/bundles/spray-routing_2.10-1.3.3.jar:/home/marco/.ivy2/cache/io.spray/spray-httpx_2.10/bundles/spray-httpx_2.10-1.3.3.jar:/home/marco/.ivy2/cache/org.jvnet.mimepull/mimepull/jars/mimepull-1.9.5.jar:/home/marco/.ivy2/cache/com.chuusai/shapeless_2.10/jars/shapeless_2.10-1.2.4.jar:/home/marco/.ivy2/cache/io.spray/spray-client_2.10/bundles/spray-client_2.10-1.3.3.jar:/home/marco/.ivy2/cache/com.yammer.metrics/metrics-core/jars/metrics-core-2.2.0.jar:/home/marco/.ivy2/cache/org.joda/joda-convert/jars/joda-convert-1.8.1.jar:/home/marco/.ivy2/cache/joda-time/joda-time/jars/joda-time-2.9.3.jar:/home/marco/.ivy2/cache/com.typesafe.slick/slick_2.10/bundles/slick_2.10-2.1.0.jar:/home/marco/.ivy2/cache/com.h2database/h2/jars/h2-1.3.176.jar:/home/marco/.ivy2/cache/commons-dbcp/commons-dbcp/jars/commons-dbcp-1.4.jar:/home/marco/.ivy2/cache/commons-pool/commons-pool/jars/commons-pool-1.5.4.jar:/home/marco/.ivy2/cache/org.flywaydb/flyway-core/jars/flyway-core-3.2.1.jar:/home/marco/.ivy2/cache/org.apache.shiro/shiro-core/bundles/shiro-core-1.2.4.jar:/home/marco/.ivy2/cache/commons-beanutils/commons-beanutils/jars/commons-beanutils-1.8.3.jar:/home/marco/.ivy2/cache/org.scoverage/scalac-scoverage-runtime_2.10/jars/scalac-scoverage-runtime_2.10-1.1.1.jar:/home/marco/.ivy2/cache/org.scoverage/scalac-scoverage-plugin_2.10/jars/scalac-scoverage-plugin_2.10-1.1.1.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-core_2.10/jars/spark-core_2.10-1.6.1.jar:/home/marco/.ivy2/cache/org.apache.avro/avro-mapred/jars/avro-mapred-1.7.7-hadoop2.jar:/home/marco/.ivy2/cache/org.apache.avro/avro-ipc/jars/avro-ipc-1.7.7-tests.jar:/home/marco/.ivy2/cache/org.apache.avro/avro-ipc/jars/avro-ipc-1.7.7.jar:/home/marco/.ivy2/cache/org.apache.avro/avro/jars/avro-1.7.7.jar:/home/marco/.ivy2/cache/org.codehaus.jackson/jackson-core-asl/jars/jackson-core-asl-1.9.13.jar:/home/marco/.ivy2/cache/org.codehaus.jackson/jackson-mapper-asl/jars/jackson-mapper-asl-1.9.13.jar:/home/marco/.ivy2/cache/org.xerial.snappy/snappy-java/bundles/snappy-java-1.1.2.jar:/home/marco/.ivy2/cache/org.apache.commons/commons-compress/jars/commons-compress-1.4.1.jar:/home/marco/.ivy2/cache/org.tukaani/xz/jars/xz-1.0.jar:/home/marco/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.7.10.jar:/home/marco/.ivy2/cache/com.twitter/chill_2.10/jars/chill_2.10-0.5.0.jar:/home/marco/.ivy2/cache/com.twitter/chill-java/jars/chill-java-0.5.0.jar:/home/marco/.ivy2/cache/com.esotericsoftware.kryo/kryo/bundles/kryo-2.21.jar:/home/marco/.ivy2/cache/com.esotericsoftware.reflectasm/reflectasm/jars/reflectasm-1.07-shaded.jar:/home/marco/.ivy2/cache/com.esotericsoftware.minlog/minlog/jars/minlog-1.2.jar:/home/marco/.ivy2/cache/org.objenesis/objenesis/jars/objenesis-1.2.jar:/home/marco/.ivy2/cache/org.apache.xbean/xbean-asm5-shaded/bundles/xbean-asm5-shaded-4.4.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-client/jars/hadoop-client-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-common/jars/hadoop-common-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-annotations/jars/hadoop-annotations-2.2.0.jar:/home/marco/.ivy2/cache/com.google.code.findbugs/jsr305/jars/jsr305-1.3.9.jar:/home/marco/.ivy2/cache/commons-cli/commons-cli/jars/commons-cli-1.2.jar:/home/marco/.ivy2/cache/org.apache.commons/commons-math/jars/commons-math-2.1.jar:/home/marco/.ivy2/cache/xmlenc/xmlenc/jars/xmlenc-0.52.jar:/home/marco/.ivy2/cache/commons-httpclient/commons-httpclient/jars/commons-httpclient-3.1.jar:/home/marco/.ivy2/cache/commons-codec/commons-codec/jars/commons-codec-1.4.jar:/home/marco/.ivy2/cache/commons-net/commons-net/jars/commons-net-2.2.jar:/home/marco/.ivy2/cache/log4j/log4j/bundles/log4j-1.2.17.jar:/home/marco/.ivy2/cache/commons-lang/commons-lang/jars/commons-lang-2.5.jar:/home/marco/.ivy2/cache/commons-configuration/commons-configuration/jars/commons-configuration-1.6.jar:/home/marco/.ivy2/cache/commons-collections/commons-collections/jars/commons-collections-3.2.1.jar:/home/marco/.ivy2/cache/commons-digester/commons-digester/jars/commons-digester-1.8.jar:/home/marco/.ivy2/cache/commons-beanutils/commons-beanutils-core/jars/commons-beanutils-core-1.8.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-auth/jars/hadoop-auth-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/jars/hadoop-hdfs-2.2.0.jar:/home/marco/.ivy2/cache/org.mortbay.jetty/jetty-util/jars/jetty-util-6.1.26.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-app/jars/hadoop-mapreduce-client-app-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-common/jars/hadoop-mapreduce-client-common-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-yarn-common/jars/hadoop-yarn-common-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-yarn-api/jars/hadoop-yarn-api-2.2.0.jar:/home/marco/.ivy2/cache/org.slf4j/slf4j-log4j12/jars/slf4j-log4j12-1.7.10.jar:/home/marco/.ivy2/cache/com.google.inject/guice/jars/guice-3.0.jar:/home/marco/.ivy2/cache/javax.inject/javax.inject/jars/javax.inject-1.jar:/home/marco/.ivy2/cache/aopalliance/aopalliance/jars/aopalliance-1.0.jar:/home/marco/.ivy2/cache/org.sonatype.sisu.inject/cglib/jars/cglib-2.2.1-v20090111.jar:/home/marco/.ivy2/cache/com.sun.jersey.jersey-test-framework/jersey-test-framework-grizzly2/jars/jersey-test-framework-grizzly2-1.9.jar:/home/marco/.ivy2/cache/com.sun.jersey/jersey-server/bundles/jersey-server-1.9.jar:/home/marco/.ivy2/cache/asm/asm/jars/asm-3.1.jar:/home/marco/.ivy2/cache/com.sun.jersey/jersey-json/bundles/jersey-json-1.9.jar:/home/marco/.ivy2/cache/org.codehaus.jettison/jettison/bundles/jettison-1.1.jar:/home/marco/.ivy2/cache/stax/stax-api/jars/stax-api-1.0.1.jar:/home/marco/.ivy2/cache/com.sun.xml.bind/jaxb-impl/jars/jaxb-impl-2.2.3-1.jar:/home/marco/.ivy2/cache/javax.xml.bind/jaxb-api/jars/jaxb-api-2.2.2.jar:/home/marco/.ivy2/cache/javax.activation/activation/jars/activation-1.1.jar:/home/marco/.ivy2/cache/org.codehaus.jackson/jackson-jaxrs/jars/jackson-jaxrs-1.8.3.jar:/home/marco/.ivy2/cache/org.codehaus.jackson/jackson-xc/jars/jackson-xc-1.8.3.jar:/home/marco/.ivy2/cache/com.sun.jersey.contribs/jersey-guice/jars/jersey-guice-1.9.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-yarn-client/jars/hadoop-yarn-client-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-core/jars/hadoop-mapreduce-client-core-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-yarn-server-common/jars/hadoop-yarn-server-common-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-shuffle/jars/hadoop-mapreduce-client-shuffle-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.hadoop/hadoop-mapreduce-client-jobclient/jars/hadoop-mapreduce-client-jobclient-2.2.0.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-launcher_2.10/jars/spark-launcher_2.10-1.6.1.jar:/home/marco/.ivy2/cache/org.spark-project.spark/unused/jars/unused-1.0.0.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-network-common_2.10/jars/spark-network-common_2.10-1.6.1.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-network-shuffle_2.10/jars/spark-network-shuffle_2.10-1.6.1.jar:/home/marco/.ivy2/cache/org.fusesource.leveldbjni/leveldbjni-all/bundles/leveldbjni-all-1.8.jar:/home/marco/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.4.4.jar:/home/marco/.ivy2/cache/com.fasterxml.jackson.core/jackson-annotations/bundles/jackson-annotations-2.4.4.jar:/home/marco/.ivy2/cache/com.fasterxml.jackson.core/jackson-core/bundles/jackson-core-2.4.4.jar:/home/marco/.ivy2/cache/org.apache.spark/spark-unsafe_2.10/jars/spark-unsafe_2.10-1.6.1.jar:/home/marco/.ivy2/cache/net.java.dev.jets3t/jets3t/jars/jets3t-0.7.1.jar:/home/marco/.ivy2/cache/org.apache.curator/curator-recipes/bundles/curator-recipes-2.4.0.jar:/home/marco/.ivy2/cache/org.apache.curator/curator-framework/bundles/curator-framework-2.4.0.jar:/home/marco/.ivy2/cache/org.apache.curator/curator-client/bundles/curator-client-2.4.0.jar:/home/marco/.ivy2/cache/org.apache.zookeeper/zookeeper/jars/zookeeper-3.4.5.jar:/home/marco/.ivy2/cache/jline/jline/jars/jline-0.9.94.jar:/home/marco/.ivy2/cache/com.google.guava/guava/bundles/guava-14.0.1.jar:/home/marco/.ivy2/cache/org.eclipse.jetty.orbit/javax.servlet/orbits/javax.servlet-3.0.0.v201112011016.jar:/home/marco/.ivy2/cache/org.apache.commons/commons-lang3/jars/commons-lang3-3.3.2.jar:/home/marco/.ivy2/cache/org.apache.commons/commons-math3/jars/commons-math3-3.4.1.jar:/home/marco/.ivy2/cache/org.slf4j/jul-to-slf4j/jars/jul-to-slf4j-1.7.10.jar:/home/marco/.ivy2/cache/org.slf4j/jcl-over-slf4j/jars/jcl-over-slf4j-1.7.10.jar:/home/marco/.ivy2/cache/com.ning/compress-lzf/bundles/compress-lzf-1.0.3.jar:/home/marco/.ivy2/cache/net.jpountz.lz4/lz4/jars/lz4-1.3.0.jar:/home/marco/.ivy2/cache/org.roaringbitmap/RoaringBitmap/bundles/RoaringBitmap-0.5.11.jar:/home/marco/.ivy2/cache/com.typesafe.akka/akka-slf4j_2.10/jars/akka-slf4j_2.10-2.3.11.jar:/home/marco/.ivy2/cache/org.json4s/json4s-jackson_2.10/jars/json4s-jackson_2.10-3.2.10.jar:/home/marco/.ivy2/cache/org.json4s/json4s-core_2.10/jars/json4s-core_2.10-3.2.10.jar:/home/marco/.ivy2/cache/org.json4s/json4s-ast_2.10/jars/json4s-ast_2.10-3.2.10.jar:/home/marco/.ivy2/cache/com.thoughtworks.paranamer/paranamer/jars/paranamer-2.6.jar:/home/marco/.ivy2/cache/org.scala-lang/scalap/jars/scalap-2.10.0.jar:/home/marco/.ivy2/cache/org.scala-lang/scala-compiler/jars/scala-compiler-2.10.0.jar:/home/marco/.ivy2/cache/com.sun.jersey/jersey-core/bundles/jersey-core-1.9.jar:/home/marco/.ivy2/cache/org.apache.mesos/mesos/jars/mesos-0.21.1-shaded-protobuf.jar:/home/marco/.ivy2/cache/com.clearspring.analytics/stream/jars/stream-2.7.0.jar:/home/marco/.ivy2/cache/io.dropwizard.metrics/metrics-core/bundles/metrics-core-3.1.2.jar:/home/marco/.ivy2/cache/io.dropwizard.metrics/metrics-jvm/bundles/metrics-jvm-3.1.2.jar:/home/marco/.ivy2/cache/io.dropwizard.metrics/metrics-json/bundles/metrics-json-3.1.2.jar:/home/marco/.ivy2/cache/io.dropwizard.metrics/metrics-graphite/bundles/metrics-graphite-3.1.2.jar:/home/marco/.ivy2/cache/com.fasterxml.jackson.module/jackson-module-scala_2.10/bundles/jackson-module-scala_2.10-2.4.4.jar:/home/marco/.ivy2/cache/org.scala-lang/scala-reflect/jars/scala-reflect-2.10.4.jar:/home/marco/.ivy2/cache/org.apache.ivy/ivy/jars/ivy-2.4.0.jar:/home/marco/.ivy2/cache/oro/oro/jars/oro-2.0.8.jar:/home/marco/.ivy2/cache/org.tachyonproject/tachyon-client/jars/tachyon-client-0.8.2.jar:/home/marco/.ivy2/cache/commons-io/commons-io/jars/commons-io-2.4.jar:/home/marco/.ivy2/cache/org.tachyonproject/tachyon-underfs-hdfs/jars/tachyon-underfs-hdfs-0.8.2.jar:/home/marco/.ivy2/cache/org.tachyonproject/tachyon-underfs-s3/jars/tachyon-underfs-s3-0.8.2.jar:/home/marco/.ivy2/cache/org.tachyonproject/tachyon-underfs-local/jars/tachyon-underfs-local-0.8.2.jar:/home/marco/.ivy2/cache/net.razorvine/pyrolite/jars/pyrolite-4.9.jar:/home/marco/.ivy2/cache/net.sf.py4j/py4j/jars/py4j-0.9.jar -feature -bootclasspath /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/resources.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/rt.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/sunrsasign.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jsse.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jce.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/charsets.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/jfr.jar:/usr/lib/jvm/java-8-openjdk-amd64/jre/classes:/home/marco/.sbt/boot/scala-2.10.6/lib/scala-library.jar -language:implicitConversions -language:postfixOps
[error]
[error] last tree to typer: Literal(Constant(collection.Set))
[error] symbol: null
[error] symbol definition: null
[error] tpe: Class(classOf[scala.collection.Set])
[error] symbol owners:
[error] context owners: object DefaultSparkMasterProvider -> package util
[error]
[error] == Enclosing template or block ==
[error]
[error] Template( // val <local DefaultSparkMasterProvider>: <notype> in object DefaultSparkMasterProvider, tree.tpe=spark.jobserver.util.DefaultSparkMasterProvider.type
[error] "java.lang.Object", "spark.jobserver.util.SparkMasterProvider" // parents
[error] ValDef(
[error] private
[error] "_"
[error] <tpt>
[error] <empty>
[error] )
[error] // 2 statements
[error] DefDef( // def getSparkMaster(config: com.typesafe.config.Config): String in object DefaultSparkMasterProvider
[error] <method>
[error] "getSparkMaster"
[error] []
[error] // 1 parameter list
[error] ValDef( // config: com.typesafe.config.Config
[error] <param> <triedcooking>
[error] "config"
[error] <tpt> // tree.tpe=com.typesafe.config.Config
[error] <empty>
[error] )
[error] <tpt> // tree.tpe=String
[error] Apply( // def getString(x$1: String): String in trait Config, tree.tpe=String
[error] "config"."getString" // def getString(x$1: String): String in trait Config, tree.tpe=(x$1: String)String
[error] "spark.master"
[error] )
[error] )
[error] DefDef( // def <init>(): spark.jobserver.util.DefaultSparkMasterProvider.type in object DefaultSparkMasterProvider
[error] <method>
[error] "<init>"
[error] []
[error] List(Nil)
[error] <tpt> // tree.tpe=spark.jobserver.util.DefaultSparkMasterProvider.type
[error] Block( // tree.tpe=Unit
[error] Apply( // def <init>(): Object in class Object, tree.tpe=Object
[error] DefaultSparkMasterProvider.super."<init>" // def <init>(): Object in class Object, tree.tpe=()Object
[error] Nil
[error] )
[error] ()
[error] )
[error] )
[error] )
[error]
[error] == Expanded type of tree ==
[error]
[error] ConstantType(value = Constant(collection.Set))
[error]
[error] uncaught exception during compilation: java.io.IOException
[error] File name too long
[error] two errors found
[error] (job-server/compile:compileIncremental) Compilation failed
[error] Total time: 16 s, completed Oct 25, 2016 3:32:36 PM
I didn't find anything, somebody knows how to do this?
Thank you
Apparently, you can't run this on an encrypted folder on Ubuntu.
Moving the project folder to a disk partition non- encrypted made the magic.
For more infos, see: https://github.com/scala/pickling/issues/10

Gradle dependency and space in path

i am creating a Gradle plugin
when i try to set a jar path to project.dependencies compile which is having space, i am getting exception
* What went wrong:
A problem occurred evaluating root project 'visage-gradle-sample'.
> No signature of method: org.gradle.api.internal.artifacts.dsl.dependencies.DefaultDependencyHandler.compile() is applicable for argument types: (org.gradle.api.internal.file.collections.DefaultConfigurableFileCollection) values: [file collection]
Possible solutions: module(java.lang.Object)
I am trying to set it in MyPlugin apply method
private void configureSetup(project) {
final javafxHome = System.env["JAVAFX_HOME"]
def jfxJar =''
if (javafxHome) {
jfxJar = "${javafxHome}${File.separator}rt${File.separator}lib${File.separator}jfxrt.jar"
}
else{
final javaHome = System.env["JAVA_HOME"]
jfxJar = "${javaHome}${File.separator}jre${File.separator}lib${File.separator}jfxrt.jar"
}
if(!(new File(jfxJar)).exists())
throw new StopExecutionException("JAVAFX_HOME is not set or your JDK is not having JAVAFX jar.")
project.dependencies {
compile project.files(jfxJar)
}
}
I am trying to set JavaFX 2 jfxrt.jar which is in following path
C:\Program Files\Oracle\JavaFX 2.1 SDK\rt\lib\jfxrt.jar
the debug shows
15:36:31.036 [ERROR] [org.gradle.BuildExceptionReporter] Caused by: groovy.lang.MissingMethodException: No signature of method: org.gradle.api.internal.artifacts.dsl.dependencies.DefaultDependencyHandler.compile() is applicable for argument types: (org.gradle.api.internal.file.collections.DefaultConfigurableFileCollection) values: [file collection]
Possible solutions: module(java.lang.Object)
15:36:31.051 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.artifacts.dsl.dependencies.DefaultDependencyHandler.methodMissing(DefaultDependencyHandler.groovy:94)
15:36:31.067 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.artifacts.dsl.dependencies.DefaultDependencyHandler.invokeMethod(DefaultDependencyHandler.groovy)
15:36:31.082 [ERROR] [org.gradle.BuildExceptionReporter] at org.visage.gradle.plugin.VisagePlugin$_configureSetup_closure1.doCall(VisagePlugin.groovy:122)
15:36:31.098 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.util.ConfigureUtil.configure(ConfigureUtil.java:141)
15:36:31.114 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.util.ConfigureUtil.configure(ConfigureUtil.java:90)
15:36:31.129 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.project.AbstractProject.dependencies(AbstractProject.java:879)
15:36:31.129 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.Project$dependencies.call(Unknown Source)
15:36:31.160 [ERROR] [org.gradle.BuildExceptionReporter] at org.visage.gradle.plugin.VisagePlugin.configureSetup(VisagePlugin.groovy:121)
15:36:31.176 [ERROR] [org.gradle.BuildExceptionReporter] at org.visage.gradle.plugin.VisagePlugin.this$2$configureSetup(VisagePlugin.groovy)
15:36:31.176 [ERROR] [org.gradle.BuildExceptionReporter] at org.visage.gradle.plugin.VisagePlugin$this$2$configureSetup.callCurrent(Unknown Source)
15:36:31.207 [ERROR] [org.gradle.BuildExceptionReporter] at org.visage.gradle.plugin.VisagePlugin.apply(VisagePlugin.groovy:67)
15:36:31.207 [ERROR] [org.gradle.BuildExceptionReporter] at org.visage.gradle.plugin.VisagePlugin.apply(VisagePlugin.groovy)
15:36:31.223 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.plugins.DefaultProjectsPluginContainer.providePlugin(DefaultProjectsPluginContainer.java:107)
15:36:31.239 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.plugins.DefaultProjectsPluginContainer.addPluginInternal(DefaultProjectsPluginContainer.java:71)
15:36:31.254 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.plugins.DefaultProjectsPluginContainer.apply(DefaultProjectsPluginContainer.java:37)
15:36:31.270 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.plugins.DefaultObjectConfigurationAction.applyPlugin(DefaultObjectConfigurationAction.java:101)
15:36:31.285 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.plugins.DefaultObjectConfigurationAction.access$200(DefaultObjectConfigurationAction.java:32)
15:36:31.301 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.plugins.DefaultObjectConfigurationAction$3.run(DefaultObjectConfigurationAction.java:72)
15:36:31.317 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.plugins.DefaultObjectConfigurationAction.execute(DefaultObjectConfigurationAction.java:114)
15:36:31.348 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.project.AbstractProject.apply(AbstractProject.java:840)
15:36:31.364 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.Project$apply.call(Unknown Source)
15:36:31.379 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.internal.project.ProjectScript.apply(ProjectScript.groovy:34)
15:36:31.395 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.api.Script$apply.callCurrent(Unknown Source)
15:36:31.410 [ERROR] [org.gradle.BuildExceptionReporter] at build_1c4c4h5n90lk41hinuth6in4rk.run(D:\MyWorkBench\jugchennai\visage-gradle-plugin\visage-gradle-sample\build.gradle:8)
15:36:31.410 [ERROR] [org.gradle.BuildExceptionReporter] at org.gradle.groovy.scripts.internal.DefaultScriptRunnerFactory$ScriptRunnerImpl.run(DefaultScriptRunnerFactory.java:52)
15:36:31.426 [ERROR] [org.gradle.BuildExceptionReporter] ... 29 more
15:36:31.442 [ERROR] [org.gradle.BuildExceptionReporter]
15:36:31.457 [LIFECYCLE] [org.gradle.BuildResultLogger]
How to over come space in path ?
It doesn't look like the problem is related to a space in a path. I assume the code doesn't apply the java plugin, hence there is no compile configuration.
By the way, it is never necessary to use File.separator when passing paths to Gradle. Just use / and Gradle will do the right thing.

Resources