I'm using a Play WsClient to send requests to a Spray server endpoint that fronts a Spark driver program. The problematic call is here:
def serializeDataset(requestUrl: String, recipe: Recipe): Future[(Option[String], String, Int)] = {
ws.url(requestUrl).post(Json.toJson(recipe)).map { response =>
val code = (response.json \ "code").as[Int]
code match {
case OK => ((response.json \ "uuid").asOpt[String], (response.json \ "schema").as[String], code)
case _ => ((response.json \ "message").asOpt[String], "", code)
}
}
}
When executed, I get this error
Caused by: java.lang.NoSuchMethodError: io.netty.util.internal.PlatformDependent.newAtomicIntegerFieldUpdater(Ljava/lang/Class;Ljava/lang/String;)Ljava/util/concurrent/atomic/AtomicIntegerFieldUpdater;
at org.asynchttpclient.netty.NettyResponseFuture.<clinit>(NettyResponseFuture.java:52)
at org.asynchttpclient.netty.request.NettyRequestSender.newNettyResponseFuture(NettyRequestSender.java:311)
at org.asynchttpclient.netty.request.NettyRequestSender.newNettyRequestAndResponseFuture(NettyRequestSender.java:193)
at org.asynchttpclient.netty.request.NettyRequestSender.sendRequestWithCertainForceConnect(NettyRequestSender.java:129)
at org.asynchttpclient.netty.request.NettyRequestSender.sendRequest(NettyRequestSender.java:107)
at org.asynchttpclient.DefaultAsyncHttpClient.execute(DefaultAsyncHttpClient.java:216)
at org.asynchttpclient.DefaultAsyncHttpClient.executeRequest(DefaultAsyncHttpClient.java:184)
at play.api.libs.ws.ahc.AhcWSClient.executeRequest(AhcWS.scala:45)
at play.api.libs.ws.ahc.AhcWSRequest$.execute(AhcWS.scala:90)
at play.api.libs.ws.ahc.AhcWSRequest$$anon$2.execute(AhcWS.scala:166)
at play.api.libs.ws.ahc.AhcWSRequest.execute(AhcWS.scala:168)
at play.api.libs.ws.WSRequest$class.post(WS.scala:510)
at play.api.libs.ws.ahc.AhcWSRequest.post(AhcWS.scala:107)
at webservices.DataFrameService.serializeDataset(DataFrameService.scala:36)
It looks like the WSClient is picking up a version of Netty that doesn't include the relevant function.
This issue occurs when I compile the application with the 2.2-SNAPSHOT version of Spark, but not when I compile with the 2.1 version. I don't have an idea as to why this change would make a difference. The Spark driver program is a separate project in my sbt build.
My suspicion is that this has something to do with the packaging of the application and its dependencies. Here is what I have tried in sbt to recitify:
Added an explicit ("io.netty" % "netty-all" % "4.0.43.Final") to my dependencies
Added exclude statements to the spark imports like so:
"org.apache.spark" %% "spark-sql" % sparkV exclude("org.jboss.netty","netty") exclude("io.netty","netty")
"org.apache.spark" %% "spark-core" % sparkV exclude("org.jboss.netty","netty") exclude("io.netty","netty")
"org.apache.spark" %% "spark-mllib" % sparkV exclude("org.scalamacros", "quasiquotes") exclude("org.jboss.netty","netty") exclude("io.netty","netty")
"org.apache.spark" %% "spark-hive" % sparkV exclude("org.scalamacros", "quasiquotes") exclude("org.jboss.netty","netty") exclude("io.netty","netty")
Changed the order in which the play-ws module is added to the project dependencies (moved it to the end, moved it to the beginning)
Any help much appreciated.
On further review, I found that there was a lingering dependency to the Spark libraries within the Play project. I removed this and it seems to be working.
Related
When I build my Sconstruct file, I am getting the below error.
scons: *** Found dependency cycle(s):
build/sselser/sselConfigArgs.h -> build/sselser/sselConfigArgs.h
Internal Error: no cycle found for node build/sselser/sselMain (<SCons.Node.FS.File instance at 0x9f61e8>) in state pending
Internal Error: no cycle found for node build/sselser/sselMain.o (<SCons.Node.FS.File instance at 0x9f2e68>) in state pending
File "/nfs/scons/scons-1.3.0/lib/scons-1.3.0/SCons/Taskmaster.py", line 1026, in cleanup
I guess this is due to dependency of sselMain in sselTransorm as the error occurs during the build of sselTransform directory.
Makefile in sselTransform:
UNIT_SUPPORT_FILES += ../sselser/sselMain intest ../../../make/Makenv
MDE_SUPPORT_FILES += ../sselser/sselMain intest ../../../make/Makenv
I need to add the same in Sconscript of sselTransform directory to resolve this issue.
How to resolve this issue?
Sconscript:
#Set CPPPATH, RPATH, DEFINES and CCFLAGS
env = Environment(CPPPATH =['.','../sselTransform','../sselSm','../sselSRC'],
RPATH = ['/l-n/app/colr/lib/infra/SunOS5.10/WS12.0'],CPPDEFINES = ['THREADSAFE','_RWSTD_SOLARIS_THREADS','_SVID_GETTO
D','DEBUG','sun5'],CCFLAGS = ['library=rwtools7_std','features=no%tmplife','-pta','-mt','-xdebugformat=stabs','-g0','-xildoff'])
env['CXX']=CXX
Src = Split('sselManager.C PromoNotifyMgr.C ')
env.StaticLibrary('libSselser-g0.a',Src)
Src1 = Split('sselMain.C sselManager.o PromoNotifyMgr.o ')
env.Program('sselMain',Src1)
configfile = 'sselConfigArgs.h'
CONFIG_PATH = '../../build/include/'
CONFIG=CONFIG_PATH+configfile
env.Command(CONFIG,configfile,
[Copy('$TARGET', '$SOURCE'),
Chmod('$TARGET', 0444)])
Sconstruct:
SConscript('src/ssel/sselser/SConscript',variant_dir='build/sselser',duplicate=0,exports='env')
Try this?
Notes:
I'm saving the build objects for your two source files and using those in both the program and static library.
I've added the target dir you're copying the header file to earlier in the CPPPATH.
You could have skipped the variables configfile, CONFIG_PATH, CONFIG and just used the strings in your Command.
You are using a VERY old version of SCons. If you're limited to python 2.7 please try using SCons 3.0.1? If you're not and can use Python 3.6, then try using SCons 4.3.0.
#Set CPPPATH, RPATH, DEFINES and CCFLAGS
env = Environment(
CPPPATH =['.','../include','../sselTransform','../sselSm','../sselSRC'],
RPATH = ['/l-n/app/colr/lib/infra/SunOS5.10/WS12.0'],
CPPDEFINES = ['THREADSAFE','_RWSTD_SOLARIS_THREADS','_SVID_GETTOD','DEBUG','sun5'],
CCFLAGS = ['library=rwtools7_std','features=no%tmplife','-pta','-mt','-xdebugformat=stabs','-g0','-xildoff'])
env['CXX']=CXX
Src = ['sselManager.C','PromoNotifyMgr.C']
objects = []
for s in Src:
objects.extend(env.StaticObject(s))
env.StaticLibrary('Sselser-g0',objects)
Src1 = ['sselMain.C'] + objects
env.Program('sselMain', Src1)
configfile = 'sselConfigArgs.h'
CONFIG_PATH = '../include/'
CONFIG=CONFIG_PATH+configfile
env.Command(CONFIG, configfile,
[Copy('$TARGET', '$SOURCE'),
Chmod('$TARGET', 0444)])
I am trying to understand how cats effect Cancelable works. I have the following minimal app, based on the documentation
import java.util.concurrent.{Executors, ScheduledExecutorService}
import cats.effect._
import cats.implicits._
import scala.concurrent.duration._
object Main extends IOApp {
def delayedTick(d: FiniteDuration)
(implicit sc: ScheduledExecutorService): IO[Unit] = {
IO.cancelable { cb =>
val r = new Runnable {
def run() =
cb(Right(()))
}
val f = sc.schedule(r, d.length, d.unit)
// Returning the cancellation token needed to cancel
// the scheduling and release resources early
val mayInterruptIfRunning = false
IO(f.cancel(mayInterruptIfRunning)).void
}
}
override def run(args: List[String]): IO[ExitCode] = {
val scheduledExecutorService =
Executors.newSingleThreadScheduledExecutor()
for {
x <- delayedTick(1.second)(scheduledExecutorService)
_ <- IO(println(s"$x"))
} yield ExitCode.Success
}
}
When I run this:
❯ sbt run
[info] Loading global plugins from /Users/ethan/.sbt/1.0/plugins
[info] Loading settings for project stackoverflow-build from plugins.sbt ...
[info] Loading project definition from /Users/ethan/IdeaProjects/stackoverflow/project
[info] Loading settings for project stackoverflow from build.sbt ...
[info] Set current project to cats-effect-tutorial (in build file:/Users/ethan/IdeaProjects/stackoverflow/)
[info] Compiling 1 Scala source to /Users/ethan/IdeaProjects/stackoverflow/target/scala-2.12/classes ...
[info] running (fork) Main
[info] ()
The program just hangs at this point. I have many questions:
Why does the program hang instead of terminating after 1 second?
Why do we set mayInterruptIfRunning = false? Isn't the whole point of cancellation to interrupt a running task?
Is this the recommended way to define the ScheduledExecutorService? I did not see examples in the docs.
This program waits 1 second, and then returns () (then unexpectedly hangs). What if I wanted to return something else? For example, let's say I wanted to return a string, the result of some long-running computation. How would I extract that value from IO.cancelable? The difficulty, it seems, is that IO.cancelable returns the cancelation operation, not the return value of the process to be cancelled.
Pardon the long post but this is my build.sbt:
name := "cats-effect-tutorial"
version := "1.0"
fork := true
scalaVersion := "2.12.8"
libraryDependencies += "org.typelevel" %% "cats-effect" % "1.3.0" withSources() withJavadoc()
scalacOptions ++= Seq(
"-feature",
"-deprecation",
"-unchecked",
"-language:postfixOps",
"-language:higherKinds",
"-Ypartial-unification")
you need shutdown the ScheduledExecutorService, Try this
Resource.make(IO(Executors.newSingleThreadScheduledExecutor))(se => IO(se.shutdown())).use {
se =>
for {
x <- delayedTick(5.second)(se)
_ <- IO(println(s"$x"))
} yield ExitCode.Success
}
I was able to find an answer to these questions although there are still some things that I don't understand.
Why does the program hang instead of terminating after 1 second?
For some reason, Executors.newSingleThreadScheduledExecutor() causes things to hang. To fix the problem, I had to use Executors.newSingleThreadScheduledExecutor(new Thread(_)). It appears that the only difference is that the first version is equivalent to Executors.newSingleThreadScheduledExecutor(Executors.defaultThreadFactory()), although nothing in the docs makes it clear why this is the case.
Why do we set mayInterruptIfRunning = false? Isn't the whole point of cancellation to interrupt a running task?
I have to admit that I do not understand this entirely. Again, the docs were not especially clarifying on this point. Switching the flag to true does not seem to change the behavior at all, at least in the case of Ctrl-c interrupts.
Is this the recommended way to define the ScheduledExecutorService? I did not see examples in the docs.
Clearly not. The way that I came up with was loosely inspired by this snippet from the cats effect source code.
This program waits 1 second, and then returns () (then unexpectedly hangs). What if I wanted to return something else? For example, let's say I wanted to return a string, the result of some long-running computation. How would I extract that value from IO.cancelable? The difficulty, it seems, is that IO.cancelable returns the cancelation operation, not the return value of the process to be cancelled.
The IO.cancellable { ... } block returns IO[A] and the callback cb function has type Either[Throwable, A] => Unit. Logically this suggests that whatever is fed into the cb function is what the IO.cancellable expression will returned (wrapped in IO). So to return the string "hello" instead of (), we rewrite delayedTick:
def delayedTick(d: FiniteDuration)
(implicit sc: ScheduledExecutorService): IO[String] = { // Note IO[String] instead of IO[Unit]
implicit val processRunner: JVMProcessRunner[IO] = new JVMProcessRunner
IO.cancelable[String] { cb => // Note IO.cancelable[String] instead of IO[Unit]
val r = new Runnable {
def run() =
cb(Right("hello")) // Note "hello" instead of ()
}
val f: ScheduledFuture[_] = sc.schedule(r, d.length, d.unit)
IO(f.cancel(true))
}
}
You need explicitly terminate the executor at the end, as it is not managed by Scala or Cats runtime, it wouldn't exit by itself, that's why your App hands up instead of exit immediately.
mayInterruptIfRunning = false gracefully terminates a thread if it is running. You can set it as true to forcely kill it, but it is not recommanded.
You have many way to create a ScheduledExecutorService, it depends on need. For this case it doesn't matter, but the question 1.
You can return anything from the Cancelable IO by call cb(Right("put your stuff here")), the only thing blocks you to retrieve the return A is when your cancellation works. You wouldn't get anything if you stop it before it gets to the point. Try to return IO(f.cancel(mayInterruptIfRunning)).delayBy(FiniteDuration(2, TimeUnit.SECONDS)).void, you will get what you expected. Because 2 seconds > 1 second, your code gets enough time to run before it has been cancelled.
I want a Scala string to be ${foo.bar} (literally, for testing some variable substitution later).
I tried:
val str = "${foo.bar}"
val str = """${foo.bar}"""
val str = "\${foo.bar}"
val str = "$${foo.bar}"
val str = "$\{foo.bar}"
All giving compile errors like Error:(19, 15) possible missing interpolator: detected an interpolated expression or invalid escape character.
This is not a question about String interpolation (or variable substitution), This normally works without problems. Starting the Scala REPL (Scala 2.11.3, Java 1.8) works as expected. Somewhere there must be an SBT a setting (other than -Xlint or a hidden Xlint) which apparently is causing this behavior (from commandline and IntelliJ).
The s or f interpolator will emit a constant:
$ scala -Xlint
Welcome to Scala 2.12.6 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_144).
Type in expressions for evaluation. Or try :help.
scala> "${foo.bar}"
<console>:12: warning: possible missing interpolator: detected an interpolated expression
"${foo.bar}"
^
res0: String = ${foo.bar}
scala> f"$${foo.bar}"
res1: String = ${foo.bar}
It's usual to use -Xfatal-warnings to turn the warning into an error. IntelliJ reports it as an error at the source position, whereas scalac reports it as a warning, but with a summary error message that will fail a build.
\$ and \{ are invalid escape characters and will not compile. The other versions compile just fine on 2.12.6 though perhaps there are problems in earlier versions.
Hi all,
I'm using SBT to build my project, and here is the structure of my project.
HiveGenerator
├── build.sbt
├---lib
├── project
│ ├──
│ ├── assembly.sbt
│ └── plugins.sbt
├──
├──
└── src
└── main
└── scala
└── Main.scala
But i'm facing this error "java.lang.ClassNotFoundException: package.classname", no matter how many times i build it.
I have used,
sbt clean package
sbt clean assembly,but with no luck.My class is always missing from the jar.
Here is my build.sbt
lazy val root = (project in file(".")).
settings(
name := "kafkaToMaprfs",
version := "1.0",
scalaVersion := "2.10.5",
mainClass in Compile := Some("classname")
)
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-hive_2.10" % "1.6.1",
"org.apache.spark" % "spark-core_2.10" % "1.6.1",
"org.apache.spark" % "spark-sql_2.10" % "1.6.1",
"com.databricks" % "spark-avro_2.10" % "2.0.1",
"org.apache.avro" % "avro" % "1.8.1",
"org.apache.avro" % "avro-mapred" % "1.8.1",
"org.apache.avro" % "avro-tools" % "1.8.1",
"org.apache.spark" % "spark-streaming_2.10" % "1.6.1",
"org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1",
"org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13",
"org.openrdf.sesame" % "sesame-rio-api" % "2.7.2",
"log4j" % "log4j" % "1.2.17",
"com.twitter" % "bijection-avro_2.10" % "0.7.0"
)
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
}
Here is my assembly.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")
plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-site" % "0.7.0")
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
resolvers += "OSS Sonatype" at "https://repo1.maven.org/maven2/"
However, im not able to build a fat jar or you can say jar-with-dependencies.jar like in maven.
In maven we have
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
Which helped me to accomplish this.
My question is,
*1. why am i not building a jar with all the classes in it?
2.Which commands should i use to create a jar with dependencies in sbt?
3.To we have anything equivalent to "descriptorRefs" in sbt to do the magic?*
Last question , which i didnt find answer to,
can't we achieve a proper output with sbt should we always use spark-submit to make it happen(not considering local or cluster modes)?
Thanks in advance.
Try deleting your ~/.ivy2/ or moving it out of the way and rebuild, letting everything reload from the net. Of course, you'll have to rebuild all of your local builds that contribute to your assembly as well.
I found your post because I had the same problem, this fixed it. It may not solve your issue, but it does solve some issues of this nature (I've seen it quite a bit).
I have a play framework 2.1 application. This app worked. Then I migrated to 2.2, tested it and it worked. Now I am migrating to 2.3 and I got an error like:
[debug] application - Unforseen error for favicon.svg at /public
java.lang.RuntimeException: no resource
at controllers.Assets$$anonfun$controllers$Assets$$assetInfoFromResource$1$$anonfun$13.apply(Assets.scala:237) ~[na:na]
at controllers.Assets$$anonfun$controllers$Assets$$assetInfoFromResource$1$$anonfun$13.apply(Assets.scala:237) ~[na:na]
at scala.Option.getOrElse(Option.scala:120) [na:na]
at controllers.Assets$$anonfun$controllers$Assets$$assetInfoFromResource$1.apply(Assets.scala:237) ~[na:na]
at controllers.Assets$$anonfun$controllers$Assets$$assetInfoFromResource$1.apply(Assets.scala:236) ~[na:na]
There is a /public folder, but all resources are resulting in above error. The app serves such resources as 404 Not Found.
Any help would be great. Some cleaning procecese, cached files I can delete, redownload dependencies or maybe I have a wrong configuration.
Here are some config files I have for better understanding:
build.sbt:
import com.typesafe.sbt.less.Import.LessKeys import play.PlayJava
name := """blabla-de"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
scalaVersion := "2.11.1"
libraryDependencies ++= Seq( filters, cache, javaCore, javaWs, javaJdbc, javaEbean, "org.webjars" % "bootstrap" % "3.0.0", "org.webjars" % "rjs" % "2.1.11-1-trireme" % "test", "org.webjars" % "squirejs" % "0.1.0" % "test", "junit" % "junit" % "4.11" % "test" )
testOptions += Tests.Argument(TestFrameworks.JUnit, "-v")
LessKeys.compress in Assets := true
includeFilter in (Assets, LessKeys.less) := "*.less"
excludeFilter in (Assets, LessKeys.less) := "_*.less"
pipelineStages := Seq(rjs, digest, gzip)
conf/routes:
# Home page
GET / controllers.Index.index()
GET /about controllers.About.index()
## Contact Page
GET /contact controllers.Contact.index()
POST /contact controllers.Contact.newContact()
## Gallery List
GET /portfolio controllers.Portfolio.index()
## Text(HTML) Page
GET /impressum controllers.Impressum.index()
#GET /legal
GET /privacy controllers.Privacy.index()
# Map static resources from the /public folder to the / URL path
GET /*file controllers.Assets.at(path="/public", file)
project/build.properties:
sbt.version=0.13.5
project/plugins.sbt:
// Comment to get more information during initialization
logLevel := Level.Warn
// The Typesafe repository
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.1")
// web plugins
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-less" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-coffeescript" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-rjs" % "1.0.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-digest" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-gzip" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-mocha" % "1.0.0")
Thank You in Advance.
The key is in the migration guide: https://www.playframework.com/documentation/2.4.x/Migration23
The largest new feature for Play 2.3 is the introduction of sbt-web.
In summary sbt-web allows Html, CSS and JavaScript functionality to be
factored out of Play’s core into a family of pure sbt plugins.
Unavailability of public resources must be caused by improper configuration of sbt-web.
What I can see immediately is that you forgot to enable SbtWeb plugins. Here is another quote from the migration guide:
declaring addSbtPlugin may not be sufficient for plugins that now
utilize to the auto plugin functionality.
So you need to fix the following line in your build.sbt:
lazy val root = (project in file(".")).enablePlugins(PlayJava, SbtWeb)