ClassNotFoundException while building jar in SBT - apache-spark

Hi all,
I'm using SBT to build my project, and here is the structure of my project.
HiveGenerator
├── build.sbt
├---lib
├── project
│ ├──
│ ├── assembly.sbt
│ └── plugins.sbt
├──
├──
└── src
└── main
└── scala
└── Main.scala
But i'm facing this error "java.lang.ClassNotFoundException: package.classname", no matter how many times i build it.
I have used,
sbt clean package
sbt clean assembly,but with no luck.My class is always missing from the jar.
Here is my build.sbt
lazy val root = (project in file(".")).
settings(
name := "kafkaToMaprfs",
version := "1.0",
scalaVersion := "2.10.5",
mainClass in Compile := Some("classname")
)
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-hive_2.10" % "1.6.1",
"org.apache.spark" % "spark-core_2.10" % "1.6.1",
"org.apache.spark" % "spark-sql_2.10" % "1.6.1",
"com.databricks" % "spark-avro_2.10" % "2.0.1",
"org.apache.avro" % "avro" % "1.8.1",
"org.apache.avro" % "avro-mapred" % "1.8.1",
"org.apache.avro" % "avro-tools" % "1.8.1",
"org.apache.spark" % "spark-streaming_2.10" % "1.6.1",
"org.apache.spark" % "spark-streaming-kafka_2.10" % "1.6.1",
"org.codehaus.jackson" % "jackson-mapper-asl" % "1.9.13",
"org.openrdf.sesame" % "sesame-rio-api" % "2.7.2",
"log4j" % "log4j" % "1.2.17",
"com.twitter" % "bijection-avro_2.10" % "0.7.0"
)
mergeStrategy in assembly <<= (mergeStrategy in assembly) { (old) =>
{
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
}
Here is my assembly.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.3")
plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-site" % "0.7.0")
resolvers += Resolver.url("bintray-sbt-plugins", url("http://dl.bintray.com/sbt/sbt-plugin-releases"))(Resolver.ivyStylePatterns)
resolvers += "OSS Sonatype" at "https://repo1.maven.org/maven2/"
However, im not able to build a fat jar or you can say jar-with-dependencies.jar like in maven.
In maven we have
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
Which helped me to accomplish this.
My question is,
*1. why am i not building a jar with all the classes in it?
2.Which commands should i use to create a jar with dependencies in sbt?
3.To we have anything equivalent to "descriptorRefs" in sbt to do the magic?*
Last question , which i didnt find answer to,
can't we achieve a proper output with sbt should we always use spark-submit to make it happen(not considering local or cluster modes)?
Thanks in advance.

Try deleting your ~/.ivy2/ or moving it out of the way and rebuild, letting everything reload from the net. Of course, you'll have to rebuild all of your local builds that contribute to your assembly as well.
I found your post because I had the same problem, this fixed it. It may not solve your issue, but it does solve some issues of this nature (I've seen it quite a bit).

Related

Link f2py generated *.so file in a python package using setuptools

I wish to deploy a package to PyPi using setuptools. However, the core part of the package is actually written in Fortran, and I am using f2py to wrap it in python. Basically the project's structure looks like this:
my_project
license.txt
README.md
setup.py
my_project
init.py
myfunc.py
hello.so
The module myfunc.py imports hello.so (import my_project.hello) which can then be used by functions inside myfunc.py. This works perfectly on my machine.
Then I tried standard setuptools installation: sudo python3 setup.py install on my Ubuntu, and it gets installed perfectly. But unfortunately, while importing, it throws ModuleNotFoundError: No module named 'hello'.
Now, from what I understand, on Linux based systems, for python, the shared libraries *.so are stored in /usr/lib/python3/dist-packages/. So I manually copied this hello.so there, and I got a working package! But of course this works only locally. What I would like to do is to tell setuptools to include hello.so inside the python-egg and automatically do the copying etc so that when a user uses pip3 install my_package, they will have access to this shared library automatically. I can see that numpy has somehow achieved that but even after looking at their code, I haven't been able to decode how they did it. Can someone help me with this? Thanks in advance.
You can achieve this with a setup.py file like this (simplified version, keep only the relevant parts for building external modules)
import os
from setuptools import setup, Extension
from setuptools.command.build_ext import build_ext
class f2py_Extension(Extension):
def __init__(self, name, sourcedirs):
Extension.__init__(self, name, sources=[])
self.sourcedirs = [os.path.abspath(sourcedir) for sourcedir in sourcedirs]
self.dirs = sourcedirs
class f2py_Build(build_ext):
def run(self):
for ext in self.extensions:
self.build_extension(ext)
def build_extension(self, ext):
# compile
for ind,to_compile in enumerate(ext.sourcedirs):
module_loc = os.path.split(ext.dirs[ind])[0]
module_name = os.path.split(to_compile)[1].split('.')[0]
os.system('cd %s;f2py -c %s -m %s' % (module_loc,to_compile,module_name))
setup(
name="foo",
ext_modules=[f2py_Extension('fortran_external',['foo/one.F90','foo/bar/two.F90'])],
cmdclass=dict(build_ext=f2py_Build),
)
The essential parts for building an external module are ext_modules and cmdclass in setup(...). ext_modules is just a list of Extension instances, each of which describes a set of extension modules. In the setup.py above, I tell ext_modules I want to create two external modules with two source files foo/test.F90 and foo/bar/two.F90. Based on ext_modules, cmdclass is responsible for compiling the two modules, in our case, the command for compiling the module is
'cd %s;f2py -c %s -m %s' % (module_loc,to_compile,module_name)
Project structure before installation
├── foo
│   ├── __init__.py
│   ├── bar
│   │   └── two.F90
│   └── one.F90
└── setup.py
Project structure after python setup.py install
├── build
│   └── bdist.linux-x86_64
├── dist
│   └── foo-0.0.0-py3.7-linux-x86_64.egg
├── foo
│   ├── __init__.py
│   ├── __pycache__
│   │   └── __init__.cpython-37.pyc
│   ├── bar
│   │   ├── two.F90
│   │   └── two.cpython-37m-x86_64-linux-gnu.so
│   ├── one.F90
│   └── one.cpython-37m-x86_64-linux-gnu.so
├── foo.egg-info
│   ├── PKG-INFO
│   ├── SOURCES.txt
│   ├── dependency_links.txt
│   └── top_level.txt
└── setup.py
The two source files one.F90 and two.F90 are very simple
one.F90
module test
implicit none
contains
subroutine add(a)
implicit none
integer :: a
integer :: b
b = a + 1
print *, 'one',b
end subroutine add
end module test
two.F90
module test
implicit none
contains
subroutine add(a)
implicit none
integer :: a
integer :: b
b = a + 2
print *, 'two',b
end subroutine add
end module test
After I installed the package, I can successfully run
>>> from foo.bar.two import test
>>> test.add(5)
two 7
and
>>> from foo.one import test
>>> test.add(5)
one 6
Here is an approach based on F2PY's documentation (the example there covers building multiple F2PY modules, and multiple source files per module), making use of numpy.distutils, that supports Fortran source files.
The structure of a minimal example with multiple F2PY extension modules is based on a src directory layout. It is not necessary/required, but has the advantage that the test routine cannot run unless the package has been installed successfully.
Source layout
my_project
|
+-- src
| |
| +-- my_project
| |
| +-- __init__.py
| +-- mod1.py
| +-- funcs_m.f90
| +-- two
| |
| +-- pluss2.f90
| +-- times2.f90
|
+-- test_my_project.py
+-- setup.py
setup.py
from setuptools import find_packages
from numpy.distutils.core import setup, Extension
ext1 = Extension(name='my_project.modf90',
sources=['src/my_project/funcs_m.f90'],
f2py_options=['--quiet'],
)
ext2 = Extension(name='my_project.oldf90',
sources=['src/my_project/two/plus2.f90', 'src/my_project/two/times2.f90'],
f2py_options=['--quiet'],
)
setup(name="my_project",
version="0.0.1",
package_dir={"": "src"},
packages=find_packages(where="src"),
ext_modules=[ext1, ext2])
__init__.py
The __init__.py file is empty. (Can e.g. import the F2PY modules here if desired)
mod1.py
def add(a, b):
""" add inputs a and b, and return """
return a + b
funcs_m.f90
module funcs_m
implicit none
contains
subroutine add(a, b, c)
integer, intent(in) :: a
integer, intent(in) :: b
integer, intent(out) :: c
c = a + b
end subroutine add
end module funcs_m
plus2.f90
subroutine plus2(x, y)
integer, intent(in) :: x
integer, intent(out) :: y
y = x + 2
end subroutine plus2
times2.f90
subroutine times2(x, y)
integer, intent(in) :: x
integer, intent(out) :: y
y = x * 2
end subroutine times2
test_my_project.py
import my_project.mod1
import my_project.oldf90
import my_project.modf90
print("mod1.add: 1 + 2 = ", my_project.mod1.add(1, 2))
print("modf90.funcs_m.add: 1 + 2 = ", my_project.modf90.funcs_m.add(1, 2))
x = 1
x = my_project.oldf90.plus2(x)
print("oldf90.plus2: 1 + 2 = ", x)
x = my_project.oldf90.times2(x)
print("oldf90.plus2: 3 * 2 = ", x)
Installing
Now, one can use pip to install the package. There are several advantages to using pip (including ease of upgrading, or uninstalling) as opposed to setup.py install (but this can still be used for building the package for distribution!). From the directory containing setup.py:
> python -m pip install .
Testing
And then, to test the just installed package
> python test_my_project.py
mod1.add: 1 + 2 = 3
modf90.funcs_m.add: 1 + 2 = 3
oldf90.plus2: 1 + 2 = 3
oldf90.plus2: 3 * 2 = 6
This setup has been tested with success on Windows 10 (with ifort), on Ubuntu 18.04 (with gfortran) and on MacOS High Sierra (with gfortran), all with Python 3.6.3.

How to update scons build, when source file changes

I do not get the automated build to update the project with SCons. First I change something in the source files and the scons tells me:
scons: done reading SConscript files.
scons: Building targets ...
scons: `.' is up to date.
scons: done building targets.
How to update an automated build?
UPDATE 20170601:
leder#PC-LAP127:~/Source/Eiffel/PF_HP-mt$ scons --tree=prune project=pf_hp.ecf
+-.
+-.sconf_temp
+-SConstruct
+-build
| +-build/F_code-unix.tar
| | +-pf_hp.ecf
| | +-project.py
| | +-/home/leder/Source/Eiffel/library/Eiffel-Loop/precomp/linux-x86-64/console-application.ecf
| | +-/home/leder/Source/Eiffel/library/Eiffel-Loop/precomp/console-application.ecf
| +-build/linux-x86-64
| +-build/linux-x86-64/package
| +-build/linux-x86-64/package/bin
| +-build/linux-x86-64/package/bin/pf_hp
| +-[build/F_code-unix.tar]
+-config.log
+-pf_hp.ecf
+-project.py
leder#PC-LAP127:~/Source/Eiffel/PF_HP-mt$ tree -L 2 .
.
├── build
│   ├── F_code-unix.tar
│   ├── linux-x86-64
│   └── version.txt
├── config.log
├── EIFGENs
│   └── classic
├── git_push.sh
├── input.txt
├── LICENSE.gpl
├── LIESMICH.txt
├── pf_hp.ecf
├── pf_hp.ecf.old
├── pf_hp.pecf
├── project.py
├── project.pyc
├── README.txt
├── SConstruct
├── source
│   ├── application_root.e
│   ├── build_info.e
│   ├── folding
│   ├── notes
│   ├── sub-applications
│   └── testing
└── test.sh
9 directories, 17 files
leder#PC-LAP127:~/Source/Eiffel/PF_HP-mt$ less SConstruct
import eiffel_loop.eiffel.SConstruct
SConstruct (END)
UPDATE_20170601, eiffel_loop.eiffel.SConstruct.py:
# author: "Finnian Reilly"
# copyright: "Copyright (c) 2001-2012 Finnian Reilly"
# contact: "finnian at eiffel hyphen loop dot com"
# license: "MIT license (See: en.wikipedia.org/wiki/MIT_License)"
# date: "3 June 2010"
# revision: "0.2"
import os, sys
from os import path
from eiffel_loop.eiffel import project
from eiffel_loop.scons import eiffel
from eiffel_loop.eiffel.ecf import EIFFEL_CONFIG_FILE
from eiffel_loop.eiffel.ecf import FREEZE_BUILD
from eiffel_loop.eiffel.ecf import C_CODE_TAR_BUILD
from eiffel_loop.eiffel.ecf import FINALIZED_BUILD
from SCons.Script import *
# SCRIPT START
arguments = Variables()
arguments.Add (EnumVariable('cpu', 'Set target cpu for compiler', 'x64', allowed_values=('x64', 'x86')))
arguments.Add (
EnumVariable('action', 'Set build action', 'finalize',
allowed_values=(
Split ("freeze finalize finalize_and_test finalize_and_install install_resources make_installers")
)
)
)
arguments.Add (BoolVariable ('compile_eiffel', 'Compile Eiffel source (no implies C compile only)', 'yes'))
arguments.Add (BoolVariable ('install', 'Set to \'yes\' to install finalized release', 'no'))
arguments.Add (PathVariable ('project', 'Path to Eiffel configuration file', 'default.ecf'))
#arguments.Add (
# ListVariable (
# 'MSC_options', 'Visual Studio setenv.cmd options', '', Split ("/Debug /Release /x86 /x64 /ia64 /vista /xp /2003 /2008 /win7")
# )
#)
env = Environment (variables = arguments)
Help (arguments.GenerateHelpText (env) + '\nproject: Set to name of Eiffel project configuration file (*.ecf)\n')
if env.GetOption ('help'):
None
else:
is_windows_platform = sys.platform == 'win32'
project_py = project.read_project_py ()
# MSC_options = env.get ('MSC_options').data
# if MSC_options:
# project_py.MSC_options = MSC_options
# print 'MSC_options:', project_py.MSC_options
ecf_path = env.get ('project')
action = env.get ('action')
compile_eiffel = env.get ('compile_eiffel')
project_py.set_build_environment (env.get ('cpu'))
env.Append (ENV = os.environ, ISE_PLATFORM = os.environ ['ISE_PLATFORM'])
if 'ISE_C_COMPILER' in os.environ:
env.Append (ISE_C_COMPILER = os.environ ['ISE_C_COMPILER'])
config = EIFFEL_CONFIG_FILE (ecf_path)
project_files = [ecf_path, 'project.py']
if action == 'install_resources':
build = FREEZE_BUILD (config, project_py)
build.post_compilation ()
else:
if action in ['finalize', 'make_installers']:
tar_build = C_CODE_TAR_BUILD (config, project_py)
build = FINALIZED_BUILD (config, project_py)
if compile_eiffel:
env.Append (EIFFEL_BUILD = tar_build)
env.Append (BUILDERS = {'eiffel_compile' : Builder (action = eiffel.compile_eiffel)})
f_code = env.eiffel_compile (tar_build.target (), project_files)
else:
f_code = None
else:
build = FREEZE_BUILD (config, project_py)
f_code = None
env.Append (C_BUILD = build)
env.Append (BUILDERS = {'c_compile' : Builder (action = eiffel.compile_C_code)})
if f_code:
executable = env.c_compile (build.target (), tar_build.target ())
else:
executable = env.c_compile (build.target (), project_files)
if build.precompile_path:
env.Append (BUILDERS = {'precomp_copier' : Builder (action = eiffel.copy_precompile)})
precompile_name = path.basename (build.precompile_path)
precompile_dir = path.dirname (path.dirname (build.precompile_path))
precomp_ecf = env.precomp_copier (build.precompile_path, path.join (precompile_dir, precompile_name))
if f_code:
Depends (tar_build.target (), build.precompile_path)
else:
Depends (executable, build.precompile_path)
eiffel.check_C_libraries (env, build)
if len (build.SConscripts) > 0:
print "\nDepends on External libraries:"
for script in build.SConscripts:
print "\t" + script
SConscript (build.SConscripts, exports='env')
# only make library a dependency if it doesn't exist or object files are being cleaned out
lib_dependencies = []
for lib in build.scons_buildable_libs:
if env.GetOption ('clean') or not path.exists (lib):
if not lib in lib_dependencies:
lib_dependencies.append (lib)
Depends (executable, lib_dependencies)
productions = [executable, precomp_ecf]
if f_code:
productions.append (tar_build.target ())
env.NoClean (productions)
If SCons does not know or see what files have been changed, an alternative is to run EiffelStudio compiler every time. It performs quick incremental recompilation in workbench mode, so you are not penalized by waiting for recompilation from scratch.
Note. If you are not using graphical environment, projects can be built with a slightly smaller and slightly faster ecb version of the compiler (instead of the regular ec). But this comes at the cost of incompatibility with the IDE (e.g., in completely non-interactive compilation setups).

Play WSClient Nosuchmethod error with Spark 2.2 Snapshot

I'm using a Play WsClient to send requests to a Spray server endpoint that fronts a Spark driver program. The problematic call is here:
def serializeDataset(requestUrl: String, recipe: Recipe): Future[(Option[String], String, Int)] = {
ws.url(requestUrl).post(Json.toJson(recipe)).map { response =>
val code = (response.json \ "code").as[Int]
code match {
case OK => ((response.json \ "uuid").asOpt[String], (response.json \ "schema").as[String], code)
case _ => ((response.json \ "message").asOpt[String], "", code)
}
}
}
When executed, I get this error
Caused by: java.lang.NoSuchMethodError: io.netty.util.internal.PlatformDependent.newAtomicIntegerFieldUpdater(Ljava/lang/Class;Ljava/lang/String;)Ljava/util/concurrent/atomic/AtomicIntegerFieldUpdater;
at org.asynchttpclient.netty.NettyResponseFuture.<clinit>(NettyResponseFuture.java:52)
at org.asynchttpclient.netty.request.NettyRequestSender.newNettyResponseFuture(NettyRequestSender.java:311)
at org.asynchttpclient.netty.request.NettyRequestSender.newNettyRequestAndResponseFuture(NettyRequestSender.java:193)
at org.asynchttpclient.netty.request.NettyRequestSender.sendRequestWithCertainForceConnect(NettyRequestSender.java:129)
at org.asynchttpclient.netty.request.NettyRequestSender.sendRequest(NettyRequestSender.java:107)
at org.asynchttpclient.DefaultAsyncHttpClient.execute(DefaultAsyncHttpClient.java:216)
at org.asynchttpclient.DefaultAsyncHttpClient.executeRequest(DefaultAsyncHttpClient.java:184)
at play.api.libs.ws.ahc.AhcWSClient.executeRequest(AhcWS.scala:45)
at play.api.libs.ws.ahc.AhcWSRequest$.execute(AhcWS.scala:90)
at play.api.libs.ws.ahc.AhcWSRequest$$anon$2.execute(AhcWS.scala:166)
at play.api.libs.ws.ahc.AhcWSRequest.execute(AhcWS.scala:168)
at play.api.libs.ws.WSRequest$class.post(WS.scala:510)
at play.api.libs.ws.ahc.AhcWSRequest.post(AhcWS.scala:107)
at webservices.DataFrameService.serializeDataset(DataFrameService.scala:36)
It looks like the WSClient is picking up a version of Netty that doesn't include the relevant function.
This issue occurs when I compile the application with the 2.2-SNAPSHOT version of Spark, but not when I compile with the 2.1 version. I don't have an idea as to why this change would make a difference. The Spark driver program is a separate project in my sbt build.
My suspicion is that this has something to do with the packaging of the application and its dependencies. Here is what I have tried in sbt to recitify:
Added an explicit ("io.netty" % "netty-all" % "4.0.43.Final") to my dependencies
Added exclude statements to the spark imports like so:
"org.apache.spark" %% "spark-sql" % sparkV exclude("org.jboss.netty","netty") exclude("io.netty","netty")
"org.apache.spark" %% "spark-core" % sparkV exclude("org.jboss.netty","netty") exclude("io.netty","netty")
"org.apache.spark" %% "spark-mllib" % sparkV exclude("org.scalamacros", "quasiquotes") exclude("org.jboss.netty","netty") exclude("io.netty","netty")
"org.apache.spark" %% "spark-hive" % sparkV exclude("org.scalamacros", "quasiquotes") exclude("org.jboss.netty","netty") exclude("io.netty","netty")
Changed the order in which the play-ws module is added to the project dependencies (moved it to the end, moved it to the beginning)
Any help much appreciated.
On further review, I found that there was a lingering dependency to the Spark libraries within the Play project. I removed this and it seems to be working.

Play Framework 2.3 no resources after migration

I have a play framework 2.1 application. This app worked. Then I migrated to 2.2, tested it and it worked. Now I am migrating to 2.3 and I got an error like:
[debug] application - Unforseen error for favicon.svg at /public
java.lang.RuntimeException: no resource
at controllers.Assets$$anonfun$controllers$Assets$$assetInfoFromResource$1$$anonfun$13.apply(Assets.scala:237) ~[na:na]
at controllers.Assets$$anonfun$controllers$Assets$$assetInfoFromResource$1$$anonfun$13.apply(Assets.scala:237) ~[na:na]
at scala.Option.getOrElse(Option.scala:120) [na:na]
at controllers.Assets$$anonfun$controllers$Assets$$assetInfoFromResource$1.apply(Assets.scala:237) ~[na:na]
at controllers.Assets$$anonfun$controllers$Assets$$assetInfoFromResource$1.apply(Assets.scala:236) ~[na:na]
There is a /public folder, but all resources are resulting in above error. The app serves such resources as 404 Not Found.
Any help would be great. Some cleaning procecese, cached files I can delete, redownload dependencies or maybe I have a wrong configuration.
Here are some config files I have for better understanding:
build.sbt:
import com.typesafe.sbt.less.Import.LessKeys import play.PlayJava
name := """blabla-de"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
scalaVersion := "2.11.1"
libraryDependencies ++= Seq( filters, cache, javaCore, javaWs, javaJdbc, javaEbean, "org.webjars" % "bootstrap" % "3.0.0", "org.webjars" % "rjs" % "2.1.11-1-trireme" % "test", "org.webjars" % "squirejs" % "0.1.0" % "test", "junit" % "junit" % "4.11" % "test" )
testOptions += Tests.Argument(TestFrameworks.JUnit, "-v")
LessKeys.compress in Assets := true
includeFilter in (Assets, LessKeys.less) := "*.less"
excludeFilter in (Assets, LessKeys.less) := "_*.less"
pipelineStages := Seq(rjs, digest, gzip)
conf/routes:
# Home page
GET / controllers.Index.index()
GET /about controllers.About.index()
## Contact Page
GET /contact controllers.Contact.index()
POST /contact controllers.Contact.newContact()
## Gallery List
GET /portfolio controllers.Portfolio.index()
## Text(HTML) Page
GET /impressum controllers.Impressum.index()
#GET /legal
GET /privacy controllers.Privacy.index()
# Map static resources from the /public folder to the / URL path
GET /*file controllers.Assets.at(path="/public", file)
project/build.properties:
sbt.version=0.13.5
project/plugins.sbt:
// Comment to get more information during initialization
logLevel := Level.Warn
// The Typesafe repository
resolvers += "Typesafe repository" at "http://repo.typesafe.com/typesafe/releases/"
// The Play plugin
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.1")
// web plugins
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.3.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-less" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-coffeescript" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-rjs" % "1.0.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-digest" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-gzip" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-mocha" % "1.0.0")
Thank You in Advance.
The key is in the migration guide: https://www.playframework.com/documentation/2.4.x/Migration23
The largest new feature for Play 2.3 is the introduction of sbt-web.
In summary sbt-web allows Html, CSS and JavaScript functionality to be
factored out of Play’s core into a family of pure sbt plugins.
Unavailability of public resources must be caused by improper configuration of sbt-web.
What I can see immediately is that you forgot to enable SbtWeb plugins. Here is another quote from the migration guide:
declaring addSbtPlugin may not be sufficient for plugins that now
utilize to the auto plugin functionality.
So you need to fix the following line in your build.sbt:
lazy val root = (project in file(".")).enablePlugins(PlayJava, SbtWeb)

How to avoid having version numbers in .so file name

I'm trying to build a dynamic library on Linux using qmake. Here is my .pro file:
TEMPLATE = lib
TARGET = sqxUiBase
QT += core gui
CONFIG += dll
INCLUDEPATH += ../../public/include
DEPENDPATH += .
UI_DIR += ../GeneratedFiles
RCC_DIR += ../GeneratedFiles
CONFIG(release, debug|release) {
DESTDIR = ../lib/release
LIBS += -L"../lib/release"
MOC_DIR += ../GeneratedFiles/release
OBJECTS_DIR += release
} else {
DESTDIR = ../lib/debug
LIBS += -L"../lib/debug"
MOC_DIR += ../GeneratedFiles/debug
OBJECTS_DIR += debug
}
include(sqxUiBase.pri)
The sqxUiBase.pri file contains the list of files that need to be built.
Now, the problem is that whatever I do, the resulting file is always named sqxUiBase.so.1.0.0, with a bunch of symlinks (sqxUiBase.so, sqxUiBase.so.1 and sqxUiBase.so.1.0) pointing to it. How can I make it so that there's only a sqxUiBase.so file and no links?
What you are looking for is making a plugin.
Add CONFIG += plugin to your project file, and qmake will generate a Makefile that builds a libFoo.so file, without the numbered links
After looking at the qmake source I found CONFIG += unversioned_libname for nix and CONFIG += skip_target_version_ext for windows.

Resources