Is there a sbt-native-packager option equivalent to sbt-pack packMain? - sbt-native-packager

I have a scala application library with several main classes.
When running sbt stage, sbt correctly creates bash scripts per main class that I have but with pre-defined names (taken from each class name).
I would like to control the name of the bash script and the JVM opts passed to each one.
For example: given two main classes: FooBar and BarFoo
I get bin/foo-bar and bin/bar-foo respectively.
I would like to somehow pass a map like
mainClasses := Map(
"newFooBar" -> "com.example.FooBar",
"newBarFoo" -> "com.example.BarFoo"
)
mainClassesJVM := Map(
"newFooBar" -> "-Xmx512m",
"newBarFoo" -> "-Xmx2g"
)
I found an sbt plugin called sbt-pack that does what I'm trying to achieve but I was wondering if I could achieve the same only with the sbt-native-packager plugin.
Example using sbt-pack plugin:
// [Optional] Specify mappings from program name -> Main class (full package path). If no value is set, it will find main classes
automatically
packMain := Map(
"newFooBar" -> "com.example.FooBar",
"newBarFoo" -> "com.example.BarFoo"
)
// [Optional] JVM options of scripts (program name -> Seq(JVM
option, ...))
packJvmOpts := Map(
"newFooBar" -> "-Xmx512m",
"newBarFoo" -> "-Xmx2g"
)
Does anyone know if there is an option to achieve the above using only sbt-native-packager?

Related

Groovy: how to use inner Enum of class as parameter type outside of class

Given is a class EnumTest that declares an inner enum MyEnum.
Using MyEnum as parameter type from within the class works as expected.
Using MyEnum as a parameter type outside of EnumTest fails to compile with unable to resolve class test.EnumTest.MyEnum.
I've browsed related questions, of which the best one was this, but they didn't address the specific issue of using the enum as a type.
Am I missing something very obvious here (as I'm very new to Groovy)? Or is this just another of the language's quirks "enhancements" regarding enums?
Edit: This is just a test demonstrating the issue. The actual issue happens in Jenkins JobDSL, and classpaths and imports seem to be fine there otherwise.
Groovy Version: 2.4.8
JVM: 1.8.0_201
Vendor: Oracle Corporation
OS: Linux
$ tree test
test
├── EnumTest.groovy
├── File2.groovy
└── File3.groovy
EnumTest.groovy:
package test
public class EnumTest {
public static enum MyEnum {
FOO, BAR
}
def doStuff(MyEnum v) {
println v
}
}
File2.groovy:
package test
import test.EnumTest
// prints BAR
new EnumTest().doStuff(EnumTest.MyEnum.BAR)
// prints FOO
println EnumTest.MyEnum.FOO
File3.groovy:
package test
import test.EnumTest
// fails: unable to resolve class test.EnumTest.MyEnum
def thisShouldWorkIMHO(EnumTest.MyEnum v) {
println v
}
When I'm running the test files using groovy -cp %, the output is as follows:
# groovy -cp . File2.groovy
BAR
FOO
# groovy -cp . File3.groovy
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
/home/lwille/-/test/GroovyTest2.groovy: 6: unable to resolve class EnumTest.MyEnum
# line 6, column 24.
def thisShouldWorkIMHO(EnumTest.MyEnum v) {
^
1 error
A few things worth mentioning. You don't need to import classes from the same package. Secondly, when you use a package test then you need to execute Groovy from the root folder, e.g. groovy test/File3.groovy to properly set up the classpath. (There is no need to use -cp . in such case).
Here's what it should look like.
$ tree test
test
├── EnumTest.groovy
├── File2.groovy
└── File3.groovy
0 directories, 3 files
test/EnumTest.groovy
package test
public class EnumTest {
public static enum MyEnum {
FOO, BAR
}
def doStuff(MyEnum v) {
println v
}
}
test/File2.groovy
package test
// prints BAR
new EnumTest().doStuff(EnumTest.MyEnum.BAR)
// prints FOO
println EnumTest.MyEnum.FOO
test/File3.groovy
package test
// fails: unable to resolve class test.EnumTest.MyEnum
def thisShouldWorkIMHO(EnumTest.MyEnum v) {
println v
}
thisShouldWorkIMHO(EnumTest.MyEnum.BAR)
The console output:
$ groovy test/File2.groovy
BAR
FOO
$ groovy test/File3.groovy
BAR
However, if you want to execute script from inside the test folder then you need to specify the classpath to point to the parent folder, e.g.:
$ groovy -cp ../. File3.groovy
BAR
$ groovy -cp . File3.groovy
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
/home/wololock/workspace/groovy-sandbox/src/test/File3.groovy: 4: unable to resolve class EnumTest.MyEnum
# line 4, column 24.
def thisShouldWorkIMHO(EnumTest.MyEnum v) {
^
1 error
UPDATE: the difference between Groovy 2.4 and 2.5 versions
One thing worth mentioning - the above solution works for Groovy 2.5.x and above. It is important to understand that things like methods parameters type check happen at the compiler's Phase.SEMANTIC_ANALYSIS phase. In Groovy 2.4 version, semantic analysis class resolving happens without loading classes. In case of using an inner class, it is critical to load its outer class so it can get resolved. Groovy 2.5 fixed that problem (intentionally or not) and semantic analysis resolves inner classes without an issue mentioned in this question.
For more detailed analysis, please check the following Stack Overflow question GroovyScriptEngine throws MultipleCompilationErrorsException while loading class that uses other class' static inner class where I have investigated a similar issue found in a Groovy 2.4 script. I explained there step by step how to dig down to the roots of this problem.

get location of script requiring current script

I need to do some file operations with paths relative to the script that required the current one.
Say we have the following in ~/somewhere/file2.js
const y = require('~/file1.js');
And in ~/file1.js we have:
const x = require('./other/script.js'); //relative to ~/file1.js
And we invoke it like this:
cd ~/somedir
node ~/somewhere/file2.js
then within ~/other/script.js we can do this:
console.log(__dirname); // -> ~/other
console.log(__filename); // -> ~/other/script.js
console.log(process.cwd()); // -> ~/somedir
console.log(process.argv[0]); // -> path/to/node
console.log(path.resolve('.')); // -> ~/somedir
console.log(process.argv[1]); // -> ~/somewhere/file2.js
None of these are the path I need.
How, from ~other/script.js, can I determine the location of the script that required us - i.e ~/file1.js
To put it another way.
~/somewhere/file2.js requires ~/file1.js
and
~/file1.js requires ~/other/script.js
from within ~/other/script.js I need to do file operations relative to ~/somewhere/file1.js - how can I get it's location?
I actually only need the directory in which file1.js sits, so filename or directory will work for me.
You can use module.parent.filename inside of other/script.js, or you can pass the __dirname as a parameter to your module like require('other/script.js')(__dirname) (given your module exports a function)

Library builder not triggering on modified file in SCons

I have to build a target using a two steps compilation.
The first step: .c -> .asm
The second step: .asm -> .o
I am creating a library from some .o files.
My implementation is the following:
The first step:
c_to_asm_builder = SCons.Builder.Builder(action = SCons.Defaults.CAction,
emitter = {},
suffix = '.asm',
src_suffix = ['.c','.cpp'],
src_builder = '',
source_scanner = SCons.Tool.CScanner
)
env['Builders']['CTOASM'] = c_to_asm_builder
The second step:
suffixesASM = ['.asm', '.s']
static_obj, shared_obj = SCons.Tool.createObjBuilders(env)
for suffix in suffixesASM:
static_obj.add_action(suffix, SCons.Defaults.ASAction)
I am then calling the builders as follows:
env.CTOASM(['file1.c', 'file2.c', 'file3.c'], CFLAGS = '-flag')
env.Object(['file1.asm', 'file2.asm', 'file3.asm'], ASFLAGS = '-flag')
I am creating a library like this:
env.Library('name', ['file1.o', 'file2.o'])
Everything works fine for the compilation.
The problem appers when:
I change file1.c content. I expect file1.c to pass trough these steps:
file1.c -> file1.asm -> file1.o and then name.a library to be recreated.
What happens:
Only c_to_asm_builder is retriggered by the change (file1.c -> file1.asm). The Object builder (file1.asm -> file1.o) is not retriggered and also the Library builder and Program builder are not retriggered.
I don't know what I am missing. I know that for a single step compilation that I configured in another project the Object builder and Library builder are somehow aware of each other.
How to make Library and Program builder aware of Object and CTOASM builders ?
You are not specifying an emitter, so either write one, or explicitly list your expected targets...
asm = env.CTOASM(['file1.asm', 'file2.asm', 'file3.asm'],
['file1.c', 'file2.c', 'file3.c'], CFLAGS = '-flag')
obj = env.Object(['file1.o', 'file2.o', 'file3.o'], asm, ASFLAGS = '-flag')
lib = env.Library('name', obj)
Here is a good reference on how to add an emitter to your builder.
https://bitbucket.org/scons/scons/wiki/ToolsForFools
Just scroll down to "Using Emitters".

Passing script results to main program in Scala 2.11 ScriptEngine

Using Scala Scripting Engine in 2.11 Milestone 7, how do I get a typed value back from the script engine? I'm getting error messages like "mypackage.Holler cannot be cast to mypackage.Holler".
Here is the use case (reduced to essentials). I want to use scripts to prepare and configure objects of a standard type that I will process in my main program. I have a trait:
package mypackage
trait Holler {
def shout: Unit
}
I have a user script in Scala, saved in the file /home/me/Foo.scala
object Foo extends mypackage.Holler {
def shout: Unit = println("Hello World!")
}
When I run this script using IMain.eval(Reader), I expect that the object Foo will be returned since it is the result of the last statement. Here is a program, including a couple useful printouts to run the script:
package mypackage
import javax.script.ScriptEngineManager
import scala.tools.nsc.interpreter.IMain
object Runner {
def main(args: Array[String]): Unit = {
// Create the script engine
val javaxEngine = new ScriptEngineManager().getEngineByName("scala")
val scalaEngine = javaEngine.asInstanceOf[IMain]
// Configure script engine to use the Java classpath
val useJavaClassPath = scalaEngine.settings.usejavacp.tryToSet(List("true"))
println("Use Java CP? " + useJavaClassPath)
val script = new java.io.FileReader("/home/me/Foo.scala")
val result = scalaEngine.eval(script)
println("Script Result Type: " + result.getClass.getName)
println("Defined Symbols: " + scalaEngine.definedSymbolList)
val myHoller = result.asInstanceOf[mypackage.Holler]
}
}
The script runs just fine under the script engine. But the result cannot be cast to Holler. The output of this program is as follows:
Use Java CP? Some(List(true))
Script Result Type: $line3.$read$$iw$$iw$Foo$
Defined Symbols: List(value engine, object iw$Foo)
Exception in thread "main" java.lang.ClassCastException: $line3.$read$$iw$$iw$Foo$ cannot be cast to mypackage.Holler
This tells me that the classpath is successfully recognized by the script engine, and that the Foo object is being constructed. But the trait mypackage.Holler (from the common classpath) inside the script is different from the trait mypackage.Holler in the main program.
If I add the following lines to the script:
Foo.shout
val result: Holler = Foo
I see:
The shout method being exercised ("Hello World!" prints out),
"result" is added to the list of defined symbols
result is clearly compatible with type Holler.
I can bind a "catcher" object to the script engine. Its code looks like this:
package mypackage
class Catcher {
var item: Holler = null
}
And I bind with
val mycatcher = new Catcher
scalaEngine.bind("catcher", mycatcher)
scalaEngine.eval("catcher = Foo")
Now "catcher" shows up in the list of defined symbols to the script engine and I can use the catcher to go into the script engine with a command like
scalaScriptEngine.eval("catcher.item = result")
But then I get strange "compile time" ClassCastExceptions saying:
mypackage.Holler cannot be cast to mypackage.Holler
If I make the "item" in the Catcher an Any, then I don't get the exception until I do
mycatcher.item.asInstanceOf[Holler]
in the main program. But I still get pretty much the same exception. It is as if two incompatible class loaders are being used with the same classpath. So how, from the main program, do I access the Foo object as an instance of Holler (which it clearly implements in the script engine)?

How to make interface and implementation files separately?

I would like to make interface (class, or instance) and implementation files in Haskell separately as follow:
file1: (For interface)
class X where
funcX1 = doFuncX1
funcX2 = doFuncX2
....
instance Y where
funcY1 = doFuncY1
funcY2 = doFuncY2
...
file 2: (For implementation)
doFuncX1 = ...
doFuncX2 = ...
doFuncY1 = ...
...
How can I do that when file1 must be imported in file2 and vice versa ?
You don't need any such cumbersome separation in Haskell. Just mark only what you want to be public in the module export list (module Foo ( X(..) ... ) where ...), build your project with cabal, and if you want to export a library but not release the source code you can simply publish only the dist folder with the binary interface files and the Haddock documentation. That's much more convenient than nasty e.g. .h and .cpp files that need to be kept manually in sync.
But of course, nothing prevents you from putting implementations in a seperate, non-public file. You just don't need to do "vice versa" imports for this, only perhaps a common file with the necessary data type declarations. E.g.
Public.hs:
module Public(module Public.Datatypes) where
import Public.Datatypes
import Private.Implementations
instance X Bar where { funcX1 = implFuncX1; ... }
Public/Datatypes.hs:
module Public.Datatypes where
data Bar = Bar { ... }
class X bar where { funcX1 :: ... }
Private/Implementations.hs:
module Private.Implementations(implFuncX1, ...) where
import Public.Datatypes
implFuncX1 :: ...
implFuncX1 = ...
But usually it would be better to simply put everything in Public.hs.

Resources