I know that since Groovy 2.0 there are annotations for static compilation.
However it's ease to omit such annotation by accident and still run into troubles.
Is there any way to achieve the opposite compiler behaviour, like compile static all project files by default and compile dynamic only files chosen by purpose with some kind #CompileDynamic annotation for example?
I have found some (I believe recently introduced) feature which allows doing so with Gradle.
In build.gradle file for the project containing groovy sources we need to add following lines:
compileGroovy {
configure(groovyOptions) {
configurationScript = file("$rootDir/config/groovy/compiler-config.groovy")
}
}
or compileTestGroovy { ... for applying the same to test sources. Keep in mind that neither static compilation nor type checking works well with Spock Framework though. Spock by its nature utilizes dynamic 'groovyness' a lot.
Then on a root of the project create folder config/groovy/ and a file named compiler-config.groovy within. The content of the file is as follows:
import groovy.transform.CompileStatic
withConfig(configuration) {
ast(CompileStatic)
}
Obviously path and name of the configurationScript may vary and it's up to you. It shouldn't rather go to the same src/main/groovy though as it would be mixing totally separate concerns.
The same may be done with groovy.transform.TypeChecked or any other annotation, of course.
To reverse applied behaviour on certain classes or methods then #CompileDynamic annotation or #TypeChecked(TypeCheckingMode.SKIP) respectively may be used.
I'm not sure how to achieve the same when no Gradle is in use as build tool. I may update this answer in the future with such info though.
Not at this time, but there is an open Jira issue here you can follow to watch progress for this feature
There was also a discussion about methods for doing this on the Groovy developers list
Related
I am getting an error
"Multiple dex files define Lcom/android/volley/toolbox/Volley;
Is it possible to use two volley libraries i.e compile com.android.volley:volley:1.0.0 and dev.dworks.libs:volleyplus:+ in a single project?
If these two volley libraries have different package names, it is fine to have both as dependency.
But, what is the added value for using two (same? or similar?) libraries? Why not refactor your project to slim down your dependencies?
The solution to multiple dex definition for a particular package and class is adding a proper packaging option. e.g.
packagingOptions {
pickFirst "anyFileWillDo"
exclude "/secret-data/**"
}
In my opinion, having a duplicate library dependency is really not a good practice, it may yield some uncertainties on the final binary. Because the pickFirst option is not deterministic on choosing the class from the right version of library, it will only pick up the one it sees first.
Please see here: https://google.github.io/android-gradle-dsl/current/com.android.build.gradle.internal.dsl.PackagingOptions.html
I am attempting to convert parts of an Android app to iOS using Doppl, and I am getting a strange result: Doppl keeps trying to pull in android.arch.lifecycle:reactivestreams, even though I don't want it to.
Specifically, in app/build/j2objcSrcGenMain/android/arch/lifecycle/, there is a reactivestrams/ subdirectory with R.h and R.m files in it. This seems to make Xcode cranky and may explain why I had some oddities with pod install.
My app/build.gradle has compile "android.arch.lifecycle:reactivestreams:$archVer", because my activity is using LiveDataReactiveStreams.fromPublisher(). However:
The activity is not in the translatePattern (and since its code is not showing up in app/build/j2objcSrcGenMain/, I have to assume that the translatePattern is fine)
I do not have a doppl statement related to reactivestreams, because there does not appear to be a Doppl conversion of this library (nor should it be needed here)
AFAIK, nowhere else in this app am I referring to LiveDataReactiveStreams, which AFAIK is the one-and-only public class from the reactivestreams library
So, the questions:
What determines whether Doppl creates R.h and R.m files for some dependency? It's not the existence of a doppl statement, as I have doppl statements for a lot of other dependencies (RxJava, RxAndroid, Retrofit) and those do not get R.h and R.m files. It's not whether the dependency is referenced from generated code, as my repository definitely uses RxJava and Retrofit, yet there are no R files for those.
How can I figure out why Doppl generates R.h and R.m for reactivestreams?
Once I get this cleared up... do I re-run pod install, or is there some other pod command to refresh an existing pod with a new implementation?
Look into 'app/build/generated/source/r/debug' and confirm there's an R.java being created for the architecture component. It'll be under 'android/arch/lifecycle/reactivestrams'.
I think there are 2 problems here.
Problem 1
Somehow Doppl/J2objc is of the opinion that this file should be transpiled. It could be either that 'translatePattern' matches with it, or that something in the shared code is referencing it. If you can't figure out which, please post a comment and I'll try to help (or post in slack group).
Problem 2
Regardless of why that 'R.java' is being sucked into the translate step, because of how stock J2objc is configured, the code is being generated with package folders instead of creating One Big Name. That generated file should be called 'AndroidArchLifecycleReactivestramsR.h' (and AndroidArchLifecycleReactivestramsR.m). Xcode really doesn't like package folders. That's why there's a slightly custom J2ojbc being used with Doppl, so we can have files with big names instead of folders.
In cases where you intentionally use package names that match with what J2objc considers to be "system" classes, you need to provide a header mapping file to force long names. The 'androidbase' doppl library needs to add a lot of files that are in the 'android' package, which J2objc considers "system". We override those names in the mapping file.
build.gradle
https://github.com/doppllib/core-doppl/blob/master/androidbase/build.gradle#L19
mapping file
https://github.com/doppllib/core-doppl/blob/master/androidbase/src/main/java/androidbase.mappings
I screwed up.
In my dopplConfig, I have:
translatePattern {
include '**/api/**'
include '**/arch/**'
include '**/RepositoryTest.java'
}
In this case, **/arch/** not only matches my arch package, but also the arch package from the Architecture Components.
Ordinarily, this would not matter, because the Architecture Components source code is not in my project. But, R.java gets generated, due to resources, and the translatePattern includes generated source code in addition to lovingly hand-crafted source code. So, that's where my extraneous Objective-C was coming from.
Many thanks to Kevin Galligan for his assistance with this, out on the #newbiehelp Doppl Slack channel!
I am writing haxe code which I want to compile to an arbitrary target as a module and then use the results from another module compiled for this same target. I don’t want to handle this the “Haxe way” (which is to fully inline all libraries at compiletime). Instead I want to be able to write distinct Haxe modules and reference them with full type safety without inlining between the modules. The natural way to do this would be to have both source Haxe files and a separate directory of “headers” filled with extern describing the public API of my module, with these externs somehow automatically generated so that they don’t need to be manually maintained.
I cannot figure out how to get Haxe to emit externs. It would make sense to me if haxe-externs were an actual “target platform” so that I could do something like:
$ haxe ClassName -hxe externsoutdir
It would make less sense but still be acceptable if one of the -D flags like -D dump (which seems to sort of get one part of the way there) or some imaginary, nonexistent -D dump-externs existed. Then you could generate externs while compiling to your favorite target:
$ haxe ClassName -js outfile.js -D shallow-expose -D dump-externs=externsoutdir
The idea is to take a class definition like this:
#:expose
class ClassName {
function quack() {
trace('quack');
}
}
and emit something like this in a separate directory:
extern class ClassName {
function quack():Void;
}
so that I can consume it from another module like this:
#:expose
class MyClassName extends ClassName {
override function quack() {
super.quack();
trace('…and again I say “quack”');
}
}
$ haxe -cp path\to\externsoutdir MyClassName -js outfile.js -D shallow-expose
It would only make sense to generate externs for things decorated with #:expose or some other decorator.
I will figure out how to wrap the emitted modules to load each other correctly. That’s easy. The hard part is generating the extern definitions—shouldn’t Haxe already have a way to do this?
Is there already some tool or built-in way I’m missing to do this? When Googling, all I see are projects that supposedly help with generating externs for existing JavaScript libraries. But that’s not my use case…
Update: --gen-hx-classes was removed sometime aroudn Haxe 4.0.0-rc3. Apparently the functionality still exists secretly as -D gen-hx-classes, but beware, if you rely on this, it seems like its going away.
I believe --gen-hx-classes option might be what you're looking for. Oddly I don't see it in the compiler flags list.
I use it in a modular JavaScript build system that is similar to what you're talking about.
I believe it creates a directory of .hx files that are externs for every class generated by the build (including those from the Haxe standard library.) Actually, getting duplicates of the classes in the standard library may be a problem you will face.
You may also need to use #:keep (or the related macro) to ensure dead code elimination doesn't remove things the other build will need.
You might also need to exclude a class from one or the other builds, e.g. --macro 'exclude("haxe.io.Input")' (or, excludeFile is actually more performant for a whole list of exclusions.)
I'd like to extend String's asType method to handle LocalDateTime. I know how to override this method, however I've no idea where should I put it in project structure to work globally - for all strings in my project. Is it enough to put such extension wherever in the classpath? I know that there's a special convention for extensions (META-INF/services), how does it work for method overriding?
All documentation regarding this topic can be found here. And here exactly the relevant part can be found.
Module extension and module descriptor
For Groovy to be able to load your extension methods, you must declare
your extension helper classes. You must create a file named
org.codehaus.groovy.runtime.ExtensionModule into the META-INF/services
directory:
org.codehaus.groovy.runtime.ExtensionModule moduleName=Test module for
specifications moduleVersion=1.0-test
extensionClasses=support.MaxRetriesExtension
staticExtensionClasses=support.StaticStringExtension The module
descriptor requires 4 keys:
moduleName : the name of your module
moduleVersion: the version of your module. Note that version number is
only used to check that you don’t load the same module in two
different versions.
extensionClasses: the list of extension helper classes for instance
methods. You can provide several classes, given that they are comma
separated.
staticExtensionClasses: the list of extension helper classes for
static methods. You can provide several classes, given that they are
comma separated.
Note that it is not required for a module to define both static
helpers and instance helpers, and that you may add several classes to
a single module. You can also extend different classes in a single
module without problem. It is even possible to use different classes
in a single extension class, but it is recommended to group extension
methods into classes by feature set.
Module extension and classpath
It’s worth noting that you can’t use an extension which is compiled at
the same time as code using it. That means that to use an extension,
it has to be available on classpath, as compiled classes, before the
code using it gets compiled. Usually, this means that you can’t have
the test classes in the same source unit as the extension class
itself. Since in general, test sources are separated from normal
sources and executed in another step of the build, this is not an
issue.
I've been tasked with creating conformance tests of user input, the task if fairly tricky and we need very high levels of reliability. The server runs on PHP, the client runs on JS, and I thought Haxe might reduce duplicative work.
However, I'm having trouble with deadcode removal. Since I am just creating helper functions (utilObject.isMeaningOfLife(42)) I don't have a main program that calls each one. I tried adding #:keep: to a utility class, but it was cut out anyway.
I tried to specify that utility class through the -main switch, but I had to add a dummy main() method and this doesn't scale beyond that single class.
You can force the inclusion of all the files defined in a given package and its sub packages to be included in the build using a compiler argument.
haxe --macro include('my.package') ..etc
This is a shortcut to the macro.Compiler.include function.
As you can see the signature of this function allows you to do it recursive and also exclude packages.
static include (pack:String, rec:Bool = true, ?ignore:Array<String>, ?classPaths:Array<String>):Void
I think you don't have to use #:keep in that case for each library class.
I'm not sure if this is what you are looking for, I hope it helps.
Otherwise this could be helpful checks:
Is it bad that the code is cut away if you don't use it?
It could also be the case some code is inlined in the final output?
Compile your code using the compiler flag -dce std as mentioned in comments.
If you use the static analyzer, don't use it.
Add #:keep and reference the class+function somewhere.
Otherwise provide minimal setup if you can reproduce.