Just starting a new Gradle project.
This test passes:
def 'Launcher.main should call App.launch'(){
given:
GroovyMock(Application, global: true)
when:
Launcher.main()
then:
1 * Application.launch( App, null ) >> null
}
... until, to get another test using a (Java) Mock to work, I have to add these dependencies:
testImplementation 'net.bytebuddy:byte-buddy:1.10.8'
testImplementation 'org.objenesis:objenesis:3.1'
(NB I assume these versions are OK for Groovy 3.+, which I'm now using ... both are the most up-to-date available at Maven Repo).
With these dependencies the above test fails:
java.lang.InstantiationError: javafx.application.Application
at org.objenesis.instantiator.sun.SunReflectionFactoryInstantiator.newInstance(SunReflectionFactoryInstantiator.java:48)
at org.objenesis.ObjenesisBase.newInstance(ObjenesisBase.java:73)
at org.objenesis.ObjenesisHelper.newInstance(ObjenesisHelper.java:44)
at org.spockframework.mock.runtime.MockInstantiator$ObjenesisInstantiator.instantiate(MockInstantiator.java:45)
at org.spockframework.mock.runtime.MockInstantiator.instantiate(MockInstantiator.java:31)
at org.spockframework.mock.runtime.GroovyMockFactory.create(GroovyMockFactory.java:57)
at org.spockframework.mock.runtime.CompositeMockFactory.create(CompositeMockFactory.java:42)
at org.spockframework.lang.SpecInternals.createMock(SpecInternals.java:47)
at org.spockframework.lang.SpecInternals.createMockImpl(SpecInternals.java:298)
at org.spockframework.lang.SpecInternals.createMockImpl(SpecInternals.java:288)
at org.spockframework.lang.SpecInternals.GroovyMockImpl(SpecInternals.java:215)
at core.AppSpec.Launcher.main should call App.launch(first_tests.groovy:30)
I confess that I have only the sketchiest notion of what "bytebuddy" and "objenesis" actually do, although I assume it is fiendishly clever. Edit: having just visited their respective home pages my notion is now slightly less sketchy, and yes, it is fiendishly clever.
If an orthodox solution to this is not available, is it by any chance possible to turn off the use of these dependencies for an individual feature (i.e. test)? Possibly using some annotation maybe?
Edit
This is an MCVE:
Specs: Java 11.0.5, OS Linux Mint 18.3.
build.gradle:
plugins {
id 'groovy'
id 'java'
id 'application'
id 'org.openjfx.javafxplugin' version '0.0.8'
}
repositories { mavenCentral() }
javafx {
version = "11.0.2"
modules = [ 'javafx.controls', 'javafx.fxml' ]
}
dependencies {
implementation 'org.codehaus.groovy:groovy:3.+'
testImplementation 'junit:junit:4.12'
testImplementation 'org.spockframework:spock-core:2.0-M2-groovy-3.0'
testImplementation 'net.bytebuddy:byte-buddy:1.10.8'
testImplementation 'org.objenesis:objenesis:3.1'
// in light of kriegaex's comments:
implementation group: 'cglib', name: 'cglib', version: '3.3.0'
}
test { useJUnitPlatform() }
application {
mainClassName = 'core.Launcher'
}
installDist{}
main.groovy:
class Launcher {
static void main(String[] args) {
Application.launch(App, null )
}
}
class App extends Application {
void start(Stage primaryStage) {
}
}
first_tests.groovy:
class AppSpec extends Specification {
def 'Launcher.main should call App.launch'(){
given:
GroovyMock(Application, global: true)
when:
Launcher.main()
then:
1 * Application.launch( App, null ) >> null
}
}
The reason why this project needs something to call the Application subclass is explained here: it's so that it is possible to do an installDist which bundles in JavaFX.
Don't we have to use a global GroovyMock?
If you want to check the interaction, yes. But actually you are testing the JavaFX launcher rather than your application. So I doubt that there is any benefit. I would focus on testing the App class instead. Also imagine for a moment that you would write the classes with main methods in Java instead of Groovy. Groovy mocks would not work when called from Java code, especially not global ones. Then you would end up testing via Powermockito from Spock, which would also work but still you would test the JavaFX launcher rather than your application.
Also isn't it slightly extreme to say any use of Groovy mocks is wrong?
I did not say that. I said: "probably something is wrong with your application design". The reason I said that is because the use of Groovy mocks and things like mocking static methods are test code smells. You can check the smell and then decide it is okay, which IMO in most cases it is not. Besides, instead of application design the problem can also be in the test itself, which in this case I would say it is. But that is arguable, so I am going to present a solution to you further below.
In this case technically the global Application mock is your only way if you do insist to test the JavaFX launcher because even a global mock on App would not work as the launcher uses reflection in order to call the App constructor and that is not intercepted by the mock framework.
you say that Spock spock-core:2.0-M2-groovy-3.0 is a "pre-release". I can't see anything on this page (...) which says that. How do you know?
You found out already by checking out the GitHub repository, but I was just seeing it in the unusual version number containing "M2" like "milestone 2" which is similar to "RC" (or "CR") for release candidates (or candidate releases).
As for the technical problem, you can either not declare Objenesis in your Gradle script because it is an optional dependency, then the test compiles and runs fine, as you already noticed yourself. But assuming you need optional dependencies like Objenesis, CGLIB (actually cglib-nodep), Bytebuddy and ASM for other tests in your suite, you can just tell Spock not to use Objenesis in this case. So assuming you have a Gradle build file like this:
plugins {
id 'groovy'
id 'java'
id 'application'
id 'org.openjfx.javafxplugin' version '0.0.8'
}
repositories { mavenCentral() }
javafx {
version = "11.0.2"
modules = ['javafx.controls', 'javafx.fxml']
}
dependencies {
implementation 'org.codehaus.groovy:groovy:3.+'
testImplementation 'org.spockframework:spock-core:2.0-M2-groovy-3.0'
// Optional Spock dependencies, versions matching the ones listed at
// https://mvnrepository.com/artifact/org.spockframework/spock-core/2.0-M2-groovy-3.0
testImplementation 'net.bytebuddy:byte-buddy:1.9.11'
testImplementation 'org.objenesis:objenesis:3.0.1'
testImplementation 'cglib:cglib-nodep:3.2.10'
testImplementation 'org.ow2.asm:asm:7.1'
}
test { useJUnitPlatform() }
application {
mainClassName = 'de.scrum_master.app.Launcher'
}
installDist {}
My version of your MCVE would looks like this (sorry, I added my own package names and also imports because otherwise it is not really an MCVE):
package de.scrum_master.app
import javafx.application.Application
import javafx.scene.Scene
import javafx.scene.control.Label
import javafx.scene.layout.StackPane
import javafx.stage.Stage
class App extends Application {
#Override
void start(Stage stage) {
def javaVersion = System.getProperty("java.version")
def javafxVersion = System.getProperty("javafx.version")
Label l = new Label("Hello, JavaFX $javafxVersion, running on Java $javaVersion.")
Scene scene = new Scene(new StackPane(l), 640, 480)
stage.setScene(scene)
stage.show()
}
}
package de.scrum_master.app
import javafx.application.Application
class Launcher {
static void main(String[] args) {
Application.launch(App, null)
}
}
package de.scrum_master.app
import javafx.application.Application
import spock.lang.Specification
class AppSpec extends Specification {
def 'Launcher.main should call App.launch'() {
given:
GroovyMock(Application, global: true, useObjenesis: false)
when:
Launcher.main()
then:
1 * Application.launch(App, null)
}
}
The decisive detail here is the useObjenesis: false parameter.
Update: Just for reference, this is how you would do it with a launcher class implemented in Java using PowerMockito.
Attention, this solution needs the Sputnik runner from Spock 1.x which was removed in 2.x. So in Spock 2 this currently does not work because it is based on JUnit 5 and can no longer use #RunWith(PowerMockRunner) and #PowerMockRunnerDelegate(Sputnik) because PowerMock currently does not support JUnit 5. But I tested it with Spock 1.3-groovy-2.5 and Groovy 2.5.8.
package de.scrum_master.app
import javafx.application.Application
import org.junit.runner.RunWith
import org.powermock.core.classloader.annotations.PrepareForTest
import org.powermock.modules.junit4.PowerMockRunner
import org.powermock.modules.junit4.PowerMockRunnerDelegate
import org.spockframework.runtime.Sputnik
import spock.lang.Specification
import static org.mockito.Mockito.*
import static org.powermock.api.mockito.PowerMockito.*
#RunWith(PowerMockRunner)
#PowerMockRunnerDelegate(Sputnik)
#PrepareForTest(Application)
class JavaAppSpec extends Specification {
def 'JavaLauncher.main should launch JavaApp'() {
given:
mockStatic(Application)
when:
JavaLauncher.main()
then:
verifyStatic(Application, times(1))
Application.launch(JavaApp)
}
}
Related
In Spring it's possible to define bean dependencies in separate modules, which are then resolved via the classpath at runtime. Is it possible to do something similar in Quarkus?
For example, a multi-module setup that looks like this:
- service
- service-test
- service-artifact
In Spring it's possible to define #Configuration in the service module, that resolves concrete dependencies at runtime via the classpath of its current context, either service-test or service-artifact, allowing injection of dummy or test dependencies when under test, and real ones in the production artifact.
For example, a class in service requires an instance of SomeInterface. The implementation of SomeInterface is defined in either the -test or -artifact module. The service module has no direct dependency on either the -test or -artifact modules.
Some code:
In the service module:
#ApplicationScoped
class OrderService(private val repository: OrderRepository) {
fun process(order: Order) {
repository.save(order)
}
}
interface OrderRepository {
fun save(order: Order)
}
In the service-test module:
class InMemoryOrderRepository : OrderRepository {
val orders = mutableListOf<Order>()
override fun save(order: Order) {
orders.add(order)
}
}
class OrderServiceTestConfig {
#ApplicationScoped
fun orderRepository(): OrderRepository {
return InMemoryOrderRepository()
}
}
#QuarkusTest
class OrderServiceTest {
#Inject
private lateinit var service: OrderService
#Test
fun `injected order service with resolved repository dependency`() {
// This builds and runs OK
service.process(Order("some_test_order"))
}
}
Where I have tried to replicate a Spring-style setup as above in Quarkus, ArC validation is failing with UnsatisfiedResolutionException on the build of the service module, even though everywhere it is actually consumed provides the correct dependencies; a test successfully resolves the dependency and passes.
How do I achieve the separation of dependency interface from the implementation, and keep ArC validation happy, with Quarkus?
(Note: this behaviour occurs with Java and Maven also.)
I have included a maven example here. Note that ./mvnw install fails with the UnsatisfiedResolutionException but that it's possible to build and run the test successfully using ./mvnw test.
Build files:
root project build.gradle.kts:
import org.jetbrains.kotlin.gradle.tasks.KotlinCompile
plugins {
kotlin("jvm") version "1.3.72"
kotlin("plugin.allopen") version "1.3.72"
}
allprojects {
group = "my-group"
version = "1.0.0-SNAPSHOT"
repositories {
mavenLocal()
mavenCentral()
}
}
subprojects {
apply {
plugin("kotlin")
plugin("kotlin-allopen")
}
java {
sourceCompatibility = JavaVersion.VERSION_11
targetCompatibility = JavaVersion.VERSION_11
}
allOpen {
annotation("javax.ws.rs.Path")
annotation("javax.enterprise.context.ApplicationScoped")
annotation("io.quarkus.test.junit.QuarkusTest")
}
apply {
plugin("kotlin")
}
dependencies {
implementation("org.jetbrains.kotlin:kotlin-reflect")
implementation("org.jetbrains.kotlin:kotlin-stdlib-jdk8")
}
tasks.withType<KotlinCompile> {
kotlinOptions.jvmTarget = JavaVersion.VERSION_11.toString()
kotlinOptions.javaParameters = true
}
}
build.gradle.kts for service:
import io.quarkus.gradle.tasks.QuarkusDev
plugins {
id("io.quarkus") version "1.9.1.Final"
}
apply {
plugin("io.quarkus")
}
dependencies {
implementation(project(":common:model"))
implementation(enforcedPlatform("io.quarkus:quarkus-universe-bom:1.9.1.Final"))
implementation("io.quarkus:quarkus-kotlin")
}
build.gradle.kts for service-test:
import io.quarkus.gradle.tasks.QuarkusDev
plugins {
id("io.quarkus") version "1.9.1.Final"
}
apply {
plugin("io.quarkus")
}
dependencies {
implementation(project(":service"))
implementation(enforcedPlatform("io.quarkus:quarkus-universe-bom:1.9.1.Final"))
implementation("io.quarkus:quarkus-kotlin")
testImplementation("io.quarkus:quarkus-junit5")
}
Try to use instance injection (java example):
import javax.enterprise.inject.Instance;
...
#Inject
Instance<MyBeanClass> bean;
...
bean.get(); // for a single bean
bean.stream(); // for a collection
Unfortunately, Quarkus has a bit different way of creating and injecting beans as in Spring.
It's using "simplified bean discovery", and that means that the beans are scanned on the classpath during the build time, but only those that have annotations considered as "discovery mode", are taken into the account.
Those would be: #ApplicationScoped, #SessionScoped, #ConversationScoped and #RequestScoped, #Interceptor and #Decorator more described here
In addition to that, beans must not have the visibility boundaries.
In you'd like to use beans from other modules, create configuration within that module annotated with #Dependent and create beans with #Producer annotation and one of the above annotations.
But notice, despite that, some beans are treated by Quarkus in a special way (ex. all beans that have #Path annotation) and those should be annotated preferably with #ApplicationScope, using either constructor, or field injection. Produced by #Producer methods won't allow for all the magic that Quarkus is doing.
If you'd like some more quarkus dependant beans, ex. the bean that bounds a configuration (using #ConfigMapping annotated beans), in addition you need to have either beans.xml in your META-INF directory, or which seems easier to add the jandex index to your build system:
plugins {
id("io.quarkus") version "2.14.1.Final"
id("org.kordamp.gradle.jandex") version "1.0.0"
}
Summary: don't use configuration beans as in spring, only the constructor/field injection, and to have beans discovered from different modules, add the jandex index file using plugin.
I am wondering how would you use typescript IOC specifically node app.
In case of external module-based architecture there is no any classes in the app. Just pure modules because my app heavily depends on node_modules.
How would I integrate IOC solution in such case? Any thoughts?
Here is my specific case I want to use IOC for:
I have mongoose model:
interface IStuffModel extends IStuff, mongoose.Document { }
var Stuff= mongoose.model<IStuffModel>('Stuff', Schemas.stuffSchema);
export = Stuff;
And related fake class:
export class Stuff implements IStuff {
//do stuff
}
How would I integrate IOC solution in such case
Here is a very popular library that I recommend : https://github.com/inversify/InversifyJS
External modules
Using external modules doesn't change the code at all. Instead of
kernel.bind(new TypeBinding<FooBarInterface>("FooBarInterface", FooBar));
Production
You just have
import {ProdFooBar} from "./prodFooBar";
kernel.bind(new TypeBinding<FooBarInterface>("FooBarInterface", ProdFooBar));
Test
import {MockFooBar} from "./mockFooBar";
kernel.bind(new TypeBinding<FooBarInterface>("FooBarInterface", MockFooBar));
As Basarat indicated in his answer, I have developed an IoC container called InversifyJS with advanced dependency injection features like contextual bindings.
You need to follow 3 basic steps to use it:
1. Add annotations
The annotation API is based on Angular 2.0:
import { injectable, inject } from "inversify";
#injectable()
class Katana implements IKatana {
public hit() {
return "cut!";
}
}
#injectable()
class Shuriken implements IShuriken {
public throw() {
return "hit!";
}
}
#injectable()
class Ninja implements INinja {
private _katana: IKatana;
private _shuriken: IShuriken;
public constructor(
#inject("IKatana") katana: IKatana,
#inject("IShuriken") shuriken: IShuriken
) {
this._katana = katana;
this._shuriken = shuriken;
}
public fight() { return this._katana.hit(); };
public sneak() { return this._shuriken.throw(); };
}
2. Declare bindings
The binding API is based on Ninject:
import { Kernel } from "inversify";
import { Ninja } from "./entities/ninja";
import { Katana } from "./entities/katana";
import { Shuriken} from "./entities/shuriken";
var kernel = new Kernel();
kernel.bind<INinja>("INinja").to(Ninja);
kernel.bind<IKatana>("IKatana").to(Katana);
kernel.bind<IShuriken>("IShuriken").to(Shuriken);
export default kernel;
3. Resolve dependencies
The resolution API is based on Ninject:
import kernel = from "./inversify.config";
var ninja = kernel.get<INinja>("INinja");
expect(ninja.fight()).eql("cut!"); // true
expect(ninja.sneak()).eql("hit!"); // true
The latest release (2.0.0) supports many use cases:
Kernel modules
Kernel middleware
Use classes, string literals or Symbols as dependency identifiers
Injection of constant values
Injection of class constructors
Injection of factories
Auto factory
Injection of providers (async factory)
Activation handlers (used to inject proxies)
Multi injections
Tagged bindings
Custom tag decorators
Named bindings
Contextual bindings
Friendly exceptions (e.g. Circular dependencies)
You can learn more about it at https://github.com/inversify/InversifyJS
In the particular context of Node.js there is a hapi.js example that uses InversifyJS.
I would like feedback on the best practices for defining plugin tasks that depend on external state (i.e. defined in the build.gradle that referenced the plugin). I'm using extension objects and closures to defer accessing those settings until they're needed and available. I'm also interested in sharing state between tasks, e.g. configuring the outputs of one task to be the inputs of another.
The code uses "project.afterEvaluate" to define the tasks when the required settings have been configured through the extension object. This seems more complex than should be needed. If I move the code out of the "afterEvaluate", it gets compileFlag == null which isn't the external setting. If the code is changed again to use the << or doLast syntax, then it will get the external flag... but then it fails to work with type:Exec and other similarly helpful types.
I feel that I'm fighting Gradle in some ways, which means I don't understand better how to work well with it. The following is a simplified pseudo-code of what I'm using. This works but I'm looking to see if this can be simplified, or indeed what the best practices are. Also, the exception shouldn't be thrown unless the tasks are being executed.
apply plugin: MyPlugin
class MyPluginExtension {
String compileFlag = null
}
class MyPlugin implements Plugin<Project> {
void apply(Project project) {
project.extensions.create("myPluginConfig", MyPluginExtension)
project.afterEvaluate {
// Closure delays getting and checking flag until strictly needed
def compileFlag = {
if (project.myPluginConfig.compileFlag == null) {
throw new InvalidUserDataException(
"Must set compileFlag: myPluginConfig { compileFlag = '-flag' }")
}
return project.myPluginConfig.compileFlag
}
// Inputs for translateTask
def javaInputs = {
project.files(project.fileTree(
dir: project.projectDir, includes: ['**/*.java']))
}
// This is the output of the first task and input to the second
def translatedOutputs = {
project.files(javaInputs().collect { file ->
return file.path.replace('src/', 'build/dir/')
})
}
// Translates all java files into 'translatedOutputs'
project.tasks.create(name: 'translateTask', type:Exec) {
inputs.files javaInputs()
outputs.files translatedOutputs()
executable '/bin/echo'
inputs.files.each { file ->
args file.path
}
}
// Compiles 'translatedOutputs' to binary
project.tasks.create(name: 'compileTask', type:Exec, dependsOn: 'translateTask') {
inputs.files translatedOutputs()
outputs.file project.file(project.buildDir.path + '/compiledBinary')
executable '/bin/echo'
args compileFlag()
translatedOutputs().each { file ->
args file.path
}
}
}
}
}
I'd look at this problem another way. It seems like what you want to put in your extension is really owned by each of your tasks. If you had something that was a "global" plugin configuration option, would it be treated as an input necessarily?
Another way of doing this would have been to use your own SourceSets and wire those into your custom tasks. That's not quite easy enough yet, IMO. We're still pulling together the JVM and native representations of sources.
I'd recommend extracting your Exec tasks as custom tasks with a #TaskAction that does the heavy lifting (even if it just calls project.exec {}). You can then annotate your inputs with #Input, #InputFiles, etc and your outputs with #OutputFiles, #OutputDirectory, etc. Those annotations will help auto-wire your dependencies and inputs/outputs (I think that's where some of the fighting is coming from).
Another thing that you're missing is if the compileFlag effects the final output, you'd want to detect changes to it and force a rebuild (but not a re-translate).
I simplified the body of the plugin class by using the Groovy .with method.
I'm not completely happy with this (I think the translatedFiles could be done differently), but I hope it shows you some of the best practices. I made this a working example (as long as you have a src/something.java) by implementing the translate as a copy/rename and the compile as something that just creates an 'executable' file (contents is just the list of the inputs). I've also left your extension class in place to demonstrate the "global" plug-in config. Also take a look at what happens with compileFlag is not set (I wish the error was a little better).
The translateTask isn't going to be incremental (although, I think you could probably figure out a way to do that). So you'd probably need to delete the output directory each time. I wouldn't mix other output into that directory if you want to keep that simple.
HTH
apply plugin: 'base'
apply plugin: MyPlugin
class MyTranslateTask extends DefaultTask {
#InputFiles FileCollection srcFiles
#OutputDirectory File translatedDir
#TaskAction
public void translate() {
// println "toolhome is ${project.myPluginConfig.toolHome}"
// translate java files by renaming them
project.copy {
includeEmptyDirs = false
from(srcFiles)
into(translatedDir)
rename '(.+).java', '$1.m'
}
}
}
class MyCompileTask extends DefaultTask {
#Input String compileFlag
#InputFiles FileCollection translatedFiles
#OutputDirectory File outputDir
#TaskAction
public void compile() {
// write inputs to the executable file
project.file("$outputDir/executable") << "${project.myPluginConfig.toolHome} $compileFlag ${translatedFiles.collect { it.path }}"
}
}
class MyPluginExtension {
File toolHome = new File("/some/sane/default")
}
class MyPlugin implements Plugin<Project> {
void apply(Project project) {
project.with {
extensions.create("myPluginConfig", MyPluginExtension)
tasks.create(name: 'translateTask', type: MyTranslateTask) {
description = "Translates all java files into translatedDir"
srcFiles = fileTree(dir: projectDir, includes: [ '**/*.java' ])
translatedDir = file("${buildDir}/dir")
}
tasks.create(name: 'compileTask', type: MyCompileTask) {
description = "Compiles translated files into outputDir"
translatedFiles = fileTree(tasks.translateTask.outputs.files.singleFile) {
includes [ '**/*.m' ]
builtBy tasks.translateTask
}
outputDir = file("${buildDir}/compiledBinary")
}
}
}
}
myPluginConfig {
toolHome = file("/some/custom/path")
}
compileTask {
compileFlag = '-flag'
}
I want to use the Sauce Labs Java REST API to send Pass/Fail status back to the Sauce Labs dashboard. I am using Geb+Spock, and my Gradle build creates a test results directory where results are output in XML. My problem is that the results XML file doesn't seem to be generated until after the Spock specification's cleanupSpec() exits. This causes my code to report the results of the previous test run, rather than the current one. Clearly not what I want!
Is there some way to get to the results from within cleanupSpec() without relying on the XML? Or a way to get the results to file earlier? Or some alternative that will be much better than either of those?
Some code:
In build.gradle, I specify the testResultsDir. This is where the XML file is written after the Spock specifications exit:
drivers.each { driver ->
task "${driver}Test"(type: Test) {
cleanTest
systemProperty "geb.env", driver
testResultsDir = file("$buildDir/test-results/${driver}")
systemProperty "proj.test.resultsDir", testResultsDir
}
}
Here is the setupSpec() and cleanupSpec() in my LoginSpec class:
class LoginSpec extends GebSpec {
#Shared def SauceREST client = new SauceREST("redactedName", "redactedKey")
#Shared def sauceJobID
#Shared def allSpecsPass = true
def setupSpec() {
sauceJobID = driver.getSessionId().toString()
}
def cleanupSpec() {
def String specResultsDir = System.getProperty("proj.test.resultsDir") ?: "./build/test-results"
def String specResultsFile = this.getClass().getName()
def String specResultsXML = "${specResultsDir}/TEST-${specResultsFile}.xml"
def testsuiteResults = new XmlSlurper().parse( new File( specResultsXML ))
// read error and failure counts from the XML
def errors = testsuiteResults.#errors.text()?.toInteger()
def failures = testsuiteResults.#failures.text()?.toInteger()
if ( (errors + failures) > 0 ) { allSpecsPass = false }
if ( allSpecsPass ) {
client.jobPassed(sauceJobID)
} else {
client.jobFailed(sauceJobID)
}
}
}
The rest of this class contains login specifications that do not interact with SauceLabs. When I read the XML, it turns out that it was written at the end of the previous LoginSpec run. I need a way to get to the values of the current run.
Thanks!
Test reports are generated after a Specification has finished execution and the generation is performed by the build system, so in your case by Gradle. Spock has no knowledge of that so you are unable to get that information from within the test.
You can on the other hand quite easily get that information from Gradle. Test task has two methods that might be of interest to you here: addTestListener() and afterSuite(). It seems that the cleaner solution here would be to use the first method, implement a test listener and put your logic in afterSuite() of the listener (and not the task configuration). You would probably need to put that listener implementation in buildSrc as it looks like you have a dependency on SauceREST and you would need to build and compile your listener class before being able to use it as an argument to addTestListener() in build.gradle of your project.
Following on from erdi's suggestion, I've created a Sauce Gradle helper library, which provides a Test Listener that parses the test XML output and invokes the Sauce REST API to set the pass/fail status.
The library can be included by adding the following to your build.gradle file:
import com.saucelabs.gradle.SauceListener
buildscript {
repositories {
mavenCentral()
maven {
url "https://repository-saucelabs.forge.cloudbees.com/release"
}
}
dependencies {
classpath group: 'com.saucelabs', name: 'saucerest', version: '1.0.2'
classpath group: 'com.saucelabs', name: 'sauce_java_common', version: '1.0.14'
classpath group: 'com.saucelabs.gradle', name: 'sauce-gradle-plugin', version: '0.0.1'
}
}
gradle.addListener(new SauceListener("YOUR_SAUCE_USERNAME", "YOUR_SAUCE_ACCESS_KEY"))
You will also need to output the Selenium session id for each test, so that the SauceListener can associate the Sauce Job with the pass/fail status. To do this, include the following output in the stdout:
SauceOnDemandSessionID=SELENIUM_SESSION_ID
Extended GroovyClassloader and override loadclass method
If I make lookupScriptFiles "true" in the loadClass() method the script run and doesn't require an import statement referencing a groovy class in a different package
i have extended GroovyClassloader, and override the loadclass method, in the loadclass the argument lookupScriptFiles =true
When this is true, it sucessfully compiles even first.groovy don't have import statement
when lookupScriptFiles=false it throws compilation error as expected.
my source code snippet
C:\>cat first.groovy
def cos=new Second()
==============================================================
C:>cat Second.groovy
package com.test
class Second
{
Second()
{
println "Anish"
}
}
=========================================================
C:\bin>echo %CLASSPATH%
C:\zGroovy\bin;C:\vsexclude\opt\groovy-1.7.2\embeddable\groovy-all-1.7.2.jar
===============================================
C:\vsexclude\trees\bac-4.2\workspace\zGroovy\bin>java GCtest
path------>>C:\first.groovy
Anish
=================================
import groovy.lang.GroovyClassLoader;
import org.codehaus.groovy.control.CompilationFailedException;
import org.codehaus.groovy.control.CompilerConfiguration;
/**
* #author Anish
*/
public class GCloader extends GroovyClassLoader {
public GCloader(ClassLoader parent) {
super(parent, new CompilerConfiguration());
}
#Override
public Class<?> loadClass(final String name, boolean lookupScriptFiles,
boolean preferClassOverScript, boolean resolve)
throws ClassNotFoundException, CompilationFailedException {
//return loadFiles(name, true, preferClassOverScript, resolve);
return super.loadClass(name, true,
preferClassOverScript, resolve);
}
}
Assuming your question is:
If I set lookupScriptFiles to true, can I remove the import statements from my groovy scripts?
Then the answer is no. The classloader will try to lookup scripts it doesn't know about, but you will still need to imports to tell it in which packages to look for each class
Update
So, you have two groovy files in the same directory, one of which you have arbitrarily added a package statement to.
I assume you are loading the classes straight from scripts (yet another thing you don't say in your question)
If this is the case, then you will need to tell the classloader to lookup the other scripts to compile to classes.
If you don't -- as you have seen -- it will not work (imports or no imports)
However, putting two groovy files in the same folder, and just adding a package line to one of them is awful coding practice, and I'm surprised you got anything working