Custom task build order in Gradle - jaxb

I have a multi-project Gradle build with a custom defind xjc task to build the jaxb generated objects and I am having issues with steps building in the correct order.
I have 3 projects, common, ref and product. ref depends on common and product depends on ref and common. The naming is important to my problem as it seems gradle does things in alphabetical order and I have stripped out some other dependencies as they do not impact the problem.
Within each project the order should be jaxb, java compile and then scala compile.
In the top level build.gradle I specify the jaxb task to be:
task jaxb() {
description 'Converts xsds to classes'
def jaxbTargetFile = file( generatedSources )
def jaxbSourceFile = file ( jaxbSourceDir )
def jaxbEpisodesFile = file ( jaxbEpisodeDir )
def bindingRootDir = file ( rootDir.getPath() + '/')
inputs.dir jaxbSourceDir
outputs.dir jaxbTargetFile
doLast {
ant.taskdef(name: 'xjc', classname: 'com.sun.tools.xjc.XJCTask', classpath: configurations.jaxb.asPath)
jaxbTargetFile.mkdirs()
jaxbEpisodesFile.mkdirs()
for ( xsd in bindingsMap) {
if (!episodeMap.containsKey(xsd.key)) {
ant.fail( "Entry no found in the episodeMap for xsd $xsd.key" )
}
def episodeFile = projectDir.getPath() + '/' + jaxbEpisodeDir + '/' + episodeMap.get(xsd.key)
println( "Processing xsd $xsd.key with binding $xsd.value producing $episodeFile" )
ant.xjc(destdir: "$jaxbTargetFile", extension: true, removeOldOutput: true) {
schema(dir:"$jaxbSourceFile", includes: "$xsd.key")
binding(dir:"$bindingRootDir" , includes: "$xsd.value")
arg(value: '-npa')
arg(value: '-verbose')
arg(value: '-episode')
arg(value: episodeFile)
}
}
}
}
In the individual build.gradle file for product I specify (with similar in ref)
dependencies {
compile project(':common')
compile project(':ref')
}
and in all three projects I specify
compileJava.dependsOn(jaxb)
when I run publish (or jar) in the product project I can see the following output:
common:jaxb
common:compileJava
common:compileScala
common:jar
product:jaxb
ref:jaxb
refcompileJava
ref:compileScala
ref:jar
product:compileJava
product:compileScala
This gives me an error because the xsd in product refers to ref and as ref has not run jaxb yet there are no episode binding files for ref and product regenerates the imported classes with the wrong package name.
How can I ensure that ref jaxb runs before product jaxb?

If your product's jaxb tasks depends on jaxb tasks for ref and common, you should define this dependency:
(in product build.gradle)
task jaxb(dependsOn: [':common:jaxb', ':ref:jaxb']) {
...
}
Set same kind dependency in ref (on commmon)

Related

Jaxb generated sources with nullable fields

I have kotlin project where I'm using Jaxb generated src files from xsd. The problem with these generated sources is that they have nullable fields but IDEA does not know about it. It can lead to bugs in production. To fix it we can add #Nullable annotation to all getters in generated srs-es.
How can we do it gracefully?
I made this solution, it works for me but maybe somebody knows better approuch?
Gradle kt task
tasks.register("nullableForXsdFields") {
group = "code generation"
description = "Add Nullable annotation to generated jaxb files"
actions.add {
val xjcFiles = fileTree("$buildDir/generated-sources/main/xjc")
xjcFiles.forEach { xjcFile ->
var content = xjcFile.readText()
Regex("(public) (\\w+|<|>|\\*) (get)").findAll(content).distinct()
.forEach { match ->
content = content.replace(
match.groups[0]!!.value,
match.groups[0]!!.value.replace("public ", "public #Nullable ")
)
}.run {
content = content.replace(
"import javax.xml.bind.annotation.XmlType;",
"import javax.xml.bind.annotation.XmlType;\nimport org.jetbrains.annotations.Nullable;"
)
}
xjcFile.writeBytes(content.toByteArray())
}
}
}
tasks.getByName("xjcGeneration").finalizedBy("nullableForXsdFields")
tasks.getByName("compileKotlin").dependsOn("nullableForXsdFields")
tasks.getByName("compileJava").dependsOn("nullableForXsdFields")
xjcGeneration - is my plugin to generate src from xsd
I faced the same problem, so I've created an extension for maven jaxb plugin
com.github.labai:labai-jsr305-jaxb-plugin.
This extension marks all generated packages as nullable by default, and then marks with #NotNull only those fields, which are mandatory by xsd scheme.
You can find more details how to use it with maven in github
https://github.com/labai/labai-jsr305

Gradle extend task / Create task based on task

New to Groovy, gradle. I want to extend the following task (part of java plugin):
jar {
from { configurations.runtime.collect { zipTree(it) } }
manifest.attributes( 'Implementation-Title': archivesBaseName,
'Implementation-Version': version,
'Main-Class': mainClassName,
'Built-By': POM_DEVELOPER_NAME
)
}
The extended task will have
archiveName = 'PJ-latest.jar'
destinationDir = project.getRootDir()
In typical Java I would have called super but I'm not sure how I can do it in gradle. My best try is to extend the Jar class add the same parameters as jar has and also the dependencies:
task assembleCompiler(type: Jar){
dependsOn compileJava, processResources, classes
archiveName = 'PJ-latest.jar'
destinationDir = project.getRootDir()
from { configurations.runtime.collect { zipTree(it) } }
manifest.attributes( 'Implementation-Title': archivesBaseName,
'Implementation-Version': version,
'Main-Class': mainClassName,
'Built-By': POM_DEVELOPER_NAME
)
}
You'll have to configure the second task separately, like you already did. If you want to share this code, turn it into a plugin.

How to access a Gradle configurations object correctly

First off, this is my first foray into Gradle/Groovy (using Gradle 1.10). I'm setting up a multi-project environment where I'm creating a jar artifact in one project and then want to define an Exec task, in another project, which depends on the created jar. I'm setting it up something like this:
// This is from the jar building project
jar {
...
}
configurations {
loaderJar
}
dependencies {
loaderJar files(jar.archivePath)
...
}
// From the project which consumes the built jar
configurations {
loaderJar
}
dependencies {
loaderJar project(path: ":gfxd-demo-loader", configuration: "loaderJar")
}
// This is my test task
task foo << {
configurations.loaderJar.each { println it }
println configurations.loaderJar.collect { it }[0]
// The following line breaks!!!
println configurations.loaderJar[0]
}
When executing the foo task it fails with:
> Could not find method getAt() for arguments [0] on configuration ':loaderJar'.
In my foo task I'm just testing to see how to access the jar. So the question is, why does the very last println fail? if a Configuration object is a Collection/Iterable then surely I should be able to index into it?
Configuration is-a java.util.Iterable, but not a java.util.Collection. As can be seen in the Groovy GDK docs, the getAt method (which corresponds to the [] operator) is defined on Collection, but not on Iterable. Hence, you can't index into a Configuration.

publish artifact overwrite other artifact in Gradle

I am experimenting with Gradle to build a few jars, rather than maintain a list of classes that hold EJBs so that I can deploy them separately I thought it might be neat to scan the classes when making the jar.
Rather than load the classes and use reflection to get the annotations I thought it may be simpler to scan the classes with asm, hence the chuncky ClassReader in one of the tasks.
I don't think this is the issue so can be ignored, basically I have 2 tasks that I use to define the contents of the jars, both report that different content is going into them via the eachFile print out, however when I look in the publish repository location both files and associated sha1 are identical.
Either Gradle is broken or, more likely, I've done something crazy but can't see what it is, can anyone help?
By the way if I disable the publish of either of the jar files the one that does get created is correct so I think it's something wrong with the publish rather than the jarring up, but could be wrong.
// ASM is used to interpret the class files, this avoids having to load all classes in the vm and use reflection
import org.objectweb.asm.*
task ejbJar(type: Jar) {
//outputs.upToDateWhen { false }
from "${project.buildDir}/classes/main"
eachFile { println "EJB server: ${name}" }
include getEjbClassFiles(project.buildDir)
}
task clientEjbJar(type: Jar) {
//outputs.upToDateWhen { false }
from "${project.buildDir}/classes/main/com/company/core/versioner"
eachFile { println "Client EJB ${name}" }
include '**/*'
}
artifacts {
archives clientEjbJar
archives ejbJar
}
String[] getEjbClassFiles(base) {
def includedFiles = []
def baseDir = project.file("${base}/classes/main")
def parentPath = baseDir.toPath()
if (baseDir.isDirectory()) {
baseDir.eachFileRecurse(groovy.io.FileType.FILES) { file ->
if(file.name.endsWith('.class')) {
//get hold of annotations in there --- org.objectweb.asm.Opcodes.ASM4
def reader = new ClassReader(file.bytes).accept(
new ClassVisitor(Opcodes.ASM4) {
public AnnotationVisitor visitAnnotation(String desc, boolean visible) {
if(desc.equals("Ljavax/ejb/Stateless;") ||
desc.equals("Ljavax/ejb/Stateful;")) {
includedFiles += parentPath.relativize(file.toPath())
}
return null //no interest in actually visiting the annotation values
}
},
ClassReader.SKIP_DEBUG | ClassReader.EXPAND_FRAMES | ClassReader.SKIP_FRAMES | ClassReader.SKIP_CODE
)
}
}
}
return includedFiles
}
publishing {
publications {
mypub(IvyPublication) {
artifact(ejbJar) {
name 'ejb'
}
artifact(clientEjbJar) {
name 'client-ejb'
}
}
}
repositories {
ivy {
name 'personal'
url "${ant['developer.repository']}/"
layout 'pattern', {
artifact "[organisation]/[module]/[artifact]/[revision]/[type]/[artifact]-[revision].[ext]"
ivy "[organisation]/[module]/[type]/[revision]/[type]/[type]-[revision].[ext]"
}
}
}
}
I did break the thing down into a simpler form as I thought it may be a Gradle bug.
The simplified form was:
apply plugin: 'java'
apply plugin: 'ivy-publish'
task bigJar(type: Jar) {
from "${rootDir}/src/main/resources"
include '**/*'
}
task smallJar(type: Jar) {
from "${rootDir}/src/main/resources/A/B"
include '**/*'
}
group 'ICantBeEmpty'
artifacts {
archives bigJar
archives smallJar
}
publishing {
publications {
mypub(IvyPublication) {
artifact(bigJar) { name 'biggie' }
artifact(smallJar) { name 'smallie' }
}
repositories {
ivy {
name 'personal'
url "c:/temp/gradletest"
layout 'pattern', {
artifact "[organisation]/[module]/[artifact]/[revision]/[type]/[artifact]-[revision].[ext]"
ivy "[organisation]/[module]/[type]/[revision]/[type]/[type]-[revision].[ext]"
}
}
}
}
}
This results in 2 files in c:/temp/gradletest/ICantBeEmpty/report-bug/biggie/unspecified/biggie-unspecified.jar and c:/temp/gradletest/ICantBeEmpty/report-bug/smallie/unspecified/smallie-unspecified.jar
Both of these files are identical, however I think I know why see my later answer.
Whilst looking at some configurations I noticed some odd behaviour that led me to a resolution of this issue, and it is a Gradle bug.
In my build I had a scratch task doing
configurations.archives.artifacts.each { println it }
This gave me 5 different lines output, however changing it to this
configurations.archives.artifacts.each { println it.file }
produced the same filename 5 times.
It turns out this is related to my issue, although the artifacts are there as separate entities the name used to uniquely identify them was the same so the same file was always chosen during a publish. The name of the artifacts is given by ${baseName}-${appendix}-${version}-${classifier}.${extension} by default in the java plugin. This means that if neither appendix or classifier is specified then the artifact will have the same name.
I tested this using the above sample code by adding an appendix name
task bigJar(type: Jar) {
appendix = 'big'
from "${rootDir}/src/main/resources"
include '**/*'
}
task smallJar(type: Jar) {
appendix = 'small'
from "${rootDir}/src/main/resources/A/B"
include '**/*'
}
Using this rather than the code from the question produces 2 different jars.
It's not a complete answer but is a good enough work around, if I add a new publication definition I can publish the artifacts that I want to to the location that I want, the only downside is that it will create another gradle task which isn't ideal.
publications {
mypub(IvyPublication) {
artifact(ejbJar) {
name 'ejb'
}
}
newpub(IvyPublication) {
artifact(clientEjbJar) {
name 'client-ejb'
}
}
}
The above answer works in the short term, however does reveal yet another short coming in the Gradle world enter link description here
Not sure Gradle is all it could be at the moment, and so far no one has answered my questions so maybe it's not that actively developed!!
I'm no expert in this part of Gradle, but the functionality you are using is marked as "incubating"; you are using the new publishing feature which might or might not be complete. Perhaps you should use the old way of doing things. You also seem to be mixing both ways by using the artifacts closure.

How to Report Results to Sauce Labs using Geb/Spock?

I want to use the Sauce Labs Java REST API to send Pass/Fail status back to the Sauce Labs dashboard. I am using Geb+Spock, and my Gradle build creates a test results directory where results are output in XML. My problem is that the results XML file doesn't seem to be generated until after the Spock specification's cleanupSpec() exits. This causes my code to report the results of the previous test run, rather than the current one. Clearly not what I want!
Is there some way to get to the results from within cleanupSpec() without relying on the XML? Or a way to get the results to file earlier? Or some alternative that will be much better than either of those?
Some code:
In build.gradle, I specify the testResultsDir. This is where the XML file is written after the Spock specifications exit:
drivers.each { driver ->
task "${driver}Test"(type: Test) {
cleanTest
systemProperty "geb.env", driver
testResultsDir = file("$buildDir/test-results/${driver}")
systemProperty "proj.test.resultsDir", testResultsDir
}
}
Here is the setupSpec() and cleanupSpec() in my LoginSpec class:
class LoginSpec extends GebSpec {
#Shared def SauceREST client = new SauceREST("redactedName", "redactedKey")
#Shared def sauceJobID
#Shared def allSpecsPass = true
def setupSpec() {
sauceJobID = driver.getSessionId().toString()
}
def cleanupSpec() {
def String specResultsDir = System.getProperty("proj.test.resultsDir") ?: "./build/test-results"
def String specResultsFile = this.getClass().getName()
def String specResultsXML = "${specResultsDir}/TEST-${specResultsFile}.xml"
def testsuiteResults = new XmlSlurper().parse( new File( specResultsXML ))
// read error and failure counts from the XML
def errors = testsuiteResults.#errors.text()?.toInteger()
def failures = testsuiteResults.#failures.text()?.toInteger()
if ( (errors + failures) > 0 ) { allSpecsPass = false }
if ( allSpecsPass ) {
client.jobPassed(sauceJobID)
} else {
client.jobFailed(sauceJobID)
}
}
}
The rest of this class contains login specifications that do not interact with SauceLabs. When I read the XML, it turns out that it was written at the end of the previous LoginSpec run. I need a way to get to the values of the current run.
Thanks!
Test reports are generated after a Specification has finished execution and the generation is performed by the build system, so in your case by Gradle. Spock has no knowledge of that so you are unable to get that information from within the test.
You can on the other hand quite easily get that information from Gradle. Test task has two methods that might be of interest to you here: addTestListener() and afterSuite(). It seems that the cleaner solution here would be to use the first method, implement a test listener and put your logic in afterSuite() of the listener (and not the task configuration). You would probably need to put that listener implementation in buildSrc as it looks like you have a dependency on SauceREST and you would need to build and compile your listener class before being able to use it as an argument to addTestListener() in build.gradle of your project.
Following on from erdi's suggestion, I've created a Sauce Gradle helper library, which provides a Test Listener that parses the test XML output and invokes the Sauce REST API to set the pass/fail status.
The library can be included by adding the following to your build.gradle file:
import com.saucelabs.gradle.SauceListener
buildscript {
repositories {
mavenCentral()
maven {
url "https://repository-saucelabs.forge.cloudbees.com/release"
}
}
dependencies {
classpath group: 'com.saucelabs', name: 'saucerest', version: '1.0.2'
classpath group: 'com.saucelabs', name: 'sauce_java_common', version: '1.0.14'
classpath group: 'com.saucelabs.gradle', name: 'sauce-gradle-plugin', version: '0.0.1'
}
}
gradle.addListener(new SauceListener("YOUR_SAUCE_USERNAME", "YOUR_SAUCE_ACCESS_KEY"))
You will also need to output the Selenium session id for each test, so that the SauceListener can associate the Sauce Job with the pass/fail status. To do this, include the following output in the stdout:
SauceOnDemandSessionID=SELENIUM_SESSION_ID

Resources