Gradle Incremental Tasks: Adding already generated code to the classpath - groovy

I have created a custom Gradle task that generates some java code. To optimize execution this plugin uses the #InputDirectory and #OutputDirectory annotations, so that the code does not have to be generated each build.
However, I do want this task to add the generated code to the classpath. I am currently doing this by
class JaxbTask extends DefaultTask {
#OutputDirectory
File destdir = project.file( "${project.buildDir}/generated-sources/mygen" )
#InputDirectory
File schemaRoot = project.file("${project.projectDir}/src/main/resources/myschema/")
#TaskAction
def main() {
..
project.sourceSets.main.java.srcDirs += destdir
..
}
The problem is that the TaskAction is not executed and the sourcedirectory is not added to the compile path when the generated code is up to date. Is there any way to make sure that the modification of the sourcepath is always performed?

A task should never try to configure the build model. Configuration is the responsibility of build scripts and plugins, and needs to happen in the configuration phase (before any task has run).

Related

How to prevent rawproto file generation or delete them automatically?

Android gradle plugin generates tons of .rawproto files in build/android-profile directory. What are they used for? Is there a way to disable this madness or automatically delete them?
I've been bugged by it for a long time, and now that I noticed there's gigabytes of this hogging my smallish SSD, I've decided to figure out a way to disable it. For me the most annoying thing before occupying too much space was gradlew clean leaving a build folder behind.
Only tested with com.android.tools.build:gradle:3.0.1, so YMMV.
TL;DR
For global application read last section, per-project use this in rootProject's build.gradle:
com.android.build.gradle.internal.profile.ProfilerInitializer.recordingBuildListener =
new com.android.build.gradle.internal.profile.RecordingBuildListener(
com.android.builder.profile.ProcessProfileWriter.get());
// and then `gradlew --stop && gradlew clean` to verify no build folder is left behind
Investigation
Thanks to https://stackoverflow.com/a/43910084/253468 linked by #JeffRichards mentioning ProcessProfileWriterFactory.java, I've put a breakpoint there and checked who's calling it by running gradlew -Dorg.gradle.debug=true --info (not to be confused with --debug) and attaching a remote debugger.
I followed the trail and found that ProcessProfileWriter.finishAndMaybeWrite creates the folder and writes. Backtracing on method calls I found that ProfilerInitializer.recordingBuildListener controls whether it's called ... and that is initialized directly by BasePlugin (apply plugin: 'com.android.*').
So in order to prevent anything from happening I opted to try to disable the guard, by pre-initialized that static field. Thankfully Groovy (and hence Gradle) doesn't give a * about JVM visibility modifiers, so without reflection here's the magic line:
com.android.build.gradle.internal.profile.ProfilerInitializer.recordingBuildListener =
new com.android.build.gradle.internal.profile.RecordingBuildListener(
com.android.builder.profile.ProcessProfileWriter.get());
I know, it's a bit verbose, but it works, and if you import stuff it looks better:
ProfilerInitializer.recordingBuildListener = new RecordingBuildListener(ProcessProfileWriter.get());
Applying the magic
In a single-project build (one build.gradle) you must apply this before
apply plugin: 'com.android.application'
In multi-project builds (most template projects: app folder, settings.gradle, and many build.gradles) I suggest you apply it around the buildscript block:
buildscript {
// ...
dependencies {
classpath 'com.android.tools.build:gradle:3.0.1'
}
}
// magic line here
Make sure it's before any apply plugin:s, and not inside a buildscript block.
Applying the magic globally
Obviously if it bothers us in one project, it will in all cases, so following Gradle's manual, create a file in ~/.gradle/init.gradle or %USERPROFILE%\.gradle\init.gradle (mind you this folder can be relocated with GRADLE_USER_HOME) with the following contents:
// NB: any changes to this script require a new daemon (`gradlew --stop` or `gradlew --no-daemon <tasks>`)
rootProject { rootProject -> // see https://stackoverflow.com/a/48087543/253468
// listen for lifecycle events on the project's plugins
rootProject.plugins.whenPluginAdded { plugin ->
// check if any Android plugin is being applied (not necessarily just 'com.android.application')
// this plugin is actually exactly for this purpose: to get notified
if (plugin.class.name == 'com.android.build.gradle.api.AndroidBasePlugin') {
logger.info 'Turning off `build/android-profile/profile-*.(rawproto|json)` generation.'
// execute the hack in the context of the buildscript, not in this initscript
new GroovyShell(plugin.class.classLoader).evaluate("""
com.android.build.gradle.internal.profile.ProfilerInitializer.recordingBuildListener =
new com.android.build.gradle.internal.profile.RecordingBuildListener(
com.android.builder.profile.ProcessProfileWriter.get());
""")
}
}
}

Gradle ant build file not not available during configuration phase

I can't figure out how to execute ant target in case that ant build.xml file is not available during configuration phase. Because it's a remote resource (Maven Remote Resources Plugin).
So basically first I need to get this remote resource like this:
configurations {
remoteResources
}
dependencies.remoteResources 'group:my-remote-resource:version'
task getRemoteResources(type: Copy) {
from(zipTree(configurations.remoteResources.first()))
into("$buildDir/remote-resources")
// replace placeholders
filter(org.apache.tools.ant.filters.ReplaceTokens, , tokens: remotePlaceholders)
}
Only then I have build.xml in
"$buildDir/remote-resources"
But I can't use ant.importBuild as that expects build.xml to be available during the configuration, which is not my case.
I was thinking to move the remote resource "download" into initialization phase, but I have a multi-module project and although only some sub-projects are using this remote resource they all has it own placeholders to replace.
Is there any way how to execute ant targets in this special case?
EDIT: I found pretty nice solution utilising Ant's ProjectHelper and Project classes. So I guess that is my answer..
So here is the final solution. (as already mentioned credits go to groovy-almanac)
import org.apache.tools.ant.Project;
import org.apache.tools.ant.ProjectHelper;
task executeAntTarget(dependsOn: getRemoteResources) << {
def antFile = new File("$buildDir/remote-resources/build.xml")
def antProject = new Project()
antProject.init()
ProjectHelper.projectHelper.parse(antProject, antFile)
antProject.executeTarget('<target-to-execute>');
}

Set the project properties in subclassed gradle task

I am defining a gradle task "launchIPad2Simulator" that is subclassing another already defined task "launchIPadSimulatorfrom" in robovm gradle plugin. The goal is to set the project properties which are defining which simulator will run.
// Run the IPad2 simulator
task launchIPad2Simulator2(type: org.robovm.gradle.tasks.IPadSimulatorTask) {
project.setProperty("robovm.device.name", "iPad-2")
project.setProperty("robovm.arch", "x86")
}
But the problem is, I must first define the properties in the gradle.properties file. They don't even need to have any value assigned. The whole content of the gradle.properties file:
robovm.device.name
robovm.arch
I would rather have gradle.properties file empty, but if the above task is then run, the error: Error:(112, 0) No such property: robovm.device.name for class: org.gradle.api.internal.project.DefaultProject_Decorated is shown.
Also if properties are only defined in task as following (gradle.properties is empty), they are ignored.
// Run the IPad2 simulator
task launchIPad2Simulator2(type: org.robovm.gradle.tasks.IPadSimulatorTask) {
project.properties.put("robovm.device.name", "iPad-2")
project.properties.put("robovm.arch", "x86")
}
So what is the correct way to dynamically set the project properties in subclassed task?
=== Edit ===
Ok now I see that setting the project properties is also not good, because in multiple tasks it gets overwritten. So maybe this shouldn't be project properties in first place.
The temp solution now is using command line invocation of tasks:
// simulator with properties launched from command line
task launchIPad2Simulator1(type: Exec) {
commandLine 'gradle', '-Probovm.device.name=iPad-2', '-Probovm.arch=x86', 'launchIPadSimulator'
}
try
task launchIPad2Simulator2(type: org.robovm.gradle.tasks.IPadSimulatorTask) {
project.ext."robovm.device.name" = "iPad-2"
project.ext."robovm.arch" = "x86"
}
this is the gradle syntax to add dynamic properites to the project object.

Jenkins Groovy Postbuild use static file instead of script

Is it possible to load an external groovy script into the groovy post build plugin instead of pasting the script content into each job? We have approximately 200 jobs so updating them all is rather time consuming. I know that I could write a script to update the config files directly (as in this post: Add Jenkins Groovy Postbuild step to all jobs), but these jobs run 24x7 so finding a window when I can restart Jenkins or reload the config is problematic.
Thanks!
Just put the following in the "Groovy script:" field:
evaluate(new File("... groovy script file name ..."));
Also, you might want to go even further.
What if script name or path changes?
Using Template plugin you can create a single "template" job, define call to groovy script (above line) in there, and in all jobs that need it add post-build action called "Use publishers from another project" referencing this template project.
Update: This is what really solved it for me: https://issues.jenkins-ci.org/browse/JENKINS-21480
"I am able to do just that by doing the following. Enter these lines in lieu of the script in the "Groovy script" box:"
// Delegate to an external script
// The filename must match the class name
import JenkinsPostBuild
def postBuild = new JenkinsPostBuild(manager)
postBuild.run()
"Then in the "Additional groovy classpath" box enter the path to that file."
We do it in the following fashion.
We have a file c:\somepath\MyScriptLibClass.groovy (accessible for Jenkins) which contains code of a groovy class MyScriptLibClass. The class contains a number of functions designed to act like static methods (to be mixed in later).
We include this functions writing the following statement in the beginning of sytem groovy and postbuild groovy steps:
[ // include lib scripts
'MyScriptLibClass'
].each{ this.metaClass.mixin(new GroovyScriptEngine('c:\\somepath').loadScriptByName(it+'.groovy')) }
This could look a bit ugly but you need to write it only once for script. You could include more than one script and also use inheritance between library classes.
Here you see that all methods from the library class are mixed in the current script. So if your class looks like:
class MyScriptLibClass {
def setBuildName( String str ){
binding?.variables['manager'].build.displayName = str
}
}
in Groovy Postbuild you could write just:
[ // include lib scripts
'MyScriptLibClass'
].each{ this.metaClass.mixin(new GroovyScriptEngine('c:\\somepath').loadScriptByName(it+'.groovy')) }
setBuildName( 'My Greatest Build' )
and it will change your current build's name.
There are also other ways to load external groovy classes and it is not necessary to use mixing in. For instance you can take a look here Compiling and using Groovy classes from Java at runtime?
How did I solve this:
Create file $JENKINS_HOME/scripts/PostbuildActions.groovy with following content:
public class PostbuildActions {
void setBuildName(Object manager, String str ){
binding?.variables['manager'].build.displayName = str
}
}
In this case in Groovy Postbuild you could write:
File script = new File("${manager.envVars['JENKINS_HOME']}/scripts/PostbuildActions.groovy")
Object actions = new GroovyClassLoader(getClass().getClassLoader()).parseClass(script).newInstance();
actions.setBuildName(manager, 'My Greatest Build');
If you wish to have the Groovy script in your Code Repository, and loaded onto the Build / Test Slave in the workspace, then you need to be aware that Groovy Postbuild runs on the Master.
For us, the master is a Unix Server, while the Build/Test Slaves are Windows PCs on the local network. As a result, prior to using the script, we must open a channel from the master to the Slave, and use a FilePath to the file.
The following worked for us:
// Get an Instance of the Build object, and from there
// the channel from the Master to the Workspace
build = Thread.currentThread().executable
channel = build.workspace.channel;
// Open a FilePath to the script
fp = new FilePath(channel, build.workspace.toString() + "<relative path to the script in Unix notation>")
// Some have suggested that the "Not NULL" check is redundant
// I've kept it for completeness
if(fp != null)
{
// 'Evaluate' requires a string, so read the file contents to a String
script = fp.readToString();
// Execute the script
evaluate(script);
}
I've just faced with the same task and tried to use #Blaskovicz approach.
Unfortunately it does not work for me, but I find upgraded code here (Zach Auclair)
Publish here with minor changes:
postbuild task
//imports
import hudson.model.*
import groovy.lang.GroovyClassLoader;
import groovy.lang.GroovyObject;
import java.io.File;
// define git file
def postBuildFile = manager.build.getEnvVars()["WORKSPACE"] + "/Jenkins/SimpleTaskPostBuildReporter.GROOVY"
def file = new File(postBuildFile)
// load custom class from file
Class groovy = this.class.classLoader.parseClass(file);
// create custom object
GroovyObject groovyObj = (GroovyObject) groovy.newInstance(manager);
// do report
groovyObj.report();
postbuild class file in git repo (SimpleTaskPostBuildReporter.GROOVY)
class SimpleTaskPostBuildReporter {
def manager
public SimpleTaskPostBuildReporter(Object manager){
if(manager == null) {
throw new RuntimeException("Manager object can't be null")
}
this.manager = manager
}
public def report() {
// do work with manager object
}
}
I haven't tried this exactly.
You could try the Jenkins Job DSL plugin which allows you to rebuild jobs from within jenkins using a Groovy DSL and supports post build groovy steps directly from the wiki
Groovy Postbuild
Executes Groovy scripts after a build.
groovyPostBuild(String script, Behavior behavior = Behavior.DoNothing)
Arguments:
script The Groovy script to execute after the build. See the plugin's
page for details on what can be done. behavior optional. If the script
fails, allows you to set mark the build as failed, unstable, or do
nothing. The behavior argument uses an enum, which currently has three
values: DoNothing, MarkUnstable, and MarkFailed.
Examples:
This example will run a groovy script that prints hello, world and if
that fails, it won't affect the build's status:
groovyPostBuild('println "hello, world"') This example will run a
groovy script, and if that fails will mark the build as failed:
groovyPostBuild('// some groovy script', Behavior.MarkFailed) This example
will run a groovy script, and if that fails will mark the
build as unstable:
groovyPostBuild('// some groovy script', Behavior.MarkUnstable) (Since 1.19)
There is a facility to use a template job (this is the bit I haven't tried) which could be the job itself so you only need to add the post build step. If you don't use a template you need to recode the whole project.
My approach is to have a script to regenerate or create all jobs from scratch just so I don't have to apply the same upgrade multiple times. Regenerated jobs keep their build history
I was able to get the following to work (I also posted on this jira issue).
in my postbuild task
this.class.classLoader.parseClass("/home/jenkins/GitlabPostbuildReporter.groovy")
GitlabPostbuildReporter.newInstance(manager).report()
in my file on disk at /home/jenkins/GitlabPostbuildReporter.groovy
class GitlabPostbuildReporter {
def manager
public GitlabPostbuildReporter(manager){
if(manager == null) {
throw new RuntimeException("Manager object musn't be null")
}
this.manager = manager
}
public def report() {
// do work with manager object
}
}

create root level gradle task to be used by several subprojects?

I want to create a task in my root project that will create a custom TAR for some subprojects.
task archive(type: Tar) {
project.logger.info("Tar bundle :: ")
baseName = project.baseName
destinationDir = new File(project.buildDir.path)
archiveDir = new File(project.buildDir.path)
compression = Compression.GZIP
from (archiveDir)
}
I have found this task but what I need, is a subproject to send in the subproject.name and have the task use the buildDir and the source dir from the subproject calling the task. Then save the archive to /target of the subproject as well.
Is this possible without creating a new task for each subproject?
If every subproject is supposed to create its own Tar, then you need to create one task per subproject. This is as easy as wrapping the task declaration in subprojects { ... }. You'll also have to remove all occurrences of project. from your code, plus the whole baseName line. (All of this is redundant anyway.)
new File(project.buildDir.path) is the same as buildDir. Instead of logger.info("Tar bundle :: "), you probably want doFirst { logger.info("Tar bundle :: ") }. Otherwise, this message will get logged for each and every build, no matter which tasks get executed.

Resources