create root level gradle task to be used by several subprojects? - groovy

I want to create a task in my root project that will create a custom TAR for some subprojects.
task archive(type: Tar) {
project.logger.info("Tar bundle :: ")
baseName = project.baseName
destinationDir = new File(project.buildDir.path)
archiveDir = new File(project.buildDir.path)
compression = Compression.GZIP
from (archiveDir)
}
I have found this task but what I need, is a subproject to send in the subproject.name and have the task use the buildDir and the source dir from the subproject calling the task. Then save the archive to /target of the subproject as well.
Is this possible without creating a new task for each subproject?

If every subproject is supposed to create its own Tar, then you need to create one task per subproject. This is as easy as wrapping the task declaration in subprojects { ... }. You'll also have to remove all occurrences of project. from your code, plus the whole baseName line. (All of this is redundant anyway.)
new File(project.buildDir.path) is the same as buildDir. Instead of logger.info("Tar bundle :: "), you probably want doFirst { logger.info("Tar bundle :: ") }. Otherwise, this message will get logged for each and every build, no matter which tasks get executed.

Related

How to prevent rawproto file generation or delete them automatically?

Android gradle plugin generates tons of .rawproto files in build/android-profile directory. What are they used for? Is there a way to disable this madness or automatically delete them?
I've been bugged by it for a long time, and now that I noticed there's gigabytes of this hogging my smallish SSD, I've decided to figure out a way to disable it. For me the most annoying thing before occupying too much space was gradlew clean leaving a build folder behind.
Only tested with com.android.tools.build:gradle:3.0.1, so YMMV.
TL;DR
For global application read last section, per-project use this in rootProject's build.gradle:
com.android.build.gradle.internal.profile.ProfilerInitializer.recordingBuildListener =
new com.android.build.gradle.internal.profile.RecordingBuildListener(
com.android.builder.profile.ProcessProfileWriter.get());
// and then `gradlew --stop && gradlew clean` to verify no build folder is left behind
Investigation
Thanks to https://stackoverflow.com/a/43910084/253468 linked by #JeffRichards mentioning ProcessProfileWriterFactory.java, I've put a breakpoint there and checked who's calling it by running gradlew -Dorg.gradle.debug=true --info (not to be confused with --debug) and attaching a remote debugger.
I followed the trail and found that ProcessProfileWriter.finishAndMaybeWrite creates the folder and writes. Backtracing on method calls I found that ProfilerInitializer.recordingBuildListener controls whether it's called ... and that is initialized directly by BasePlugin (apply plugin: 'com.android.*').
So in order to prevent anything from happening I opted to try to disable the guard, by pre-initialized that static field. Thankfully Groovy (and hence Gradle) doesn't give a * about JVM visibility modifiers, so without reflection here's the magic line:
com.android.build.gradle.internal.profile.ProfilerInitializer.recordingBuildListener =
new com.android.build.gradle.internal.profile.RecordingBuildListener(
com.android.builder.profile.ProcessProfileWriter.get());
I know, it's a bit verbose, but it works, and if you import stuff it looks better:
ProfilerInitializer.recordingBuildListener = new RecordingBuildListener(ProcessProfileWriter.get());
Applying the magic
In a single-project build (one build.gradle) you must apply this before
apply plugin: 'com.android.application'
In multi-project builds (most template projects: app folder, settings.gradle, and many build.gradles) I suggest you apply it around the buildscript block:
buildscript {
// ...
dependencies {
classpath 'com.android.tools.build:gradle:3.0.1'
}
}
// magic line here
Make sure it's before any apply plugin:s, and not inside a buildscript block.
Applying the magic globally
Obviously if it bothers us in one project, it will in all cases, so following Gradle's manual, create a file in ~/.gradle/init.gradle or %USERPROFILE%\.gradle\init.gradle (mind you this folder can be relocated with GRADLE_USER_HOME) with the following contents:
// NB: any changes to this script require a new daemon (`gradlew --stop` or `gradlew --no-daemon <tasks>`)
rootProject { rootProject -> // see https://stackoverflow.com/a/48087543/253468
// listen for lifecycle events on the project's plugins
rootProject.plugins.whenPluginAdded { plugin ->
// check if any Android plugin is being applied (not necessarily just 'com.android.application')
// this plugin is actually exactly for this purpose: to get notified
if (plugin.class.name == 'com.android.build.gradle.api.AndroidBasePlugin') {
logger.info 'Turning off `build/android-profile/profile-*.(rawproto|json)` generation.'
// execute the hack in the context of the buildscript, not in this initscript
new GroovyShell(plugin.class.classLoader).evaluate("""
com.android.build.gradle.internal.profile.ProfilerInitializer.recordingBuildListener =
new com.android.build.gradle.internal.profile.RecordingBuildListener(
com.android.builder.profile.ProcessProfileWriter.get());
""")
}
}
}

Gradle ant build file not not available during configuration phase

I can't figure out how to execute ant target in case that ant build.xml file is not available during configuration phase. Because it's a remote resource (Maven Remote Resources Plugin).
So basically first I need to get this remote resource like this:
configurations {
remoteResources
}
dependencies.remoteResources 'group:my-remote-resource:version'
task getRemoteResources(type: Copy) {
from(zipTree(configurations.remoteResources.first()))
into("$buildDir/remote-resources")
// replace placeholders
filter(org.apache.tools.ant.filters.ReplaceTokens, , tokens: remotePlaceholders)
}
Only then I have build.xml in
"$buildDir/remote-resources"
But I can't use ant.importBuild as that expects build.xml to be available during the configuration, which is not my case.
I was thinking to move the remote resource "download" into initialization phase, but I have a multi-module project and although only some sub-projects are using this remote resource they all has it own placeholders to replace.
Is there any way how to execute ant targets in this special case?
EDIT: I found pretty nice solution utilising Ant's ProjectHelper and Project classes. So I guess that is my answer..
So here is the final solution. (as already mentioned credits go to groovy-almanac)
import org.apache.tools.ant.Project;
import org.apache.tools.ant.ProjectHelper;
task executeAntTarget(dependsOn: getRemoteResources) << {
def antFile = new File("$buildDir/remote-resources/build.xml")
def antProject = new Project()
antProject.init()
ProjectHelper.projectHelper.parse(antProject, antFile)
antProject.executeTarget('<target-to-execute>');
}

Grade: Can't access configuration defined in one project from another project

I'm pretty new to both gradle and groovy.
The Problem
I've got a very simple multi-project structure just like below:
Root project 'gradle_test'
+--- Project ':sub1'
\--- Project ':sub2'
This is what the 'build.grade' file looks like for sub1 project:
// build.gradle of sub1 project
task testConfiguration {
println project(':sub2').configurations.sub2FooConfiguration
}
And finally, this is the 'build.grade' file of sub2 project:
// build.gradle of sub2 project
configurations {
sub2FooConfiguration
}
Very minimum. Now, if I run gradle :sub1:testConfiguration, I got the following error:
A problem occurred evaluating project ':sub1'.
> Could not find property 'sub2FooConfiguration' on configuration container.
However, everything just works if the testConfiguration task in sub1 project is modified like this:
// notice the "<<" (I believe this is calling the 'doLast' method on the task instance)
task testConfiguration << {
println project(':sub2').configurations.sub2FooConfiguration
}
The Question
What I assume the difference between the two versions of the 'testConfiguration' task is that in the first instance, a configuration closure is passed to the task whereas in the modified version a 'normal' closure is passed to the 'doLast' method.
So, first of all, is my assumption correct?
Secondly, why I don't have access to 'sub2' project in the first instance?
And finally, is it possible to access 'sub2' project in the first instance (i.e. in the configuration closure)?
[Update] A Further Question
Given the accepted answer provided by "Invisible Arrow", I'd like to ask a further question regarding the best practice of referencing a configuration of another project (i.e. a task in sub1 needs to use an archive produced by sub2 project).
Should I declare evaluation dependency between the two projects?
Or Should I only reference sub2's configuration at execution time (e.g. in doLast())?
Or, should I create a dependency configuration between the two projects?
Yes, there is a difference between the two.
There are essentially 3 phases to a build which are Initialization, Configuration and Execution. This is described in detail under the Build Lifecycle chapter in the Gradle documentation.
In your case, the first instance falls under the Configuration phase, which is always evaluated irrespective of whether the task is executed or not. That means all statements within the closure are executed when you start a build.
task testConfiguration {
// This always runs during a build,
// irrespective of whether the task is executed or not
println project(':sub2').configurations.sub2FooConfiguration
}
The second instance falls under the Execution phase. Note that << is a shorthand for doLast, and this closure is called when the task is executed.
task testConfiguration << {
// Called during actual execution of the task,
// and called only if the task was scheduled to be executed.
// Note that Configuration phase for both projects are complete at this point,
// which is why :sub1 is able to access :sub2's configurations.sub2FooConfiguration
println project(':sub2').configurations.sub2FooConfiguration
}
Now coming to why the first instance gave the error. This was because the Configuration phase of sub2 project was not evaluated yet. Hence the sub2FooConfiguration wasn't yet created.
Why? Because there is no explicit evaluation dependency between sub1 and sub2. In your case, sub1 needs sub2 as an evaluation dependency, hence we can add that dependency in sub1's build.gradle before the task declaration, as follows:
evaluationDependsOn(':sub2')
task testConfiguration {
println project(':sub2').configurations.sub2FooConfiguration
}
This will ensure that sub2 is always evaluated before sub1 (evaluation meaning the Configuration phase for the projects). sub1 will now be able to access configurations.sub2FooConfiguration in the task declaration closure. This is explained in detail in the Multi-project Builds chapter.
In the second instance, configurations.sub2FooConfiguration was accessible, as the call was in the execution block of the task (which is after the Configuration phase for both projects).
PS: Note that if you reversed the names of the projects, then the first instance might actually work, as Gradle configures projects alphabetically if there are no explicit dependencies. But, of course, you should never rely on this and ensure that the dependencies between projects are declared explicitly.

Gradle Incremental Tasks: Adding already generated code to the classpath

I have created a custom Gradle task that generates some java code. To optimize execution this plugin uses the #InputDirectory and #OutputDirectory annotations, so that the code does not have to be generated each build.
However, I do want this task to add the generated code to the classpath. I am currently doing this by
class JaxbTask extends DefaultTask {
#OutputDirectory
File destdir = project.file( "${project.buildDir}/generated-sources/mygen" )
#InputDirectory
File schemaRoot = project.file("${project.projectDir}/src/main/resources/myschema/")
#TaskAction
def main() {
..
project.sourceSets.main.java.srcDirs += destdir
..
}
The problem is that the TaskAction is not executed and the sourcedirectory is not added to the compile path when the generated code is up to date. Is there any way to make sure that the modification of the sourcepath is always performed?
A task should never try to configure the build model. Configuration is the responsibility of build scripts and plugins, and needs to happen in the configuration phase (before any task has run).

Jenkins Groovy Postbuild use static file instead of script

Is it possible to load an external groovy script into the groovy post build plugin instead of pasting the script content into each job? We have approximately 200 jobs so updating them all is rather time consuming. I know that I could write a script to update the config files directly (as in this post: Add Jenkins Groovy Postbuild step to all jobs), but these jobs run 24x7 so finding a window when I can restart Jenkins or reload the config is problematic.
Thanks!
Just put the following in the "Groovy script:" field:
evaluate(new File("... groovy script file name ..."));
Also, you might want to go even further.
What if script name or path changes?
Using Template plugin you can create a single "template" job, define call to groovy script (above line) in there, and in all jobs that need it add post-build action called "Use publishers from another project" referencing this template project.
Update: This is what really solved it for me: https://issues.jenkins-ci.org/browse/JENKINS-21480
"I am able to do just that by doing the following. Enter these lines in lieu of the script in the "Groovy script" box:"
// Delegate to an external script
// The filename must match the class name
import JenkinsPostBuild
def postBuild = new JenkinsPostBuild(manager)
postBuild.run()
"Then in the "Additional groovy classpath" box enter the path to that file."
We do it in the following fashion.
We have a file c:\somepath\MyScriptLibClass.groovy (accessible for Jenkins) which contains code of a groovy class MyScriptLibClass. The class contains a number of functions designed to act like static methods (to be mixed in later).
We include this functions writing the following statement in the beginning of sytem groovy and postbuild groovy steps:
[ // include lib scripts
'MyScriptLibClass'
].each{ this.metaClass.mixin(new GroovyScriptEngine('c:\\somepath').loadScriptByName(it+'.groovy')) }
This could look a bit ugly but you need to write it only once for script. You could include more than one script and also use inheritance between library classes.
Here you see that all methods from the library class are mixed in the current script. So if your class looks like:
class MyScriptLibClass {
def setBuildName( String str ){
binding?.variables['manager'].build.displayName = str
}
}
in Groovy Postbuild you could write just:
[ // include lib scripts
'MyScriptLibClass'
].each{ this.metaClass.mixin(new GroovyScriptEngine('c:\\somepath').loadScriptByName(it+'.groovy')) }
setBuildName( 'My Greatest Build' )
and it will change your current build's name.
There are also other ways to load external groovy classes and it is not necessary to use mixing in. For instance you can take a look here Compiling and using Groovy classes from Java at runtime?
How did I solve this:
Create file $JENKINS_HOME/scripts/PostbuildActions.groovy with following content:
public class PostbuildActions {
void setBuildName(Object manager, String str ){
binding?.variables['manager'].build.displayName = str
}
}
In this case in Groovy Postbuild you could write:
File script = new File("${manager.envVars['JENKINS_HOME']}/scripts/PostbuildActions.groovy")
Object actions = new GroovyClassLoader(getClass().getClassLoader()).parseClass(script).newInstance();
actions.setBuildName(manager, 'My Greatest Build');
If you wish to have the Groovy script in your Code Repository, and loaded onto the Build / Test Slave in the workspace, then you need to be aware that Groovy Postbuild runs on the Master.
For us, the master is a Unix Server, while the Build/Test Slaves are Windows PCs on the local network. As a result, prior to using the script, we must open a channel from the master to the Slave, and use a FilePath to the file.
The following worked for us:
// Get an Instance of the Build object, and from there
// the channel from the Master to the Workspace
build = Thread.currentThread().executable
channel = build.workspace.channel;
// Open a FilePath to the script
fp = new FilePath(channel, build.workspace.toString() + "<relative path to the script in Unix notation>")
// Some have suggested that the "Not NULL" check is redundant
// I've kept it for completeness
if(fp != null)
{
// 'Evaluate' requires a string, so read the file contents to a String
script = fp.readToString();
// Execute the script
evaluate(script);
}
I've just faced with the same task and tried to use #Blaskovicz approach.
Unfortunately it does not work for me, but I find upgraded code here (Zach Auclair)
Publish here with minor changes:
postbuild task
//imports
import hudson.model.*
import groovy.lang.GroovyClassLoader;
import groovy.lang.GroovyObject;
import java.io.File;
// define git file
def postBuildFile = manager.build.getEnvVars()["WORKSPACE"] + "/Jenkins/SimpleTaskPostBuildReporter.GROOVY"
def file = new File(postBuildFile)
// load custom class from file
Class groovy = this.class.classLoader.parseClass(file);
// create custom object
GroovyObject groovyObj = (GroovyObject) groovy.newInstance(manager);
// do report
groovyObj.report();
postbuild class file in git repo (SimpleTaskPostBuildReporter.GROOVY)
class SimpleTaskPostBuildReporter {
def manager
public SimpleTaskPostBuildReporter(Object manager){
if(manager == null) {
throw new RuntimeException("Manager object can't be null")
}
this.manager = manager
}
public def report() {
// do work with manager object
}
}
I haven't tried this exactly.
You could try the Jenkins Job DSL plugin which allows you to rebuild jobs from within jenkins using a Groovy DSL and supports post build groovy steps directly from the wiki
Groovy Postbuild
Executes Groovy scripts after a build.
groovyPostBuild(String script, Behavior behavior = Behavior.DoNothing)
Arguments:
script The Groovy script to execute after the build. See the plugin's
page for details on what can be done. behavior optional. If the script
fails, allows you to set mark the build as failed, unstable, or do
nothing. The behavior argument uses an enum, which currently has three
values: DoNothing, MarkUnstable, and MarkFailed.
Examples:
This example will run a groovy script that prints hello, world and if
that fails, it won't affect the build's status:
groovyPostBuild('println "hello, world"') This example will run a
groovy script, and if that fails will mark the build as failed:
groovyPostBuild('// some groovy script', Behavior.MarkFailed) This example
will run a groovy script, and if that fails will mark the
build as unstable:
groovyPostBuild('// some groovy script', Behavior.MarkUnstable) (Since 1.19)
There is a facility to use a template job (this is the bit I haven't tried) which could be the job itself so you only need to add the post build step. If you don't use a template you need to recode the whole project.
My approach is to have a script to regenerate or create all jobs from scratch just so I don't have to apply the same upgrade multiple times. Regenerated jobs keep their build history
I was able to get the following to work (I also posted on this jira issue).
in my postbuild task
this.class.classLoader.parseClass("/home/jenkins/GitlabPostbuildReporter.groovy")
GitlabPostbuildReporter.newInstance(manager).report()
in my file on disk at /home/jenkins/GitlabPostbuildReporter.groovy
class GitlabPostbuildReporter {
def manager
public GitlabPostbuildReporter(manager){
if(manager == null) {
throw new RuntimeException("Manager object musn't be null")
}
this.manager = manager
}
public def report() {
// do work with manager object
}
}

Resources