Gradle ant build file not not available during configuration phase - groovy

I can't figure out how to execute ant target in case that ant build.xml file is not available during configuration phase. Because it's a remote resource (Maven Remote Resources Plugin).
So basically first I need to get this remote resource like this:
configurations {
remoteResources
}
dependencies.remoteResources 'group:my-remote-resource:version'
task getRemoteResources(type: Copy) {
from(zipTree(configurations.remoteResources.first()))
into("$buildDir/remote-resources")
// replace placeholders
filter(org.apache.tools.ant.filters.ReplaceTokens, , tokens: remotePlaceholders)
}
Only then I have build.xml in
"$buildDir/remote-resources"
But I can't use ant.importBuild as that expects build.xml to be available during the configuration, which is not my case.
I was thinking to move the remote resource "download" into initialization phase, but I have a multi-module project and although only some sub-projects are using this remote resource they all has it own placeholders to replace.
Is there any way how to execute ant targets in this special case?
EDIT: I found pretty nice solution utilising Ant's ProjectHelper and Project classes. So I guess that is my answer..

So here is the final solution. (as already mentioned credits go to groovy-almanac)
import org.apache.tools.ant.Project;
import org.apache.tools.ant.ProjectHelper;
task executeAntTarget(dependsOn: getRemoteResources) << {
def antFile = new File("$buildDir/remote-resources/build.xml")
def antProject = new Project()
antProject.init()
ProjectHelper.projectHelper.parse(antProject, antFile)
antProject.executeTarget('<target-to-execute>');
}

Related

Is additional context configuration required when upgrading cucumber-jvm from version 4 to version 6?

I am using cucumber-jvm to perform some functional tests in Kotlin.
I have the standard empty runner class:
#RunWith(Cucumber::class)
#CucumberOptions(features=[foo],
glue=[bar],
plugin=[baz],
strict=true,
monochrome=true)
class Whatever
The actual steps are defined in another class with the #ContextConfiguration springframework annotation.
This class also uses other spring features like #Autowire or #Qualifier
#ContextConfiguration(locations=["x/y/z/config.xml"])
class MyClass {
...
#Before
...
#Given("some feature file stuff")
...
// etc
}
This all work fine in cucumber version 4.2.0, however upgrading to version 6.3.0 breaks things. After updating the imports to match the new cucumber project layout the tests now fail with this error:
io.cucumber.core.backend.CucumberBackendException: Please annotate a glue class with some context configuration.
It provides examples of what it means...
For example:
#CucumberContextConfiguration
#SpringBootTest(classes = TestConfig.class)
public class CucumberSpringConfiguration {}
Or:
#CucumberContextConfiguration
#ContextConfiguration( ... )
public class CucumberSpringConfiguration {}
It looks like it's telling me I can just add #CucumberContextConfiguration to MyClass.
But why?
I get the point of #CucumberContextConfiguration, it's explained well here but why do I need it now with version 6 when version 4 got on fine without it? I can't see any feature that was deprecated and replaced by this.
Any help would be appreciated :)
Since the error matches exactly with the error I was getting in running Cucumber tests with Spring Boot, so I am sharing my fix.
One of the probable reason is: Cucumber can't find the CucumberSpringConfiguration
class in the glue path.
Solution 1:
Move the CucumberSpringConfiguration class inside the glue path (which in my case was inside the steps package).
Solution 2:
Add the CucumberSpringConfiguration package path in the glue path.
The below screenshot depicts my project structure.
As you can see that my CucumberSpringConfig class was under configurations package so it was throwing me the error when I tried to run feature file from command prompt (mvn clean test):
"Please annotate a glue class with some context configuration."
So I applied solution 2, i.e added the configurations package in the glue path in my runner class annotation.
And this is the screenshot of the contents of CucumberSpringConfiguration class:
Just an extra info:
To run tests from command prompt we need to include the below plugin in pom.xml
https://github.com/cucumber/cucumber-jvm/pull/1959 removed the context configuration auto-discovery. The author concluded that it hid user errors and removing it would provide more clarity and reduce complexity. It also listed the scenarios where the context configuration auto-discovery used to apply.
Note that it was introduced after https://github.com/cucumber/cucumber-jvm/pull/1911, which you had mentioned.
Had the same error but while running Cucumber tests from Jar with Gradle.
The solution was to add a rule to the jar task to merge all the files with the name "META-INF/services/io.cucumber.core.backend.BackendProviderService" (there could be multiple of them in different Cucumber libs - cucumber-java, cucumber-spring).
For Gradle it is:
shadowJar {
....
transform(AppendingTransformer) {
resource = 'META-INF/services/io.cucumber.core.backend.BackendProviderService'
}
}
For Maven something like this:
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/services/io.cucumber.core.backend.BackendProviderService</resource>
</transformer>
</transformers>
A bit more explanation could be found in this answer

Jenkins Script to Restrict all builds to a given node

We've recently added a second slave node to our Jenkins build environment running a different OS (Linux instead of Windows) for specific builds. Unsurprisingly, this means we need to restrict builds via the "Restrict where this project can be run" setting. However we have a lot of builds (100+) so the prospect of clicking through them all to change this setting manually isn't thrilling me.
Can someone provide a groovy script to achieve this via the Jenkins script console? I've used similar scripts in the past for changing other settings but I can't find any reference for this particular setting.
Managed to figure out the script for myself based on previous scripts and the Jenkins source. Script is as follows:
import hudson.model.*
import hudson.model.labels.*
import hudson.maven.*
import hudson.tasks.*
import hudson.plugins.git.*
hudsonInstance = hudson.model.Hudson.instance
allItems = hudsonInstance.allItems
buildableItems = allItems.findAll{ job -> job instanceof BuildableItemWithBuildWrappers }
buildableItems.each { item ->
boolean shouldSave = false
item.allJobs.each { job ->
job.assignedLabel = new LabelAtom('windows-x86')
}
}
Replace 'windows-x86' with whatever your node label needs to be. You could also do conditional changes based on item.name to filter out some jobs, if necessary.
You could try the Jenkins Job-DSL plugin
which would allow you to create a job to alter your other jobs. This works by providing a build step in a groovy based DSL to modify other jobs.
This one here would add a label to the job 'xxxx'. I've cheated a bit by using the job itself as a template.
job{
using 'xxxx'
name 'xxxx'
label 'Linux'
}
You might need to adjust it if some of you jobs are different types

Gradle Incremental Tasks: Adding already generated code to the classpath

I have created a custom Gradle task that generates some java code. To optimize execution this plugin uses the #InputDirectory and #OutputDirectory annotations, so that the code does not have to be generated each build.
However, I do want this task to add the generated code to the classpath. I am currently doing this by
class JaxbTask extends DefaultTask {
#OutputDirectory
File destdir = project.file( "${project.buildDir}/generated-sources/mygen" )
#InputDirectory
File schemaRoot = project.file("${project.projectDir}/src/main/resources/myschema/")
#TaskAction
def main() {
..
project.sourceSets.main.java.srcDirs += destdir
..
}
The problem is that the TaskAction is not executed and the sourcedirectory is not added to the compile path when the generated code is up to date. Is there any way to make sure that the modification of the sourcepath is always performed?
A task should never try to configure the build model. Configuration is the responsibility of build scripts and plugins, and needs to happen in the configuration phase (before any task has run).

Jenkins Groovy Postbuild use static file instead of script

Is it possible to load an external groovy script into the groovy post build plugin instead of pasting the script content into each job? We have approximately 200 jobs so updating them all is rather time consuming. I know that I could write a script to update the config files directly (as in this post: Add Jenkins Groovy Postbuild step to all jobs), but these jobs run 24x7 so finding a window when I can restart Jenkins or reload the config is problematic.
Thanks!
Just put the following in the "Groovy script:" field:
evaluate(new File("... groovy script file name ..."));
Also, you might want to go even further.
What if script name or path changes?
Using Template plugin you can create a single "template" job, define call to groovy script (above line) in there, and in all jobs that need it add post-build action called "Use publishers from another project" referencing this template project.
Update: This is what really solved it for me: https://issues.jenkins-ci.org/browse/JENKINS-21480
"I am able to do just that by doing the following. Enter these lines in lieu of the script in the "Groovy script" box:"
// Delegate to an external script
// The filename must match the class name
import JenkinsPostBuild
def postBuild = new JenkinsPostBuild(manager)
postBuild.run()
"Then in the "Additional groovy classpath" box enter the path to that file."
We do it in the following fashion.
We have a file c:\somepath\MyScriptLibClass.groovy (accessible for Jenkins) which contains code of a groovy class MyScriptLibClass. The class contains a number of functions designed to act like static methods (to be mixed in later).
We include this functions writing the following statement in the beginning of sytem groovy and postbuild groovy steps:
[ // include lib scripts
'MyScriptLibClass'
].each{ this.metaClass.mixin(new GroovyScriptEngine('c:\\somepath').loadScriptByName(it+'.groovy')) }
This could look a bit ugly but you need to write it only once for script. You could include more than one script and also use inheritance between library classes.
Here you see that all methods from the library class are mixed in the current script. So if your class looks like:
class MyScriptLibClass {
def setBuildName( String str ){
binding?.variables['manager'].build.displayName = str
}
}
in Groovy Postbuild you could write just:
[ // include lib scripts
'MyScriptLibClass'
].each{ this.metaClass.mixin(new GroovyScriptEngine('c:\\somepath').loadScriptByName(it+'.groovy')) }
setBuildName( 'My Greatest Build' )
and it will change your current build's name.
There are also other ways to load external groovy classes and it is not necessary to use mixing in. For instance you can take a look here Compiling and using Groovy classes from Java at runtime?
How did I solve this:
Create file $JENKINS_HOME/scripts/PostbuildActions.groovy with following content:
public class PostbuildActions {
void setBuildName(Object manager, String str ){
binding?.variables['manager'].build.displayName = str
}
}
In this case in Groovy Postbuild you could write:
File script = new File("${manager.envVars['JENKINS_HOME']}/scripts/PostbuildActions.groovy")
Object actions = new GroovyClassLoader(getClass().getClassLoader()).parseClass(script).newInstance();
actions.setBuildName(manager, 'My Greatest Build');
If you wish to have the Groovy script in your Code Repository, and loaded onto the Build / Test Slave in the workspace, then you need to be aware that Groovy Postbuild runs on the Master.
For us, the master is a Unix Server, while the Build/Test Slaves are Windows PCs on the local network. As a result, prior to using the script, we must open a channel from the master to the Slave, and use a FilePath to the file.
The following worked for us:
// Get an Instance of the Build object, and from there
// the channel from the Master to the Workspace
build = Thread.currentThread().executable
channel = build.workspace.channel;
// Open a FilePath to the script
fp = new FilePath(channel, build.workspace.toString() + "<relative path to the script in Unix notation>")
// Some have suggested that the "Not NULL" check is redundant
// I've kept it for completeness
if(fp != null)
{
// 'Evaluate' requires a string, so read the file contents to a String
script = fp.readToString();
// Execute the script
evaluate(script);
}
I've just faced with the same task and tried to use #Blaskovicz approach.
Unfortunately it does not work for me, but I find upgraded code here (Zach Auclair)
Publish here with minor changes:
postbuild task
//imports
import hudson.model.*
import groovy.lang.GroovyClassLoader;
import groovy.lang.GroovyObject;
import java.io.File;
// define git file
def postBuildFile = manager.build.getEnvVars()["WORKSPACE"] + "/Jenkins/SimpleTaskPostBuildReporter.GROOVY"
def file = new File(postBuildFile)
// load custom class from file
Class groovy = this.class.classLoader.parseClass(file);
// create custom object
GroovyObject groovyObj = (GroovyObject) groovy.newInstance(manager);
// do report
groovyObj.report();
postbuild class file in git repo (SimpleTaskPostBuildReporter.GROOVY)
class SimpleTaskPostBuildReporter {
def manager
public SimpleTaskPostBuildReporter(Object manager){
if(manager == null) {
throw new RuntimeException("Manager object can't be null")
}
this.manager = manager
}
public def report() {
// do work with manager object
}
}
I haven't tried this exactly.
You could try the Jenkins Job DSL plugin which allows you to rebuild jobs from within jenkins using a Groovy DSL and supports post build groovy steps directly from the wiki
Groovy Postbuild
Executes Groovy scripts after a build.
groovyPostBuild(String script, Behavior behavior = Behavior.DoNothing)
Arguments:
script The Groovy script to execute after the build. See the plugin's
page for details on what can be done. behavior optional. If the script
fails, allows you to set mark the build as failed, unstable, or do
nothing. The behavior argument uses an enum, which currently has three
values: DoNothing, MarkUnstable, and MarkFailed.
Examples:
This example will run a groovy script that prints hello, world and if
that fails, it won't affect the build's status:
groovyPostBuild('println "hello, world"') This example will run a
groovy script, and if that fails will mark the build as failed:
groovyPostBuild('// some groovy script', Behavior.MarkFailed) This example
will run a groovy script, and if that fails will mark the
build as unstable:
groovyPostBuild('// some groovy script', Behavior.MarkUnstable) (Since 1.19)
There is a facility to use a template job (this is the bit I haven't tried) which could be the job itself so you only need to add the post build step. If you don't use a template you need to recode the whole project.
My approach is to have a script to regenerate or create all jobs from scratch just so I don't have to apply the same upgrade multiple times. Regenerated jobs keep their build history
I was able to get the following to work (I also posted on this jira issue).
in my postbuild task
this.class.classLoader.parseClass("/home/jenkins/GitlabPostbuildReporter.groovy")
GitlabPostbuildReporter.newInstance(manager).report()
in my file on disk at /home/jenkins/GitlabPostbuildReporter.groovy
class GitlabPostbuildReporter {
def manager
public GitlabPostbuildReporter(manager){
if(manager == null) {
throw new RuntimeException("Manager object musn't be null")
}
this.manager = manager
}
public def report() {
// do work with manager object
}
}

How to refer to the values to be declared in build.gradle

I'm totally new to this gradle, teamcity and groovy.
I'm tryign to write a plugin,which will get the value from svninfo. If the developer wants to override the value(in build.gradle) they can override something like this.
globalVariables{
virtualRepo = "virtualRepo"
baseName = "baseName"
version = "version"
group = "group"
}
Here i provide the extension called globalvariable.
Now, The jars to be produced shd hav the name according to the values from the build.gradle..
How to get the value from build.gradle in the plugin inorder name the jar???
Not sure I understand the question. It's the plugin that installs the extension object, and it's the plugin that needs to do something with it.
Note that the plugin has to defer reading information from the extension object because the latter might only get populated after the plugin has run. (A plugin runs when apply plugin: is called in a build script. globalVariables {} might come after that.) There are several techniques for deferring configuration. In your particular case, if I wanted to use the information provided by the extension object to configure a Jar task, I might use jar.doFirst { ... } or gradle.projectsEvaluated { jar. ... }.
Before you go about writing plugins, make sure to study the Writing Custom Plugins chapter in the Gradle user guide. A search on Stack Overflow or on http://forums.gradle.org should turn up more information on techniques for deferring configuration. Another valuable source of information is the Gradle codebase (e.g. the plugins in the code-quality subproject).

Resources