How to retrieve Jenkins build parameters using the Groovy API? - groovy

I have a parameterized job that uses the Perforce plugin and would like to retrieve the build parameters/properties as well as the p4.change property that's set by the Perforce plugin.
How do I retrieve these properties with the Jenkins Groovy API?

Update: Jenkins 2.x solution:
With Jenkins 2 pipeline dsl, you can directly access any parameter with the trivial syntax based on the params (Map) built-in:
echo " FOOBAR value: ${params.'FOOBAR'}"
The returned value will be a String or a boolean depending on the Parameter type itself. The syntax is the same for scripted or declarative syntax. More info at: https://jenkins.io/doc/book/pipeline/jenkinsfile/#handling-parameters
If your parameter name is itself in a variable:
def paramName = "FOOBAR"
def paramValue = params.get(paramName) // or: params."${paramName}"
echo """ FOOBAR value: ${paramValue}"
Original Answer for Jenkins 1.x:
For Jenkins 1.x, the syntax is based on the build.buildVariableResolver built-ins:
// ... or if you want the parameter by name ...
def hardcoded_param = "FOOBAR"
def resolver = build.buildVariableResolver
def hardcoded_param_value = resolver.resolve(hardcoded_param)
Please note the official Jenkins Wiki page covers this in more details as well, especially how to iterate upon the build parameters:
https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+System+Groovy+script
The salient part is reproduced below:
// get parameters
def parameters = build?.actions.find{ it instanceof ParametersAction }?.parameters
parameters.each {
println "parameter ${it.name}:"
println it.dump()
}

For resolving a single parameter (I guess what's most commonly needed), this is the simplest I found:
build.buildVariableResolver.resolve("myparameter")
in your Groovy System script build step.

Regarding parameters:
See this answer first. To get a list of all the builds for a project (obtained as per that answer):
project.builds
When you find your particular build, you need to get all actions of type ParametersAction with build.getActions(hudson.model.ParametersAction). You then query the returned object for your specific parameters.
Regarding p4.change: I suspect that it is also stored as an action. In Jenkins Groovy console get all actions for a build that contains p4.change and examine them - it will give you an idea what to look for in your code.

I've just got this working, so specifically, using the Groovy Postbuild plugin, you can do the following:
def paramText
def actionList = manager.build.getActions(hudson.model.ParametersAction)
if (actionList.size() != 0)
{
def pA = actionList.get(0)
paramText = pA.createVariableResolver(manager.build).resolve("MY_PARAM_NAME")
}

In cases when a parameter name cannot be hardcoded I found this would be the simplest and best way to access parameters:
def myParam = env.getProperty(dynamicParamName)
In cases, when a parameter name is known and can be hardcoded the following 3 lines are equivalent:
def myParam = env.getProperty("myParamName")
def myParam = env.myParamName
def myParam = myParamName

To get the parameterized build params from the current build from your GroovyScript (using Pipeline), all you need to do is:
Say you had a variable called VARNAME.
def myVariable = env.VARNAME

Get all of the parameters:
System.getenv().each{
println it
}
Or more sophisticated:
def myvariables = getBinding().getVariables()
for (v in myvariables) {
echo "${v} " + myvariables.get(v)
}
You will need to disable "Use Groovy Sandbox" for both.

If you are trying to get all parameters passed to Jenkins job you can use the global variable params in your groovy pipeline to fetch it.
http://jenkins_host:8080/pipeline-syntax/globals
params
Use something like below.
def dumpParameter()
{
params.each {
println it.key + " = " + it.value
}
}

thanks patrice-n! this code worked to get both queued and running jobs and their parameters:
import hudson.model.Job
import hudson.model.ParametersAction
import hudson.model.Queue
import jenkins.model.Jenkins
println("================================================")
for (Job job : Jenkins.instanceOrNull.getAllItems(Job.class)) {
if (job.isInQueue()) {
println("------------------------------------------------")
println("InQueue " + job.name)
Queue.Item queue = job.getQueueItem()
if (queue != null) {
println(queue.params)
}
}
if (job.isBuilding()) {
println("------------------------------------------------")
println("Building " + job.name)
def build = job.getBuilds().getLastBuild()
def parameters = build?.getAllActions().find{ it instanceof ParametersAction }?.parameters
parameters.each {
def dump = it.dump()
println "parameter ${it.name}: ${dump}"
}
}
}
println("================================================")

The following can be used to retreive an environment parameter:
println System.getenv("MY_PARAM")

The following snippet worked for me to get a parameter value in a parameterized project:
String myParameter = this.getProperty('binding').getVariable('MY_PARAMETER')
The goal was to dynamically lock a resource based on the selected project parameter.
In "[✓] This build requires lockable resources" I have the following "[✓] Groovy Expression":
if (resourceName == 'resource_lock_name') {
Binding binding = this.getProperty('binding')
String profile = binding.getVariable('BUILD_PROFILE')
return profile == '-Poradb' // acquire lock if "oradb" profile is selected
}
return false
In "[✓] This project is parameterized" section I have a "Choice Parameter" named e.g. BUILD_PROFILE
Example of Choices are:
-Poradb
-Ph2db
-DskipTests -T4
The lock on "resource_lock_name" will be acquired only if "-Poradb" is selected when building project with parameters
[-] Use Groovy Sandbox shall be unchecked for this syntax to work

Related

Reading file from Workspace in Jenkins with Groovy script

I want to add a Build step with the Groovy plugin to read a file and trigger a build fail depending on the content of the file.
How can I inject the workspace file path in the groovy plugin ?
myFileDirectory = // Get workspace filepath here ???
myFileName = "output.log"
myFile = new File(myFileDirectory + myFileName)
lastLine = myFile.readLines().get(myFile.readLines().size().toInteger() - 1)
if (lastLine ==~ /.Fatal Error.*/ ){
println "Fatal error found"
System.exit(1)
} else{
println "nothing to see here"
}
I realize this question was about creating a plugin, but since the new Jenkins 2 Pipeline builds use Groovy, I found myself here while trying to figure out how to read a file from a workspace in a Pipeline build. So maybe I can help someone like me out in the future.
Turns out it's very easy, there is a readfile step, and I should have rtfm:
env.WORKSPACE = pwd()
def version = readFile "${env.WORKSPACE}/version.txt"
If you are trying to read a file from the workspace during a pipeline build step, there's a method for that:
readFile('name-of-file.groovy')
For reference, see https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#readfile-read-file-from-workspace.
Based on your comments, you would be better off with Text-finder plugin.
It allows to search file(s), as well as console, for a regular expression and then set the build either unstable or failed if found.
As for the Groovy, you can use the following to access ${WORKSPACE} environment variable:
def workspace = manager.build.getEnvVars()["WORKSPACE"]
Although this question is only related to finding directory path ($WORKSPACE) however I had a requirement to read the file from workspace and parse it into JSON object to read sonar issues ( ignore minor/notes issues )
Might help someone, this is how I did it-
from readFile
jsonParse(readFile('xyz.json'))
and jsonParse method-
#NonCPS
def jsonParse(text) {
return new groovy.json.JsonSlurperClassic().parseText(text);
}
This will also require script approval in ManageJenkins-> In-process script approval
May this help to someone if they have the same requirement.
This will read a file that contains the Jenkins Job name and run them iteratively from one single job.
Please change below code accordingly in your Jenkins.
pipeline {
agent any
stages {
stage('Hello') {
steps {
script{
git branch: 'Your Branch name', credentialsId: 'Your crendiatails', url: ' Your BitBucket Repo URL '
##To read file from workspace which will contain the Jenkins Job Name ###
def filePath = readFile "${WORKSPACE}/ Your File Location"
##To read file line by line ###
def lines = filePath.readLines()
##To iterate and run Jenkins Jobs one by one ####
for (line in lines) {
build(job: "$line/branchName",
parameters:
[string(name: 'vertical', value: "${params.vert}"),
string(name: 'environment', value: "${params.env}"),
string(name: 'branch', value: "${params.branch}"),
string(name: 'project', value: "${params.project}")
]
)
}
}
}
}
}
}
If you already have the Groovy (Postbuild) plugin installed, I think it's a valid desire to get this done with (generic) Groovy instead of installing a (specialized) plugin.
That said, you can get the workspace using manager.build.workspace.getRemote(). Don't forget to add File.separator between path and file name.
As mentioned in a different post Read .txt file from workspace groovy script in Jenkins I was struggling to make it work for the pom modules for a file in the workspace, in the
Extended Choice Parameter. Here is my solution with the printlns:
import groovy.util.XmlSlurper
import java.util.Map
import jenkins.*
import jenkins.model.*
import hudson.*
import hudson.model.*
try{
//get Jenkins instance
def jenkins = Jenkins.instance
//get job Item
def item = jenkins.getItemByFullName("The_JOB_NAME")
println item
// get workspacePath for the job Item
def workspacePath = jenkins.getWorkspaceFor (item)
println workspacePath
def file = new File(workspacePath.toString()+"\\pom.xml")
def pomFile = new XmlSlurper().parse(file)
def pomModules = pomFile.modules.children().join(",")
return pomModules
} catch (Exception ex){
println ex.message
}

Delete jenkins builds during Promote / promotion step

Jenkins: Version 1.525
Jenkins Server URL: http://my.jenkins.server.com:9040
Linux Red Hat 5.3
Artifactory: Free version
Artifactory Server URL: http://my.artifactory.server:8081/Artifactory
I'm successfully able to build in Jenkins and upload artifacts to my Artifactory server under a respective repository.
When a build occurs, the artifacts (ProjectA-1.0.0.25.tar.gz) goes to Artifactory under libs-snapshot-local repository. Here 1.0.0 is denoting Major, minor and interim version of a given release for an application/jenkins job: "ProjectA" in this case. Lets say 25 is the build number
When ProjectA build gets stable in Development, we promote a given build of that application release to INT or any other higher environment (QA/PrePROD etc).
During this promotion process, we just select which build to promote and using Jenkins Promoted Build Plugin, we are able to do it successfully.
Now, what we need is:
During the promote process, I want to call a Groovy script, which will delete all Jenkins builds from Jenkins and Artifactory (libs-snapshot-local) for ProjectA's release 1.0.0 after "ProjectA-1.0.0.25.tar.gz" is successfully promoted to INT. Promotion part is working fine right now; All I need is a Groovy script which will delete Jenkins builds (1.0.0.1 to 1.0.0.24 and >= 1.0.0.25) in Jenkins and its associated artifacts from Artifactory repository (libs-snapshot-local).
Idea in our company is, once a release version's build is promoted for an application, all other builds / artifacts we have (in Jenkins/Artifactory) - we want to delete forever using a Groovy script. Someone will ask, what if I want to promote a different build#; well in our case, we dont want that. Simple rule is, if someone promotes ProjectA-1.0.0.25.tar.gz, then, delete ProjectA's builds / artifacts in Jenkins and Artifactory where build/artifact is other than 1.0.0.25 and continue with new release 1.1.0
The script with the following capability would be great.
1. Use property files (jenkins.properties / artifactory.properties) - which will contains some variables about hostname/username/password etc if any.
2. Use REST API to perform the deletion for a given application/job and given release (for ex 1.0.0)
3. Can be used for both Jenkins/Artifactory deletion - if at command prompt, I say use this (Jenkins) property file -or that (Artifatory) - as in both cases, application and its release value will be same.
4. We know that for promoting a build to INT (using Jenkins promote plugin), we'll always delete from Jenkins server and from Artifactory server only at libs-snapshot-release.
Now if someone does promotion to QA (at a later time), then artifactory repository will be (libs-stage-local)
In other words, we should call the Groovy script, pass some variables/values (REST) and tell which application/job to delete and what build release version it's. Then, it'll delete all builds except the one which a user will pass (i.e. 1.0.0.25)
I'm new to both Groovy / using REST API for doing this "deletion" piece of work for Jenkins/Artifactory. If someone already has any sample script that does this kind of activity and if you can share, I'll tweak it according to my settings and see if I can see the above mentioned behaviour during promotion step. I have some time crunch in getting a working version of this script, would appreciate some script code doing the same task (instead of great people telling me to go through big documentations/links, I know that'll make me a better coder in Groovy but it'll delay the whole purpose for this post).
Thanks a lot.
Found one way (not using REST API calls at this time but soon I'll update, or you can help).
Solution 1 - To delete all builds of a Jenkins job except one build (which we'd selected for promotion) thus, during the promotion, we'll call "scriptler" script under BUILD section of Jenkins, script which has the following code or create a separate job and call this script by passing 2 parameters (jobName and buildNumber - string params in Jenkins job).
-bash-3.2$ cat bulkDeleteBuildsExceptOne.groovy
/*** BEGIN META {
"name" : "Bulk Delete Builds except the given build number",
"comment" : "For a given job and a given build numnber, delete all build except the user provided one.",
"parameters" : [ 'jobName', 'buildNumber' ],
"core": "1.410",
"authors" : [
{ name : "Arun Sangal" }
]
} END META **/
// NOTE: Uncomment parameters below if not using Scriptler >= 2.0, or if you're just pasting the script in manually.
// ----- Logic in this script takes 5000 as the infinite number, decrease / increase this value from your own experience.
// The name of the job.
//def jobName = "some-job"
// The range of build numbers to delete.
//def buildNumber = "5"
def lastBuildNumber = buildNumber.toInteger() - 1;
def nextBuildNumber = buildNumber.toInteger() + 1;
import jenkins.model.*;
import hudson.model.Fingerprint.RangeSet;
def jij = jenkins.model.Jenkins.instance.getItem(jobName);
println("Keeping Job_Name: ${jobName} and build Number: ${buildNumber}");
println ""
def setBuildRange = "1-${lastBuildNumber}"
//println setBuildRange
def range = RangeSet.fromString(setBuildRange, true);
jij.getBuilds(range).each { it.delete() }
println("Builds have been deleted - Range: " + setBuildRange)
setBuildRange = "${nextBuildNumber}-5000"
//println setBuildRange
range = RangeSet.fromString(setBuildRange, true);
jij.getBuilds(range).each { it.delete() }
println("Builds have been deleted - Range: " + setBuildRange)
https://github.com/gigaaks/jenkins-scripts/blob/master/scriptler/bulkDeleteBuildsExceptOne.groovy
-OR
http://scriptlerweb.appspot.com/script/show/101001 (Scriptler Web site) - This can be visible in Jenkins Scriptler plugin under Remote catalogs scripts section.
Would have been a little easier if GITHUB people provided an easy button/link to PUSH my changes to main jenkinsci branch/repository.
Though I'm still looking for 2 things:
How can I make the following script parameterized in Groovy. Using CliBuilder, I get class not found error.
How to do this using Jenkins REST API Call. Later, I'll do the same using Artifactory REST API call.
OK. There was a little tweak. Found that if a Jenkins job is generating multiple release/version builds/artifacts from a single job (i.e. if it's using Build Name Setter plugin) and using Major.minor.interim (2.75.0 for ex.) as it's release and generating builds 1-150 for this release and later once that release is gone to INT/QA env, the same job is creating builds from 1-Nth number for next release (i.e. 2.75.1 or 2.76.0 or etc..) then the following script will do the trick.
See this link:
Do not delete a Jenkins build if it's marked as "Keep this build forever" - Groovy script to delete Jenkins builds
bulkDeleteJenkinsBuildsExceptOne_OfAGivenRelease.groovy
/*** BEGIN META {
"name" : "Bulk Delete Builds except the given build number",
"comment" : "For a given job and a given build numnber, delete all builds of a given release version (M.m.interim) only and except the user provided one. Sometimes a Jenkins job use Build Name setter plugin and same job generates 2.75.0.1 and 2.76.0.43",
"parameters" : [ 'jobName', 'releaseVersion', 'buildNumber' ],
"core": "1.409",
"authors" : [
{ name : "Arun Sangal" }
]
} END META **/
// NOTE: Uncomment parameters below if not using Scriptler >= 2.0, or if you're just pasting the script in manually.
// ----- Logic in this script takes 5000 as the infinite number, decrease / increase this value from your own experience.
// The name of the job.
//def jobName = "some-job"
// The release / version of a Jenkins job - i.e. in case you use "Build name" setter plugin in Jenkins for getting builds like 2.75.0.1, 2.75.0.2, .. , 2.75.0.15 etc.
// and over the time, change the release/version value (2.75.0) to a newer value i.e. 2.75.1 or 2.76.0 and start builds of this new release/version from #1 onwards.
//def releaseVersion = "2.75.0"
// The range of build numbers to delete.
//def buildNumber = "5"
def lastBuildNumber = buildNumber.toInteger() - 1;
def nextBuildNumber = buildNumber.toInteger() + 1;
import jenkins.model.*;
import hudson.model.Fingerprint.RangeSet;
def jij = jenkins.model.Jenkins.instance.getItem(jobName);
//def build = jij.getLastBuild();
println ""
println("- Jenkins Job_Name: ${jobName} -- Version: ${releaseVersion} -- Keep Build Number: ${buildNumber}");
println ""
println " -- Range before given build number: ${buildNumber}"
println ""
def setBuildRange = "1-${lastBuildNumber}"
def range = RangeSet.fromString(setBuildRange, true);
jij.getBuilds(range).each {
if ( it.getDisplayName().find(/${releaseVersion}.*/)) {
println " ## Deleting >>>>>>>>>: " + it.getDisplayName();
// Trying to find - how to NOT delete a build in Jenkins if it's marked as "keep this build forever". If someone has an idea, please update this script with a newer version in GitHub.
//if ( !build.isKeepLog()) {
it.delete();
//} else {
// println "build -- can't be deleted as :" + build.getWhyKeepLog();
//}
}
}
println ""
println " -- Range after given build number: ${buildNumber}"
println ""
setBuildRange = "${nextBuildNumber}-5000"
range = RangeSet.fromString(setBuildRange, true);
jij.getBuilds(range).each {
if ( it.getDisplayName().find(/${releaseVersion}.*/)) {
println " ## Deleting >>>>>>>>>: " + it.getDisplayName();
it.delete();
}
}
println ""
println("- Builds have been successfully deleted for the above mentioned release: ${releaseVersion}")
println ""
For using REST API to call the above scriptler script OR a Jenkins job would be like: wondering where I'm passing POST action.
The main line is: def artifactSearchUri = "api/build/${jobName}/${buildNumber}" ... which we need to tweak like: ="api/build/Some_Jenkins_Job_That_You_Will_Create/buildWithParameters?jobName=Test_AppSvc&releaseVersion=2.75.0&buildNumber=15"
import groovy.json.*
def artifactoryURL= properties["jenkins.ARTIFACTORY_URL"]
def artifactoryUser = properties["artifactoryUser"]
def artifactoryPassword = properties["artifactoryPassword"]
def authString = "${artifactoryUser}:${artifactoryPassword}".getBytes().encodeBase64().toString()
def jobName = properties["jobName"]
def buildNumber = properties["buildNumber"]
def artifactSearchUri = "api/build/${jobName}/${buildNumber}"
def conn = "${artifactoryURL}/${artifactSearchUri}".toURL().openConnection()
conn.setRequestProperty("Authorization", "Basic " + authString);
println "Searching artifactory with: ${artifactSearchUri}"
def searchResults
if( conn.responseCode == 200 ) {
searchResults = new JsonSlurper().parseText(conn.content.text)
} else {
throw new Exception ("Failed to find the build info for ${jobName}/${buildNumber}: ${conn.responseCode} - ${conn.responseMessage}")
}
and for deleting Artifactory builds, we have to club the above logic via using the following groovy script, which I'm still trying to get working. I know I'm close.
BLOG: http://browse.feedreader.com/c/Gridshore/11546011
Script: https://github.com/jettro/small-scripts/blob/master/groovy/artifactory/Artifactory.groovy
package artifactory
import groovy.text.SimpleTemplateEngine
import groovyx.net.http.RESTClient
import net.sf.json.JSON
/**
* This groovy class is meant to be used to clean up your Atifactory server or get more information about it's
* contents. The api of artifactory is documented very well at the following location
* {#see http://wiki.jfrog.org/confluence/display/RTF/Artifactory%27s+REST+API}
*
* At the moment there is one major use of this class, cleaning your repository.
*
* Reading data about the repositories is done against /api/repository, if you want to remove items you need to use
* '/api/storage'
*
* Artifactory returns a strange Content Type in the response. We want to use a generic JSON library. Therefore we need
* to map the incoming type to the standard application/json. An example of the mapping is below, all the other
* mappings can be found in the obtainServerConnection method.
* 'application/vnd.org.jfrog.artifactory.storage.FolderInfo+json' => server.parser.'application/json'
*
* The class makes use of a config object. The config object is a map with a minimum of the following fields:
* def config = [
* server: 'http://localhost:8080',
* repository: 'libs-release-local',
* versionsToRemove: ['/3.2.0-build-'],
* dryRun: true]
*
* The versionsToRemove is an array of strings that are the strart of builds that should be removed. To give an idea of
* the build numbers we use: 3.2.0-build-1 or 2011.10-build-1. The -build- is important for the solution. This is how
* we identify an artifact instead of a group folder.
*
* The final option to notice is the dryRun option. This way you can get an overview of what will be deleted. If set
* to false, it will delete the selected artifacts.
*
* Usage example
* -------------
* def config = [
* server: 'http://localhost:8080',
* repository: 'libs-release-local',
* versionsToRemove: ['/3.2.0-build-'],
* dryRun: false]
*
* def artifactory = new Artifactory(config)
*
* def numberRemoved = artifactory.cleanArtifactsRecursive('nl/gridshore/toberemoved')
*
* if (config.dryRun) {* println "$numberRemoved folders would have been removed."
*} else {* println "$numberRemoved folders were removed."
*}* #author Jettro Coenradie
*/
private class Artifactory {
def engine = new SimpleTemplateEngine()
def config
def Artifactory(config) {
this.config = config
}
/**
* Print information about all the available repositories in the configured Artifactory
*/
def printRepositories() {
def server = obtainServerConnection()
def resp = server.get(path: '/artifactory/api/repositories')
if (resp.status != 200) {
println "ERROR: problem with the call: " + resp.status
System.exit(-1)
}
JSON json = resp.data
json.each {
println "key :" + it.key
println "type : " + it.type
println "descritpion : " + it.description
println "url : " + it.url
println ""
}
}
/**
* Return information about the provided path for the configured artifactory and server.
*
* #param path String representing the path to obtain information for
*
* #return JSON object containing information about the specified folder
*/
def JSON folderInfo(path) {
def binding = [repository: config.repository, path: path]
def template = engine.createTemplate('''/artifactory/api/storage/$repository/$path''').make(binding)
def query = template.toString()
def server = obtainServerConnection()
def resp = server.get(path: query)
if (resp.status != 200) {
println "ERROR: problem obtaining folder info: " + resp.status
println query
System.exit(-1)
}
return resp.data
}
/**
* Recursively removes all folders containing builds that start with the configured paths.
*
* #param path String containing the folder to check and use the childs to recursively check as well.
* #return Number with the amount of folders that were removed.
*/
def cleanArtifactsRecursive(path) {
def deleteCounter = 0
JSON json = folderInfo(path)
json.children.each {child ->
if (child.folder) {
if (isArtifactFolder(child)) {
config.versionsToRemove.each {toRemove ->
if (child.uri.startsWith(toRemove)) {
removeItem(path, child)
deleteCounter++
}
}
} else {
if (!child.uri.contains("ro-scripts")) {
deleteCounter += cleanArtifactsRecursive(path + child.uri)
}
}
}
}
return deleteCounter
}
private RESTClient obtainServerConnection() {
def server = new RESTClient(config.server)
server.parser.'application/vnd.org.jfrog.artifactory.storage.FolderInfo+json' = server.parser.'application/json'
server.parser.'application/vnd.org.jfrog.artifactory.repositories.RepositoryDetailsList+json' = server.parser.'application/json'
return server
}
private def isArtifactFolder(child) {
child.uri.contains("-build-")
}
private def removeItem(path, child) {
println "folder: " + path + child.uri + " DELETE"
def binding = [repository: config.repository, path: path + child.uri]
def template = engine.createTemplate('''/artifactory/$repository/$path''').make(binding)
def query = template.toString()
if (!config.dryRun) {
def server = new RESTClient(config.server)
server.delete(path: query)
}
}
}
FINAL Answer: This includes deleting the build artifacts from Artifactory as well using Artifactor's REST API call. This script will delete Jenkins/Artifactory builds/artifacts of a given Release/Version (as sometimes over the time - a given Jenkins job can create multiple release / version builds for ex: 2.75.0.1, 2.75.0.2, 2.75.0.3,....,2.75.0.54, 2.76.0.1, 2.76.0.2, ..., 2.76.0.16, 2.76.1.1, 2.76.1.2, ...., 2.76.1.5). In this case, for every new release of that job, we start the build# from 1 fresh. If you have to delete the all builds except one / even all (change the script a little bit for your own needs) and don't change older/other release builds, then use the following script.
Scriptler Catalog link: http://scriptlerweb.appspot.com/script/show/103001
Enjoy!
/*** BEGIN META {
"name" : "Bulk Delete Builds except the given build number",
"comment" : "For a given job and a given build numnber, delete all builds of a given release version (M.m.interim) only and except the user provided one. Sometimes a Jenkins job use Build Name setter plugin and same job generates 2.75.0.1 and 2.76.0.43",
"parameters" : [ 'jobName', 'releaseVersion', 'buildNumber' ],
"core": "1.409",
"authors" : [
{ name : "Arun Sangal - Maddys Version" }
]
} END META **/
import groovy.json.*
import jenkins.model.*;
import hudson.model.Fingerprint.RangeSet;
import hudson.model.Job;
import hudson.model.Fingerprint;
//these should be passed in as arguments to the script
if(!artifactoryURL) throw new Exception("artifactoryURL not provided")
if(!artifactoryUser) throw new Exception("artifactoryUser not provided")
if(!artifactoryPassword) throw new Exception("artifactoryPassword not provided")
def authString = "${artifactoryUser}:${artifactoryPassword}".getBytes().encodeBase64().toString()
def artifactorySettings = [artifactoryURL: artifactoryURL, authString: authString]
if(!jobName) throw new Exception("jobName not provided")
if(!buildNumber) throw new Exception("buildNumber not provided")
def lastBuildNumber = buildNumber.toInteger() - 1;
def nextBuildNumber = buildNumber.toInteger() + 1;
def jij = jenkins.model.Jenkins.instance.getItem(jobName);
def promotedBuildRange = new Fingerprint.RangeSet()
promotedBuildRange.add(buildNumber.toInteger())
def promoteBuildsList = jij.getBuilds(promotedBuildRange)
assert promoteBuildsList.size() == 1
def promotedBuild = promoteBuildsList[0]
// The release / version of a Jenkins job - i.e. in case you use "Build name" setter plugin in Jenkins for getting builds like 2.75.0.1, 2.75.0.2, .. , 2.75.0.15 etc.
// and over the time, change the release/version value (2.75.0) to a newer value i.e. 2.75.1 or 2.76.0 and start builds of this new release/version from #1 onwards.
def releaseVersion = promotedBuild.getDisplayName().split("\\.")[0..2].join(".")
println ""
println("- Jenkins Job_Name: ${jobName} -- Version: ${releaseVersion} -- Keep Build Number: ${buildNumber}");
println ""
/** delete the indicated build and its artifacts from artifactory */
def deleteBuildFromArtifactory(String jobName, int deleteBuildNumber, Map<String, String> artifactorySettings){
println " ## Deleting >>>>>>>>>: - ${jobName}:${deleteBuildNumber} from artifactory"
def artifactSearchUri = "api/build/${jobName}?buildNumbers=${deleteBuildNumber}&artifacts=1"
def conn = "${artifactorySettings['artifactoryURL']}/${artifactSearchUri}".toURL().openConnection()
conn.setRequestProperty("Authorization", "Basic " + artifactorySettings['authString']);
conn.setRequestMethod("DELETE")
if( conn.responseCode != 200 ) {
println "Failed to delete the build artifacts from artifactory for ${jobName}/${deleteBuildNumber}: ${conn.responseCode} - ${conn.responseMessage}"
}
}
/** delete all builds in the indicated range that match the releaseVersion */
def deleteBuildsInRange(String buildRange, String releaseVersion, Job theJob, Map<String, String> artifactorySettings){
def range = RangeSet.fromString(buildRange, true);
theJob.getBuilds(range).each {
if ( it.getDisplayName().find(/${releaseVersion}.*/)) {
println " ## Deleting >>>>>>>>>: " + it.getDisplayName();
deleteBuildFromArtifactory(theJob.name, it.number, artifactorySettings)
it.delete();
}
}
}
//delete all the matching builds before the promoted build number
deleteBuildsInRange("1-${lastBuildNumber}", releaseVersion, jij, artifactorySettings)
//delete all the matching builds after the promoted build number
deleteBuildsInRange("${nextBuildNumber}-${jij.nextBuildNumber}", releaseVersion, jij, artifactorySettings)
println ""
println("- Builds have been successfully deleted for the above mentioned release: ${releaseVersion}")
println ""
First of all, but it is just not done. Why would you delete :
A few extra Jenkins builds which are not harming you.
Delete artifacts from an artifact repository (which is named Artifactory !!)
Now that said, I understand you might still have a good reason to do so (which would be interesting to know). This is an alternate approach I can propose:
Jenkins : I am assuming you are using Maven. In that case, you use M2 release plugin to create "Release Builds". Now these builds will have a special suitcase sort of icon next to them and will be "keep this build for ever". You can play around with how many days to keep artifacts, how many builds to keep etc. in Jenkins and make your own policy so that your requirement is addressed.
Artifactory : I use Nexus so the implementation might be different. But you can set it so that snapshot builds are over-written every time. So at all times you have n number of release builds and exactly 1 snapshot. The second policy is "delete snapshot when released". This ensures same numbered snapshot and release do no co-exist in the repo. Now this is exactly how it should be and there should be no reason to delete "released" artifacts from a repository like Artifactory. That's the whole point of a release.

How to Report Results to Sauce Labs using Geb/Spock?

I want to use the Sauce Labs Java REST API to send Pass/Fail status back to the Sauce Labs dashboard. I am using Geb+Spock, and my Gradle build creates a test results directory where results are output in XML. My problem is that the results XML file doesn't seem to be generated until after the Spock specification's cleanupSpec() exits. This causes my code to report the results of the previous test run, rather than the current one. Clearly not what I want!
Is there some way to get to the results from within cleanupSpec() without relying on the XML? Or a way to get the results to file earlier? Or some alternative that will be much better than either of those?
Some code:
In build.gradle, I specify the testResultsDir. This is where the XML file is written after the Spock specifications exit:
drivers.each { driver ->
task "${driver}Test"(type: Test) {
cleanTest
systemProperty "geb.env", driver
testResultsDir = file("$buildDir/test-results/${driver}")
systemProperty "proj.test.resultsDir", testResultsDir
}
}
Here is the setupSpec() and cleanupSpec() in my LoginSpec class:
class LoginSpec extends GebSpec {
#Shared def SauceREST client = new SauceREST("redactedName", "redactedKey")
#Shared def sauceJobID
#Shared def allSpecsPass = true
def setupSpec() {
sauceJobID = driver.getSessionId().toString()
}
def cleanupSpec() {
def String specResultsDir = System.getProperty("proj.test.resultsDir") ?: "./build/test-results"
def String specResultsFile = this.getClass().getName()
def String specResultsXML = "${specResultsDir}/TEST-${specResultsFile}.xml"
def testsuiteResults = new XmlSlurper().parse( new File( specResultsXML ))
// read error and failure counts from the XML
def errors = testsuiteResults.#errors.text()?.toInteger()
def failures = testsuiteResults.#failures.text()?.toInteger()
if ( (errors + failures) > 0 ) { allSpecsPass = false }
if ( allSpecsPass ) {
client.jobPassed(sauceJobID)
} else {
client.jobFailed(sauceJobID)
}
}
}
The rest of this class contains login specifications that do not interact with SauceLabs. When I read the XML, it turns out that it was written at the end of the previous LoginSpec run. I need a way to get to the values of the current run.
Thanks!
Test reports are generated after a Specification has finished execution and the generation is performed by the build system, so in your case by Gradle. Spock has no knowledge of that so you are unable to get that information from within the test.
You can on the other hand quite easily get that information from Gradle. Test task has two methods that might be of interest to you here: addTestListener() and afterSuite(). It seems that the cleaner solution here would be to use the first method, implement a test listener and put your logic in afterSuite() of the listener (and not the task configuration). You would probably need to put that listener implementation in buildSrc as it looks like you have a dependency on SauceREST and you would need to build and compile your listener class before being able to use it as an argument to addTestListener() in build.gradle of your project.
Following on from erdi's suggestion, I've created a Sauce Gradle helper library, which provides a Test Listener that parses the test XML output and invokes the Sauce REST API to set the pass/fail status.
The library can be included by adding the following to your build.gradle file:
import com.saucelabs.gradle.SauceListener
buildscript {
repositories {
mavenCentral()
maven {
url "https://repository-saucelabs.forge.cloudbees.com/release"
}
}
dependencies {
classpath group: 'com.saucelabs', name: 'saucerest', version: '1.0.2'
classpath group: 'com.saucelabs', name: 'sauce_java_common', version: '1.0.14'
classpath group: 'com.saucelabs.gradle', name: 'sauce-gradle-plugin', version: '0.0.1'
}
}
gradle.addListener(new SauceListener("YOUR_SAUCE_USERNAME", "YOUR_SAUCE_ACCESS_KEY"))
You will also need to output the Selenium session id for each test, so that the SauceListener can associate the Sauce Job with the pass/fail status. To do this, include the following output in the stdout:
SauceOnDemandSessionID=SELENIUM_SESSION_ID

Creating a Jenkins environment variable using Groovy

My project takes in a version number (separated by '.' or '_'). I tried writing a Groovy script that creates a Jenkins environment variable using only the first two of these numbers:
//Get the version parameter
def env = System.getenv()
def version = env['currentversion']
def m = version =~/\d{1,2}/
env = ['miniVersion':m[0].m[1]]
Am I doing this correctly? Can I even create a new environment variable? Is there a better solution to this?
Jenkins 1.x
The following groovy snippet should pass the version (as you've already supplied), and store it in the job's variables as 'miniVersion'.
import hudson.model.*
def env = System.getenv()
def version = env['currentversion']
def m = version =~/\d{1,2}/
def minVerVal = m[0]+"."+m[1]
def pa = new ParametersAction([
new StringParameterValue("miniVersion", minVerVal)
])
// add variable to current job
Thread.currentThread().executable.addAction(pa)
The variable will then be accessible from other build steps. e.g.
echo miniVersion=%miniVersion%
Outputs:
miniVersion=12.34
I believe you'll need to use the "System Groovy Script" (on the Master node only) as opposed to the "Groovy Plugin" - https://wiki.jenkins-ci.org/display/JENKINS/Groovy+plugin#Groovyplugin-GroovyScriptvsSystemGroovyScript
Jenkins 2.x
I believe the previous (Jenkins 1.x) behaviour stopped working because of this Security Advisory...
Solution (paraphrased from the Security Advisory)
It's possible to restore the previous behaviour by setting the system property hudson.model.ParametersAction.keepUndefinedParameters to true. This is potentially very unsafe and intended as a short-term workaround only.
java -Dhudson.model.ParametersAction.keepUndefinedParameters=true -jar jenkins.war
To allow specific, known safe parameter names to be passed to builds, set the system property hudson.model.ParametersAction.safeParameters to a comma-separated list of safe parameter names.
e.g.
java -Dhudson.model.ParametersAction.safeParameters=miniVersion,FOO,BAR -jar jenkins.war
And in groovy these two lines should be written this way:
System.setProperty("hudson.model.ParametersAction.keepUndefinedParameters","true");
System.setProperty("hudson.model.ParametersAction.safeParameters","miniVersion,FOO,BAR");
You can also define a variable without the EnvInject Plugin within your Groovy System Script:
import hudson.model.*
def build = Thread.currentThread().executable
def pa = new ParametersAction([
new StringParameterValue("FOO", "BAR")
])
build.addAction(pa)
Then you can access this variable in the next build step which (for example) is an windows batch command:
#echo off
Setlocal EnableDelayedExpansion
echo FOO=!FOO!
This echo will show you "FOO=BAR".
Regards
For me, the following also worked in Jenkins 2 (2.73.3)
Replace
def pa = new ParametersAction([new StringParameterValue("FOO", foo)])
build.addAction(pa)
with
def pa = new ParametersAction([new StringParameterValue("FOO", foo)], ["FOO"])
build.addAction(pa)
ParametersAction seems to have a second constructor which allows to pass in "additionalSafeParameters" https://github.com/jenkinsci/jenkins/blob/master/core/src/main/java/hudson/model/ParametersAction.java
As other answers state setting new ParametersAction is the way to inject one or more environment variables, but when a job is already parameterised adding new action won't take effect. Instead you'll see two links to a build parameters pointing to the same set of parameters and the one you wanted to add will be null.
Here is a snippet updating the parameters list in both cases (a parametrised and non-parametrised job):
import hudson.model.*
def build = Thread.currentThread().executable
def env = System.getenv()
def version = env['currentversion']
def m = version =~/\d{1,2}/
def minVerVal = m[0]+"."+m[1]
def newParams = null
def pl = new ArrayList<StringParameterValue>()
pl.add(new StringParameterValue('miniVersion', miniVerVal))
def oldParams = build.getAction(ParametersAction.class)
if(oldParams != null) {
newParams = oldParams.createUpdated(pl)
build.actions.remove(oldParams)
} else {
newParams = new ParametersAction(pl)
}
build.addAction(newParams)
The Jenkins EnvInject Plugin might be able to help you. It allows injecting environment variables into the build environment.
I know it has some ability to do scripting, so it might be able to do what you want. I have only used it to set simple properties (e.g. "LOG_PATH=${WORKSPACE}\logs").
After searching around a bit, the best solution in my opinion makes use of hudson.model.EnvironmentContributingAction.
import hudson.model.EnvironmentContributingAction
import hudson.model.AbstractBuild
import hudson.EnvVars
class BuildVariableInjector {
def build
def out
def BuildVariableInjector(build, out) {
this.build = build
this.out = out
}
def addBuildEnvironmentVariable(key, value) {
def action = new VariableInjectionAction(key, value)
build.addAction(action)
//Must call this for action to be added
build.getEnvironment()
}
class VariableInjectionAction implements EnvironmentContributingAction {
private String key
private String value
public VariableInjectionAction(String key, String value) {
this.key = key
this.value = value
}
public void buildEnvVars(AbstractBuild build, EnvVars envVars) {
if (envVars != null && key != null && value != null) {
envVars.put(key, value);
}
}
public String getDisplayName() {
return "VariableInjectionAction";
}
public String getIconFileName() {
return null;
}
public String getUrlName() {
return null;
}
}
}
I use this class in a system groovy script (using the groovy plugin) within a job.
import hudson.model.*
import java.io.File;
import jenkins.model.Jenkins;
def jenkinsRootDir = build.getEnvVars()["JENKINS_HOME"];
def parent = getClass().getClassLoader()
def loader = new GroovyClassLoader(parent)
def buildVariableInjector = loader.parseClass(new File(jenkinsRootDir + "/userContent/GroovyScripts/BuildVariableInjector.groovy")).newInstance(build, getBinding().out)
def projectBranchDependencies = []
//Some logic to set projectBranchDependencies variable
buildVariableInjector.addBuildEnvironmentVariable("projectBranchDependencies", projectBranchDependencies.join(","));
You can then access the projectBranchDependencies variable at any other point in your build, in my case, from an ANT script.
Note: I borrowed / modified the ideas for parts of this implementation from a blog post, but at the time of this posting I was unable to locate the original source in order to give due credit.
Just had the same issue. Wanted to dynamically trigger parametrized downstream jobs based on the outcome of some groovy scripting.
Unfortunately on our Jenkins it's not possible to run System Groovy scripts. Therefore I had to do a small workaround:
Run groovy script which creates a properties file where the environment variable to be set is specified
def props = new File("properties.text")
if (props.text == 'foo=bar') {
props.text = 'foo=baz'
} else {
props.text = 'foo=bar'
}
Use env inject plugin to inject the variable written into this script
Inject environment variable
Property file path: properties.text
After that I was able to use the variable 'foo' as parameter for the parametrized trigger plugin. Some kind of workaround. But works!
My environment was prior tooling such as Jenkins and was running with batch files (I know, I'm old). So those batch files (and their sub-batch files) are using environment variables. This was my piece of groovy script which injects environment variables. The names and parameters used are dummy ones.
// The process/batch which uses environment variables
def buildLabel = "SomeVersionNr"
def script = "startBuild.bat"
def processBuilder = new ProcessBuilder(script, buildLabel)
//Inject our environment variables
Map<String, String> env = processBuilder.environment()
env.put("ProjectRoot", "someLocation")
env.put("SomeVar", "Some")
Process p = processBuilder.start()
p.waitFor()
Of course if you set Jenkins up from scratch you would probably do it differently and share the variables in another way, or pass parameters around but this might come handy.
On my side it only worked this way by replacing an existing parameter.
def artifactNameParam = new StringParameterValue('CopyProjectArtifactName', 'bla bla bla')
build.replaceAction(new ParametersAction(artifactNameParam))
Additionally this script must be run with system groovy.
A groovy must be manually installed on that system and the bin dir of groovy must be added to path. Additionally in the lib folder I had to add jenkins-core.jar.
Then it was possible to modify a parameter in a groovy script and get the modified value in a batch script after to continue work.
For me the following worked on Jenkins 2.190.1 and was much simpler than some of the other workarounds:
matcher = manager.getLogMatcher('^.*Text we want comes next: (.*)$');
if (matcher.matches()) {
def myVar = matcher.group(1);
def envVar = new EnvVars([MY_ENV_VAR: myVar]);
def newEnv = Environment.create(envVar);
manager.build.environments.add(0, newEnv);
// now the matched text from the LogMatcher is passed to an
// env var we can access at $MY_ENV_VAR in post build steps
}
This was using the Groovy Script plugin with no additional changes to Jenkins.
You can also create a global environment variable for jenkins if want to use wider.
Written here longer.
import hudson.EnvVars;
import hudson.slaves.EnvironmentVariablesNodeProperty;
import hudson.slaves.NodeProperty;
import hudson.slaves.NodePropertyDescriptor;
import hudson.util.DescribableList;
import jenkins.model.Jenkins;
public createGlobalEnvironmentVariables(String key, String value){
Jenkins instance = Jenkins.getInstance();
DescribableList<NodeProperty<?>, NodePropertyDescriptor> globalNodeProperties = instance.getGlobalNodeProperties();
List<EnvironmentVariablesNodeProperty> envVarsNodePropertyList = globalNodeProperties.getAll(EnvironmentVariablesNodeProperty.class);
EnvironmentVariablesNodeProperty newEnvVarsNodeProperty = null;
EnvVars envVars = null;
if ( envVarsNodePropertyList == null || envVarsNodePropertyList.size() == 0 ) {
newEnvVarsNodeProperty = new hudson.slaves.EnvironmentVariablesNodeProperty();
globalNodeProperties.add(newEnvVarsNodeProperty);
envVars = newEnvVarsNodeProperty.getEnvVars();
} else {
envVars = envVarsNodePropertyList.get(0).getEnvVars();
}
envVars.put(key, value)
instance.save()
}
createGlobalEnvironmentVariables('Var1','Dummy')

External Content with Groovy BuilderSupport

I've built a custom builder in Groovy by extending BuilderSupport. It works well when configured like nearly every builder code sample out there:
def builder = new MyBuilder()
builder.foo {
"Some Entry" (property1:value1, property2: value2)
}
This, of course, works perfectly. The problem is that I don't want the information I'm building to be in the code. I want to have this information in a file somewhere that is read in and built into objects by the builder. I cannot figure out how to do this.
I can't even make this work by moving the simple entry around in the code.
This works:
def textClosure = { "Some Entry" (property1:value1, property2: value2) }
builder.foo(textClosure)
because textClosure is a closure.
If I do this:
def text = '"Some Entry" (property1:value1, property2: value2)'
def textClosure = { text }
builder.foo(textClosure)
the builder only gets called for the "foo" node. I've tried many variants of this, including passing the text block directly into the builder without wrapping it in a closure. They all yield the same result.
Is there some way I take a piece of arbitrary text and pass it into my builder so that it will be able to correctly parse and build it?
Your problem is that a String is not Groovy code. The way ConfigSlurper handles this is to compile the text into an instance of Script using GroovyClassLoader#parseClass. e.g.,
// create a Binding subclass that delegates to the builder
class MyBinding extends Binding {
def builder
Object getVariable(String name) {
return { Object... args -> builder.invokeMethod(name,args) }
}
}
// parse the script and run it against the builder
new File("foo.groovy").withInputStream { input ->
Script s = new GroovyClassLoader().parseClass(input).newInstance()
s.binding = new MyBinding(builder:builder)
s.run()
}
The subclass of Binding simply returns a closure for all variables that delegates the call to the builder. So assuming foo.groovy contains:
foo {
"Some Entry" (property1:value1, property2: value2)
}
It would be equivalent to your code above.
I think the problem you described is better solved with a slurper or parser.
See:
http://groovy.codehaus.org/Reading+XML+using+Groovy%27s+XmlSlurper
http://groovy.codehaus.org/Reading+XML+using+Groovy%27s+XmlParser
for XML based examples.
In your case. Given the XML file:
<foo>
<entry name='Some Entry' property1="value1" property2="value2"/>
</foo>
You could slurp it with:
def text = new File("test.xml").text
def foo = new XmlSlurper().parseText(text)
def allEntries = foo.entry
allEntries.each {
println it.#name
println it.#property1
println it.#property2
}
Originally, I wanted to be able to specify
"Some Entry" (property1:value1, property2: value2)
in an external file. I'm specifically trying to avoid XML and XML-like syntax to make these files easier for regular users to create and modify. My current solution uses ConfigSlurper and the file now looks like:
"Some Entry"
{
property1 = value1
property2 = value2
}
ConfigSlurper gives me a map like this:
["Some Entry":[property1:value1,property2:value2]]
It's pretty simple to use these values to create my objects, especially since I can just pass the property/value map into the constructor.

Resources