I have an issue where tags from my gauge gradle task is not being passed to beforespec whereas the the tags are passed to spec files
Any idea what is the issue?
I'm in Gauge version: 0.9.1
Plugins
html-report (3.1.0)
java (0.6.2)
gradle gauge task :
task runTestsInQA(type: GaugeTask) {
doFirst {
println 'Running tests for the V1 in QA environment...'
gauge {
specsDir = 'specs'
tags = 'V1'
env = 'qa'
additionalFlags = '--verbose'
}
}
}
_
My beforespec code:
#beforespec(tags = "V1")
public void beforeSpec(ExecutionContext context)
{
System.out.println("Tags in scenario "+context.getAllTags());
}
Here, print statement is throwing null array [ ]
This gist contains an example project with gauge + java + gradle.
In your case, note that the BeforeSpec hook is a tagged execution hook, so it will get executed only when the respective tags are passed.
Also note that if you wish to get tags for a scenario, you are better off using the BeforeScenario hook, since you can then get all the tags (scenarios inherit spec tags).
In the configuration phase of the task I register some dir as builtBy: thisTask. I expect gradle to automatically detect if sources were changed, but the task always being executed.
Here is the task:
subprojects {
def srcMainMirah = file('src/main/mirah')
if (srcMainMirah.exists()) {
idea.module.sourceDirs += srcMainMirah
task compileMirah {
def classesMirahMain = file("$buildDir/classes-mirah/main")
inputs.sourceDir srcMainMirah
def thisTask = delegate
sourceSets.main {
output.dir(classesMirahMain, builtBy: thisTask)
java.srcDir srcMainMirah
}
dependsOn tasks.compileJava
doFirst {
def classpath = files("$buildDir/classes/main").plus(configurations.compile)
mirahc(srcMainMirah, classesMirahMain, classpath)
}
}
}
}
It is for compiling sources in mirah language, which produces *.class files just like java compiler does.
Declaring inputs alone for a task is insufficient to determine if the task is up-to-date. You are required to also declared task.outputs
A task with no defined outputs will never be considered up-to-date. For scenarios where the outputs of a task are not files, or for more complex scenarios, the TaskOutputs.upToDateWhen() method allows you to calculate programmatically if the tasks outputs should be considered up to date.
A task with only outputs defined will be considered up-to-date if those outputs are unchanged since the previous build.
From section 17.9.1 here.
We have an ivy repository, and we are using gradle for our dependency management and build framework. When an artifact is determined to be production-ready, we don't want to have to build it again, so we want to just "promote" an existing artifact via a web application that is leveraging Gradle and the tooling API to do most of the heaving lifting for us.
Currently, I'm copying the artifacts to a local folder and running another build.gradle that just re-publishes it. We are publishing it to a new folder in our existing repository, and a folder in the release repository.
In doing so, it is only publishing the ivy.xml to both locations.
I'm guessing this is due to where the artifacts are located.
PromotionService.groovy
void promote(Project project, Build build, String newVersion) {
def artifactLocation = "/path/to/repository"
// we are generating this build.gradle and copying it
def buildFileText = new File('promote.gradle').getText('UTF-8')
def artifacts = buildDao.findArtifactsByBuild(build)
def localBuildFolderPath = "/path/to/local/gradle/build"
def localBuildFolder = new File(localBuildFolderPath)
localBuildFolder.mkdirs()
// remove everything currently in the directory
def buildFiles = localBuildFolder.listFiles()
buildFiles.each {
it.delete()
}
def newFile = new File("/path/to/local/gradle/build.gradle")
newFile.mkdirs()
if (newFile.exists())
newFile.delete()
newFile << buildFileText
artifacts.each { VersionedArtifact it ->
def folder = new File("${artifactLocation}/${it.module}/${build.branch}/${it.version}")
def files = folder.listFiles()
files.each { File from ->
// remove version number from file name
String fromName = from.name
def matcher = fromName =~ /(.*?)-(\d)+\.(\d)+\.(\d)+(\.\d+)?\.(.*)/
fromName = "${matcher[0][1]}.${matcher[0][6]}"
File to = new File("${localBuildFolderPath}/${it.module}/${fromName}")
to.mkdirs()
if (to.exists()) to.delete()
// wrapper for Guava's Files.copy()
FileUtil.copy(from, to)
}
ProjectConnection connection = GradleConnector.newConnector().forProjectDirectory(new File("${workingDir}/gradle")).connect()
connection.newBuild()
.forTasks("publishReleaseBranchPublicationToIvyRepository", "publishReleaseRepoPublicationToReleaseRepository")
.withArguments("-PMODULE=${it.module}", "-PVERSION=${it.version}", "-PNEWVERSION=${newVersion}")
.run()
}
}
build.gradle
apply plugin: 'groovy'
apply plugin: 'ivy-publish'
publishing {
publications {
releaseBranch(IvyPublication) {
organisation 'our-organization'
module MODULE
revision VERSION
descriptor.status = 'release'
configurations { archive {
} }
}
releaseRepo(IvyPublication) {
organisation 'our-organization'
module MODULE
revision NEWVERSION
descriptor.status = 'release'
configurations { archive {
}}
}
}
repositories {
ivy {
name 'ivy'
url "/path/to/ivy/repo"
layout "pattern", {
ivy "[organisation]/[module]/release/[revision]/[module]-[revision].xml"
artifact "[organisation]/[module]/release/[revision]/[artifact](-[classifier])-[revision].[ext]"
}
}
ivy {
name 'release'
url "/path/to/release/repo"
layout "pattern", {
ivy "[organisation]/[module]/[revision]/[module]-[revision].xml"
artifact "[organization]/[module]/[revision]/[artifact](-[classifier])-[revision].[ext]"
}
}
}
}
Edit: Made it clearer we're writing a web application to promote artifacts.
It's not clear to me why the promotion is implemented using the tooling API, rather than as a regular Gradle task or plugin. Anyway, the IvyPublications are neither configured using IvyPublication#from, nor using IvyPublication#artifact. Hence they won't have any artifacts.
Jenkins: Version 1.525
Jenkins Server URL: http://my.jenkins.server.com:9040
Linux Red Hat 5.3
Artifactory: Free version
Artifactory Server URL: http://my.artifactory.server:8081/Artifactory
I'm successfully able to build in Jenkins and upload artifacts to my Artifactory server under a respective repository.
When a build occurs, the artifacts (ProjectA-1.0.0.25.tar.gz) goes to Artifactory under libs-snapshot-local repository. Here 1.0.0 is denoting Major, minor and interim version of a given release for an application/jenkins job: "ProjectA" in this case. Lets say 25 is the build number
When ProjectA build gets stable in Development, we promote a given build of that application release to INT or any other higher environment (QA/PrePROD etc).
During this promotion process, we just select which build to promote and using Jenkins Promoted Build Plugin, we are able to do it successfully.
Now, what we need is:
During the promote process, I want to call a Groovy script, which will delete all Jenkins builds from Jenkins and Artifactory (libs-snapshot-local) for ProjectA's release 1.0.0 after "ProjectA-1.0.0.25.tar.gz" is successfully promoted to INT. Promotion part is working fine right now; All I need is a Groovy script which will delete Jenkins builds (1.0.0.1 to 1.0.0.24 and >= 1.0.0.25) in Jenkins and its associated artifacts from Artifactory repository (libs-snapshot-local).
Idea in our company is, once a release version's build is promoted for an application, all other builds / artifacts we have (in Jenkins/Artifactory) - we want to delete forever using a Groovy script. Someone will ask, what if I want to promote a different build#; well in our case, we dont want that. Simple rule is, if someone promotes ProjectA-1.0.0.25.tar.gz, then, delete ProjectA's builds / artifacts in Jenkins and Artifactory where build/artifact is other than 1.0.0.25 and continue with new release 1.1.0
The script with the following capability would be great.
1. Use property files (jenkins.properties / artifactory.properties) - which will contains some variables about hostname/username/password etc if any.
2. Use REST API to perform the deletion for a given application/job and given release (for ex 1.0.0)
3. Can be used for both Jenkins/Artifactory deletion - if at command prompt, I say use this (Jenkins) property file -or that (Artifatory) - as in both cases, application and its release value will be same.
4. We know that for promoting a build to INT (using Jenkins promote plugin), we'll always delete from Jenkins server and from Artifactory server only at libs-snapshot-release.
Now if someone does promotion to QA (at a later time), then artifactory repository will be (libs-stage-local)
In other words, we should call the Groovy script, pass some variables/values (REST) and tell which application/job to delete and what build release version it's. Then, it'll delete all builds except the one which a user will pass (i.e. 1.0.0.25)
I'm new to both Groovy / using REST API for doing this "deletion" piece of work for Jenkins/Artifactory. If someone already has any sample script that does this kind of activity and if you can share, I'll tweak it according to my settings and see if I can see the above mentioned behaviour during promotion step. I have some time crunch in getting a working version of this script, would appreciate some script code doing the same task (instead of great people telling me to go through big documentations/links, I know that'll make me a better coder in Groovy but it'll delay the whole purpose for this post).
Thanks a lot.
Found one way (not using REST API calls at this time but soon I'll update, or you can help).
Solution 1 - To delete all builds of a Jenkins job except one build (which we'd selected for promotion) thus, during the promotion, we'll call "scriptler" script under BUILD section of Jenkins, script which has the following code or create a separate job and call this script by passing 2 parameters (jobName and buildNumber - string params in Jenkins job).
-bash-3.2$ cat bulkDeleteBuildsExceptOne.groovy
/*** BEGIN META {
"name" : "Bulk Delete Builds except the given build number",
"comment" : "For a given job and a given build numnber, delete all build except the user provided one.",
"parameters" : [ 'jobName', 'buildNumber' ],
"core": "1.410",
"authors" : [
{ name : "Arun Sangal" }
]
} END META **/
// NOTE: Uncomment parameters below if not using Scriptler >= 2.0, or if you're just pasting the script in manually.
// ----- Logic in this script takes 5000 as the infinite number, decrease / increase this value from your own experience.
// The name of the job.
//def jobName = "some-job"
// The range of build numbers to delete.
//def buildNumber = "5"
def lastBuildNumber = buildNumber.toInteger() - 1;
def nextBuildNumber = buildNumber.toInteger() + 1;
import jenkins.model.*;
import hudson.model.Fingerprint.RangeSet;
def jij = jenkins.model.Jenkins.instance.getItem(jobName);
println("Keeping Job_Name: ${jobName} and build Number: ${buildNumber}");
println ""
def setBuildRange = "1-${lastBuildNumber}"
//println setBuildRange
def range = RangeSet.fromString(setBuildRange, true);
jij.getBuilds(range).each { it.delete() }
println("Builds have been deleted - Range: " + setBuildRange)
setBuildRange = "${nextBuildNumber}-5000"
//println setBuildRange
range = RangeSet.fromString(setBuildRange, true);
jij.getBuilds(range).each { it.delete() }
println("Builds have been deleted - Range: " + setBuildRange)
https://github.com/gigaaks/jenkins-scripts/blob/master/scriptler/bulkDeleteBuildsExceptOne.groovy
-OR
http://scriptlerweb.appspot.com/script/show/101001 (Scriptler Web site) - This can be visible in Jenkins Scriptler plugin under Remote catalogs scripts section.
Would have been a little easier if GITHUB people provided an easy button/link to PUSH my changes to main jenkinsci branch/repository.
Though I'm still looking for 2 things:
How can I make the following script parameterized in Groovy. Using CliBuilder, I get class not found error.
How to do this using Jenkins REST API Call. Later, I'll do the same using Artifactory REST API call.
OK. There was a little tweak. Found that if a Jenkins job is generating multiple release/version builds/artifacts from a single job (i.e. if it's using Build Name Setter plugin) and using Major.minor.interim (2.75.0 for ex.) as it's release and generating builds 1-150 for this release and later once that release is gone to INT/QA env, the same job is creating builds from 1-Nth number for next release (i.e. 2.75.1 or 2.76.0 or etc..) then the following script will do the trick.
See this link:
Do not delete a Jenkins build if it's marked as "Keep this build forever" - Groovy script to delete Jenkins builds
bulkDeleteJenkinsBuildsExceptOne_OfAGivenRelease.groovy
/*** BEGIN META {
"name" : "Bulk Delete Builds except the given build number",
"comment" : "For a given job and a given build numnber, delete all builds of a given release version (M.m.interim) only and except the user provided one. Sometimes a Jenkins job use Build Name setter plugin and same job generates 2.75.0.1 and 2.76.0.43",
"parameters" : [ 'jobName', 'releaseVersion', 'buildNumber' ],
"core": "1.409",
"authors" : [
{ name : "Arun Sangal" }
]
} END META **/
// NOTE: Uncomment parameters below if not using Scriptler >= 2.0, or if you're just pasting the script in manually.
// ----- Logic in this script takes 5000 as the infinite number, decrease / increase this value from your own experience.
// The name of the job.
//def jobName = "some-job"
// The release / version of a Jenkins job - i.e. in case you use "Build name" setter plugin in Jenkins for getting builds like 2.75.0.1, 2.75.0.2, .. , 2.75.0.15 etc.
// and over the time, change the release/version value (2.75.0) to a newer value i.e. 2.75.1 or 2.76.0 and start builds of this new release/version from #1 onwards.
//def releaseVersion = "2.75.0"
// The range of build numbers to delete.
//def buildNumber = "5"
def lastBuildNumber = buildNumber.toInteger() - 1;
def nextBuildNumber = buildNumber.toInteger() + 1;
import jenkins.model.*;
import hudson.model.Fingerprint.RangeSet;
def jij = jenkins.model.Jenkins.instance.getItem(jobName);
//def build = jij.getLastBuild();
println ""
println("- Jenkins Job_Name: ${jobName} -- Version: ${releaseVersion} -- Keep Build Number: ${buildNumber}");
println ""
println " -- Range before given build number: ${buildNumber}"
println ""
def setBuildRange = "1-${lastBuildNumber}"
def range = RangeSet.fromString(setBuildRange, true);
jij.getBuilds(range).each {
if ( it.getDisplayName().find(/${releaseVersion}.*/)) {
println " ## Deleting >>>>>>>>>: " + it.getDisplayName();
// Trying to find - how to NOT delete a build in Jenkins if it's marked as "keep this build forever". If someone has an idea, please update this script with a newer version in GitHub.
//if ( !build.isKeepLog()) {
it.delete();
//} else {
// println "build -- can't be deleted as :" + build.getWhyKeepLog();
//}
}
}
println ""
println " -- Range after given build number: ${buildNumber}"
println ""
setBuildRange = "${nextBuildNumber}-5000"
range = RangeSet.fromString(setBuildRange, true);
jij.getBuilds(range).each {
if ( it.getDisplayName().find(/${releaseVersion}.*/)) {
println " ## Deleting >>>>>>>>>: " + it.getDisplayName();
it.delete();
}
}
println ""
println("- Builds have been successfully deleted for the above mentioned release: ${releaseVersion}")
println ""
For using REST API to call the above scriptler script OR a Jenkins job would be like: wondering where I'm passing POST action.
The main line is: def artifactSearchUri = "api/build/${jobName}/${buildNumber}" ... which we need to tweak like: ="api/build/Some_Jenkins_Job_That_You_Will_Create/buildWithParameters?jobName=Test_AppSvc&releaseVersion=2.75.0&buildNumber=15"
import groovy.json.*
def artifactoryURL= properties["jenkins.ARTIFACTORY_URL"]
def artifactoryUser = properties["artifactoryUser"]
def artifactoryPassword = properties["artifactoryPassword"]
def authString = "${artifactoryUser}:${artifactoryPassword}".getBytes().encodeBase64().toString()
def jobName = properties["jobName"]
def buildNumber = properties["buildNumber"]
def artifactSearchUri = "api/build/${jobName}/${buildNumber}"
def conn = "${artifactoryURL}/${artifactSearchUri}".toURL().openConnection()
conn.setRequestProperty("Authorization", "Basic " + authString);
println "Searching artifactory with: ${artifactSearchUri}"
def searchResults
if( conn.responseCode == 200 ) {
searchResults = new JsonSlurper().parseText(conn.content.text)
} else {
throw new Exception ("Failed to find the build info for ${jobName}/${buildNumber}: ${conn.responseCode} - ${conn.responseMessage}")
}
and for deleting Artifactory builds, we have to club the above logic via using the following groovy script, which I'm still trying to get working. I know I'm close.
BLOG: http://browse.feedreader.com/c/Gridshore/11546011
Script: https://github.com/jettro/small-scripts/blob/master/groovy/artifactory/Artifactory.groovy
package artifactory
import groovy.text.SimpleTemplateEngine
import groovyx.net.http.RESTClient
import net.sf.json.JSON
/**
* This groovy class is meant to be used to clean up your Atifactory server or get more information about it's
* contents. The api of artifactory is documented very well at the following location
* {#see http://wiki.jfrog.org/confluence/display/RTF/Artifactory%27s+REST+API}
*
* At the moment there is one major use of this class, cleaning your repository.
*
* Reading data about the repositories is done against /api/repository, if you want to remove items you need to use
* '/api/storage'
*
* Artifactory returns a strange Content Type in the response. We want to use a generic JSON library. Therefore we need
* to map the incoming type to the standard application/json. An example of the mapping is below, all the other
* mappings can be found in the obtainServerConnection method.
* 'application/vnd.org.jfrog.artifactory.storage.FolderInfo+json' => server.parser.'application/json'
*
* The class makes use of a config object. The config object is a map with a minimum of the following fields:
* def config = [
* server: 'http://localhost:8080',
* repository: 'libs-release-local',
* versionsToRemove: ['/3.2.0-build-'],
* dryRun: true]
*
* The versionsToRemove is an array of strings that are the strart of builds that should be removed. To give an idea of
* the build numbers we use: 3.2.0-build-1 or 2011.10-build-1. The -build- is important for the solution. This is how
* we identify an artifact instead of a group folder.
*
* The final option to notice is the dryRun option. This way you can get an overview of what will be deleted. If set
* to false, it will delete the selected artifacts.
*
* Usage example
* -------------
* def config = [
* server: 'http://localhost:8080',
* repository: 'libs-release-local',
* versionsToRemove: ['/3.2.0-build-'],
* dryRun: false]
*
* def artifactory = new Artifactory(config)
*
* def numberRemoved = artifactory.cleanArtifactsRecursive('nl/gridshore/toberemoved')
*
* if (config.dryRun) {* println "$numberRemoved folders would have been removed."
*} else {* println "$numberRemoved folders were removed."
*}* #author Jettro Coenradie
*/
private class Artifactory {
def engine = new SimpleTemplateEngine()
def config
def Artifactory(config) {
this.config = config
}
/**
* Print information about all the available repositories in the configured Artifactory
*/
def printRepositories() {
def server = obtainServerConnection()
def resp = server.get(path: '/artifactory/api/repositories')
if (resp.status != 200) {
println "ERROR: problem with the call: " + resp.status
System.exit(-1)
}
JSON json = resp.data
json.each {
println "key :" + it.key
println "type : " + it.type
println "descritpion : " + it.description
println "url : " + it.url
println ""
}
}
/**
* Return information about the provided path for the configured artifactory and server.
*
* #param path String representing the path to obtain information for
*
* #return JSON object containing information about the specified folder
*/
def JSON folderInfo(path) {
def binding = [repository: config.repository, path: path]
def template = engine.createTemplate('''/artifactory/api/storage/$repository/$path''').make(binding)
def query = template.toString()
def server = obtainServerConnection()
def resp = server.get(path: query)
if (resp.status != 200) {
println "ERROR: problem obtaining folder info: " + resp.status
println query
System.exit(-1)
}
return resp.data
}
/**
* Recursively removes all folders containing builds that start with the configured paths.
*
* #param path String containing the folder to check and use the childs to recursively check as well.
* #return Number with the amount of folders that were removed.
*/
def cleanArtifactsRecursive(path) {
def deleteCounter = 0
JSON json = folderInfo(path)
json.children.each {child ->
if (child.folder) {
if (isArtifactFolder(child)) {
config.versionsToRemove.each {toRemove ->
if (child.uri.startsWith(toRemove)) {
removeItem(path, child)
deleteCounter++
}
}
} else {
if (!child.uri.contains("ro-scripts")) {
deleteCounter += cleanArtifactsRecursive(path + child.uri)
}
}
}
}
return deleteCounter
}
private RESTClient obtainServerConnection() {
def server = new RESTClient(config.server)
server.parser.'application/vnd.org.jfrog.artifactory.storage.FolderInfo+json' = server.parser.'application/json'
server.parser.'application/vnd.org.jfrog.artifactory.repositories.RepositoryDetailsList+json' = server.parser.'application/json'
return server
}
private def isArtifactFolder(child) {
child.uri.contains("-build-")
}
private def removeItem(path, child) {
println "folder: " + path + child.uri + " DELETE"
def binding = [repository: config.repository, path: path + child.uri]
def template = engine.createTemplate('''/artifactory/$repository/$path''').make(binding)
def query = template.toString()
if (!config.dryRun) {
def server = new RESTClient(config.server)
server.delete(path: query)
}
}
}
FINAL Answer: This includes deleting the build artifacts from Artifactory as well using Artifactor's REST API call. This script will delete Jenkins/Artifactory builds/artifacts of a given Release/Version (as sometimes over the time - a given Jenkins job can create multiple release / version builds for ex: 2.75.0.1, 2.75.0.2, 2.75.0.3,....,2.75.0.54, 2.76.0.1, 2.76.0.2, ..., 2.76.0.16, 2.76.1.1, 2.76.1.2, ...., 2.76.1.5). In this case, for every new release of that job, we start the build# from 1 fresh. If you have to delete the all builds except one / even all (change the script a little bit for your own needs) and don't change older/other release builds, then use the following script.
Scriptler Catalog link: http://scriptlerweb.appspot.com/script/show/103001
Enjoy!
/*** BEGIN META {
"name" : "Bulk Delete Builds except the given build number",
"comment" : "For a given job and a given build numnber, delete all builds of a given release version (M.m.interim) only and except the user provided one. Sometimes a Jenkins job use Build Name setter plugin and same job generates 2.75.0.1 and 2.76.0.43",
"parameters" : [ 'jobName', 'releaseVersion', 'buildNumber' ],
"core": "1.409",
"authors" : [
{ name : "Arun Sangal - Maddys Version" }
]
} END META **/
import groovy.json.*
import jenkins.model.*;
import hudson.model.Fingerprint.RangeSet;
import hudson.model.Job;
import hudson.model.Fingerprint;
//these should be passed in as arguments to the script
if(!artifactoryURL) throw new Exception("artifactoryURL not provided")
if(!artifactoryUser) throw new Exception("artifactoryUser not provided")
if(!artifactoryPassword) throw new Exception("artifactoryPassword not provided")
def authString = "${artifactoryUser}:${artifactoryPassword}".getBytes().encodeBase64().toString()
def artifactorySettings = [artifactoryURL: artifactoryURL, authString: authString]
if(!jobName) throw new Exception("jobName not provided")
if(!buildNumber) throw new Exception("buildNumber not provided")
def lastBuildNumber = buildNumber.toInteger() - 1;
def nextBuildNumber = buildNumber.toInteger() + 1;
def jij = jenkins.model.Jenkins.instance.getItem(jobName);
def promotedBuildRange = new Fingerprint.RangeSet()
promotedBuildRange.add(buildNumber.toInteger())
def promoteBuildsList = jij.getBuilds(promotedBuildRange)
assert promoteBuildsList.size() == 1
def promotedBuild = promoteBuildsList[0]
// The release / version of a Jenkins job - i.e. in case you use "Build name" setter plugin in Jenkins for getting builds like 2.75.0.1, 2.75.0.2, .. , 2.75.0.15 etc.
// and over the time, change the release/version value (2.75.0) to a newer value i.e. 2.75.1 or 2.76.0 and start builds of this new release/version from #1 onwards.
def releaseVersion = promotedBuild.getDisplayName().split("\\.")[0..2].join(".")
println ""
println("- Jenkins Job_Name: ${jobName} -- Version: ${releaseVersion} -- Keep Build Number: ${buildNumber}");
println ""
/** delete the indicated build and its artifacts from artifactory */
def deleteBuildFromArtifactory(String jobName, int deleteBuildNumber, Map<String, String> artifactorySettings){
println " ## Deleting >>>>>>>>>: - ${jobName}:${deleteBuildNumber} from artifactory"
def artifactSearchUri = "api/build/${jobName}?buildNumbers=${deleteBuildNumber}&artifacts=1"
def conn = "${artifactorySettings['artifactoryURL']}/${artifactSearchUri}".toURL().openConnection()
conn.setRequestProperty("Authorization", "Basic " + artifactorySettings['authString']);
conn.setRequestMethod("DELETE")
if( conn.responseCode != 200 ) {
println "Failed to delete the build artifacts from artifactory for ${jobName}/${deleteBuildNumber}: ${conn.responseCode} - ${conn.responseMessage}"
}
}
/** delete all builds in the indicated range that match the releaseVersion */
def deleteBuildsInRange(String buildRange, String releaseVersion, Job theJob, Map<String, String> artifactorySettings){
def range = RangeSet.fromString(buildRange, true);
theJob.getBuilds(range).each {
if ( it.getDisplayName().find(/${releaseVersion}.*/)) {
println " ## Deleting >>>>>>>>>: " + it.getDisplayName();
deleteBuildFromArtifactory(theJob.name, it.number, artifactorySettings)
it.delete();
}
}
}
//delete all the matching builds before the promoted build number
deleteBuildsInRange("1-${lastBuildNumber}", releaseVersion, jij, artifactorySettings)
//delete all the matching builds after the promoted build number
deleteBuildsInRange("${nextBuildNumber}-${jij.nextBuildNumber}", releaseVersion, jij, artifactorySettings)
println ""
println("- Builds have been successfully deleted for the above mentioned release: ${releaseVersion}")
println ""
First of all, but it is just not done. Why would you delete :
A few extra Jenkins builds which are not harming you.
Delete artifacts from an artifact repository (which is named Artifactory !!)
Now that said, I understand you might still have a good reason to do so (which would be interesting to know). This is an alternate approach I can propose:
Jenkins : I am assuming you are using Maven. In that case, you use M2 release plugin to create "Release Builds". Now these builds will have a special suitcase sort of icon next to them and will be "keep this build for ever". You can play around with how many days to keep artifacts, how many builds to keep etc. in Jenkins and make your own policy so that your requirement is addressed.
Artifactory : I use Nexus so the implementation might be different. But you can set it so that snapshot builds are over-written every time. So at all times you have n number of release builds and exactly 1 snapshot. The second policy is "delete snapshot when released". This ensures same numbered snapshot and release do no co-exist in the repo. Now this is exactly how it should be and there should be no reason to delete "released" artifacts from a repository like Artifactory. That's the whole point of a release.
I'm using Cucumber for my tests. How do I rerun only the failed tests?
Run Cucumber with rerun formatter:
cucumber -f rerun --out rerun.txt
It will output locations of all failed scenarios to this file.
Then you can rerun them by using
cucumber #rerun.txt
Here is my simple and neat solution.
Step 1: Write your cucumber java file as mentioned below with rerun:target/rerun.txt. Cucumber writes the failed scenarios line numbers in rerun.txt as shown below.
features/MyScenaios.feature:25
features/MyScenaios.feature:45
Later we can use this file in Step 2. Name this file as MyScenarioTests.java. This is your main file to run your tagged scenarios. If your scenarios has failed test cases, MyScenarioTests.java will write/mark them rerun.txt under target directory.
#RunWith(Cucumber.class)
#CucumberOptions(
monochrome = true,
features = "classpath:features",
plugin = {"pretty", "html:target/cucumber-reports",
"json:target/cucumber.json",
"rerun:target/rerun.txt"} //Creates a text file with failed scenarios
,tags = "#mytag"
)
public class MyScenarioTests {
}
Step 2: Create another scenario file as shown below. Let's say this as FailedScenarios.java. Whenever you notice any failed scenario run this file. This file will use target/rerun.txt as an input for running the scenarios.
#RunWith(Cucumber.class)
#CucumberOptions(
monochrome = true,
features = "#target/rerun.txt", //Cucumber picks the failed scenarios from this file
format = {"pretty", "html:target/site/cucumber-pretty",
"json:target/cucumber.json"}
)
public class FailedScenarios {
}
Everytime if you notice any failed scenarios run the file in Step 2
task cucumber() {
dependsOn assemble, compileTestJava
doLast {
javaexec {
main = "io.cucumber.core.cli.Main"
classpath = configurations.cucumberRuntime + sourceSets.main.output + sourceSets.test.output
args = ['--plugin', 'json:target/cucumber-reports/json/cucumber.json',
'--plugin', "rerun:target/rerun.txt",
'--glue', 'steps',
'src/test/resources']
}
}
}
task cucumberRerunFailed() {
doLast {
javaexec {
main = "io.cucumber.core.cli.Main"
classpath = configurations.cucumberRuntime + sourceSets.main.output + sourceSets.test.output
args = ['--plugin', 'json:target/cucumber-reports/json/cucumber.json',
'#target/rerun.txt']
}
}
}
I know this is old, but I found my way here first and then later found a much more up to date answer (not the accepted, Cyril Duchon-Doris' answer):
https://stackoverflow.com/a/41174252/215789
Since cucumber 3.0 you can use --retry to specify how many times to retry a scenario that failed.
https://cucumber.io/blog/open-source/announcing-cucumber-ruby-3-0-0/
Just tack it on to your cucumber command:
cucumber ... --retry 2
You need at least version 1.2.0 in order to use the #target/rerun.txt new feature. After that just create a runner that runs at the end and uses this file. Also, if you are using Jenkins, you can put a tag on the random failures features so the build doesn't fails unless after being ran twice.