Is there a way to define a block/environment with custom open and close methods? Currently, I have:
script {
withCredentials([usernamePassword(credentialsId: '...', usernameVariable: 'CONFIG_USER', passwordVariable: 'CONFIG_PASS')]) {
def sql = Sql.newInstance("...", CONFIG_USER, CONFIG_PASS, "com.mysql.jdbc.Driver")
sql.rows("SELECT * FROM visualization").each { row ->
println "row ${row.branch}"
}
sql.close()
}
}
I would like to be able to do:
with sqlConnection() { sql ->
sql.rows("SELECT * FROM visualization").each { row ->
println "row ${row.branch}"
}
}
Where it automatically opens/closes the connection accordingly. I am new to Groovy, so it's the syntax I'm concerned about. In Python I would do this using an object __enter__/__exit__.
If I understand you correctly you want a new method sqlConnection() that does the withCredentials part?
You can use a closure parameter to do something before or after something else.
def sqlConnection(Closure withSqlClosure) {
withCredentials([usernamePassword(credentialsId: '...', usernameVariable: 'CONFIG_USER', passwordVariable: 'CONFIG_PASS')]) {
Sql.newInstance("...", CONFIG_USER, CONFIG_PASS, "com.mysql.jdbc.Driver").withCloseable {sql ->
withSqlClosure(sql)
}
}
}
Can be used like this
sqlConnection() { sql ->
sql.rows("SELECT * FROM visualization").each { row ->
println "row ${row.branch}"
}
}
So everything before the call to the closure (withSqlClosure(sql)) corresponds to __enter__ everything after the call is your __exit__. Note that you will need to lookout for exceptions. Usually you will want to wrap the closure call in a try { ... } finally { ... } Statement. Here I used withCloseable which does that for us (assuming Sql.newInstance returns a Closeable).
To aid your IDE and enable #CompileStatic you should also add a #ClosureParams
def sqlConnection(
#ClosureParams(value = groovy.transform.stc.SimpleType,
options = ["your.sql.type"]) Closure withSqlClosure) {
withCredentials([usernamePassword(credentialsId: '...', usernameVariable: 'CONFIG_USER', passwordVariable: 'CONFIG_PASS')]) {
Sql.newInstance("...", CONFIG_USER, CONFIG_PASS, "com.mysql.jdbc.Driver").withCloseable {sql ->
withSqlClosure(sql)
}
}
}
Here your.sql.type is the return type of Sql.newInstance.
Related
I'm developing a Jenkins shared library right now.
I wasn't able to figure how to easily "wrap" a code inside a function without copy-pasting the whole code. For example: If a developer sets a value to true, then I want to wrap the whole code inside a function. Right now I want to use this to allow e.g. the gitlabIntegration to be turned off from the Jenkinsfile.
Example:
// vars/stageWrapper.groovy
def call(Map parameters = [:], body) {
stage(stageName) {
if (pushtoGitlab) {
gitlabCommitStatus(stageName) {
if (!containerName) body()
else {
container(containerName) {
body()
}
}
}
} else {
if (!containerName) body()
else {
container(containerName) {
body()
}
}
}
}
}
let the user select if the stage should be pushed to gitlab via the gitlabCommitStatus wrapper.
switch to a specified container or use default container (if none is specified)
To realize this I currently repeat the code, which I really don't like...
Is there any way of achieving the same, but without repeating the same code over and over?
Thank You!
In Groovy you can reuse a Closure in different DSL-Builders by setting it's delegate to builder's delegate.
Something like this should work:
def containerNameBody = { body ->
if (!containerName)
body()
else
container(containerName) {
body()
}
}
def call(Map parameters = [:], body) {
stage(stageName) {
containerNameBody.delegate = delegate
if (pushtoGitlab)
gitlabCommitStatus(stageName) {
containerNameBody body
}
else
containerNameBody body
}
}
How about following approach to pass down the param into function, then decide how to do inside the function by the param value.
def gitHub(gitHubOn) {
}
def gitLab(gitLabOn) {
}
def call(Map parameters = [:], body){
//some code....
foo=bar
gitLab(parameters.gitLabOn)
gitHub(parameters.gitHubOn)
body()
}
A little bit confuse about this simple use case :
I want to add a closure around an other one only with conditions.
For the moment, I only succeed to do this :
if(condition) {
my_root_closure {
my_main_closure {
do_stuff()
}
}
} else {
my_main_closure {
do_stuff()
}
}
I would like to do this without repeat the my_main_closure bloc.
To avoid repetition, you could create new closure that calls my_main_closure and store it in a variable:
def mmc = {
my_main_closure {
do_stuff()
}
}
if(condition) {
my_root_closure( mmc )
} else {
mmc()
}
On Jenkins, using Job DLS plugin, I'm trying to prepare one script which will create jobs configured for different environments (dev and preprod). Depends on for which environments this job has to run, different parameters are needed.
In this situation how to define, in the shortest way, that parameters for dev environment include the same as preprod parameters plus additionally i.e. 2 more?
An example of the code which I use is presented below.
def environments = ["DEV", "PREPROD"]
def names = ["name1", "name2", "name3"]
def jobParameters = {
string {
name("browser")
defaultValue("CHROME")
description("Browser on which one tests will run")
trim(true)
}
string {
name("parameter1")
defaultValue("")
description("")
trim(true)
}
}
def jobParametersDev = {
jobParameters
string {
name("parameter2")
defaultValue("")
description("")
trim(true)
}
string {
name("parameter3")
defaultValue("")
description("")
trim(true)
}
}
def createJob(name, env, runCommand, jobParameters) {
job("Job-${-> name}-${-> env}") {
description("My first job for ${-> name}")
parameters(jobParameters)
steps {
shell {
command(runCommand)
}
}
}
}
for (name in names) {
for (env in environments) {
if (env == 'DEV') {
def runCommand = "python35 -u ./TestSuite-${-> name}.py %parameter1% %parameter2%,%parameter3% %browser%"
createJob(name, env, runCommand, jobParametersDev)
} else {
def runCommand = "python35 -u ./TestSuite-${-> name}.py %parameter1% ${-> env} %browser%"
createJob(name, env, runCommand, jobParameters)
}
}
}
To summarise - the last thing which I tried is:
def jobParametersDev = {
jobParameters
...
}
But it doesn't work... Only values for jobParametersDev are visible.
How to add these values? If it's not necessary I don't want to double the same code.
I will be really grateful for any help.
You can not simply call one closure within another. But you can chain method calls. You just need to pass the job reference.
def generateParameters = { job ->
job.parameters {
stringParam('param3', '', '')
// more params here...
}
}
def generateDevParameters = { job ->
generateParameters(job)
job.parameters {
stringParam('param4', '', '')
// more params here...
}
}
def createJob(name, generateParameters) {
def j = job(name) {
// more config here...
}
generateParameters(j)
}
createJob('test1', generateParameters)
createJob('test2', generateDevParameters)
Let's say I have a DSL like this
setup {name = "aDSLScript"}
println "this is common groovy code"
doStuff {println "I'm doing dsl stuff"}
One would have a delegating class implementing the methods 'setup' and 'doStuff' usually. Beside, one could write common Groovy code to be executed (println...).
What I am searching for, is a way to execute this in two steps. In the first step only the setup method should be processed (neither println). The second step handles the other parts.
At the moment, I have two delegating classes. One implements 'setup' the other one implements 'doStuff'. But both execute the println statement, of course.
You can create a single class to intercept the method calls from the script and let it coordinate the following method invoke. I did it through reflection, but you can go declarative if you want. These are the model and script classes:
class FirstDelegate {
def setup(closure) { "firstDelegate.setup" }
}
class SecondDelegate {
def doStuff(closure) { "secondDelegate.doStuff" }
}
class MethodInterceptor {
def invokedMethods = []
def methodMissing(String method, args) {
invokedMethods << [method: method, args: args]
}
def delegate() {
def lookupCalls = { instance ->
def invokes = instance.metaClass.methods.findResults { method ->
invokedMethods.findResult { invocation ->
invocation.method == method.name ?
[method: method, invocation: invocation] : null
}
}
invokes.collect { invoked ->
invoked.method.invoke(instance, invoked.invocation.args)
}
}
return lookupCalls(new FirstDelegate()) + lookupCalls(new SecondDelegate())
}
}
Here be scripts and assertions:
import org.codehaus.groovy.control.CompilerConfiguration
def dsl = '''
setup {name = "aDSLScript"}
println "this is common groovy code"
doStuff {println "Ima doing dsl stuff"}
'''
def compiler = new CompilerConfiguration()
compiler.scriptBaseClass = DelegatingScript.class.name
def shell = new GroovyShell(this.class.classLoader, new Binding(), compiler)
script = shell.parse dsl
interceptor = new MethodInterceptor()
script.setDelegate interceptor
script.run()
assert interceptor.invokedMethods*.method == [ 'setup', 'doStuff' ]
assert interceptor.delegate() ==
['firstDelegate.setup', 'secondDelegate.doStuff']
Notice I didn't bothered intercepting println call, which is a DefaultGroovyMethods thus, a little more cumbersome to handle.
Also having the class MethodInterceptor implementing the method delegate() is not a good idea, since this allows the user-defined script to call it.
I found a way to split up execution of the DSL script. I used a CompilationCustomizer to remove every statement from AST except the doFirst{}. So the first run will only execute doFirst. The second run does everything else. Here's some code:
class DoFirstProcessor {
def doFirst(Closure c) {
c()
}
}
class TheRestProcessor {
def doStuff(Closure c) {
c()
}
def methodMissing(String name, args) {
//nothing to do
}
}
def dsl = "
println 'this is text that will not be printed out in first line!'
doFirst { println 'First things first: e.g. setting up environment' }
doStuff { println 'doing some stuff now' }
println 'That is it!'
"
class HighlanderCustomizer extends CompilationCustomizer {
def methodName
HighlanderCustomizer(def methodName) {
super(CompilePhase.SEMANTIC_ANALYSIS)
this.methodName = methodName
}
#Override
void call(SourceUnit sourceUnit, GeneratorContext generatorContext, ClassNode classNode) throws CompilationFailedException {
def methods = classNode.getMethods()
methods.each { MethodNode m ->
m.code.each { Statement st ->
if (!(st instanceof BlockStatement)) {
return
}
def removeStmts = []
st.statements.each { Statement bst ->
if (bst instanceof ExpressionStatement) {
def ex = bst.expression
if (ex instanceof MethodCallExpression) {
if (!ex.methodAsString.equals(methodName)) {
removeStmts << bst
}
} else {
removeStmts << bst
}
} else {
removeStmts << bst
}
}
st.statements.removeAll(removeStmts)
}
}
}
}
def cc = new CompilerConfiguration()
cc.addCompilationCustomizers new HighlanderCustomizer("doFirst")
cc.scriptBaseClass = DelegatingScript.class.name
def doFirstShell = new GroovyShell(new Binding(), cc)
def doFirstScript = doFirstShell.parse dsl
doFirstScript.setDelegate new DoFirstProcessor()
doFirstScript.run()
cc.compilationCustomizers.clear()
def shell = new GroovyShell(new Binding(), cc)
def script = shell.parse dsl
script.setDelegate new TheRestProcessor()
script.run()
I did another variation of this where I execute the DSL in one step. See my blog post about it: http://hackserei.metacode.de/?p=247
In a piece of Gradle build script, the amount of code i'm repeating is increasing. All tasks have a big part in common, except for a few lines:
task copyZipFile() {
doLast {
def remoteBuildProperties = getRemoteBuildProperties(project)
ant {
taskdef(name: 'ftp',
classname: 'org.apache.tools.ant.taskdefs.optional.net.FTP',
classpath: configurations.ftpAntTask.asPath)
ftp(server: remoteBuildProperties['host.name'],
userid: remoteBuildProperties['username'],
password: remoteBuildProperties['password'],
remotedir: 'some-folder', // This value differs from call to call
passive: 'true') {
// The next two lines also are different per case, and might be less or more lines
fileset(dir: rootProject.buildDir) { include(name: 'build.zip') }
fileset(dir: rootProject.projectDir) { include(name: 'build.properties') }
}
}
}
}
I don't like to repeat myself, so I'd like to reduce this code to a new helper method that does this trick, and a simple caller, something like:
task copyZipFile() {
doLast {
def remoteBuildProperties = getRemoteBuildProperties(project)
upload(remoteBuildProperties, 'some-folder') {
fileset(dir: rootProject.buildDir) { include(name: 'build.zip') }
fileset(dir: rootProject.projectDir) { include(name: 'build.properties') }
}
}
}
How would I achieve this?
You can pass the inner closure to your upload method as the final parameter. Set the delegate to the original builder delegate so the inner closure calls get handled properly. For example:
def upload(remoteBuildProperties, folder, body) {
ant {
taskdef(name: 'ftp',
classname: 'org.apache.tools.ant.taskdefs.optional.net.FTP',
classpath: configurations.ftpAntTask.asPath)
ftp(server: remoteBuildProperties['host.name'],
userid: remoteBuildProperties['username'],
password: remoteBuildProperties['password'],
remotedir: folder,
passive: 'true') {
body.delegate = delegate
body()
}
}
}