I use a shared jenkins library and clone another git repo from there. That repo contains a jenkinsfile similar to the follwing:
#!/usr/bin/env groovy
#Library('mylib')
import jLib.*
someStage{
myparam = someValue
}
I want to read "someValue".
Currently I'm doing this with a regexp but this way I can only retrieve a String and not more complex values like a map.
In the documentation of Shared jenkins libs values are loaded from a jenkinsfile the following way:
def call(body) {
// evaluate the body block, and collect configuration into the object
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
}
How can I extract values from the Jenkinsfile in the workspace in a similar manner? Where does the body come from?
Thank you for your time
I found a solution.
Map configFromJenkinsfile(jenkinsfile){
def jenkinsfileConfig = [:]
def matches = jenkinsfile =~ /(?s)\{(.*?)\}/
def exec = new GroovyShell(new Binding(jenkinsfileConfig)).evaluate(matches[0][1])
return jenkinsfileConfig;
}
I tested it with passing jenkinsFile as string (after using String projectJenkinsFile = readFile "Jenkinsfile")
I execute the part in the jenkinsfile with all the assignments (between '{' and '}') in a groovyshell with my map as binding. All assignments without type are caught in the binding and can be accessed.
Related
I'm having a JSON data, from which I'm able to fetch some data from JSON Extractor post-processor.
And the data looks something like this,
group_1 = ["Scooter"]
group_2 = ["Bus","Jeep","Car"]
group_ALL = ["Scooter"],["Bus","Jeep","Car"]
group_matchNr = 2
So I want to fetch all this data (or accumulate) in only one array variable,
Example : groups=["Scooter","Bus","Jeep","Car"]
so that I loop it through all the individual data in the api calls in a foreach loop.
example api call in foreach loop: https://vehicles.com/${groups}
My question is, I'm trying to write the groovy script to get all the data in groups variable. But unable to. I need your help in this and to configure the foreach controller also.
Groovy Script I tried in JSR223 Postprocessor which did not work:
import groovy.json.JsonSlurper;
groups= vars.get("group_ALL");
def jsonSlurper = new JsonSlurper();
jsonData = JSON.stringify(vars.get("groups"))
String tempRemoveBr = jsonData.replaceAll("[","");
String tempRemoveBr1 = tempRemoveBr.replaceAll("]","");
String[] arrVehicles = tempRemoveBr1.split(",")
for (int i=0;i<arrVehicles.size();i++)
{
log.info("arrVehicles[i]")
}
vars.put("arrVehicles",arrVehicles)
foreach controller configured as :
Input variable : arrVehicles
start index : 0
output variable name : vehicleN
Most probably you could achieve it using JSON JMESPath Extractor, however I cannot come up with a proper configuration without seeing your full response.
If you want to process existing variables here is a Groovy script you could use:
def group1 = new groovy.json.JsonSlurper().parseText(vars.get('group_1'))
def group2 = new groovy.json.JsonSlurper().parseText(vars.get('group_2'))
def arrVehicles = group1 + group2
vars.put('arrVehicles', new groovy.json.JsonBuilder(arrVehicles).toString())
More information:
Apache Groovy - Parsing and Producing JSON
Apache Groovy: What Is Groovy Used For?
I have one common groovy file that contains few const variables and functions...
and I also have more groovy files with pipelineJob that use the variables and functions from the common file
what is the best way to import all the data from the common file to the other files?
I have not tested this with Jenkins, but if Jenkins executes the Groovy script as if by invoking groovy -cp .... myScript.groovy it should work:
utils.groovy:
// notice there's no "def", otherwise the def would be local only
name = 'Joe'
class MyUtils {
static String greeting(String name) {
"Hello $name"
}
}
src/main.groovy
def shell = new GroovyShell(getBinding())
shell.evaluate(new File('utils.groovy'))
println MyUtils.greeting(name)
Running it:
$ groovy src/Main.groovy
Hello Joe
Because the Script base class by default also has an evaluate method, your can actually just call that instead of using a GroovyShell and the result should be identical:
src/main.groovy
evaluate(new File('utils.groovy'))
println MyUtils.greeting(name)
If it doesn't work it's because the Script base class has been changed , probably... the first approach should work in all cases.
What parameters.script means in the following Groovy code:
example.groovy
class Example {
def call(Map parameters) {
def script = parameters.script
...
}
}
It retrieves a value from a Map with the key 'script'. The following are all equivalent
def script = parameters.script
script = parameters['script']
script = parameters.get('script')
from https://www.timroes.de/2015/06/28/groovy-tutorial-for-java-developers-part3-collections/:
Beside using brackets you can also use dot notation to access entries
I want to use for scripting external Groovy Scripts.
To not copy a lot of code, I want to share classes.
I have:
- external_test.groovy
- Input.groovy
Running the external_test.groovy in Intellij works.
Input is a simple class:
package helpers
class Input {
String serviceConfig
String httpMethod
String path
LinkedHashMap headers = [:]
String payload
Boolean hasResponseJson
}
When the script is executed by Camunda, it cannot find the class:
import helpers.Input
...
And throws an Exception:
unable to resolve class helpers.Input # line 16, column 9. new helpers.Input(serviceConfig: "camundaService", ^ 1 error
It is listed in the Deployment:
Do I miss something or is this not supported?
I found a post in the Camunda forum, that helped me to solve this:
https://forum.camunda.org/t/groovy-files-cant-invoke-methods-in-other-groovy-files-which-are-part-of-same-deployment/7750/5
Here is the solution (that is not really satisfying - as it needs a lot of boilerplate code):
static def getScript(fileName, execution) {
def processDefinitionId = execution.getProcessDefinitionId()
def deploymentId = execution.getProcessEngineServices().getRepositoryService().getProcessDefinition(processDefinitionId).getDeploymentId()
def resource = execution.getProcessEngineServices().getRepositoryService().getResourceAsStream(deploymentId, fileName)
def scannerResource = new Scanner(resource, 'UTF-8')
def resourceAsString = scannerResource.useDelimiter('\\Z').next()
scannerResource.close()
GroovyShell shell = new GroovyShell()
return shell.parse(resourceAsString)
}
def helper = getScript("helpers/helper_classes.groovy", execution)
helper.myFunction("hello")
When I'm trying to use snakeyaml to dump out Yaml out of Groovy interpolated strings, it ends up printing a class name instead.
For example:
#Grab(group='org.yaml', module='snakeyaml', version='1.16')
import org.yaml.snakeyaml.Yaml
Yaml yaml = new Yaml();
def a = "a"
def list = ["$a"]
def s = yaml.dump(list)
Prints:
- !!org.codehaus.groovy.runtime.GStringImpl
metaClass: !!groovy.lang.MetaClassImpl {}
I'm guessing it has something to do with the fact that GStrings get transformed to Strings when they used and I suspect snakeyaml uses some sort of introspection to determine the class of the object.
Is there a better solution than calling toString() on all GStrings?
Try to create a new Representer :
public class GroovyRepresenter extends Representer {
public GroovyRepresenter() {
this.representers.put(GString.class, new StringRepresenter());
}
}
Yaml yaml = new Yaml(new GroovyRepresenter())
...
You could add type info to your variables
Yaml yaml = new Yaml();
def a = "a"
String aStr = "$a"
def list = [aStr]
def s = yaml.dump(list)