How to remember parameter values used on last build in Jenkins/Hudson - groovy

I need to remember the last parameter values when I begin a new build with parameters.
I have two string parameters:
${BRANCH}
${ServerSpecified}
On the first build execution I need those values in blank, but for the second execution, I need the values of the first execution, in the third execution the values of the second execution, and so on...
Do I need to install a plugin? I have tried using dynamic param with groovy, but I can't extract the last value. Does anybody know how to do this or have any other idea?

There is a Rebuild plugin that would allow you to re-build any job of interest. It also allows you to modify one or more of the original build parameters

In order to retrieve parameters from previous executions, you can follow this approach in your pipeline:
def defaultValueForMyParameter = "My_Default_Value"
node('master') {
parameterValue = params.MY_PARAMETER ?: defaultValueForMyParameter
}
pipeline {
parameters {
string(name: 'MY_PARAMETER', defaultValue: parameterValue, description: "whatever")
}
...
}
This code keeps track on the last value used for the parameter allowing to change it before or during the run. If the parameter does not exist in the job, it will be created and the default value will be assigned to it.

Yes, it looks like you are trying to invent something like Version Number Plugin:
This plugin creates a new version number and stores it in the
environment variable whose name you specify in the configuration.
So you can as many variables as you want.

No one mentions the Persistent Parameter plugin, which is the one I use.
Supports string parameters, choice, and more.

Related

Custom library not recognized inside DSL file in groovyscript part

Context: I'm implementing a Jenkins Pipeline. For this, in order to define the pipeline's parameters I implemented the DSL file.
In the DSL file, I have a parameter of ActiveChoiceParam type, called ORDERS. This parameter will allow me to choose one or more orders numbers at the same time.
Problem: What I want to do is to set the values that gets rendered for ORDERS parameter from a custom library. Basically I have a directory my_libraries with a file, orders.groovy. In this file, there is an order class, with a references list property that contains my values.
The code in the DSL file is as follows:
def order = new my_libraries.order()
pipelineJob("111_name_of_pipeline") {
description("pipeline_description")
keepDependencies(false)
parameters {
stringParam("BRANCH", "master", "The branch")
activeChoiceParam("ORDERS") {
description("orders references")
choiceType('CHECKBOX')
groovyScript{
script("return order.references")
fallbackScript("return ['error']")
}
}
}
}
Also, is good to mention that my custom library works well. For example, if I choose to use a ChoiceParam as below, it works, but of course, is not the behaviour I want. I want to select multiple choices.
choiceParam("ORDERS", order.references, "Orders references list but single-choice")
How can I make order.references available in the script part of groovyScript?
I've tried using global instance of order class, instantiate the class directly in the groovyScript, but no positive result.

Conditionally Set Environment Azure DevOps

I am working with an Azure Pipeline in which I need to conditionally set the environment property. I am invoking the pipeline from a rest API call by passing Parameters in the body which is documented here.. When I try to access that parameter at compile time to set the environment conditionally though the variable is coming through as empty (assuming it is not accessible at compile time?)
Does anybody know a good way to solve for this via the pipeline or the API call?
After some digging I have found the answer to my question and I hope this helps someone else in the future.
As it turns out the Build REST API does support template parameters that can be used at compile time, the documentation just doesn't explicitly tell you. This is also supported in the Runs endpoint as well.
My payload for my request ended up looking like:
{
"Parameters": "{\"Env\":\"QA\"}",
"templateParameters": {"SkipApproval" : "Y"},
"Definition": {
"Id": 123
},
"SourceBranch": "main"
}
and my pipeline consumed those changes at compile time via the following (abbreviated version) of my pipeline
parameters:
- name: SkipApproval
default: ''
type: string
...
${{if eq(parameters.SkipApproval, 'Y')}}:
environment: NoApproval-All
${{if ne(parameters.SkipApproval, 'Y')}}:
environment: digitalCloud-qa
This is a common area of confusion for YAML pipelines. Run-time variables need to be accessed using a different syntax.
$[ variable ]
YAML pipelines go through several phases.
Compilation - This is where all of the YAML documents (templates, etc) comprising the final pipelines are compiled into a single document. Final values for parameters and variables using ${{}} syntax are inserted into the document.
Runtime - Run-time variables using the $[] syntax are plugged in.
Execution - The final pipeline is run by the agents.
This is a simplification, another explanation from Microsoft is a bit better:
First, expand templates and evaluate template expressions.
Next, evaluate dependencies at the stage level to pick the first stage(s) to run.
For each stage selected to run, two things happen:
All resources used in all jobs are gathered up and validated for authorization to run.
Evaluate dependencies at the job level to pick the first job(s) to run.
For each job selected to run, expand multi-configs (strategy: matrix or strategy: parallel in YAML) into multiple runtime jobs.
For each runtime job, evaluate conditions to decide whether that job is eligible to run.
Request an agent for each eligible runtime job.
...
This ordering helps answer a common question: why can't I use certain variables in my template parameters? Step 1, template expansion, operates solely on the text of the YAML document. Runtime variables don't exist during that step. After step 1, template parameters have been resolved and no longer exist.
[ref: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/runs?view=azure-devops]

Add a map into jenkins build flow plugin as a parameters

I have a question about jenkins build flow plugin.
There is a default value called params in build flow dsl which looks like a map.
What I want to do is pass this map to the jobs I want to build later, however, build flow dont accept a map as a parameters.
For example:
build("test_job", params)
The most stupid way I know is just paste all of them one by one, like, build("test_job", "Key1":params[1], "key2":"params[2]")
Any better idea for this case?
Br,
Tim
The order is the key here!
You can do this (at least it works for me), use the map of parameters as the first argument :
job_params = [:]
job_params['BRANCH'] = 'The Branch Name'
build( job_params, 'pipelinetester' )
And it works!
Try this method
build("jobname", parameter_name:"your parameter value")
Example :
In your case if you are using name as parameter and your value "abc"
then use
build ("job-name", name:"abc")
You can do this by archiving your map from project 1 and copy it using this plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Copy+Artifact+Plugin
Or you can use shared folder using the plugin:
https://wiki.jenkins-ci.org/display/JENKINS/CopyArchiver+Plugin

Name lookup of the local variable

I have a question about name lookup in Groovy. Consider the following build script:
apply([plugin: 'java'])
def dependenciesClosure = {
delegate.ext.name = "DelegateName"
println name
println delegate.toString()
project(':api')
}
dependenciesClosure();
dependencies(dependenciesClosure)
The gradle check command produces the output
webapp
project ':webapp'
DelegateName
org.gradle.api.internal.artifacts.dsl.dependencies.DefaultDependencyHandler_Decorated#397ef2
Taking that into account, nonlocal variable name lookup is performed on a delegate object first an, if the name's not found, performed on the global project object. Is that correct?
Correct, Gradle uses a delegate first resolve strategy within configuration closures. In this case, the delegate is an instance of DependencyHandler. You can see what any given block delegates to by looking at the Gradle DSL documentation.
Edit: To confirm your last point, yes, the build script itself delegates to an instance of Project.

How get a value from a puppet resource

I have a problem with my puppet script.
I would like to get a value set in my resource file. I declare a resource like that
define checkxml(
$account = '',
$pwd = template('abc/abc.erb'),
){
if(empty($pwd)){
fail('pwd empty')
}
}
I call it via :
checkxml{"$agtaccount":
account => $agtaccount,
}
I want to get the value of $pwd. The $pwd will get is value by Template. If i try to show the value in my resource definition it's ok, I get the right value, so the Template works fine.
My problem is to get access this value after calling the ressource. I saw the getparam of stdlib but doesn't work for me.
getparam(Checkxml["$agtaccount"],"pwd")
If i try to get the account parameters instead of pwd it's ok. I think as i doesn't declare the pwd i can't get him back
How can i get him ?
Thanks for your help
Ugh, this looks dangerous. First off I'd recommend to steer clear of that function and the concept it embodies. It faces you with evaluation order dependencies, which can always lead to inconsistent manifest behavior.
As for the retrieval of the value itself - that will likely not work if the default is used. That's because on a catalog building level, there is not yet a value that is being bound to the parameter, if that makes any sense.
The resolution of final parameter values is rather involved, so there are lots of things that can go wrong with a manifest that relies on such introspective functionality.
I recommend to retrieve the desired value in a more central location (that depends on your manifest structure) and use it both when declaring the Checkxml["$agtaccount"] resource as well as its other uses (for which you are currently trying to extract it).

Resources