I am trying to supply a parameter as the credentialId under the git step of my workflow. I define the following variables as environment variables in my job folder:
stashProject=ssh://git#stash.finra.org:7999/rpt
gitProdCredential=289b9074-c29a-463d-a793-6e926174066c
I have the following lines my inline Groovy CPS DSL workflow script:
sh 'echo retrieving code using credential: ${gitProdCredential}'
git url: '${stashProject}/etl.git', credentialsId: '${gitProdCredential}', branch: 'feature/workflow'
You can see that the variables are being evaluated properly as the gitProdCredential is echo'd and the git retrieval does attempt to get from my correct URL, based on the following output:
retrieving code using credential: 289b9074-c29a-463d-a793-6e926174066c
hudson.plugins.git.GitException: Failed to fetch from ssh://git#stash.finra.org:7999/rpt/etl.git
stderr: Permission denied (publickey).
But you can also see it is not authenticating properly. If, however, I hardcode the gitProdCredential like so
git url:'${stashProject}/etl', credentialId: '289b9074-c29a-463d-a793-6e926174066c', branch: 'feature/workflow'
It runs just fine and clones my repo. So somehow the credentialId variable isn't being evaluated correctly in the git DSL function properly, even though it appears to be in the rest of the workflow. Please advise if I'm missing something.
This is mainly a Groovy issue.
'${gitProdCredential}'
is a literal string with the text ${gitProdCredential}. Probably you meant
"${gitProdCredential}"
or more simply just
gitProdCredential
since there is no point creating a string expression which interpolates a (String-valued) expression and includes nothing else. In this case however the variable is not a Groovy variable but an environment variable, so you needed to use
env.gitProdCredential
You were probably misled by the fact that
sh 'echo retrieving code using credential: ${gitProdCredential}'
works. But this works only because it is running a Bourne shell script
echo retrieving code using credential: ${gitProdCredential}
and this shell happens to allow environment variables to be expanded using a syntax similar to that which Groovy uses in GString.
As to the incidental expansion of '${stashProject}/etl.git', this is apparently happening in the Git plugin, and is arguably a bug (values passed from a Workflow script should be used as is): some Jenkins plugins expand environment variables in configuration inputs, again using a syntax similar to that used by Groovy.
In summary, what you meant to write was
git url: "${env.stashProject}/etl.git", credentialsId: env.gitProdCredential, branch: 'feature/workflow'
By the way, when using sufficiently new versions of the Credentials plugin, when creating a new credentials item (but not thereafter) you can click the Advanced button to specify a mnemonic ID, which makes working with scripted projects like Workflow more pleasant.
Related
I have a bitbucket pipeline to push a docker image. I've defined the variable $DOCKERHUB_USERNAME=example
In my build step I have the line:
VERSION=$(npm run version --workspace=#example/core-web --silent)
When this runs though, its replacing #example with #$DOCKERHUB_USERNAME
VERSION=$(npm run version --workspace=#$DOCKERHUB_USERNAME/core-web --silent)
How can I escape that text so bitbucket doesn't try to replace it with the variable thats set to the same text? It just coincidentally is the same name, but they are not related.
If an environment variable is marked as a secret variable, Bitbucket activates a security feature that masks any accidental print of its value in the logs, replacing it by its variable name.
See https://support.atlassian.com/bitbucket-cloud/docs/variables-and-secrets/#Secured-variable-masking
Note this has no effect on the actual instructions being run: the value is only masked in the pipeline logs that are shown to you.
You should avoid such weak secrets. Using dictionary words that can legitimately show up in the logs will cause this security feature to expose the value of your secret so that it could be inferred even if it was never deliberately printed.
If you do not want to setup a secure value because it is not truly a secure variable, simply configure the variable as a regular public variable.
In Azure Pipelines, I see that you can access the environment variables from scripts in node.js during a pipeline run. However, I want to actually return a value and then capture/use it.
Does anyone know how to do this? I can't find any references on how to do this in documentation.
For consistency's sake it'd be nice to use node scripts for everything and not go back and forth between node and bash.
Thanks
Okay I finally figured this out. Azure documentation is a bit confusing on the topic, but my approach was what follows. In this example, I'm going to make a rather pointless simple script that sets a variable whose value is the name of the source branch, but all lower case.
1) Define your variable
Defining a variable can be done simply (though there is a lot of depth to how variables are used and I suggest consulting Azure documentation on variable creation for more). However, at the top of your pipeline yaml file you can define it as such:
variables
lowerCaseBranchName: ''
This creates an empty variable for use across your jobs. We'll use this variable as our example.
2) Create your script
"Returning a value" from your script simply means outputting it via node's stdout, the output of which will be consumed by the task to set it as a pipeline variable.
An important thing to remember is that any environment variables from the pipeline can be used within node, they are just reformatted and moved under node's process.env global. For instance, the commonly used Build.SourceBranchName environment variable in azure pipelines is accessible in your node script via its alias process.env.BUILD_SOURCEBRANCHNAME. This uppercase name transformation should be uniform across all environment variables.
Here's an example node.js script:
const lowerCaseBranchName = process.env.BUILD_SOURCEBRANCHNAME.toLowerCase();
process.stdout.write(lowerCaseBranchName);
3) Consume the output in the relevant step in azure pipelines
To employ that script in a job step, call it with a script task. Remember that a script task is, in this case, a bash script (though you can use others) that runs node as a command as it sets the value of our variable:
- script: |
echo "##vso[task.setvariable variable=lowerCaseBranchName]$(node path/to/your/script)"
displayName: 'Get lower case branch name'
Breaking down the syntax
Using variable definition syntax is, in my opinion extremely ugly, but pretty easy to use once you understand it. The basic syntax for setting a variable in a script is the following:
##vso[task.setvariable variable=SOME_VARIABLE_NAME]SOME_VARIABLE_VALUE
Above, SOME_VARIABLE_NAME is the name of our variable (lowerCaseBranchName) as defined in our azure pipeline configuration at the beginning. Likewise, SOME_VARIABLE_VALUE is the value we want to set that variable to.
You could do an additional line above this line to create a variable independently that you can then use to set the env variable with, however I chose to just inline the script call as you can see in the example above usign the $() syntax.
That's it. In following tasks, the environment variable lowerCaseBranchName can be utilized using any of the variable syntaxes such as $(lowerCaseBranchName),
Final result
Defining our variable in our yaml file:
variables
lowerCaseBranchName: ''
Our nodejs script:
const lowerCaseBranchName = process.env.BUILD_SOURCEBRANCHNAME.toLowerCase();
process.stdout.write(lowerCaseBranchName);
Our pipeline task implementation/execution of said script:
- script: |
echo "##vso[task.setvariable variable=lowerCaseBranchName]$(node path/to/your/script)"
displayName: 'Get lower case branch name'
A following task using its output:
- script: |
echo "$(lowerCaseBranchName)"
displayName: 'Output lower case branch name'
This will print the lower-cased branch name to the pipline console when it runs.
Hope this helps somebody! Happy devops-ing!
I'm trying to get my build repository name as an uppercase string combining predefine variables and expressions on Azure Devops as follows:
variables:
repoNameUpper: $[upper(variables['Build.Repository.Name'])]
- script: |
echo $(repoNameUpper)
Yet I get no output from it, what am I doing wrong here?
Yes, I know I could set a variable to achieve what I need using a bash script, yet I think it would not be so cool.
It because the Build.Repository.Name is agent-scoped, and can be used as an environment variable in a script and as a parameter in a build task. in another words - is not known at plan compile time, only at job execution time.
You can find more info in this GitHub issue.
I would like to print all available properties (and their values) in env object inside Jenkinsfile.
When I do
print env
I get:
org.jenkinsci.plugins.workflow.cps.EnvActionImpl#112cebc2
So it looks like toString is not implemented there, how can I access properties that are in this object if I don't know their names?
Make sure you're not running the pipeline script in sandboxed mode and you should be able to use:
env.getEnvironment()
Note, if you're running in sandbox mode in a pipeline, you should approve the method at the script approval page: http://jenkins-host/scriptApproval/
To retrieve all env properties using a Jenkinsfile written in either declarative or scripted DSL you can use:
sh 'env'
or
sh 'printenv'
As said over here: https://stackoverflow.com/a/42138466/618253
The declarative pipeline way of doing things:
node {
echo sh(returnStdout: true, script: 'env')
}
I really like being able to run Groovy scripts in Hudson (or Jenkins, but I use Hudson).
For example, see my question In Groovy, how do I get the list of parameter names for a given job? Hudson parameter names question][1]
The thing is, now I'd like use these Groovy scripts like a keyboard macro or utility. I want to be visiting one of my jobs, hit the special keystroke, and voila, the Groovy script is run. I leave it as an exercise for myself to pick up parameters from environment of current job.
Does anybody out there do this sort of thing, and if so, what strategy has been useful. So far, all I know how to do is bring up the script console, pasted in my code, edit the variable containing the name of the Hudson job, and hit "run". Kinda klunky. Suggestions appreciated.
You can use jenkins Remote access method to do this. The Jenkins wiki describes how to use Remote access:
User can execute groovy scripts remotely sending post request to
/script/ url or /scriptText/ to have response returned without the
html wrapping.
$ curl -d "script=<your_script_here>" http://jenkins/script
$ # or
$ curl -d "script=<your_script_here>" http://jenkins/scriptText
Also, Jenkins CLI offers the possibility to execute groovy
scripts remotely using groovy command or execute groovy interactivelly
via groovysh.