I have a bitbucket pipeline to push a docker image. I've defined the variable $DOCKERHUB_USERNAME=example
In my build step I have the line:
VERSION=$(npm run version --workspace=#example/core-web --silent)
When this runs though, its replacing #example with #$DOCKERHUB_USERNAME
VERSION=$(npm run version --workspace=#$DOCKERHUB_USERNAME/core-web --silent)
How can I escape that text so bitbucket doesn't try to replace it with the variable thats set to the same text? It just coincidentally is the same name, but they are not related.
If an environment variable is marked as a secret variable, Bitbucket activates a security feature that masks any accidental print of its value in the logs, replacing it by its variable name.
See https://support.atlassian.com/bitbucket-cloud/docs/variables-and-secrets/#Secured-variable-masking
Note this has no effect on the actual instructions being run: the value is only masked in the pipeline logs that are shown to you.
You should avoid such weak secrets. Using dictionary words that can legitimately show up in the logs will cause this security feature to expose the value of your secret so that it could be inferred even if it was never deliberately printed.
If you do not want to setup a secure value because it is not truly a secure variable, simply configure the variable as a regular public variable.
Related
Within our pipeline's we would like to set a variable based on some user defined capabilities. For example, agent-1 may store all python versions under "C:/Python" whereas agent-2 may store all python versions under "C:/Documents/Python" and a script may need to know of all the contents stemming from this folder. So, to fix this, we set some user capabilities of where it's stored.
Agent 1: PYTHON_DIR = C:/Python
Agent 2: PYTHON_DIR = C:/Documents/Python
We would like to extract these from in our azure-pipelines.yml for use in future script steps.
We initially tried using the syntax:
variables:
PYTHON_EXE: $(PYTHON_DIR)\Python38\...\python.exe
but this simply echos out as
$(PYTHON_DIR)\Python38\...\python.exe even after an agent reboot.
In Azure Pipelines, I see that you can access the environment variables from scripts in node.js during a pipeline run. However, I want to actually return a value and then capture/use it.
Does anyone know how to do this? I can't find any references on how to do this in documentation.
For consistency's sake it'd be nice to use node scripts for everything and not go back and forth between node and bash.
Thanks
Okay I finally figured this out. Azure documentation is a bit confusing on the topic, but my approach was what follows. In this example, I'm going to make a rather pointless simple script that sets a variable whose value is the name of the source branch, but all lower case.
1) Define your variable
Defining a variable can be done simply (though there is a lot of depth to how variables are used and I suggest consulting Azure documentation on variable creation for more). However, at the top of your pipeline yaml file you can define it as such:
variables
lowerCaseBranchName: ''
This creates an empty variable for use across your jobs. We'll use this variable as our example.
2) Create your script
"Returning a value" from your script simply means outputting it via node's stdout, the output of which will be consumed by the task to set it as a pipeline variable.
An important thing to remember is that any environment variables from the pipeline can be used within node, they are just reformatted and moved under node's process.env global. For instance, the commonly used Build.SourceBranchName environment variable in azure pipelines is accessible in your node script via its alias process.env.BUILD_SOURCEBRANCHNAME. This uppercase name transformation should be uniform across all environment variables.
Here's an example node.js script:
const lowerCaseBranchName = process.env.BUILD_SOURCEBRANCHNAME.toLowerCase();
process.stdout.write(lowerCaseBranchName);
3) Consume the output in the relevant step in azure pipelines
To employ that script in a job step, call it with a script task. Remember that a script task is, in this case, a bash script (though you can use others) that runs node as a command as it sets the value of our variable:
- script: |
echo "##vso[task.setvariable variable=lowerCaseBranchName]$(node path/to/your/script)"
displayName: 'Get lower case branch name'
Breaking down the syntax
Using variable definition syntax is, in my opinion extremely ugly, but pretty easy to use once you understand it. The basic syntax for setting a variable in a script is the following:
##vso[task.setvariable variable=SOME_VARIABLE_NAME]SOME_VARIABLE_VALUE
Above, SOME_VARIABLE_NAME is the name of our variable (lowerCaseBranchName) as defined in our azure pipeline configuration at the beginning. Likewise, SOME_VARIABLE_VALUE is the value we want to set that variable to.
You could do an additional line above this line to create a variable independently that you can then use to set the env variable with, however I chose to just inline the script call as you can see in the example above usign the $() syntax.
That's it. In following tasks, the environment variable lowerCaseBranchName can be utilized using any of the variable syntaxes such as $(lowerCaseBranchName),
Final result
Defining our variable in our yaml file:
variables
lowerCaseBranchName: ''
Our nodejs script:
const lowerCaseBranchName = process.env.BUILD_SOURCEBRANCHNAME.toLowerCase();
process.stdout.write(lowerCaseBranchName);
Our pipeline task implementation/execution of said script:
- script: |
echo "##vso[task.setvariable variable=lowerCaseBranchName]$(node path/to/your/script)"
displayName: 'Get lower case branch name'
A following task using its output:
- script: |
echo "$(lowerCaseBranchName)"
displayName: 'Output lower case branch name'
This will print the lower-cased branch name to the pipline console when it runs.
Hope this helps somebody! Happy devops-ing!
I'm trying to get my build repository name as an uppercase string combining predefine variables and expressions on Azure Devops as follows:
variables:
repoNameUpper: $[upper(variables['Build.Repository.Name'])]
- script: |
echo $(repoNameUpper)
Yet I get no output from it, what am I doing wrong here?
Yes, I know I could set a variable to achieve what I need using a bash script, yet I think it would not be so cool.
It because the Build.Repository.Name is agent-scoped, and can be used as an environment variable in a script and as a parameter in a build task. in another words - is not known at plan compile time, only at job execution time.
You can find more info in this GitHub issue.
I am trying to supply a parameter as the credentialId under the git step of my workflow. I define the following variables as environment variables in my job folder:
stashProject=ssh://git#stash.finra.org:7999/rpt
gitProdCredential=289b9074-c29a-463d-a793-6e926174066c
I have the following lines my inline Groovy CPS DSL workflow script:
sh 'echo retrieving code using credential: ${gitProdCredential}'
git url: '${stashProject}/etl.git', credentialsId: '${gitProdCredential}', branch: 'feature/workflow'
You can see that the variables are being evaluated properly as the gitProdCredential is echo'd and the git retrieval does attempt to get from my correct URL, based on the following output:
retrieving code using credential: 289b9074-c29a-463d-a793-6e926174066c
hudson.plugins.git.GitException: Failed to fetch from ssh://git#stash.finra.org:7999/rpt/etl.git
stderr: Permission denied (publickey).
But you can also see it is not authenticating properly. If, however, I hardcode the gitProdCredential like so
git url:'${stashProject}/etl', credentialId: '289b9074-c29a-463d-a793-6e926174066c', branch: 'feature/workflow'
It runs just fine and clones my repo. So somehow the credentialId variable isn't being evaluated correctly in the git DSL function properly, even though it appears to be in the rest of the workflow. Please advise if I'm missing something.
This is mainly a Groovy issue.
'${gitProdCredential}'
is a literal string with the text ${gitProdCredential}. Probably you meant
"${gitProdCredential}"
or more simply just
gitProdCredential
since there is no point creating a string expression which interpolates a (String-valued) expression and includes nothing else. In this case however the variable is not a Groovy variable but an environment variable, so you needed to use
env.gitProdCredential
You were probably misled by the fact that
sh 'echo retrieving code using credential: ${gitProdCredential}'
works. But this works only because it is running a Bourne shell script
echo retrieving code using credential: ${gitProdCredential}
and this shell happens to allow environment variables to be expanded using a syntax similar to that which Groovy uses in GString.
As to the incidental expansion of '${stashProject}/etl.git', this is apparently happening in the Git plugin, and is arguably a bug (values passed from a Workflow script should be used as is): some Jenkins plugins expand environment variables in configuration inputs, again using a syntax similar to that used by Groovy.
In summary, what you meant to write was
git url: "${env.stashProject}/etl.git", credentialsId: env.gitProdCredential, branch: 'feature/workflow'
By the way, when using sufficiently new versions of the Credentials plugin, when creating a new credentials item (but not thereafter) you can click the Advanced button to specify a mnemonic ID, which makes working with scripted projects like Workflow more pleasant.
We are using Jenkins to automate several of our build and test processes. For some of our process, the engineer starting the build needs to specify a parameter. But the range of possible and optimal values for that parameter change throughout the course of the day.
What I would like to do is let the engineer specify a value - if they know an optimal value - or leave it blank and have a value be calculated by an early build step. If the value is calculated, I would like the calculating build step to update the parameter value of the job. That way, all subsequent build steps don't have to worry about using the parameter or calculating it, they just use the parameter regardless.
It looks like the Groovy Script Plugin might be able to do this, but I can't see how I can SET the build parameters, just GET them.
Found the answer: use the EnvInject Plugin. One of the features is a build step that allows you to "inject" parameters into the build job from a settings file. I used one build step to create the settings file, then another build step to inject the new values. Then, all subsequent build steps and post-build operations used the new value.
Update with an example:
To add a new parameter (REPORT_FILE), based on existing one (JOB_NAME), inject a map with new or modified parameters in the Groovy Script box:
// Setting a map for new build parameters
def paramsMap = [:]
// Set REPORT_FILE based on JOB_NAME
def filename = JOB_NAME.replace(' ','_') + ".html"
paramsMap.put("REPORT_FILE", filename)
// Add or modify other parameters...
return paramsMap
Jenkins does have the ability to parameterize builds. For a string parameter, the developer can leave the field blank and then your build scripts can check to see if the env. variable for the parameter is set. If the env. var. is not set, the script can perform whatever calculation is needed (I don't think Jenkins has "pre-build steps") and pass it along. For a choice parameter the first line can be something like (Default), and again the build script can test its value and act accordingly.
Note on (Default)
I tried leaving the first line of the choice box blank, and Jenkins saved it correctly the first time; but when I came back to reconfigure the build Jenkins ran some kind of trim on options and the leading blank line was removed so I settled on (Default).
I hope this helps,
Zachary