Jenkinsfile - Pipeline job stuck at input() step - terraform

I'm running terraform pipeline through Jenkinsfile, where I'm using a input(...) block for the user approval, before apply. This is the code snippet:
stage('tf_plan') {
agent {
label: 'Jenkins-Linux-Dev'
}
steps {
sh(
label: 'Terraform Plan',
script: '''
#!/usr/bin/env bash
terraform plan -input=false -no-color -out=plan.tfplan'
'''
)
}
}
stage('tf_approve') {
when { expression { return env.Action == 'apply' } }
options {
timeout( time: 1, unit: 'MINUTES' )
}
steps {
input(
message: 'Proceed with above Terraform Plan??',
ok: 'Proceed'
)
}
}
stage('tf_apply') {
agent {
label: 'Jenkins-Linux-Dev'
}
when { expression { return env.Action == 'apply' } }
steps {
sh(
label: 'Terraform Apply',
script: '''
#!/usr/bin/env bash
terraform apply -auto-approve -input=false -no-color plan.tfplan'
'''
)
}
}
stage('tf_plan') is working absolutely fine but when env.Action = 'apply', it's not moving any further after stage('tf_approve'). It's stuck at Proceed or Abort step - not moving forward at all clicking either of 'em. Any idea what might be the problem?
Any help would be very much appreciated.
-S

My setup:
Jenkins 2.277.1
Groovy 2.3
Pipeline 2.6
Pipeline Utility Steps 2.6.1
And the following code works fine:
pipeline {
agent any
parameters {
choice(choices: ['-', 'apply'], name: 'Action')
}
stages {
stage('Trigger Promotion') {
when { expression { return env.Action == 'apply' } }
options {
timeout( time: 1, unit: 'MINUTES' )
}
steps {
script {
input(
message: 'Proceed with above Terraform Plan??',
ok: 'Proceed'
)
}
}
}
}
}
Therefore, I don't think the issue is with the input step. Need more info on what's going on with Jenkins and its workers at that moment. Try grabbing logs of Jenkins main node.
P.S.: I'd suggest avoiding PascalCase variables in Groovy. It's usually used to declare classses

Related

how to call a groovy function in an active choice parameter in Jenkins pipeline

I have a requirement where user has to select multiple resource names from the input block. I tried active choice parameter inside the user input step, and its working when I hardcode the values but the output is empty when I call a groovy function to dynamically generate the value. This function will return a list of resources names based on the environment passed earlier at the start of the job. So hardcoding the values won't be ideal for my situation. Any ideas on how to call a groovy function into the active choice parameter block?
pipeline {
agent any
stages{
stage('test user input') {
steps {
timestamps {
script {
def userInput = input(
parameters: [[$class: 'ChoiceParameter', choiceType: 'PT_CHECKBOX', description: 'Please select the values', filterLength: 1, filterable: false, name: 'testvalues', randomName: 'choice-parameter-37737065277176157', script: [$class: 'GroovyScript', fallbackScript: [classpath: [], sandbox: false, script: '''return[
\'error\'
]'''], script: [classpath: [], sandbox: false, script: '''def getvalues(){
return[
\'values1\',
\'values2\',
\'values3\',
\'values4\'
]
}
def value=getvalues()
return value''']]]]
)
println("input: " + userInput)
}
}
}
}
}
}

Jenkins pipeline skip stage if copying fails

Let's say we have a simple pipeline setup like this:
pipeline {
stages {
stage('Stage1') {
sh '''
echo 'Copying files'
cp ./file1 ./directory1
'''
}
stage('Stage2') {
sh '''
echo 'This stage should still work and run'
cp ./directory2/files ./directory2/subdirectory
'''
}
stage('Stage3') { ... }
...
}
}
Whenever I don't have the files in Stage1 or Stage2, it fails the build saying:
'cp cannot stat ./file1 ./directory1' or 'cp cannot stat ./directory2/files ./directory2/subdirectory'
Of course if the files exist, both stages work perfectly fine. The problem is that the build fails for the rest of the stages if a stage fails. So if Stage1 fails because there are no files, it fails every stage after and they don't even run, same goes for if Stage2 fails, then we know that Stage1 succeeded but then Stage3 and onwards fails and does not even run.
Is there a way to make it so that if the cp command fails and the cp cannot stat shows, to just skip the stage and proceed to the next one? Or at least make it so that only that stage fails and it can proceed to build the next stage(s)?
Here is an simple way of skipping the stage when a file does not exist, using the when directive:
pipeline {
agent any
stages {
stage('Stage1') {
when { expression { fileExists './file1' } }
steps {
sh '''
echo 'Copying files'
cp ./file1 ./directory1
'''
}
}
stage('Stage2') {
when { expression { fileExists './directory2/files' } }
steps {
sh '''
echo 'This stage should still work and run'
cp ./directory2/files ./directory2/subdirectory
'''
}
}
stage('Stage3') {
steps {
echo "stage 3"
}
}
}
}
In above case you have to specify the path twice, in the when directive and in the sh step, it is better to handle it in a another way e.g. using variables or closures.
Because of the restrictions in the declarative pipeline, I would recommend you to use the scripted pipeline instead.
This can be achieved using the catchError
pipeline {
agent any
stages {
stage('1') {
steps {
script{
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
echo 'Copying files'
cp ./file1 ./directory1
}
}
}
}
stage('2') {
steps {
script{
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
echo 'This stage should still work and run'
cp ./directory2/files ./directory2/subdirectory
}
}
}
}
stage('3') {
steps {
sh 'exit 0'
}
}
}
}
From above pipeline script, all stages will executed. If the cp command will not work for either of stage 1 or stage 2, it will show as failed for that particular stage but rest all stages will execute.
Similar to below screenshot:
Modified Answer
Following pipeline script include sh ''' ''', which need not have to be present inside the catchError block.
You can include only those commands inside catchError for which you want to catch the errors.
pipeline {
agent any
stages {
stage('1') {
steps {
sh """
echo 'Hello World!!!!!!!!'
curl https://www.google.com/
"""
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
echo 'Copying files'
cp ./file1 ./directory1
}
}
}
stage('2') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
echo 'This stage should still work and run'
cp ./directory2/files ./directory2/subdirectory
}
}
}
stage('3') {
steps {
sh 'exit 0'
}
}
}
}
You could just check if the file exists before you try to copy it using a conditional like this:
[ -f ./directory2/files ] && cp ./directory2/files ./directory2/subdirectory || echo "File does not exist"
Source and more info

Access Jenkins Credentials Environment Variables from Node.js

I have set up some credentials environment variables using the Credentials plugin for Jenkins.
I am using them in my Jenkinsfile like this :
pipeline {
agent any
environment {
DEV_GOOGLE_CLIENT_ID = credentials('DEV_GOOGLE_CLIENT_ID')
DEV_GOOGLE_CLIENT_SECRET = credentials('DEV_GOOGLE_CLIENT_SECRET')
}
stages {
stage('Install dependencies') {
steps {
dir("./codes") {
sh 'npm install'
}
}
}
stage('Stop previous forever process') {
steps {
dir("./codes") {
sh 'forever stop dev || ls'
}
}
}
stage('Clean forever logs') {
steps {
dir("./codes") {
sh 'forever cleanlogs'
}
}
}
stage('Test ') {
steps {
dir("./codes") {
sh 'npm run test'
}
}
}
}
}
In my Node.js code, I'm trying to get access to those env variables by writing process.env.DEV_GOOGLE_CLIENT_SECRET but that is not working I am getting undefined ...
Thank you
Which type of credentials did you use: secret text or username and password?
While using username and password you can get username and password separately each other like this
pipeline {
agent any
environment {
DEV_GOOGLE_CLIENT = credentials('DEV_GOOGLE_CLIENT')
}
stages {
stage('Get username and password') {
steps {
echo "username is $DEV_GOOGLE_CLIENT_USR"
echo "password is $DEV_GOOGLE_CLIENT_PSW"
}
}
}
I don't know how to call $DEV_GOOGLE_CLIENT_USR and $DEV_GOOGLE_CLIENT_PSW in node.js, sorry.
You pass your credentials in this order: 1 Jenkins credentials > 2 pipeline environment variable > 2 node.js command line parameter > 3 node.js environment variable
1 Jenkins credentials
Make sure to create a Jenkins credentials as secret text first: https://www.jenkins.io/doc/book/using/using-credentials/
2 pipeline environment variable + node.js command line parameter
pipeline {
agent any
environment {
JENKINS_SECRET_TEXT=credentials('JENKINS_SECRET_TEXT')
}
stages {
stage('Pass secret as command line parameter') {
steps {
sh 'SECRET_ENV_VAR="$JENKINS_SECRET_TEXT" node app.js'
}
}
}
}
3 node.js
console.log("SECRET_ENV_VAR:", process.env.SECRET_ENV_VAR);

building jenkins pipeline in nodejs

I need to implement this:
pipeline {
agent none
stages {
stage('Build') {
agent {
docker {
image 'python:2-alpine'
}
}
steps {
sh 'python -m py_compile sources/add2vals.py sources/calc.py'
}
}
stage('Test') {
agent {
docker {
image 'qnib/pytest'
}
}
steps {
sh 'py.test --verbose --junit-xml test-reports/results.xml sources/test_calc.py'
}
post {
always {
junit 'test-reports/results.xml'
}
}
}
}
}
on a nodejs express project and run unit tests with mocha and chai,
this is my code :
pipeline {
agent { docker { image 'node:6.3' } }
stages {
stage('build') {
steps {
sh 'npm --version'
}
}
}
}
can anyone tell me how I should do that? the example is with python so I have no idea what I need to do.
I would take a look at the resources on the Jenkins blog. What you are looking at is the Jenkinsfile which sits in the root of your project directory.
https://jenkins.io/doc/tutorials/build-a-node-js-and-react-app-with-npm/

How can I create parallel stages in Jenkins scripted pipeline?

I am trying to implement parallelization in my Jenkins pipeline code where I can run two stages in parallel. I know this is possible in declarative pipeline, but I am using scripted pipeline.
I've attempted to implement this by doing something like this:
parallel(
stage('StageA') {
echo "This is branch a"
},
stage('StageB') {
echo "This is branch b"
}
)
When I run this and look at this in Blue ocean, the stages do not run in parallel, but instead, StageB is executed after StageA.
Is it possible to have parallel stages in scripted jenkins pipeline? If so, how?
Try this syntax for scripted pipeline:
parallel(
"StageA": {
echo "This is branch a"
},
"StageB": {
echo "This is branch b"
}
)
It should look like this in Blue Ocean, this is what you expect right?
If you want to see the stages (and console output) in the classic view, you can use stage like this:
parallel(
"StageA": {
stage("stage A") {
echo "This is branch a"
}
},
"StageB": {
stage("stage B") {
echo "This is branch b"
}
}
)
This worked for me
stage('Check code quality') {
parallel {
stage('Run prospector') {
when {
expression { params.SKIP_PROSPECTOR == false }
}
steps {
checkout scm
sh 'echo "Running prospector..."'
sh 'make dockerized-run-prospector'
}
}
stage('Run Tests') {
when {
expression { params.SKIP_TESTS == false }
}
steps {
checkout scm
sh 'echo "Running tests..."'
sh 'make dockerized-test'
}
}
}
}
This runs stages inder the parent stage parallely.
Jenkins blue shows it like this

Resources