NodeJs Jenkins plug-in is not working with dockerfile agent - node.js

I'm trying to use NodeJs plug-in on Jenkins. I follow NodeJs document and it work fine with its example code which is using agent any
pipeline {
agent any
stages {
stage('Build') {
steps {
nodejs(nodeJSInstallationName: 'NodeJs test') {
sh 'npm config ls'
}
}
}
}
}
But if I use dockerfile agent like the code below
pipeline {
options {
timeout(time:1,unit:'HOURS')
}
environment {
docker_image_name = "myapp-test"
HTTP_PROXY = "${params.HTTP_PROXY}"
JENKINS_USER_ID = "${params.JENKINS_USER_ID}"
JENKINS_GROUP_ID = "${params.JENKINS_GROUP_ID}"
}
agent {
dockerfile {
additionalBuildArgs '--tag myapp-test --build-arg "JENKINS_USER_ID=${JENKINS_USER_ID}" --build-arg "JENKINS_GROUP_ID=${JENKINS_GROUP_ID}" --build-arg "http_proxy=${HTTP_PROXY}" --build-arg "https_proxy=${HTTP_PROXY}"'
filename 'Dockerfile'
dir '.'
label env.docker_image_name
}
}
stages {
stage('Build') {
steps {
nodejs(nodeJSInstallationName: 'NodeJs test') {
sh 'npm config ls'
}
}
}
}
}
It will return npm: command not found error.
My guess is, It couldn't find the path of nodejs... I want to try to export PATH=$PATH:?? too but I also don't know the nodejs path.
How can I make the NodeJS plug-in work with dockerfile?

NodeJS plugin won't inject itself into a docker. However you could make an ARG build argument in your dockerfile that takes the version of nodeJS to install. You will then need to get read of the nodejs step

Thank you fredericrous for the answer. Unfortunately in my system, the dockerfile can't be modified. But from your information that
NodeJS plugin won't inject itself into a docker.
I decide to run the NodeJS plugin in another agent instead of dockerfile(running multiple agents)
With the code below I manage to run it successfully.
pipeline {
options {
timeout(time:1,unit:'HOURS')
}
environment {
docker_image_name = "myapp-test"
HTTP_PROXY = "${params.HTTP_PROXY}"
JENKINS_USER_ID = "${params.JENKINS_USER_ID}"
JENKINS_GROUP_ID = "${params.JENKINS_GROUP_ID}"
}
agent {
dockerfile {
additionalBuildArgs '--tag myapp-test --build-arg "JENKINS_USER_ID=${JENKINS_USER_ID}" --build-arg "JENKINS_GROUP_ID=${JENKINS_GROUP_ID}" --build-arg "http_proxy=${HTTP_PROXY}" --build-arg "https_proxy=${HTTP_PROXY}"'
filename 'Dockerfile'
dir '.'
label env.docker_image_name
}
}
stages {
stage('Build') {
steps {
sh 'ls'
}
}
}
}
stage('Test'){
node('master'){
checkout scm
try{
nodejs(nodeJSInstallationName: 'NodeJs test') {
sh 'npm config ls'
}
}
finally {
sh 'echo done'
}
}
}

Related

how to write the correct pipline jenkins docker grovy node

I am rewriting my pipline in node, I need to understand how to perform a step with a gait in node now an error is coming from stage('Deploy')
node {
checkout scm
def customImage = docker.build("python-web-tests:${env.BUILD_ID}")
customImage.inside {
sh "python ${env.CMD_PARAMS}"
}
stage('Deploy') {
post {
always {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]
])
cleanWs()
}
}
}
and this is the old pipeline
pipeline {
agent {label "slave_first"}
stages {
stage("Создание контейнера image") {
steps {
catchError {
script {
docker.build("python-web-tests:${env.BUILD_ID}", "-f Dockerfile .")
}
}
}
}
stage("Running and debugging the test") {
steps {
sh 'ls'
sh 'docker run --rm -e REGION=${REGION} -e DATA=${DATA} -e BUILD_DESCRIPTION=${BUILD_URL} -v ${WORKSPACE}:/tmp python-web-tests:${BUILD_ID} /bin/bash -c "python ${CMD_PARAMS} || exit_code=$?; chmod -R 777 /tmp; exit $exit_code"'
}
}
}
post {
always {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]
])
cleanWs()
}
}
}
I tried to transfer the method of creating an allure report, but nothing worked, I use the version above, almost everything turned out, you can still add environment variables to the build, for example, those that are specified -e DATA=${DATA} how do I add it
I don't recommend to switch from declarative to scriptive pipeline.
You are losing possibility to use multiple tooling connected with declarative approach like syntax checkers.
If you still want to use scriptive approach try this:
node('slave_first') {
stage('Build') {
checkout scm
def customImage = docker.build("python-web-tests:${env.BUILD_ID}")
customImage.inside {
sh "python ${env.CMD_PARAMS}"
}
}
stage('Deploy') {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]])
cleanWs()
}
}
There is no post and always directive in scriptive pipelines. It's on your head to catch all exceptions and set status of the job. I guess you were using this page: https://www.jenkins.io/doc/book/pipeline/syntax/, but it's a mistake.
This page only refers to declarative approach and in few cases you have hidden scriptive code as examples.
Also i don't know if you have default agent label set in your Jenkins config, but by looking at your declarative one I think you missed 'slave_first' arg in node object.
those that are specified -e DATA=${DATA} how do I add it
That's a docker question not a Jenkins. If you want to launch docker image and then also have access to some reports located in this container you should mount workspace/file where those output files landed. You should also pass location of those files to allure.
I suggest you to try this:
mount some subfolder in workspace to docker container
cat test report file if it's visible
add allure report with passing this file location to allure step

Jenkins pipeline - Set environment variable in nodejs code

I have a Jenkins pipeline which is executing few nodejs files like below -
stage('validate_paramters') {
steps {
sh 'node ${WORKSPACE}/file1.js'
}
}
stage('test') {
steps {
sh 'node ${WORKSPACE}/file2.js'
}
}
How can I set variables in file1 which can be accessed inside file2? I tried below approach but its giving undefined as value -
file1.js -
process.env['OPERATIONS'] = "10"
file2.js -
var operations = process.env.OPERATIONS

Jenkins. Invalid agent type "docker" specified. Must be one of [any, label, none]

My JenkinsFile looks like:
pipeline {
agent {
docker {
image 'node:12.16.2'
args '-p 3000:3000'
}
}
stages {
stage('Build') {
steps {
sh 'node --version'
sh 'npm install'
sh 'npm run build'
}
}
stage ('Deliver') {
steps {
sh 'readlink -f ./package.json'
}
}
}
}
I used to have Jenkins locally and this configuration worked, but I deployed it to a remote server and get the following error:
WorkflowScript: 3: Invalid agent type "docker" specified. Must be one of [any, label, none] # line 3, column 9.
docker {
I could not find a solution to this problem on the Internet, please help me
You have to install 2 plugins: Docker plugin and Docker Pipeline.
Go to Jenkins root page > Manage Jenkins > Manage Plugins > Available and search for the plugins. (Learnt from here).
instead of
agent {
docker {
image 'node:12.16.2'
args '-p 3000:3000'
}
}
try
agent {
any {
image 'node:12.16.2'
args '-p 3000:3000'
}
}
that worked for me.
For those that are using CasC you might want to include in plugin declaration
docker:latest
docker-commons:latest
docker-workflow:latest

Passing parameters from Jenkins CI to npm script

When I run Jenkins build, I would like to pass COMMIT_HASH and BRANCH_NAME to one of my javascript files: publish.js, so that I can remove hard-coded values for tags and consumerVersion.
Here is my code:
Jenkinsfile
stage('Publish Pacts') {
steps {
script {
sh 'npm run publish:pact -Dpact.consumer.version=${COMMIT_HASH} -Dpact.tag=${env.BRANCH_NAME}'
}
}
}
package.json
"scripts": {
"publish:pact": "node ./src/test/pact/publish.js"
}
./src/test/pact/publish.js
let publisher = require('#pact-foundation/pact-node');
let path = require('path');
let opts = {
providerBaseUrl: `http://localhost:${global.port}`,
pactFilesOrDirs: [path.resolve(process.cwd(), 'pacts')],
pactBroker: 'http://localhost:80',
tags: ["prod", "test"], // $BRANCH_NAME
consumerVersion: "2.0.0" // $COMMIT_HASH
};
publisher.publishPacts(opts).then(() => {
console.log("Pacts successfully published");
done()
});
Does anyone know how to do this?
You can pass cli arguments to your node script which end up in your process.argv.
Also npm passes on cli arguments via two dashes --.
To illustrate this consider this example:
Jenkinsfile
stage('Publish Pacts') {
steps {
script {
sh 'npm run publish:pact -- ${COMMIT_HASH} ${env.BRANCH_NAME}'
}
}
}
package.json
"scripts": {
"publish:pact": "node ./src/test/pact/publish.js"
}
publish.js
// process.argv[0] = path to node binary
// process.argv[1] = path to script
console.log('COMMIT_HASH:',process.argv[2]);
console.log('BRANCH_NAME:',process.argv[3]);
I left the cli flags out for simplicity.
Hope this helps

jenkins pipeline nodeJs

My JenkinsFile script started throwing npm not found error. (it is working for maven but failing at npm)
pipeline {
environment {
JENKINS='true'
}
agent any
stages{
stage('change permissions') {
steps {
sh "chmod 777 ./mvnw "
}
}
stage('clean') {
steps {
sh './mvnw clean install'
}
}
stage('yarn install') {
steps{
sh 'npm install -g yarn'
sh 'yarn install'
}
}
stage('yarn webpack:build') {
steps {
sh 'yarn webpack:build'
}
}
stage('backend tests') {
steps {
sh './mvnw test'
}
}
stage('frontend tests') {
steps {
sh 'yarn test'
}
}
}
}
To fix that
I am trying to setup NodeJs on my jenkins node. I installed the nodejs plugin and wrote the script
pipeline {
agent any
stages {
stage('Build') {
steps {
nodejs(nodeJSInstallationName: 'Node 6.x', configId: '<config-file-provider-id>') {
sh 'npm config ls'
}
}
}
}
}
as shown in the https://wiki.jenkins.io/display/JENKINS/NodeJS+Plugin
I also setup nodejs on global tools config
I also tried the solution in the installing node on jenkins 2.0 using the pipeline plugin
and it throws
Expected to find ‘someKey "someValue"’ # line 4, column 7.
node {
error.
but I am still getting npm not found error on jenkins. I am new to jenkins so any help is appreciated.
Thanks in advance
I was able to fix the issues. Followed the following link and was able to fix the issue. https://medium.com/#gustavo.guss/jenkins-starting-with-pipeline-doing-a-node-js-test-72c6057b67d4
Its a puzzle. ;)
Has a little reference trick.
You need to configure your jenkins to see your nodejs config name.
At Global Tool Configuration, you need define your node config name. It has reference to your Jenkinsfile reference.
Look an Jenkingsfile adapted example with reference:
pipeline {
agent any
tools {nodejs "node"}
stages {
stage('Cloning Git') {
steps {
git 'https://github.com/xxxx'
}
}
stage('Install dependencies') {
steps {
sh 'npm i -save express'
}
}
stage('Test') {
steps {
sh 'node server.js'
}
}
}
}
Complete case to study: Post at Medium by Gustavo Apolinario
Hope it helps!
If you need different version of Node.js and npm, you can install NodeJS plugin for Jenkins.
Go to Manage Jenkins -> Global tools configuration and find NodeJS section.
Select the version you need and name it as you prefer. You can also add npm packages that needs to be installed globally.
In a declarative pipeline, just reference the correct version of node.js to use:
stage('Review node and npm installations') {
steps {
nodejs(nodeJSInstallationName: 'node13') {
sh 'npm -v' //substitute with your code
sh 'node -v'
}
}
}
Full example here: https://pillsfromtheweb.blogspot.com/2020/05/how-to-use-different-nodejs-versions-on.html

Resources