I am making a pipeline on Jenkins to test and deploy my node.js application using Docker containers. But I am getting my pipeline stuck because a test is failing. The behaviour I would expect is pipeline finishes without executing next stages but it will not get stuck.
Jenkinsfile:
pipeline {
agent any
stages {
stage('Build') {
steps {
sh '''docker build --tag my-web:$BUILD_NUMBER .
docker stop my-web&& docker rm my-web
echo "Build step finished"'''
}
}
stage('Unit test') {
steps {
sh '''docker build -t my-web-test -f Dockerfile.test .
docker run --rm my-web-test
'''
}
}
stage('Run') {
steps {
sh '''docker run --name my-web -p 3000:3000 my-web:$BUILD_NUMBER node /var/www/index.js &
'''
echo 'RUNNING'
}
}
stage('End') {
steps {
echo 'End of pipeline'
}
}
}
}
Dockerfile.test:
FROM node:alpine
RUN mkdir /var/test
WORKDIR /var/test
COPY package.json /var/test/
RUN npm install && npm install -g mocha
COPY src/ /var/test/
CMD ["mocha", "tests/", "--recursive"]
When I trigger the pipeline:
If I remove Unit test stage from pipeline everything works OK and application begins running.
If I do not remove Unit test stage, testing stage begins and I get a result of 14 test passed and 1 failed but the pipeline hangs in this step so Run step never triggers and the pipeline keeps in Running status.
14 passing (2s)
1 failing
1) Checking user first-time-login
Should redirect to change-password page:
Error: expected "Location" of "/dashboard/change-password", got "/dashboard"
at Test._assertHeader (node_modules/supertest/lib/test.js:249:12)
at Test._assertFunction (node_modules/supertest/lib/test.js:283:11)
at Test.assert (node_modules/supertest/lib/test.js:173:18)
at localAssert (node_modules/supertest/lib/test.js:131:12)
at /var/test/node_modules/supertest/lib/test.js:128:5
at Test.Request.callback (node_modules/superagent/lib/node/index.js:728:3)
at IncomingMessage.<anonymous> (node_modules/superagent/lib/node/index.js:916:18)
at endReadableNT (_stream_readable.js:1154:12)
at processTicksAndRejections (internal/process/task_queues.js:77:11)
Newer versions of mocha need to exit, otherwise the server keeps running so next stage is never reached.
mocha --exit
Related
I'm running docker compose file in ec2 instance, this file contains mysql, jenkins images. Also running nodejs app using pm2 command, when I run nodejs server manually in ec2-instance everything is working properly.
But When I try to deploy nodejs app using jenkins container, latest code is not deployed, i tried to debug why it is not deployed, I found one interesting thing
When i try to run pipeline all commands executed inside jenkins container workspace with jenkins user(container path : /var/jenkins_home/workspace/main)
So my question is my actual nodejs app placed in /home/ubuntu/node-app. But when try to deploy code using jenkins pipeline, pipeline is running in different path(/var/jenkins_home/workspace/main).
Now i have question, is this possible to execute pipeline deployment command for /home/ubuntu/node-app path? not docker container path?
if changing path is not possible, how to point jenkins docker container to ec2 public ip?
I shared jenkinsfile script and docker compose image code for reference
Jenkinsfile code:
stages {
stage('Build') {
steps {
sh 'npm install && npm run build'
}
}
stage('Deploy') {
steps {
sh "pwd"
sh 'git pull origin main'
sh 'pm2 stop server || true'
sh 'npm install'
sh 'npm run build'
sh 'pm2 start build/server.js '
}
}
}
Jenkins Docker Image code:
jenkins:
image: 'jenkins/jenkins:lts'
container_name: 'jenkins'
restart: always
ports:
- '8080:8080'
- '50000:50000'
volumes:
- jenkins-data:/etc/gitlab
- /var/run/docker.sock:/var/run/docker.sock
Edit 1:
I tried to change the path following ways to in jenkinsfile
cd /home/ubuntu/node-app
I'm getting following error
/var/jenkins_home/workspace/main#tmp/durable-44039790/script.sh: 1: cd: can't cd to /home/ubuntu/node-app
Note : this path(/var/jenkins_home/workspace/main) is only visible in ec2 machine after exec following command, normally this path is not exist in ec2 machine
docker exec -it jenkins bash
Try with following fix code
stage('Deploy') {
steps {
sh "cd /home/ubuntu/node-app"
sh 'git pull origin main'
sh 'pm2 stop server || true'
sh 'npm install'
sh 'npm run build'
sh 'pm2 start build/server.js '
}
}
Finally I found solution for this issue.
The actual issues is I didn't create any slave agent for jenkins pipeline, that's why jenkins pipeline jobs are running in master agent location, here master agent location was jenkins docker container space, that's why pipeline jobs are strored into /var/jenkins_home/workspace/main this path
Now I added slave agent and mentioned the customWorkspace path( i mentioned customWorkspace path is 'home/ubuntu/node-app') in jenkinsfile. Now my jenkins pipeline is working under custom workspace that is /home/ubuntu/node-app
My updated jenkinsfile code:
pipeline {
agent {
node {
label 'agent1'
customWorkspace '/home/ubuntu/node-app'
}
}
stages {
stage('Build') {
steps {
sh 'npm install && npm run build'
}
}
stage('Deploy') {
steps {
sh 'pm2 restart server'
}
}
}
}
I am doing a very simple CI/CD using codepipeline with (codeCommit, codebuild, codeDeploy).
I have a simple node.js app that has a unittest like this below
const Allsayings = require('./testAllsayings');
function logPass(tName){
console.log("PASS - " + tName);
}
function logFail(tName){
console.log("FAIL - " + tName )
}
// T01 - Search for a saying and succeed
let say01 = new Allsayings();
say01.addQuote("Everyone is looking for something.");
say01.addQuote("Let's try to figure this out together, so help me please");
let output01 = aq01.findSaying("Not here");
if (output01.search("Before you embark") > -1){
logPass("T01");
} else {
logFail("T01");
}
I want that when the unit test fails it halt/stop the deployment or the progression of the pipeline.
my byuildspec
version: 0.2
phases:
install:
runtime-versions:
nodejs: 16
commands:
- echo Installing
pre_build:
commands:
- echo Installing source NPM dependencies.
- cd serverSide
- npm install
build:
commands:
- echo Build started on `date`
- npm install pm2#latest -g
# buildspec is able to get into your servSide file?
- ls
- echo "for debugging ... starting test"
- node testAllsayings.js
- echo "test is successful ... "
post_build:
commands:
- echo Build completed on `date`
artifacts:
files:
- '**/*'
However, my problem is that when I run codepipeline the codebuild completes successfully despite I made my unittest fail and here is a part of the codebuild log
[Container] 2022/10/03 00:45:05 Running command echo "for debugging ... starting test"
for debugging ... starting test
[Container] 2022/10/03 00:45:05 Running command node testAllsayings.js
Fail - T01
[Container] 2022/10/03 00:45:05 Running command echo "test is successful ... "
test is successful ...
I read this, and I moved the command node testAllsayings.js to the pre_build stage, but still everything worked without stopping the build stage or deployment stage.
So I found a solution. For code build to catch the error from the unittest, the function has to exist with one. So I added this line of code and now code build stops when the unittest fails.
function logFail(tName){
console.log("FAIL - " + tName )
process.exitCode(1);
}
I'm pretty new to do docker and jenkins but wanted to see if I could get a node app automatically deployed and running on my raspberry pi. In an ideal world, I'd like to have Jenkins pull down code from github, use a jenkinsfile and dockerfile to build and run the docker image (hopefully this is possible).
jenkinsfile
pipeline {
agent {
dockerfile true
}
environment {
CI = 'true'
HOME = '.'
}
stages {
stage('Install dependencies') {
steps {
sh 'npm install'
}
}
stage('Test') {
steps {
sh './scripts/test'
}
}
stage('Build Container') {
steps {
sh 'docker build -t test-app:${BUILD_NUMBER} . '
}
}
}
}
dockerfile
# Create image based on the official Node image
FROM node:12
# Create a directory where our app will be placed
RUN mkdir -p /usr/src/app
# Change directory so that our commands run inside this new directory
WORKDIR /usr/src/app
# Copy dependency definitions
COPY package.json /usr/src/app
# Install dependecies
RUN npm install
# Get all the code needed to run the app
COPY . /usr/src/app
# Expose the port the app runs in
EXPOSE 3000
# Serve the app
CMD ["npm", "start"]
However, when I try to run this in jenkins, I get the following error: ../script.sh: docker: not found. This seems to be the case for any docker command. I actually tried running some other command starting with 'sudo' and it complained that sudo: not found. Is there a step missing or am I trying to do something in an incorrect way. (NOTE: docker is installed on the raspberry pi. I can log in with the jenkins user and execute docker commands. It just doesn't work through the web ui) Any advice would be appreciated.
Thanks!
Apparently this section was breaking it:
agent {
dockerfile true
}
When I set this:
agent any
it finished the build, including docker commands without any issues. I guess I just don't understand how that piece works. Any explanations would be helpful!
I try to build for one of my Laravel Project a simple pipeline script for continuous integration with Jenkins which involves couple of simple steps.
build
compile assets
test
deploy
the process works fine till is not starting to compile the assets, on assets compile to hole process hangs and is not getting executed to the end.
As background I'm using Centos 7 and for the assets compile node-10.0.0 and here is the jenkins pipeline snippet
node {
stage('Install dependencies') {
// Run Composer
sh 'rm -rf vendor'
sh 'composer install'
//sh 'cp .env.example .env'
sh 'php artisan key:generate'
}
stage('Compile Assets') {
env.NODE_ENV = "test"
print "Environment will be : ${env.NODE_ENV}"
sh 'rm -rf node_modules'
sh 'node -v'
sh 'yarn install --ignore-engines'
}
stage('Run PHP Tests') {
sh "vendor/bin/phpunit"
}
}
This is my first Jenkins pipeline project. I created a simple Node.js application, and I uploaded into hithub (public repo) and all I am trying to do with my Jenkinsfile is to "npm install" in my Build stage. I believe Jenkins is finding the Jenkinsfile but it is just not finding the npm. I am using jenkins official docker image to run my jenkins server. Here are the two plugging that I have installed
1) NodeJS Plugin and 2) Pipeline NPM Integration Plugin
and here is the file
pipeline {
agent any
stages {
stage ("Build") {
steps {
sh "npm install"
}
}
}
}
and this is the error I am getting when I run my 'Build Now'
[second project] Running shell script
+ npm install
/var/jenkins_home/workspace/second project#tmp/durable-ef33ffd4/script.sh: 2: /var/jenkins_home/workspace/second project#tmp/durable-ef33ffd4/script.sh:
npm: not found
can someone help?
I suppose, your npm binary isn't in PATH variable.
Try to specify full path to npm, usually it's /usr/bin
pipeline {
agent any
stages {
stage ("Build") {
steps {
sh "/usr/bin/npm install"
}
}
}
}
You can check npm path in console using command which npm
May be, you already figured this. Have you hosted your machine's docket socket in the container when you started the Jenkins container?
Specifically, you need use -v /var/run/docker.sock:/var/run/docker.sock on your docker run command.
Then in your pipeline, you need to run the npm on a docker container that is built it from official node docker image, such as node:10.11.0-alpine. Here is an example
pipeline {
agent {
docker {
image 'node:10.11.0-alpine'
}
}
stages {
stage ("Build") {
steps {
sh "npm install"
}
}
}
}
If you are on windows than try to run CMD as Administrator and then install NPM it will work for you