Which plugin is required to have docker.build work in Jenkins? - linux

I am using docker on Mac and have Jenkins running in docker container.
Client is interacting with docker daemon on host machine.
I have following plugins installed:
docker-plugin
workflow-aggregator
I do have docker client / command working in container. I have also checked it using sh and even docker cloud can spin up agents.
But below Jenkinsfile is contantly throwing error.
def image
pipeline {
agent {
label "container"
}
stages {
stage('Build') {
steps {
script {
image = docker.build("username/image:$BUILD")
}
}
}
}
}
But I am constantly facing below error message:
groovy.lang.MissingPropertyException: No such property: docker for class: groovy.lang.Binding

Error: No such property: docker for class: groovy.lang.Binding
No such Docker property indicates that Docker Pipeline plugin is not installed.
It's little confusing because name of these three plugins are very much similar to each other's id:
Docker have id docker-plugin
Pipeline plugin have id workflow-aggregator
Docker Pipeline plugin have id docker-workflow

If you have this issue:
groovy.lang.MissingPropertyException: No such property: docker for class: groovy.lang.Binding.
We most likely encountered the same issue, in order to fix it I only had to install the Docker Pipeline plugin in Jenkins, so all you have to do is go to:
Jenkins Homepage > Manage Jenkins > Manage Plugins > Available
Search for Docker Pipeline install it restart jenkins and you are ready to go.
For more info about Docker Pipeline Plugin Scripts click here.

Related

Is it possible to install and run docker inside node container in Jenkins?

This is somewhat complicated situation, but I have Jenkins installed inside a docker container. I'm trying to run some tests in node.js app, but this test environment requires docker+docker-compose to be enabled. At the moment, the Jenkins configuration is through pipeline code
So far, I've tried pulling docker inside a stage, as follow:
pipeline {
agent {
docker {
image 'node'
}
}
stages {
stage("Checkout") {
steps {
git url: ....
}
}
stage("Docker") {
steps {
script {
def image = docker.image('docker')
image.pull()
image.inside() {
sh 'docker --version'
sh 'docker-compose --version'
}
}
}
}
}
with error returning 'docker: not found'. I was expecting the script to succeed because I've tried exactly the same with 'agent any' which had no problem, but inside node image it doesn't seem to work.
I'm also not sure if this is the right way to do so because as I understand correctly, this way of running docker inside a docker is not recommended. One method that I have found is that when running docker, it is recommended to run docker -v /var/run/docker.sock:/var/run/docker.sock ... but currently I am running on docker-compose, with installation steps from https://www.jenkins.io/doc/book/installing/docker/ (instead of individual docker, I've combined both jenkins and jenkins-blueocean into a docker-compose file), and that did not work.
At this moment, I'm out of idea and any solutions or other suggestions as to how to run both node.js and docker in the same environment, would be greatly appreciated.
You can try to use docker-in-docker image https://hub.docker.com/_/docker

Jenkins npm: not found, despite node plugin

I have a multibranch pipeline that uses many agents. The agents are all the same (supposedly) - they are all openstack instances created from the same snapshot. All of these instances have node and npm installed globally, which I confirmed by ssh'ing in to each one under various usernames and checking node -v and npm -v, and getting expected version numbers. I had been running into problems with Jenkins throwing npm: not found, so I followed this answer from "Jenkins unable to find npm", which was to use the node plugin. In my pipeline:
pipeline {
agent {
node {
label 'agent-1'
}
}
options {
disableConcurrentBuilds()
}
tools {
nodejs 'NodeJS 14.10.1'
}
stages {
stage ('Parallel tests'){
steps {
script {
parallel parallelStages
}
}
}
}
}
Where parallelStages is a collection that contains a bunch of parallel stages, each one running on its own agent, and requiring use of npm. To avoid question bloat, I'll leave that code out, but you can see it in my other question here. Adding nodejs under tools helped solve this npm: not found problem the first time.
I just added a new stage to parallelStages, with a newly minted agent. I am again running into that old issue of npm: not found on this new agent only. I am using the nodejs plugin in my tools. I manually checked that npm is available on my new agent. When ssh'ing to the agent, which npm gave the path of /home/jenkins_user/.nvm/versions/node/v14.16.1/bin/npm. This is the case for all my agents. However, when putting sh 'which npm' in the steps for each of the parallel stages, the working nodes return /home/jenkins_user/tools/jenkins.plugins.nodejs.tools.NodeJSInstallation/NodeJS_14.10.1/bin/npm, and the non-working node returns nothing, and the stage fails there.
Just to be sure, I added this path to the tool locations in "Node Properties" in the jenkins UI:
The error persists. I checked that Jenkins is using the same credentials for all agents, and those credentials indeed work. Why is Jenkins not finding npm? Is this a problem with Jenkins? With my agent? I'm struggling to find the disconnect here.

Update Docker tag using Docker task on Azure DevOps pipeline

I'm trying to change the tag of a Docker image using a Docker task on an Azure DevOps pipeline, without success.
Consider the following Docker image hosted on an Azure container registry:
My task is configured as follows:
$(DockerImageName) value is agents/standard-linux-docker2:310851
I'm trying to change the Docker image tag (e.g. to latest) but so far I wasn't able to make it work. I've also tried to set the arguments as well, without success.
Task fails with the following error message:
Error response from daemon: No such image: agents/standard-linux-docker2:310851
/usr/bin/docker failed with return code: 1
What am I missing here?
Try using Azure CLI Task. Run the following command and select the options in the image.
az acr import --name xxxxxacr --source xxxxxacr.azurecr.io/xxx/xxx-api:stage --image xxxxyyyyyyy/yyyyyyyy-api:prod --force

Jenkins - env: ‘node’: No such file or directory

I have a jenkins server that is configured using
https://github.com/shierro/jenkins-docker-examples/tree/master/05-aws-ecs
I am running a blue ocean pipeline using a simple Jenkinsfile and the jenkins NodeJS plugin
pipeline {
agent any
tools {
nodejs 'node10'
}
stages {
stage ('Checkout Code') {
steps {
checkout scm
}
}
stage ('Install dependencies') {
steps {
sh "echo $PATH"
sh "npm install"
}
}
}
}
I made sure to add the node10 global tool as well w/c is used above
When the pipeline gets to the script sh "npm install" i am running through this error
this is the output of the command echo $PATH
so i think it's not a path issue
Also, it also wasn't able to add the global package
More info that might help:
Docker Jenkins server: FROM jenkins/jenkins:2.131-alpine
Blue ocean version: 1.7.0
NodeJS Plugin: 1.2.6
Multiple server restarts already
Any ideas why the jenkins server does not know where node is?
Big thanks in advance!
Thanks to #JoergS for some insight! The culprit in this case is: using alpine image as the docker base. So switching from jenkins/jenkins:2.131-alpine to jenkins/jenkins:2.131 solved the NodeJS plugin issue.
I have faced the same issue with jenkinsci/blueocean. I have resolved this by installing nodejs with below command(inside docker) not as jenkins plugin
apk add nodejs
I have faced the same issue with jenkinsci/blueocean. No jenkins nodejs plugin needed.
pipeline {
agent any
stages {
stage ('Checkout Code') {
steps {
checkout scm
}
}
stage ('Install dependencies') {
steps {
sh "apk add nodejs"
sh "echo $PATH"
sh "npm install"
}
}
}
}
Make a symbolic link like this:
sudo ln -s /var/lib/jenkins/tools/jenkins.plugins.nodejs.tools.NodeJSInstallation/node/bin/node /usr/bin/node
I want to highlight Mitch Downey's comment, it can't be just a comment because after spending 4 hours with no solution this comment helped me to resolve the solution
My issue ended up being with the jenkinsci/blueocean image. I was able
to just replace that image with jenkins/jenkins:lts and the NodeJS
plugin began working as expected

Security profiles in Docker (docker build --security-opt)

I'm trying to build a docker image for centos:7 that restricts system commands which any user (including root) can execute inside a docker machine. My intention is that I want to build an docker image with security profile that I need and then use that as my base image to build other application images thereby inheriting security profile from the base image. Is this doable? Am I missing something?
Here is a sample security profile I'm testing:
{
"defaultAction" : "SCMP_ACT_ALLOW",
"syscalls": [
{
"name": "mkdir",
"action": "SCMP_ACT_ERRNO"
},
{
"name": "chown",
"action":"SCMP_ACT_ERRNO"
}
]
}
When i run:
docker build -t test . --security-opt seccomp:policy.json
It throws an error :
Error response from daemon: The daemon on this platform does not support setting security options on build
Thoughts on how to get past this or other approaches I could use?
From Github...
"Docker engine does not support the parameter "--security-opt seccomp=" when executing command "docker build"
#cason you can supply a custom default profile to the daemon.
`--secomp-profile /path/to/profile.json'
https://github.com/moby/moby/issues/34454#issuecomment-321135510

Resources