Clone of 'git#******/common-ui-layout.git' into submodule path 'common-ui-layout' failed - gitlab

Looking help on Jenkins, i have written jenkinsfile where in one stage i am running sh git submodule update --init --recursive command my job is getting failed with Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password) error. but i can see on log common-ui-layout directory is present with full access, as per my analysis i found that while hitting the submodule command its not getting authenticated or its not finding the common-ui-layout folder. Im posting my jenkinsfile here, Please provide the fix of this issue.
pipeline {
agent {
label 'agent.com'
}
stages {stage("submodule clone"){
steps
{
checkout(
[
$class: 'GitSCM',
branches: [
[
name: 'master'
]
],
doGenerateSubmoduleConfigurations: false,
extensions: [
[
$class: 'SubmoduleOption',
disableSubmodules: false,
parentCredentials: true,
recursiveSubmodules: true,
reference: '',
trackingSubmodules: false
]
],
submoduleCfg: [],
userRemoteConfigs: [
[
credentialsId: '<***ID****>',
url: 'https://*****gitlab.com/****/common-ui-layout.git'
]
]
]
)
}
}
stage("fetch data"){
steps {
git branch: 'patch-1',
credentialsId: '<***ID****>',
url: 'https://****.gitlab.com/*****/****.git'
sh "pwd"
sh "ls -lat"
}
}
stage ("Installing pre-req"){
steps{
sh '''
yarn install;
yarn global add #angular/cli
'''
}
}
stage('Build app') {
steps {
sh "yarn install";
sh "pwd";
sh 'git submodule update --init --recursive';
//sh "git submodule update --recursive –remote";
sh "yarn run ng build";
println "BUILD NUMBER = $BUILD_NUMBER"
println "Build Success.."
}
}
}
}
Please refer the error snipt here

The above error got fixed after adding the rsa key. after that i got another error when i hit the git submodule update --init --recursive command in pipeline and return the with below error-
Cloning into 'common-ui-layout'...
fatal: could not read Username for 'https://xxx.xxx.com': No such device or address
please refer the snippet here
Please suggest me where i am missing?

Related

how to write the correct pipline jenkins docker grovy node

I am rewriting my pipline in node, I need to understand how to perform a step with a gait in node now an error is coming from stage('Deploy')
node {
checkout scm
def customImage = docker.build("python-web-tests:${env.BUILD_ID}")
customImage.inside {
sh "python ${env.CMD_PARAMS}"
}
stage('Deploy') {
post {
always {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]
])
cleanWs()
}
}
}
and this is the old pipeline
pipeline {
agent {label "slave_first"}
stages {
stage("Создание контейнера image") {
steps {
catchError {
script {
docker.build("python-web-tests:${env.BUILD_ID}", "-f Dockerfile .")
}
}
}
}
stage("Running and debugging the test") {
steps {
sh 'ls'
sh 'docker run --rm -e REGION=${REGION} -e DATA=${DATA} -e BUILD_DESCRIPTION=${BUILD_URL} -v ${WORKSPACE}:/tmp python-web-tests:${BUILD_ID} /bin/bash -c "python ${CMD_PARAMS} || exit_code=$?; chmod -R 777 /tmp; exit $exit_code"'
}
}
}
post {
always {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]
])
cleanWs()
}
}
}
I tried to transfer the method of creating an allure report, but nothing worked, I use the version above, almost everything turned out, you can still add environment variables to the build, for example, those that are specified -e DATA=${DATA} how do I add it
I don't recommend to switch from declarative to scriptive pipeline.
You are losing possibility to use multiple tooling connected with declarative approach like syntax checkers.
If you still want to use scriptive approach try this:
node('slave_first') {
stage('Build') {
checkout scm
def customImage = docker.build("python-web-tests:${env.BUILD_ID}")
customImage.inside {
sh "python ${env.CMD_PARAMS}"
}
}
stage('Deploy') {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]])
cleanWs()
}
}
There is no post and always directive in scriptive pipelines. It's on your head to catch all exceptions and set status of the job. I guess you were using this page: https://www.jenkins.io/doc/book/pipeline/syntax/, but it's a mistake.
This page only refers to declarative approach and in few cases you have hidden scriptive code as examples.
Also i don't know if you have default agent label set in your Jenkins config, but by looking at your declarative one I think you missed 'slave_first' arg in node object.
those that are specified -e DATA=${DATA} how do I add it
That's a docker question not a Jenkins. If you want to launch docker image and then also have access to some reports located in this container you should mount workspace/file where those output files landed. You should also pass location of those files to allure.
I suggest you to try this:
mount some subfolder in workspace to docker container
cat test report file if it's visible
add allure report with passing this file location to allure step

Gitlab checkout.groovy fails after upgrade to 12.1.17 from 12.0.1

In Jenkins i had an analysis job. The job used to checkout and build the merge request sent to the target branch. However, after upgrading the gitlab version from 12.0.1 to 12.1.17 i am unable to checkout source branch.
Below is the groovy script i was using.
#!/usr/bin/env groovy
def call() {
if (env.gitlabMergeRequestId) {
sh "echo '${env.gitlabMergeRequestId}'"
sh "echo 'Merge request detected. Merging...'"
def credentialsId = scm.userRemoteConfigs[0].credentialsId
checkout ([
$class: 'GitSCM',
branches: [[name: "${env.gitlabSourceNamespace}/${env.gitlabSourceBranch}"]],
extensions: [
[$class: 'PruneStaleBranch'],
[$class: 'CleanCheckout'],
[
$class: 'PreBuildMerge',
options: [
fastForwardMode: 'NO_FF',
mergeRemote: env.gitlabTargetNamespace,
mergeTarget: env.gitlabTargetBranch
]
]
],
userRemoteConfigs: [
[
credentialsId: credentialsId,
name: env.gitlabTargetNamespace,
url: env.gitlabTargetRepoHttpURL
],
[
credentialsId: credentialsId,
name: env.gitlabSourceNamespace,
url: env.gitlabSourceRepoHttpURL
]
]
])
} else {
sh "echo 'No merge request detected. Checking out current branch'"
checkout ([
$class: 'GitSCM',
branches: scm.branches,
extensions: [
[$class: 'PruneStaleBranch'],
[$class: 'CleanCheckout']
],
userRemoteConfigs: scm.userRemoteConfigs
])
}
}
I was able to solve it by adding in the branches
branches: [[name: "refs/heads/${env.gitlabSourceBranch}"]]

Jenkins. Invalid agent type "docker" specified. Must be one of [any, label, none]

My JenkinsFile looks like:
pipeline {
agent {
docker {
image 'node:12.16.2'
args '-p 3000:3000'
}
}
stages {
stage('Build') {
steps {
sh 'node --version'
sh 'npm install'
sh 'npm run build'
}
}
stage ('Deliver') {
steps {
sh 'readlink -f ./package.json'
}
}
}
}
I used to have Jenkins locally and this configuration worked, but I deployed it to a remote server and get the following error:
WorkflowScript: 3: Invalid agent type "docker" specified. Must be one of [any, label, none] # line 3, column 9.
docker {
I could not find a solution to this problem on the Internet, please help me
You have to install 2 plugins: Docker plugin and Docker Pipeline.
Go to Jenkins root page > Manage Jenkins > Manage Plugins > Available and search for the plugins. (Learnt from here).
instead of
agent {
docker {
image 'node:12.16.2'
args '-p 3000:3000'
}
}
try
agent {
any {
image 'node:12.16.2'
args '-p 3000:3000'
}
}
that worked for me.
For those that are using CasC you might want to include in plugin declaration
docker:latest
docker-commons:latest
docker-workflow:latest

replacing file variables by envsubst in jenkins pipeline

I want to replace some variables in a file having $variablename, at runtime from jenkins pipeline script. It seems envsubst is the best for my use case. When i execute by command line on linux server its working fine but when i'm executing through jenkins pipeline in sh script, nothing happens.
sonar-scanner.properties:
sonar.projectKey=Project:MavenTest$BRANCHNAME
sonar.projectName=MavenTest$BRANCHNAME
Example of Command line on linux box:
$ export BRANCHNAME=develop
$ envsubst '$BRANCHNAME'
Output:
sonar.projectKey=Project:MavenTestdevelop
sonar.projectName=MavenTestdevelop
But when i'm executing through jenkins file as a script, nothing is changed in file.
jenkins script:
node {
stage('checkout'){
checkout([$class: 'GitSCM', branches: [[name: ':^(?!origin/master$|origin/develop$).*']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'c0ce73db-3864-4360-9c17-d87caf8a9ea5', url: 'http://172.16.4.158:17990/scm/ctoo/testmaven.git']]])
}
stage('initialize variables'){
// Configuring BRANCH_NAME variable
sh 'git name-rev --name-only HEAD > GIT_BRANCH'
sh label: '', script: 'cut -d \'/\' -f 3 GIT_BRANCH > BRANCH'
branchname = readFile('BRANCH').trim()
env.BRANCHNAME = branchname
}
stage('build & SonarQube analysis') {
withSonarQubeEnv('Sonar') {
sh "envsubst '$BRANCHNAME' <sonar-scanner.properties"
}
}
}
Output:
[Pipeline] sh (hide)
envsubst repotest
sonar.projectKey=Project:MavenTest$BRANCHNAME
sonar.projectName=MavenTest$BRANCHNAME
Can someone please help me
Hi I don't have idea about the envbust but this can be achieved by passing sonar parameters via command line to the sonar see the below example:
withSonarQubeEnv('Sonar') {
sh "<sonarscanner path> -Dsonar.projectKey=Project:MavenTest$BRANCHNAME"
}
I had this problem and solved it by using his escape character
\,
for example:
sh "envsubst '\${SERVER_NAME}' < ./config/nginx/nginx.conf.template > ./config/nginx/nginx.conf"

How can I use the Jenkins Copy Artifacts Plugin from within the pipelines (jenkinsfile)?

I am trying to find an example of using the Jenkins Copy Artifacts Plugin from within Jenkins pipelines (workflows).
Can anyone point to a sample Groovy code that is using it?
With a declarative Jenkinsfile, you can use following pipeline:
pipeline {
agent any
stages {
stage ('push artifact') {
steps {
sh 'mkdir archive'
sh 'echo test > archive/test.txt'
zip zipFile: 'test.zip', archive: false, dir: 'archive'
archiveArtifacts artifacts: 'test.zip', fingerprint: true
}
}
stage('pull artifact') {
steps {
copyArtifacts filter: 'test.zip', fingerprintArtifacts: true, projectName: env.JOB_NAME, selector: specific(env.BUILD_NUMBER)
unzip zipFile: 'test.zip', dir: './archive_new'
sh 'cat archive_new/test.txt'
}
}
}
}
Before version 1.39 of the CopyArtifact, you must replace second stage with following (thanks #Yeroc) :
stage('pull artifact') {
steps {
step([ $class: 'CopyArtifact',
filter: 'test.zip',
fingerprintArtifacts: true,
projectName: '${JOB_NAME}',
selector: [$class: 'SpecificBuildSelector', buildNumber: '${BUILD_NUMBER}']
])
unzip zipFile: 'test.zip', dir: './archive_new'
sh 'cat archive_new/test.txt'
}
}
With CopyArtifact, I use '${JOB_NAME}' as project name which is the current running project.
Default selector used by CopyArtifact use last successful project build number, never current one (because it's not yet successful, or not). With SpecificBuildSelector you can choose '${BUILD_NUMBER}' which contains current running project build number.
This pipeline works with parallel stages and can manage huge files (I'm using a 300Mb file, it not works with stash/unstash)
This pipeline works perfectly with my Jenkins 2.74, provided you have all needed plugins
If you are using agents in your controller and you want to copy artifacts between each other you can use stash/unstash, for example:
stage 'build'
node{
git 'https://github.com/cloudbees/todo-api.git'
stash includes: 'pom.xml', name: 'pom'
}
stage name: 'test', concurrency: 3
node {
unstash 'pom'
sh 'cat pom.xml'
}
You can see this example in this link:
https://dzone.com/refcardz/continuous-delivery-with-jenkins-workflow
If builds are not running in the same pipeline you can use direct CopyArtifact plugin, here is example: https://www.cloudbees.com/blog/copying-artifacts-between-builds-jenkins-workflow and example code:
node {
// setup env..
// copy the deployment unit from another Job...
step ([$class: 'CopyArtifact',
projectName: 'webapp_build',
filter: 'target/orders.war']);
// deploy 'target/orders.war' to an app host
}
name = "/" + "${env.JOB_NAME}"
def archiveName = 'relNum'
try {
step($class: 'hudson.plugins.copyartifact.CopyArtifact', projectName: name, filter: archiveName)
} catch (none) {
echo 'No artifact to copy from ' + name + ' with name relNum'
writeFile file: archiveName, text: '3'
}
def content = readFile(archiveName).trim()
echo 'value archived: ' + content
try that using copy artifact plugin

Resources