Why does the Nexus Artifact Uploader store artifacts in a separate directory? - groovy

I want to publish an artifact to Nexus 3 in the Gradle project.
Due to requirements, publish from Jenkins instead of gradle publish using build.gradle maven-publish plug-in.
I created and executed a JenkinsPipeline script, but the artifact jar and pom have been separated.Why?
The following site is referred to this and this.
https://plugins.jenkins.io/nexus-artifact-uploader
And I make JenkinsPipeline script.
floowing pipeline script.
pipeline {
...
stage("publish to nexus") {
steps {
script {
pom = readMavenPom file: "build/pom.xml";
artifactPath = "build/libs/gs-managing-transactions-0.1.0.jar"
artifactExists = fileExists artifactPath;
if(artifactExists) {
nexusArtifactUploader(
nexusVersion: NEXUS_VERSION,
protocol: NEXUS_PROTOCOL,
nexusUrl: NEXUS_URL,
groupId: pom.groupId,
version: pom.version,
repository: NEXUS_REPOSITORY,
credentialsId: NEXUS_CREDENTIAL_ID,
artifacts: [
[artifactId: pom.artifactId, classifier: '', file: artifactPath , type: pom.packaging],
[artifactId: pom.artifactId, classifier: '', file: "build/pom.xml", type: "pom"]
]
);
} else {
error "*** File: ${artifactPath}, could not be found";
}
}
}
}
...
}
part of execute log
[Pipeline] nexusArtifactUploader
...
Uploading: http://localhost:7777/repository/maven-snapshots/com/sample/sample-spring-managing-transactions/0.0.1-SNAPSHOT/sample-spring-managing-transactions-0.0.1-20190621.123700-7-debug.jar
...
Uploading artifact gs-managing-transactions-0.1.0.jar completed.
...
Uploading: http://localhost:7777/repository/maven-snapshots/com/sample/sample-spring-managing-transactions/0.0.1-SNAPSHOT/sample-spring-managing-transactions-0.0.1-20190621.123701-8-debug.pom
Uploading artifact pom.xml completed.
Why stored separete artifact like "sample-spring-managing-transactions-0.0.1-20190621.123700-7-debug.jar", "sample-spring-managing-transactions-0.0.1-20190621.123701-8-debug.pom".
I will expect following.
"sample-spring-managing-transactions-0.0.1-20190621.123700-7-debug.jar" "sample-spring-managing-transactions-0.0.1-20190621.123701-7-debug.pom"

solved. I missed the description "Uploading maven artifacts snapshots is not supported by this plugin." plugins.jenkins.io/nexus-artifact-uploader.

Related

how to write the correct pipline jenkins docker grovy node

I am rewriting my pipline in node, I need to understand how to perform a step with a gait in node now an error is coming from stage('Deploy')
node {
checkout scm
def customImage = docker.build("python-web-tests:${env.BUILD_ID}")
customImage.inside {
sh "python ${env.CMD_PARAMS}"
}
stage('Deploy') {
post {
always {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]
])
cleanWs()
}
}
}
and this is the old pipeline
pipeline {
agent {label "slave_first"}
stages {
stage("Создание контейнера image") {
steps {
catchError {
script {
docker.build("python-web-tests:${env.BUILD_ID}", "-f Dockerfile .")
}
}
}
}
stage("Running and debugging the test") {
steps {
sh 'ls'
sh 'docker run --rm -e REGION=${REGION} -e DATA=${DATA} -e BUILD_DESCRIPTION=${BUILD_URL} -v ${WORKSPACE}:/tmp python-web-tests:${BUILD_ID} /bin/bash -c "python ${CMD_PARAMS} || exit_code=$?; chmod -R 777 /tmp; exit $exit_code"'
}
}
}
post {
always {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]
])
cleanWs()
}
}
}
I tried to transfer the method of creating an allure report, but nothing worked, I use the version above, almost everything turned out, you can still add environment variables to the build, for example, those that are specified -e DATA=${DATA} how do I add it
I don't recommend to switch from declarative to scriptive pipeline.
You are losing possibility to use multiple tooling connected with declarative approach like syntax checkers.
If you still want to use scriptive approach try this:
node('slave_first') {
stage('Build') {
checkout scm
def customImage = docker.build("python-web-tests:${env.BUILD_ID}")
customImage.inside {
sh "python ${env.CMD_PARAMS}"
}
}
stage('Deploy') {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'report']]])
cleanWs()
}
}
There is no post and always directive in scriptive pipelines. It's on your head to catch all exceptions and set status of the job. I guess you were using this page: https://www.jenkins.io/doc/book/pipeline/syntax/, but it's a mistake.
This page only refers to declarative approach and in few cases you have hidden scriptive code as examples.
Also i don't know if you have default agent label set in your Jenkins config, but by looking at your declarative one I think you missed 'slave_first' arg in node object.
those that are specified -e DATA=${DATA} how do I add it
That's a docker question not a Jenkins. If you want to launch docker image and then also have access to some reports located in this container you should mount workspace/file where those output files landed. You should also pass location of those files to allure.
I suggest you to try this:
mount some subfolder in workspace to docker container
cat test report file if it's visible
add allure report with passing this file location to allure step

How to set conditions in the parallel build to proceed to next stage if one step is success

I am creating a declarative pipeline in Jenkins. There are 6 stages in it.
First Stage: Scenario Upload
Second Stage: Pull code from Git
Third Stage: Maven Build
Fourth Stage: Its a parallel stage. First step will launch mobile emulator and second step will check device connected or not.
Fifth Stage: I want to start this stage when the second step BUILD SUCCESS else stop the job
Sixth Stage: Send email
I am stuck with point 5 (Fifth Stage). Please help
pipeline {
agent any
stages {
stage("Scenario Upload") {
steps {
script {
def inputFile = input message: 'Upload file', parameters: [file(name: 'CyclosAppStatus.xlsx')]
new hudson.FilePath(new File("$workspace/Cucumber_BDD master/Result/CyclosAppStatus.xlsx")).copyFrom(inputFile)
inputFile.delete()
}
}
}
stage('Git Pull Code') {
steps {
git credentialsId: '708a126a-66bb-4eb5-8826-55cedf6497c3', url: 'https://github.com/divakar-ragupathy/Mobile_Automation_BDD.git'
}
}
stage('Maven Clean Build') {
steps {
bat label: '', script: '''Echo Maven Clean Build...
cd %WORKSPACE%\\ADB_Devices
mvn clean compile'''
}
}
stage('Building Android Setup') {
steps {
parallel(
Invoke_Emulator: {
bat label: '', script: '''Echo Invoking Emulator...
#echo off
set emulName=%Emulator_Name%
echo %emulName%
for /f "tokens=1 delims=:" %%e in ("%emulName%") do (
%ANDROID_AVD_PATH%emulator -avd "%%e" -no-boot-anim -no-snapshot-save -no-snapshot-load
)
endlocal'''
},
Checking_Device: {
bat label: '', script: '''Echo Checking Connected Device...
cd %WORKSPACE%\\ADB_Devices
mvn exec:java -Dexec.mainClass=com.expleo.adbListner.CheckConnectedAdbDevices -Dlog4j.configuration=file:///%WORKSPACE%\\ADB_Devices\\src\\log4j.properties -Dexec.args="%Emulator_Name%"'''
}
)
}
}
}
}
If you declare a variable without the "def" keyword it is global. You can use that to store the condition in the previous stages. In the 5th stage you can use a when block to check this condition.

Could not resolve org.nodejs : Local build error

I have a Angular-App forked from another repository where they manage all their builds in a pipeline.
I wanted to build that in my local system (laptop) and push the built-app in to the hosting server.
This is their build.gradle
node {
version = "9.4.0"
npmVersion = "5.6.0"
download = true
}
task cleanProd(type: Delete) {
delete "dist"
}
task testProd(type: NodeTask, dependsOn: npmInstall) {
script = file("${projectDir}/node_modules/#angular/cli/bin/ng")
args = ["test", "--browsers", "PhantomJS", "--watch=false", "--singleRun=true"]
}
task assembleProd(type: NodeTask, dependsOn: ['npmInstall', 'testProd']) {
script = file("${projectDir}/node_modules/#angular/cli/bin/ng")
args = ["build", "--prod", "--vendor-chunk=true"]
}
task copyDist(type: Copy) {
from "dist/"
into "dist/fancy-ui-${project.version}"
}
task buildProd(dependsOn: [assembleProd])
I executed the command gradlew cleanProd buildProd copyDist and I am stuck with the below exception
Build Version = build-713-ge359ca9
:cleanProd UP-TO-DATE
:nodeSetup FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':nodeSetup'.
> Could not resolve all dependencies for configuration 'detachedConfiguration1'.
> Could not resolve org.nodejs:node:9.4.0.
Required by:
:portal-ui:build-713-ge359ca9
> Could not resolve org.nodejs:node:9.4.0.
> Could not get resource 'https://nodejs.org/dist/v9.4.0/ivy.xml'.
> Could not GET 'https://nodejs.org/dist/v9.4.0/ivy.xml'.
> nodejs.org
I have all the necessary HTTP-Proxies and there is no issue with connectivity as such .. Just that this resource https://nodejs.org/dist/v9.4.0/ivy.xml is not getting loaded .. but the same code and configuration built fine in the Jenkins server
I had the same problem with node 10.14.1.
There is a workaround that solves the problem:
repositories.whenObjectAdded {
if (it instanceof IvyArtifactRepository) {
metadataSources {
artifact()
}
}
}
Extracted from https://github.com/srs/gradle-node-plugin/issues/301

How can I use the Jenkins Copy Artifacts Plugin from within the pipelines (jenkinsfile)?

I am trying to find an example of using the Jenkins Copy Artifacts Plugin from within Jenkins pipelines (workflows).
Can anyone point to a sample Groovy code that is using it?
With a declarative Jenkinsfile, you can use following pipeline:
pipeline {
agent any
stages {
stage ('push artifact') {
steps {
sh 'mkdir archive'
sh 'echo test > archive/test.txt'
zip zipFile: 'test.zip', archive: false, dir: 'archive'
archiveArtifacts artifacts: 'test.zip', fingerprint: true
}
}
stage('pull artifact') {
steps {
copyArtifacts filter: 'test.zip', fingerprintArtifacts: true, projectName: env.JOB_NAME, selector: specific(env.BUILD_NUMBER)
unzip zipFile: 'test.zip', dir: './archive_new'
sh 'cat archive_new/test.txt'
}
}
}
}
Before version 1.39 of the CopyArtifact, you must replace second stage with following (thanks #Yeroc) :
stage('pull artifact') {
steps {
step([ $class: 'CopyArtifact',
filter: 'test.zip',
fingerprintArtifacts: true,
projectName: '${JOB_NAME}',
selector: [$class: 'SpecificBuildSelector', buildNumber: '${BUILD_NUMBER}']
])
unzip zipFile: 'test.zip', dir: './archive_new'
sh 'cat archive_new/test.txt'
}
}
With CopyArtifact, I use '${JOB_NAME}' as project name which is the current running project.
Default selector used by CopyArtifact use last successful project build number, never current one (because it's not yet successful, or not). With SpecificBuildSelector you can choose '${BUILD_NUMBER}' which contains current running project build number.
This pipeline works with parallel stages and can manage huge files (I'm using a 300Mb file, it not works with stash/unstash)
This pipeline works perfectly with my Jenkins 2.74, provided you have all needed plugins
If you are using agents in your controller and you want to copy artifacts between each other you can use stash/unstash, for example:
stage 'build'
node{
git 'https://github.com/cloudbees/todo-api.git'
stash includes: 'pom.xml', name: 'pom'
}
stage name: 'test', concurrency: 3
node {
unstash 'pom'
sh 'cat pom.xml'
}
You can see this example in this link:
https://dzone.com/refcardz/continuous-delivery-with-jenkins-workflow
If builds are not running in the same pipeline you can use direct CopyArtifact plugin, here is example: https://www.cloudbees.com/blog/copying-artifacts-between-builds-jenkins-workflow and example code:
node {
// setup env..
// copy the deployment unit from another Job...
step ([$class: 'CopyArtifact',
projectName: 'webapp_build',
filter: 'target/orders.war']);
// deploy 'target/orders.war' to an app host
}
name = "/" + "${env.JOB_NAME}"
def archiveName = 'relNum'
try {
step($class: 'hudson.plugins.copyartifact.CopyArtifact', projectName: name, filter: archiveName)
} catch (none) {
echo 'No artifact to copy from ' + name + ' with name relNum'
writeFile file: archiveName, text: '3'
}
def content = readFile(archiveName).trim()
echo 'value archived: ' + content
try that using copy artifact plugin

How to build Groovy JAR w/ Gradle and publish it to in-house repo

I have a Groovy project and am trying to build it with Gradle. First I want a package task that creates a JAR by compiling it against its dependencies. Then I need to generate a Maven POM for that JAR and publish the JAR/POM to an in-house Artifactory repo. The build.gradle:
apply plugin: "groovy"
apply plugin: "maven-publish"
repositories {
maven {
name "artifactory01"
url "http://myartifactory/artifactory/libs-release"
}
}
dependencies {
compile "long list starts here"
}
// Should compile up myapp-<version>.jar
jar {
}
// Should publish myapp-<version>.jar and its (generated) POM to our in-house Maven/Artifactory repo.
publishing {
publications {
myPublication(MavenPublication) {
from components.java
artifact sourceJar {
classifier "source"
}
pom.withXml {
// ???
}
}
}
}
task wrapper(type: Wrapper) {
gradleVersion = '1.11'
}
However I do not believe I have set up versioning correctly with my jar task (for instance, how could I get it creating myapp-1.2.1 vs. myapp-1.2.2? I also don't think I have my publications configuration set up correctly: what should go in pom.withXml?
You're more than welcome to use artifactory plugin for that.
The documentation can be found in our user guide and below you can find a full working example of gradle build.
Run gradle build artifactoryPublish to build and publish the project.
buildscript {
repositories {
jcenter()
}
dependencies {
classpath(group: 'org.jfrog.buildinfo', name: 'build-info-extractor-gradle', version: '3.0.1')
}
}
apply plugin: 'java'
apply plugin: 'maven-publish'
apply plugin: 'com.jfrog.artifactory'
group = 'com.jfrog.example'
version = '1.2-SNAPSHOT'
status = 'SNAPSHOT'
dependencies {
compile 'org.slf4j:slf4j-api:1.7.5'
testCompile 'junit:junit:4.11'
}
task sourcesJar(type: Jar, dependsOn: classes) {
classifier = 'sources'
from sourceSets.main.allSource
}
publishing {
publications {
main(MavenPublication) {
from components.java
artifact sourcesJar
}
}
artifactory {
contextUrl = 'http://myartifactory/artifactory'
resolve {
repository {
repoKey = 'libs-release'
}
}
publish {
repository {
repoKey = 'libs-snapshot-local'
username = 'whatever'
password = 'whatever123'
}
defaults {
publications 'main'
}
}
}
package is a keyword in Java/Groovy, and you'd have to use a different syntax to declare a task with that name.
Anyway, the task declaration for package should be removed, as the jar task already serves that purpose. The jar task configuration (jar { from ... }) should be at the outermost level (not nested inside another task), but from configurations.compile is unlikely what you want, as that will include Jars of compile dependencies into the Jar (which regular Java class loaders can't deal with), rather than merging them into the Jar. (Are you even sure you need a fat Jar?)
Likewise, the publish task declaration should be removed, and replaced with publishing { publications { ... } }.
Also, the buildscript block should probably be removed, and repositories { ... } and dependencies { ... } moved to the outermost level. ( buildscript { dependencies { ... } } declares dependencies of the build script itself (e.g. Gradle plugins), not the dependencies of the code to be compiled/run.)
I suggest to check out the many self-contained example builds in the samples directory of the full Gradle distribution (gradle-all).

Resources