How do I get my snapshots in Nexus to appear in the m2eclipse dependency search? - m2eclipse

I've been working through the Nexus guide this weekend and I've got everything set up, to the point that I can publish a snapshot to my local nexus install.
I can't seem to work out how to get m2eclipse to see the snapshot and offer it as an option in the Add Dependencies search panel. How do I do that? Thanks!
In case it's of any use, my settings.xml is as follows:
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0
http://maven.apache.org/xsd/settings-1.0.0.xsd">
<localRepository />
<interactiveMode />
<usePluginRegistry />
<offline />
<pluginGroups />
<servers>
<server>
<id>localSnap</id>
<username>deployment</username>
<password>*****</password>
</server>
</servers>
<mirrors>
<mirror>
<!--This sends everything else to /public -->
<id>nexus</id>
<mirrorOf>*</mirrorOf>
<url>http://localhost:8080/nexus/content/groups/public</url>
</mirror>
</mirrors>
<profiles>
<profile>
<id>nexus</id>
<!--Enable snapshots for the built in central repo to direct -->
<!--all requests to nexus via the mirror -->
<repositories>
<repository>
<id>central</id>
<url>http://central</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>central</id>
<url>http://central</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories>
</profile>
</profiles>
<activeProfiles>
<!--make the profile active all the time -->
<activeProfile>nexus</activeProfile>
</activeProfiles>
</settings>

I have the answer now. you have to set up nexus to publish the index. http://www.sonatype.com/people/2009/09/nexus-scheduled-tasks ...Set up the schedule task to publish the index for clients like m2eclipse. But you must wait until WAITING status appears at the scheduled task section in nexus. After eclipse restart it must work. Regards, Jakub
BTW: your proxied repositories must have "Download Remote Indexes - true" and nexus must be able to search through remote indexes... but I supposed you know that

Related

Why only does Maven Azure DevOps Artifacts only store dependencies from Maven Central? (MuleSoft)

I have configured Azure DevOps Maven task to connect to an Artifacts feed to store the artifacts and dependencies, but I only see the Maven Central dependencies, none of the others are stored.
Here is the MuleSoft pom.xml:
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany</groupId>
<artifactId>poc</artifactId>
<version>1.0.1-SNAPSHOT</version>
<packaging>mule-application</packaging>
<name>poc</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<app.runtime>4.3.0</app.runtime>
<mule.maven.plugin.version>3.3.5</mule.maven.plugin.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-clean-plugin</artifactId>
<version>3.0.0</version>
</plugin>
<plugin>
<groupId>org.mule.tools.maven</groupId>
<artifactId>mule-maven-plugin</artifactId>
<version>${mule.maven.plugin.version}</version>
<extensions>true</extensions>
<configuration>
<sharedLibraries>
<sharedLibrary>
<groupId>org.apache.activemq</groupId>
<artifactId>artemis-jms-client-all</artifactId>
</sharedLibrary>
<sharedLibrary>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-broker</artifactId>
</sharedLibrary>
<sharedLibrary>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>mssql-jdbc</artifactId>
</sharedLibrary>
</sharedLibraries>
<classifier>mule-application</classifier>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.mule.connectors</groupId>
<artifactId>mule-http-connector</artifactId>
<version>1.5.19</version>
<classifier>mule-plugin</classifier>
</dependency>
<dependency>
<groupId>org.mule.connectors</groupId>
<artifactId>mule-sockets-connector</artifactId>
<version>1.2.0</version>
<classifier>mule-plugin</classifier>
</dependency>
<dependency>
<groupId>org.mule.modules</groupId>
<artifactId>mule-apikit-module</artifactId>
<version>1.3.13</version>
<classifier>mule-plugin</classifier>
</dependency>
<dependency>
<groupId>org.mule.connectors</groupId>
<artifactId>mule-jms-connector</artifactId>
<version>1.7.0</version>
<classifier>mule-plugin</classifier>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-broker</artifactId>
<version>5.15.4</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>artemis-jms-client-all</artifactId>
<version>2.10.1</version>
</dependency>
<dependency>
<groupId>org.mule.connectors</groupId>
<artifactId>mule-db-connector</artifactId>
<version>1.8.1</version>
<classifier>mule-plugin</classifier>
</dependency>
<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>mssql-jdbc</artifactId>
<version>6.2.2.jre8</version>
</dependency>
</dependencies>
<distributionManagement>
<repository>
<id>azure-maven</id>
<url>https://pkgs.dev.azure.com/therevillsgames/therevillsgames/_packaging/azure-maven/maven/v1</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</distributionManagement>
<repositories>
<repository>
<id>azure-maven</id>
<url>https://pkgs.dev.azure.com/therevillsgames/therevillsgames/_packaging/azure-maven/maven/v1</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>anypoint-exchange-v2</id>
<name>Anypoint Exchange</name>
<url>https://maven.anypoint.mulesoft.com/api/v2/maven</url>
<layout>default</layout>
</repository>
<repository>
<id>mulesoft-releases</id>
<name>MuleSoft Releases Repository</name>
<url>https://repository.mulesoft.org/releases/</url>
<layout>default</layout>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>mulesoft-releases</id>
<name>mulesoft release repository</name>
<layout>default</layout>
<url>https://repository.mulesoft.org/releases/</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories>
</project>
The artifacts are stored in Azure Artifacts, but as shown only from Maven Central and we need the other there too:
Is there a way to get the MuleSoft dependencies stored too?
In azure devops Artifacts, there are only four public upstream sources: npmjs.com, NuGet.org, Maven Central and PyPI, that's why the dependencies from MuleSoft are not stored.
And as of this time, however, custom upstream sources are only available for npm.
You can click this docuement for detailed information.
As said in Azure Devops Artifacts there are only four public upstream sources.
But you can add a cache for the others.
https://learn.microsoft.com/it-it/azure/devops/pipelines/release/caching?view=azure-devops#maven

Build dependency with a specific profile

I have a web project with a dependency I want to compile on a different profile so it generates some additional files I want on the web project.
To be more specific, a Web project with a Netbeans Application as dependency. The Netbeans project has a deployment profile that created the update center (just a folder with files in it). I want this update center to be added to the war file for deployment.
Is there a way to make the web project build the dependency on this profile so I get the files I need?
Is there other options to make this work?
Update: Example
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<!-- The Basics -->
<groupId>...</groupId>
<artifactId>...</artifactId>
<version>...</version>
<dependencies>
<dependency>
<groupId>project-of-interest</groupId>
<artifactId>project-id</artifactId>
<version>4.0</version>
<type>jar</type>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.6</version>
<configuration>
<webResources>
<resource>
<!-- this is relative to the pom.xml directory -->
<directory>resource2</directory>
</resource>
</webResources>
</configuration>
</plugin>
</plugins>
</build>
</project>
project-of-interest has a deployment profile in which the files I need are generated and somehow resource2 points to the location of those files.
My main issue is making sure the files I need are available.
you can point to specific profile with maven command
mvn clean install -P $profile

Linux executable fails using javafx-maven-plugin

I have a multimodule maven project with JavaFX up and running. I can create an jar file containing all classes that is executable through a maven assembly, so I know the packaged bundle works.
For conveniance I want to create a native bundle/executable using the javafx-maven-plugin
<profile>
<id>build-installer</id>
<properties>
<native.output.dir>${project.build.directory}/jfx/native/${project.build.finalName}</native.output.dir>
<native.output.dir.app>${native.output.dir}/app</native.output.dir.app>
<native.output.dir.security>${native.output.dir}/runtime/jre/lib/security</native.output.dir.security>
<native.app.jar>${native.output.dir.app}/${project.build.finalName}-jfx.jar</native.app.jar>
</properties>
<dependencies>
<dependency>
<groupId>ch.sahits.game</groupId>
<artifactId>OpenPatricianDisplay</artifactId>
<version>${project.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>com.zenjava</groupId>
<artifactId>javafx-maven-plugin</artifactId>
<version>8.1.2</version>
<configuration>
<mainClass>ch.sahits.game.OpenPatrician</mainClass>
<verbose>true</verbose>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>native</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.7</version>
<executions>
<execution>
<id>create zip archive</id>
<phase>install</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<echo>Creating self-contained zip</echo>
<zip destfile="${project.build.directory}/OpenPatrician-${project.version}.zip" basedir="${native.output.dir}" />
</target>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
This works fine on Windows, creates an exe file that can be run. However executing the same thing on Linux, Maven runs through but the executable fails to start properly with these two messages:
OpenPatricianDisplay-0.5.0-SNAPSHOT No main class specified
OpenPatricianDisplay-0.5.0-SNAPSHOT Failed to launch JVM
Taking a look at the cfg files of the Windows and Linux bundle shows that they are different. When replacing the Linux one with the one from Windows a different errors is created. So I do not think the fact that they are different is the cause.
Creating a single module JavaFX demo app with the plugin on Linux works. To figure out if it is the Maven plugin or the underlying packager, I tried the following the Ant examples. The Hello World example works fine (chapter 10.4.1) however when trying the example with external jar files (chapter 10.4.3) even the build fails:
BUILD FAILED
/home/andi/eclipse/intellij/jdk1.8.0_60/demo/javafx_samples/src/Ensemble8/build.xml:34: You must specify at least one fileset to be packed.
The build.xml
<?xml version="1.0" encoding="UTF-8" ?>
<project name="Ensemble8 JavaFX Demo Application" default="default" basedir="."
xmlns:fx="javafx:com.sun.javafx.tools.ant">
<property name="JAVA_HOME" value="/usr/lib/jvm/java-8-oracle"/>
<path id="CLASSPATH">
<pathelement location="lib/lucene-core-3.2.0.jar"/>
<pathelement location="lib/lucene-grouping-3.2.0.jar"/>
<pathelement path="classes"/>
</path>
<property name="build.src.dir" value="src"/>
<property name="build.classes.dir" value="classes"/>
<property name="build.dist.dir" value="dist"/>
<target name="default" depends="clean,compile">
<taskdef resource="com/sun/javafx/tools/ant/antlib.xml"
uri="javafx:com.sun.javafx.tools.ant"
classpath="${JAVA_HOME}/lib/ant-javafx.jar"/>
<fx:application id="ensemble8"
name="Ensemble8"
mainClass="ensemble.EnsembleApp"/>
<fx:resources id="appRes">
<fx:fileset dir="${build.dist.dir}" includes="ensemble8.jar"/>
<fx:fileset dir="lib"/>
<fx:fileset dir="${build.classes.dir}"/>
</fx:resources>
<fx:jar destfile="${build.dist.dir}/ensemble8.jar">
<fx:application refid="ensemble8"/>
<fx:resources refid="appRes"/>
</fx:jar>
<fx:deploy outdir="." embedJNLP="true"
outfile="ensemble8"
nativeBundles="all">
<fx:application refId="ensemble8"/>
<fx:resources refid="appRes"/>
<fx:info title="Ensemble8 JavaFX Demo Application"
vendor="Oracle Corporation"/>
</fx:deploy>
</target>
<target name="clean">
<mkdir dir="${build.classes.dir}"/>
<mkdir dir="${build.dist.dir}"/>
<delete>
<fileset dir="${build.classes.dir}" includes="**/*"/>
<fileset dir="${build.dist.dir}" includes="**/*"/>
</delete>
</target>
<target name="compile" depends="clean">
<javac includeantruntime="false"
srcdir="${build.src.dir}"
destdir="${build.classes.dir}"
fork="yes"
executable="${JAVA_HOME}/bin/javac"
source="1.8"
debug="on"
classpathref="CLASSPATH">
</javac>
<!-- Copy resources to build.classes.dir -->
<copy todir="${build.classes.dir}">
<fileset dir="src/app/resources"/>
<fileset dir="src/generated/resources"/>
<fileset dir="src/samples/resources"/>
</copy>
</target>
</project>
So it looks the examples are not up to date with Java 1.8.0_60. The only difference to the build.xml from the example is the path to the JAVA_HOME.
Does anyone have an idea on:
a) how to approach the issue with the ant build to prove/disprove that the packager is the problem or
b) even better have some insights into what might be the problem when running the maven plugin.
Environment:
Linux Mint 17.2 KDE
JDK 1.8.0_60
Ant 1.9.3
Maven 3.0.5
javafx-maven-plugin 8.1.4
This is at least a partial answer to the issue with the build for ant. As it turns out the documentation is outdated, but I figured it out when taking a look at the Ant task definition.
The <fx:jar> elements requires some more children for it to work:
<fx:application id="ensemble8"
name="Ensemble8"
mainClass="ensemble.EnsembleApp"/>
<fx:resources id="appRes">
<fx:fileset dir="${build.dist.dir}" includes="ensemble8.jar"/>
<fx:fileset dir="lib"/>
<fx:fileset dir="${build.classes.dir}"/>
</fx:resources>
<fx:jar destfile="${build.dist.dir}/ensemble8.jar">
<fx:application refid="ensemble8"/>
<fx:resources refid="appRes"/>
<fx:fileset dir="${build.classes.dir}"/>
<!-- Customize jar manifest (optional) -->
<manifest>
<attribute name="Implementation-Vendor" value="Samples Team"/>
<attribute name="Implementation-Version" value="1.0"/>
<attribute name="Main-Class" value="ensemble.EnsembleApp" />
</manifest>
</fx:jar>
Especially the <manifest> and the <fx:fileset>. With that in place I can create the demo application as native bundle that is executable.
EDIT: The original issue with the javafx-maven-plugin turns out to be a problem in the packager itself and the lookup of the configuration file. Updating to version 8.1.5 and adding <bundler>linux.app</bundler> in the <configuration> is a workaround until the issue is fixed in the JDK.-

An error when i try to save a csv file within HDFS using apache spark

i am just trying to code a program which needs to save a csv file within HDFS, the code works fine in eclipse, but when i try to execute the jar outside of eclipse, it hits me with an error:
2014-10-14 12:41:31 INFO SecurityManager:58 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(aroman)
Exception in thread "main" java.lang.ExceptionInInitializerError
at com.tekcomms.c2d.utils.MyWatchService.saveIntoHdfs(MyWatchService.java:362)
at com.tekcomms.c2d.utils.MyWatchService.processDataCastFile(MyWatchService.java:332)
at com.tekcomms.c2d.utils.MyWatchService.processCreateEvent(MyWatchService.java:224)
at com.tekcomms.c2d.utils.MyWatchService.watch(MyWatchService.java:180)
at com.tekcomms.c2d.main.FeedAdaptor.main(FeedAdaptor.java:40)
Caused by: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:197)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:136)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:104)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:152)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:53)
at com.tekcomms.c2d.utils.MySparkUtils.<clinit>(MySparkUtils.java:29)
... 5 more
This is the part responsible to write within HDFS:
public class MySparkUtils {
final static Logger LOGGER = Logger.getLogger(MySparkUtils.class);
private static JavaSparkContext sc;
static {
SparkConf conf = new SparkConf().setAppName("MySparkUtils");
String master = MyWatchService.getSPARK_MASTER();
conf.setMaster(master );
//this is horrible! how can i pass of it?
String [] jars = {"target/feed-adapter-0.0.1-SNAPSHOT.jar"};
conf.setJars(jars );
sc = new JavaSparkContext(conf);
LOGGER.debug("spark context initialized!");
}
public static boolean saveWithinHDFS(String path,StringBuffer sb){
LOGGER.debug("Trying to save in HDFS. path: " + path);
boolean isOk=false;
String [] aStrings = sb.toString().split("\n");
List<String> jsonDatab = Arrays.asList(aStrings);
JavaRDD<String> dataRDD = sc.parallelize(jsonDatab);
dataRDD.saveAsTextFile(path);
return isOk;
}
}
and this is my pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.tekcomms.c2d</groupId>
<artifactId>feed-adapter</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>feed-adaptor</name>
<description>a poc about to scan every second a remote filesystem seeking new csv files from datacast, load the csv file into memory, scan every line of csv matching with a set of pattern rules (matching_phone, matching_mac) if found a match, i will create a string buffer with that previous info, if there is no match, i will create another string buffer with that discarded data. Finally i have to copy those files into HDFS. </description>
<developers>
<developer>
<name>Alonso Isidoro Román</name>
<email>XXX</email>
<timezone>+1 Madrid</timezone>
<organization>XXXX</organization>
<url>about.me/alonso.isidoro.roman</url>
</developer>
</developers>
<dependencies>
<!-- StringUtils... -->
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.6</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.0.0</version>
<scope>compile</scope>
<optional>false</optional>
</dependency>
</dependencies>
<repositories>
<repository>
<id>Akka repository</id>
<url>http://repo.akka.io/releases</url>
</repository>
<!-- >repository> <id>cloudera-repos</id> <name>Cloudera Repos</name> <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository -->
<!-- repository> <id>CLOUDERA</id> <url>https://repository.cloudera.com/artifactory/repo/org/apache/spark/spark-core_2.10/0.9.0-cdh5.0.0-beta-2/</url>
</repository> <repository> <id>cdh.repo</id> <url>https://repository.cloudera.com/artifactory/cloudera-repos</url>
<name>Cloudera Repositories</name> <snapshots> <enabled>false</enabled> </snapshots>
</repository> <repository> <id>cdh.snapshots.repo</id> <url>https://repository.cloudera.com/artifactory/libs-snapshot-local</url>
<name>Cloudera Snapshots Repository</name> <snapshots> <enabled>true</enabled>
</snapshots> <releases> <enabled>false</enabled> </releases> </repository>
<repository> <id>central</id> <url>http://repo1.maven.org/maven2/</url> <releases>
<enabled>true</enabled> </releases> <snapshots> <enabled>false</enabled>
</snapshots> </repository -->
<repository>
<id>cloudera-repos</id>
<name>Cloudera Repos</name>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.tekcomms.c2d.main.FeedAdaptor</mainClass>
</transformer>
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
What do i am doing wrong?
EDIT
finally the problem was to figure out the exact jars for your hdfs cluster, wrong versions!, and another problem was a very restrictive umask in the hdfs side, my local user was not able to write within HDFS because of permisions!.
finally the problem was to figure out the exact jars for your hdfs cluster, wrong versions!, and another problem was a very restrictive umask in the hdfs side, my local user was not able to write within HDFS because of permisions!.

Enabling Release Management in Bamboo

I'm running JIRA and Bamboo which are connected by the application link. In Bamboo, there is a Maven project with the following POM:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.experiments</groupId>
<artifactId>howto-release</artifactId>
<version>${ci.version}</version>
<packaging>jar</packaging>
<name>howto-release</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<ci.version>2.6-SNAPSHOT</ci.version>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
</dependencies>
<scm>
<connection>scm:svn:http://svn.xxx.lan/svn/howto-release/trunk</connection>
<developerConnection>scm:svn:http://svn.xxx.lan/svn/howto-release/trunk</developerConnection>
<url>http://svn.xxx.lan/svn/howto-release/trunk</url>
</scm>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-release-plugin</artifactId>
<version>2.0-beta-8</version>
<configuration>
<tagBase>http://svn.xxx.lan/svn/howto-release/tag</tagBase>
<username>release-user</username>
<password>release-user</password>
</configuration>
</plugin>
</plugins>
</build>
The maven goal definition in Bamboo looks like this:
clean install -Dci.version=${bamboo.custom.brmp.name}
By now, I want to use the incorporated feature "release management" as described here:
http://www.youtube.com/watch?v=OH-Iq4z8Mj8
http://blogs.atlassian.com/2010/09/bamboo_jira_release_management_plugin_part_2/
Unfortunately, it doesn't work as described by the Atlassian guys. There is no "enable release management" checkbox in "Plan configuration / Miscellaneous".
I have also troubles with the Release button in JIRA which currently issues an error message like this: "Bamboo server has not returned any Plans."
Does anybody know how to deal with the automated release management plugin in Bamboo and what I have to do to get the options visible (see "Plan configuration / Miscellaneous")?
I'm running the following versions of JIRA and Bamboo:
JIRA version v5.1.5#784-sha1:6c72993
Bamboo Version 4.4.0 build 3501 - 28 Jan 13.
Your help is appreciated!!
Thank you.
Katarina

Resources