Usage: java cucumber.api.cli.Main [options] [ [FILE|DIR][:LINE[:LINE]*] ]+
Options:
-g, --glue PATH Where glue code (step definitions and hooks) is loaded from.
-f, --format FORMAT[:PATH_OR_URL] How to format results. Goes to STDOUT unless PATH_OR_URL is specified.
Built-in FORMAT types: junit, html, pretty, progress, json.
FORMAT can also be a fully qualified class name.
-t, --tags TAG_EXPRESSION Only run scenarios tagged with tags matching TAG_EXPRESSION.
-n, --name REGEXP Only run scenarios whose names match REGEXP.
-d, --[no-]-dry-run Skip execution of glue code.
-m, --[no-]-monochrome Don't colour terminal output.
-s, --[no-]-strict Treat undefined and pending steps as errors.
--snippets Snippet name: underscore, camelcase
--dotcucumber PATH_OR_URL Where to write out runtime information. PATH_OR_URL can be a file system
path or a URL.
-v, --version Print version.
-h, --help You're looking at it.
Exception in thread "main" cucumber.runtime.CucumberException: Unknown option: --plugin
at cucumber.runtime.RuntimeOptions.parse(RuntimeOptions.java:119)
at cucumber.runtime.RuntimeOptions.<init>(RuntimeOptions.java:50)
at cucumber.runtime.RuntimeOptions.<init>(RuntimeOptions.java:44)
at cucumber.api.cli.Main.run(Main.java:20)
at cucumber.api.cli.Main.main(Main.java:16)
I am getting this error during running my feature file.
PoM dependency is given below and I am using 3.2.4 Spring version with cucumber veriosn 1.1.5
Looks like you are using a very old version of cucumber-jvm that is looking for
--format
instead of
--plugin
The latest cucumber-jvm usage text can be found here.
Get the latest cucumber-jvm from the Maven repository as described here or here.
Either its related with JAR library(ies) version mismatch or plugin(cucumber-eclipse-plugin) mismatch.
look Here: https://groups.google.com/forum/#!topic/cukes/1urjr3ASq78
I was also facing similar issue. The cucumber jars were old. So I updated it as below using maven repository
<properties>
<cucumber.version>6.10.3</cucumber.version>
</properties>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>${cucumber.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>${cucumber.version}</version>
<scope>test</scope>
</dependency>
NOTE: you can find the latest cucumber java in maven repository at this link Cucumber jars maven dependency
Related
I am building spark 2.3 scala code using maven , giving following error.
error: missing or invalid dependency detected while loading class file SparkSession.class.
This is snippet of pom file, please advise
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.3.0</version>
</dependency>
You might want to check your Java and Scala versions. They should be 1.6 or higher and 2.11 respectively. Or it could also be a mismatch with other dependencies like spark_sql.Make sure you have same version across all dependencies.
I try to work with spark-sql but I had the following errors :
error: missing or invalid dependency detected while loading class file
'package.class'. Could not access term annotation in package
org.apache.spark, because it (or its dependencies) are missing. Check
your build definition for missing or conflicting dependencies. (Re-run
with -Ylog-classpath to see the problematic classpath.) A full
rebuild may help if 'package.class' was compiled against an
incompatible version of org.apache.spark. warning: Class
org.apache.spark.annotation.InterfaceStability not found - continuing
with a stub. error: missing or invalid dependency detected while
loading class file 'SparkSession.class'. Could not access term
annotation in package org.apache.spark, because it (or its
dependencies) are missing. Check your build definition for missing or
conflicting dependencies. (Re-run with -Ylog-classpath to see the
problematic classpath.) A full rebuild may help if
'SparkSession.class' was compiled against an incompatible version of
org.apache.spark.
My configuration :
Scala 2.11.8
Spark-core_2.11-2.1.0
Spark-sql_2.11-2.1.0
Note: I use SparkSession.
After dig into the error message, I know how to solve this kind of errors.
For example:
Error - Symbol 'term org.apache.spark.annotation' is missing... A full rebuild may help if 'SparkSession.class' was compiled against an incompatible version of org.apache.spark
Open SparkSession.class, search "import org.apache.spark.annotation.", you will find import org.apache.spark.annotation.{DeveloperApi, Experimental, InterfaceStability}. It's sure that these classes is missing in classpath. You'll need to find the artifact which conclude these classes.
So open https://search.maven.org and search with c:"DeveloperApi" AND g:"org.apache.spark", you will find the missing artifact is spark-tags as #Prakash answered.
In my situation, just add dependencies spark-catalyst and spark-tags in pom.xml works.
But it's weird that why maven not auto resolve transitive dependencies here?
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>
If I use the above depencency, only spark-core_2.11-2.2.0.jar is in maven dependency; While if I change version to 2.1.0 or 2.3.0, all transitive dependencies will be there.
You need to include following artifacts to avoid the dependency issues.
spark-unsafe_2.11-2.1.1
spark-tags_2.11-2.1.1
I am building a project using the maven-jaxb2-plugin
<groupId>org.jvnet.jaxb2.maven2</groupId>
<artifactId>maven-jaxb2-plugin</artifactId>
I set the episode parameter
<episode>true</episode>
However I see that no .episode file is being generated. I need this file so that this project can be used as an episode in another dependent project.
In the maven build logs I do see these logs which indicate that the episode & episodeFile parameter supported by xjc are not supported by the plugin:
[WARNING] Unknown plugin option: -episode
[WARNING] Unknown plugin option: C:\projecdir\src\main\generated-sources\META-INF\sun-jaxb.episode
Is this a bug in the plugin? Also is there any known workaround for this.
Update: Something strange happens when useDependenciesAsEpisodes is not false
useDependenciesAsEpisodes = true
When this was true it shows the warning above. This build works but no episode file is generated.
useDependenciesAsEpisodes = false
When I set this to false the episode file is generated.
However I had some additional arguments passed for a plugin which break the build.
<args>
<arg>-typeId=7000</arg>
...
</args>
Error:
Caused by: com.sun.tools.xjc.BadCommandLineException: unrecognized parameter -typeId=7000
at com.sun.tools.xjc.Options.parseArguments(Options.java:817)
at org.jvnet.mjiip.v_2.OptionsFactory.createOptions(OptionsFactory.java:91)
... 24 more
I have to remove the additional args for this to work. However I need those arguments for some jaxb plugins (like shown here) used along with maven-jaxb2-plugin
<configuration>
<extension>true</extension>
<args>
<arg>-XtoString</arg>
<arg>-Xequals</arg>
<arg>-XhashCode</arg>
<arg>-Xcopyable</arg>
</args>
<plugins>
<plugin>
<groupId>org.jvnet.jaxb2_commons</groupId>
<artifactId>jaxb2-basics</artifactId>
<version><!-- version --></version>
</plugin>
</plugins>
</configuration>
Author of the maven-jaxb2-plugin and jaxb2-basics here.
Episode generation is tested quite thoroughly on each release, so I'm pretty sure it works.
useDependenciesAsEpisodes just adds all your dependency JARs as "episode" JARs, this should not influence episode generation.
-typeId=7000 is a bit suspicious, which XJC plugin processes this command? If none, this may potentially interfere with -episode, so this is where I'd look first.
What really helps is to see the mvn clean install -e -X log. It should lok the arguments finally passed to XJC, so there you could spot irregularities.
If nothing helps, put a minimal reproducing test project together and commint/send me a PR for this project in https://github.com/highsource/maven-jaxb2-plugin-support ex. under e/episode-file-not-being-generated.
Trying to use Spock via GMaven (Maven 3.1.1) with Groovy 2.3 support and I am having difficulty getting SNAPSHOT dependency. I seem to have same error even when I try to run Spock Example project that has similar dependency defined.
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-core</artifactId>
<version>1.0-groovy-2.3-SNAPSHOT</version>
<scope>test</scope>
</dependency>
I have the SNAPSHOT repository specified like the Spock Example does:
<repositories>
<!-- Only required if a snapshot version of Spock is used -->
<repository>
<id>spock-snapshots</id>
<url>http://oss.sonatype.org/content/repositories/snapshots/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
But even when I run mvn clean test for Spock Example I get:
ERROR] Failed to execute goal on project spock-example: Could not resolve dependencies for project org.spockframework:spock-example:jar:1.0-SNAPSHOT: Failure to find org.spockframework:spock-core:jar:1.0-groovy-2.3-SNAPSHOT in was cached in the local repository, resolution will not be reattempted until the update interval of nexus_sprn has elapsed or updates are forced -> [Help 1]
I succeed if I simply use 0.7-groovy-2.0 version but I want Groovy 2.3 since it appears #CompileStatic does not work properly for my project in Groovy 2.0.
EDIT:
Just notice a warning happening just before the build fails:
[WARNING] The POM for org.spockframework:spock-core:jar:1.0-groovy-2.3-SNAPSHOT is missing, no dependency information available
So based on Mr. Niederwiesser's comment I found that the settings.xml my current project requires uses a mirror that does not know about the Spock SNAPSHOT location. In addition to re-configuring my company's proxy setting I had to do the following in my global settings.xml to make it not use the mirror.
<mirrors>
<mirror>
<id>nexus_sprn</id>
<mirrorOf>*,!nexus_public,!project-lib-dir,!spock-snapshots</mirrorOf>
<url>MIRROR_URL_REMOVED</url>
</mirror>
</mirrors>
...
<repository>
<id>spock-snapshots</id>
<url>http://oss.sonatype.org/content/repositories/snapshots/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
Not sure proper etiquette here but I will leave this answer here for future unless the general consensus is it is unnecessary.
NOTE: Please do not comment on all the perils of cross-compiling. Thank you.
I have a situation where we need to have Java 6 source compiled for a Java 5 JVM (to be sure that JAX-WS usage is correct). Previously we have done this with ant ant script (which apparently can), but after migrating to Maven we have found that it ends up with javac complaining:
$ javac -source 1.6 -target 1.5
javac: source release 1.6 requires target release 1.6
Is there any Java distribution for Linux (Ubuntu 11.10, x86) where the javac can do this?
EDIT: It appears not, as the limitation is in javac which is the same. The solution (which made this need go away) was to change from the default javac compiler to the eclipse compiler in maven-compiler-plugin.
EDIT: I've found that the Eclipse compiler generates byte code for anonymous inner classes that the javadoc utility disagrees with. I am preparing a bug report for this issue.
According to the documentation (Java 5, Java 6), the Oracle SDK should be able to do this when you follow the instructions in the Cross-Compilation Example.
Java 6 should support any version between 1.3 to 1.6 as -target; it doesn't say anything what happens when you use generics and other "compatible" features in the source. The compiler should be able to strip them.
Another culprit in the game might be javac: The compiler might be able to handle this set of arguments but the command line tool might take offense.
In this case, write your own command line using the Java Compiler API. That might allow to pull of some tricks that you can't achieve otherwise.
You can also try the Eclipse compiler (see "Using the batch compiler").
This might fail because of how Java works: Java X code can run on Java Y as long as X <= Y. So while you can easily compile Java 1.4 code for a Java 6 VM, the reverse is not always true.
If everything else fails, write a preprocessor that reads the source and strips unsupported elements (like #Override on interfaces). As long as you compile the code with the annotations once in a while with Java 6, the converted code should be safe as well (unless your code stripper has a bug ...)
This answer is an implementation of what #ThorbjørnRavnAndersen explained in the comments as a solution. Using the example code from here, and fixing a few typos, I was able to come up with an example using Eclipse Compiler.
Calculator.java
package example;
// there needs to be a package to avoid A "#WebService.targetNamespace must be specified on classes with no package"
// when running this
import javax.jws.WebService;
import javax.jws.WebMethod;
import javax.xml.ws.Endpoint;
#WebService
public class Calculator {
#WebMethod
public int add(int a, int b) {
return a+b;
}
public static void main(String[] args){
// create and publish an endpoint
Calculator calculator = new Calculator();
Endpoint endpoint = Endpoint.publish("http://localhost:8080/calculator", calculator);
}
}
pom.xml
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>fi.eis.applications</groupId>
<artifactId>ws-calculator</artifactId>
<version>1.0-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.6</source>
<target>1.5</target>
<compilerId>eclipse</compilerId>
</configuration>
<dependencies>
<dependency>
<groupId>org.codehaus.plexus</groupId>
<artifactId>plexus-compiler-eclipse</artifactId>
<version>2.6</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
</project>
That you can compile with mvn clean compile and then run with java Calculator on the target/classes/example folder. It will start up a web service on port 8080, which you can test with url http://localhost:8080/calculator?wsdl on your browser.