Cucumber: NoSuchMethodError: cucumber.runtime.formatter.Plugins - cucumber

I am getting the exception in subject on IntelliJ and have no idea why.
This is the runner file:
#RunWith(Cucumber.class)
#CucumberOptions(
features = "src/main/resources/features",
glue = {"DemoDefinitions"},
tags = "#tests"
)
public class CucumberRunner {}
This is the definitions file:
import cucumber.api.java.en.Given;
public class DemoDefinitions {
#Given("Login to Azure Succeeded")
public void login_to_Azure_Succeeded() {
// Write code here that turns the phrase above into concrete actions
throw new cucumber.api.PendingException();
}
}
This is the feature file:
#tests
Feature: PoC Feature
Scenario: PoC Operations Scenario
Given Login to Azure Succeeded
And the maven dependencies are:
<dependencies>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-core</artifactId>
<version>4.4.0</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.2.6</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.2.6</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
</dependencies>
When I run execute the runner class I get:
java.lang.NoSuchMethodError:
cucumber.runtime.formatter.Plugins.(Ljava/lang/ClassLoader;Lcucumber/runtime/formatter/PluginFactory;Lcucumber/api/event/EventPublisher;Lio/cucumber/core/options/PluginOptions;)V
When I run the feature file itself I get:
Undefined scenarios:
/C:/Users/talt/IdeaProjects/Poc/src/main/resources/features/poc.feature:5
PoC Operations Scenario
1 Scenarios (1 undefined) 1 Steps (1 undefined)
Can you please advise what is the problem?

You are mixing different versions of Cucumber. Take a careful look at the version numbers.
You are also including more dependencies then strictly needed. Merely using cucumber-java and cucumber-junit would be sufficient. Both cucumber-core and junit are transitive dependencies.
After you fix your dependencies makes sure to reimport the maven project.
<dependencies>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.4.0</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.4.0</version>
</dependency>
</dependencies>

The solution was to set glue to empty string:
glue = {""}

We shall not mix direct & transitive dependencies, specially their versions! Doing so can cause unpredictable outcome.Below are few errors being reported by people due to wrong use of dependencies.
The import cucumber.api.junit cannot be resolved
java.lang.NoClassDefFoundError: gherkin/IGherkinDialectProvider
import cucumber.api.DataTable; cannot be resolved
Solution: You shall add right set of cucumber dependencies.
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.2.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>4.2.6</version>
<scope>test</scope>
</dependency>
Second you could say we did not have correct path pointing to glue. But just making it empty shall not be one of the solution even though it worked. We shall have correct path here, not empty string.

Related

Spark error with google/guava library: java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.refreshAfterWrite

I have a simple spark project - in which in the pom.xml the dependencies are only the basic scala, scalatest/junit, and spark:
<dependency>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_${scala.binary.version}</artifactId>
<version>3.0.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<scope>compile</scope>
</dependency>
</dependencies>
When attempting to run a basic spark program the SparkSession init fails on this line:
SparkSession.builder.master(master).appName("sparkApp").getOrCreate
Here is the output / error:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
18/04/07 18:06:15 INFO SparkContext: Running Spark version 2.2.1
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder
.refreshAfterWrite(JLjava/util/concurrent/TimeUnit;)
Lcom/google/common/cache/CacheBuilder;
at org.apache.hadoop.security.Groups.<init>(Groups.java:96)
at org.apache.hadoop.security.Groups.<init>(Groups.java:73)
at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2424)
at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2424)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2424)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:295)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910)
I have run spark locally many dozens of times on other projects, what might be wrong with this simple one? Is there a dependency on $HADOOP_HOME environment variable or similar?
Update By downgrading the spark version to 2.0.1 I was able to compile. That does not fix the problem (we need newer) version. But it helps point out the source of the problem
Another update In a different project the hack to downgrade to 2.0.1 does help - i.e. execution proceeds further : but then when writing out to parquet a similar exception does happen.
8/05/07 11:26:11 ERROR Executor: Exception in task 0.0 in stage 2741.0 (TID 2618)
java.lang.NoSuchMethodError: com.google.common.cache.CacheBuilder.build(Lcom/google/common/cache/CacheLoader;)Lcom/google/common/cache/LoadingCache;
at org.apache.hadoop.io.compress.CodecPool.createCache(CodecPool.java:62)
at org.apache.hadoop.io.compress.CodecPool.<clinit>(CodecPool.java:74)
at org.apache.parquet.hadoop.CodecFactory$BytesCompressor.<init>(CodecFactory.java:92)
at org.apache.parquet.hadoop.CodecFactory.getCompressor(CodecFactory.java:169)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:303)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:262)
at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.<init>(ParquetFileFormat.scala:562)
at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anon$1.newInstance(ParquetFileFormat.scala:139)
at org.apache.spark.sql.execution.datasources.BaseWriterContainer.newOutputWriter(WriterContainer.scala:131)
at org.apache.spark.sql.execution.datasources.DefaultWriterContainer.writeRows(WriterContainer.scala:247)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(InsertIntoHadoopFsRelationCommand.scala:143)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(InsertIntoHadoopFsRelationCommand.scala:143)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:86)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
This error occurs due to version mismatch between Google's guava library and Spark.
Spark shades guava but many libraries use guava. You can try Shading the Guava dependencies as per this post.
Apache-Spark-User-List
Adding shade plugin to your pom file and relocating google package can resolve this issue.
More information can found here and here
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>com.google.common</pattern>
<shadedPattern>shade.com.google.common</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
If this also doesn't help then adding guava library of version 15.0 works nicely. The reason of this work around is in dependencyManagement. The nice SO answer is here
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>15.0</version>
</dependency>
</dependencies>
</dependencyManagement>
I am getting this error in spring boot : java.lang.TypeNotPresentException: Type com.google.common.cache.CacheBuilderSpec
com.google.common.cache.CacheBuilder.build()Lcom/google/common/cache/Cache
The issue is due to "com.google.guava:guava" api. In springboot this api comes under some other api might be "spring-boot-starter-web" or "springfox-swagger2" api so we need to first exclude guava api from springfox-swagger2 jar and need to add updated version of guava api.spring-data-mongodb
Solution:
1. add guava dependency on the top of all the dependency so that springboot can ge the latest version:
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>19.0</version>
</dependency>
Find out the spring boot dependecy where artifactId: "guava" included then exlude "guava" artifact from that dependency and then add the guava dependency like above.

Apache Spark: Streaming without HDFS checkpoint

I'm implementing a Spark job which makes use of reduceByKeyAndWindow, therefore I need to add checkpointing.
From Spark's website I see that:
Checkpointing can be enabled by setting a directory in a fault-tolerant, reliable file system (e.g., HDFS, S3, etc.) to which the checkpoint information will be saved.
My application is just for academic purposes, thus I don't want to set an HDFS for checkpointing but just a local file. Doing so in MacOS works fine (setting a temporary dir as checkpoint dir), the problem comes when doing it in Windows, which throws an exception for permissions.
I already tried starting eclipse as administrator and creating the directory manually setting setWritable, setReadable and setExecutable to true. Any hint on how to overcome the problem in Windows?
Thanks!
Update Here's my code and exception. Just to clarify again, it works fine in Mac but not in Windows.
SparkConf conf = new SparkConf().setAppName("testApp").setMaster("local[2]");
JavaSparkContext ctx = new JavaSparkContext(conf);
JavaStreamingContext jsc = new JavaStreamingContext(ctx, new Duration(1000));
jsc.checkpoint(Files.createTempDir().getAbsolutePath());
Exception:
Exception in thread "pool-7-thread-3" java.lang.NullPointerException
at java.lang.ProcessBuilder.start(Unknown Source)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:404)
at org.apache.hadoop.util.Shell.run(Shell.java:379)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:678)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:661)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:424)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:905)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:783)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:772)
at org.apache.spark.streaming.CheckpointWriter$CheckpointWriteHandler.run(Checkpoint.scala:135)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Solved by adding the latest Hadoop libraries to my project.
If using Maven, the following set of dependencies do the trick.
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-twitter_2.11</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
</dependency>
On Windows, you can solve this problem as following
Download winutils.exe to a folder say MY_UTILS/bin
Create an environment variable HADOOP_HOME and point it to MY_UTILS

Primefaces upgrade to 4.0 (or 5.0) version cause Property not found on type

After have upgraded primefaces form 2.2.1 to 4.0 version (same with 5.0 version), when navigating to /abtest.xhtml page deployed on tomcat 7, getting the following error :
javax.servlet.ServletException: /abtest.xhtml: Property 'onFlowProcess' not found on type com.web.faces.abtest.ABTestWizard
javax.faces.webapp.FacesServlet.service(FacesServlet.java:325)
In pom.xml have:
<dependency>
<groupId>org.primefaces</groupId>
<artifactId>primefaces</artifactId>
<version>4.0</version>
</dependency>
<dependency>
<groupId>com.sun.faces</groupId>
<artifactId>jsf-api</artifactId>
<version>2.2.7</version>
</dependency>
<dependency>
<groupId>com.sun.faces</groupId>
<artifactId>jsf-impl</artifactId>
<version>2.2.7</version>
</dependency>
In abtest.xhtml have:
<ui:composition xmlns="http://www.w3.org/1999/xhtml"
...
xmlns:p="http://primefaces.prime.com.tr/ui"
...
<p:wizard id="abtestwizard" widgetVar="wiz" flowListener="#{abwizard.onFlowProcess}" showStepStatus="false" effect="false" showNavBar="false" onback="jQuery('.equalHeight').equalHeights();" onnext="jQuery('.equalHeight').equalHeights();">
And in com.web.faces.abtest.ABTestWizard:
#Component("abwizard")
#Scope("view")
public class ABTestWizard implements Serializable {
...
public String onFlowProcess(org.primefaces.event.FlowEvent event) { ...
}
...
}
The flowListener attribute of the p:wizard tag seems to be set properly, so no idea why this error occurs.
Would appreciate any any idea or hint to solve the problem.
Do you have this in your pom.xml as well?
<repository>
<id>prime-repo</id>
<name>PrimeFaces Maven Repository</name>
<url>http://repository.primefaces.org</url>
<layout>default</layout>
</repository>
You also might be missing some imports in your bean. Which imports do you have listed?

How to overcome maven org.eclipse.persistence JAR signature error

Im trying to add the modelgen jar to my maven project to enable automatic Metamodel class generation however I am getting the following error when I try to compile the project:
Failed to execute goal org.bsc.maven:maven-processor-plugin:2.2.4:process (process) on project Project-per: Error executing: java.lang.RuntimeException: java.lang.SecurityException: class "org.eclipse.persistence.internal.jpa.modelgen.CanonicalModelProperties"'s signer information does not match signer information of other classes in the same package
As best as I can determine there is most likely a miss match in the JAR signatures causing this. However I don't understand why or how to fix the problem.
My POM (Relevant parts only)
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>eclipselink</artifactId>
<version>2.5.0-RC2</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.nosql</artifactId>
<version>2.5.0-RC2</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.jpa.modelgen</artifactId>
<version>2.4.2</version>
</dependency>
NB. Everything works fine with the nosql and the eclipselink, its just when I add the modelgen.
OK, as best I can tell I was right, the problem was caused by a signature difference in the JAR files. How this can happen on a library as popular as eclipselink is beyond me but hey.
My solution was to use a snapshot as they are generally not signed. Given that I was unable to obtain snapshot from central I used the following repo to obtain it;
<repository>
<id>sonatype-snapshots</id>
<url>https://oss.sonatype.org/content/repositories/snapshots/</url>
</repository>
<dependency>
<groupId>org.eclipse.persistence</groupId>
<artifactId>org.eclipse.persistence.jpa.modelgen</artifactId>
<version>2.6.0-SNAPSHOT</version>
</dependency>
Add following dependency into pom
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.bsc.maven</groupId>
<artifactId>maven-processor-plugin</artifactId>
<versionRange>[2.2.4,)</versionRange>
<goals>
<goal>process</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore />
</action>
</pluginExecution>

java.lang.NoClassDefFoundError: org/atmosphere/cpr/AsyncSupportListenerAdapter

I'm trying to push a message from server to client that way :
PushContext pushContext = PushContextFactory.getDefault().getPushContext();
pushContext.push("/registrationEvent", "There was another registration");
My problem is that i've the following error
ava.lang.NoClassDefFoundError: Could not initialize class org.primefaces.push.PushContextFactory
But I think that this is due to a issue at the initialization of the projet :
java.lang.NoClassDefFoundError: org/atmosphere/cpr/AsyncSupportListenerAdapter
I've try to add jar atmosphere file... Without success. Have I done something wrong ? I'm using glassfish 3.1.
Thanks !
Primefaces MigrationGuide inform: "PrimeFaces Push is reimplemented, PushContext is deprecated, use EventBus instead along with the new Push API."
In this case, on pom.xml, put the 2.2.1 atmosphere version. Probably you are using a old atmosphere version. If you trying to use the Primefaces 5.0, put the code below:
<dependency>
<groupId>org.primefaces</groupId>
<artifactId>primefaces</artifactId>
<version>5.0</version>
</dependency>
<dependency>
<groupId>org.primefaces.extensions</groupId>
<artifactId>primefaces-extensions</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.atmosphere</groupId>
<artifactId>atmosphere-runtime</artifactId>
<version>2.2.1</version>
</dependency>

Resources