io.github.bonigarcia.wdm.online.httpclient- Error HTTP 403 executing - webdrivermanager-java

While I am replacing traditional way to initiate browser instance with WebDriverManager library, getting this error- io.github.bonigarcia.wdm.online.httpclient- Error HTTP 403 executing
imported dependency-
<dependency>
<groupId>io.github.bonigarcia</groupId>
<artifactId>webdrivermanager</artifactId>
<version>5.3.0</version>
<scope>test</scope>
</dependency>

Related

Azure blob storage client in java intellij not initializing

Running into an error while initializing context for an azure blob container storage..
Would greatly appreciate any help
>org.apache.catalina.core.StandardContext.listenerStart Exception sending context initialized event to listener instance of class [listeners.AzureBackupManagerContextListener]
java.lang.NoClassDefFoundError:
[....]
Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.databind.AnnotationIntrospector$XmlExtensions
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1412)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1220)
... 76 more
I also hit this issue and was able to solve this with adding fasterxml jackson dependencies:
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.13.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.13.3</version>
</dependency>

Cucumber: NoSuchMethodError: cucumber.runtime.formatter.Plugins

I am getting the exception in subject on IntelliJ and have no idea why.
This is the runner file:
#RunWith(Cucumber.class)
#CucumberOptions(
features = "src/main/resources/features",
glue = {"DemoDefinitions"},
tags = "#tests"
)
public class CucumberRunner {}
This is the definitions file:
import cucumber.api.java.en.Given;
public class DemoDefinitions {
#Given("Login to Azure Succeeded")
public void login_to_Azure_Succeeded() {
// Write code here that turns the phrase above into concrete actions
throw new cucumber.api.PendingException();
}
}
This is the feature file:
#tests
Feature: PoC Feature
Scenario: PoC Operations Scenario
Given Login to Azure Succeeded
And the maven dependencies are:
<dependencies>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-core</artifactId>
<version>4.4.0</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.2.6</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.2.6</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
</dependency>
</dependencies>
When I run execute the runner class I get:
java.lang.NoSuchMethodError:
cucumber.runtime.formatter.Plugins.(Ljava/lang/ClassLoader;Lcucumber/runtime/formatter/PluginFactory;Lcucumber/api/event/EventPublisher;Lio/cucumber/core/options/PluginOptions;)V
When I run the feature file itself I get:
Undefined scenarios:
/C:/Users/talt/IdeaProjects/Poc/src/main/resources/features/poc.feature:5
PoC Operations Scenario
1 Scenarios (1 undefined) 1 Steps (1 undefined)
Can you please advise what is the problem?
You are mixing different versions of Cucumber. Take a careful look at the version numbers.
You are also including more dependencies then strictly needed. Merely using cucumber-java and cucumber-junit would be sufficient. Both cucumber-core and junit are transitive dependencies.
After you fix your dependencies makes sure to reimport the maven project.
<dependencies>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-java</artifactId>
<version>4.4.0</version>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.4.0</version>
</dependency>
</dependencies>
The solution was to set glue to empty string:
glue = {""}
We shall not mix direct & transitive dependencies, specially their versions! Doing so can cause unpredictable outcome.Below are few errors being reported by people due to wrong use of dependencies.
The import cucumber.api.junit cannot be resolved
java.lang.NoClassDefFoundError: gherkin/IGherkinDialectProvider
import cucumber.api.DataTable; cannot be resolved
Solution: You shall add right set of cucumber dependencies.
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-junit</artifactId>
<version>4.2.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>4.2.6</version>
<scope>test</scope>
</dependency>
Second you could say we did not have correct path pointing to glue. But just making it empty shall not be one of the solution even though it worked. We shall have correct path here, not empty string.

I'm trying to implement the Lapis JSF Exporter use PrimeFaces 5.3

In 2.1.20 version jsf error occurs and the Project NOT rises
Abr 26, 2016 5:29:21 PM org.apache.catalina.loader.WebappClassLoaderBase checkThreadLocalMapForLeaks GRAVE: The web application [] created a ThreadLocal with key of type [javax.faces.context.FacesContext$1] (value [javax.faces.context.FacesContext$1#1e0429c3]) and a value of type [org.apache.myfaces.context.servlet.StartupFacesContextImpl] (value [org.apache.myfaces.context.servlet.StartupFacesContextImpl#2351d09a]) but failed to remove it when the web application was stopped. Threads are going to be renewed over time to try and avoid a probable memory leak
In the 2.1.28 version jsf the project up and Using the JSF Lapis Exporter occurs Error:
java.lang.UnsupportedOperationException
Fonte: https://github.com/rdicroce/jsfexporter
After verification , I identified that in my pom.xml file was using the premises " api and impl " MyFaces with the " api and impl " JSF is conflict , after removing the jsf dependencies work.
MY pom.xml...
<dependency>
<groupId>org.apache.myfaces.core</groupId>
<artifactId>myfaces-api</artifactId>
<version>${myfaces.version}</version>
</dependency>
<dependency>
<groupId>org.apache.myfaces.core</groupId>
<artifactId>myfaces-impl</artifactId>
<version>${myfaces.version}</version>
</dependency>
comment or remove...
<dependency>
<groupId>com.sun.faces</groupId>
<artifactId>jsf-api</artifactId>
<version>2.1.28</version>
</dependency>
<dependency>
<groupId>com.sun.faces</groupId>
<artifactId>jsf-impl</artifactId>
<version>2.1.28</version>
</dependency>

Apache Spark: Streaming without HDFS checkpoint

I'm implementing a Spark job which makes use of reduceByKeyAndWindow, therefore I need to add checkpointing.
From Spark's website I see that:
Checkpointing can be enabled by setting a directory in a fault-tolerant, reliable file system (e.g., HDFS, S3, etc.) to which the checkpoint information will be saved.
My application is just for academic purposes, thus I don't want to set an HDFS for checkpointing but just a local file. Doing so in MacOS works fine (setting a temporary dir as checkpoint dir), the problem comes when doing it in Windows, which throws an exception for permissions.
I already tried starting eclipse as administrator and creating the directory manually setting setWritable, setReadable and setExecutable to true. Any hint on how to overcome the problem in Windows?
Thanks!
Update Here's my code and exception. Just to clarify again, it works fine in Mac but not in Windows.
SparkConf conf = new SparkConf().setAppName("testApp").setMaster("local[2]");
JavaSparkContext ctx = new JavaSparkContext(conf);
JavaStreamingContext jsc = new JavaStreamingContext(ctx, new Duration(1000));
jsc.checkpoint(Files.createTempDir().getAbsolutePath());
Exception:
Exception in thread "pool-7-thread-3" java.lang.NullPointerException
at java.lang.ProcessBuilder.start(Unknown Source)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:404)
at org.apache.hadoop.util.Shell.run(Shell.java:379)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:678)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:661)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:639)
at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:424)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:905)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:886)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:783)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:772)
at org.apache.spark.streaming.CheckpointWriter$CheckpointWriteHandler.run(Checkpoint.scala:135)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Solved by adding the latest Hadoop libraries to my project.
If using Maven, the following set of dependencies do the trick.
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-twitter_2.11</artifactId>
<version>1.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
</dependency>
On Windows, you can solve this problem as following
Download winutils.exe to a folder say MY_UTILS/bin
Create an environment variable HADOOP_HOME and point it to MY_UTILS

java.lang.NoClassDefFoundError: org/atmosphere/cpr/AsyncSupportListenerAdapter

I'm trying to push a message from server to client that way :
PushContext pushContext = PushContextFactory.getDefault().getPushContext();
pushContext.push("/registrationEvent", "There was another registration");
My problem is that i've the following error
ava.lang.NoClassDefFoundError: Could not initialize class org.primefaces.push.PushContextFactory
But I think that this is due to a issue at the initialization of the projet :
java.lang.NoClassDefFoundError: org/atmosphere/cpr/AsyncSupportListenerAdapter
I've try to add jar atmosphere file... Without success. Have I done something wrong ? I'm using glassfish 3.1.
Thanks !
Primefaces MigrationGuide inform: "PrimeFaces Push is reimplemented, PushContext is deprecated, use EventBus instead along with the new Push API."
In this case, on pom.xml, put the 2.2.1 atmosphere version. Probably you are using a old atmosphere version. If you trying to use the Primefaces 5.0, put the code below:
<dependency>
<groupId>org.primefaces</groupId>
<artifactId>primefaces</artifactId>
<version>5.0</version>
</dependency>
<dependency>
<groupId>org.primefaces.extensions</groupId>
<artifactId>primefaces-extensions</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.atmosphere</groupId>
<artifactId>atmosphere-runtime</artifactId>
<version>2.2.1</version>
</dependency>

Resources