Migration Jave 11 _ How to resolve : java.lang.ClassNotFoundException: javax.activation.MimeTypeParseException - jaxb

I'm migrating my java project from 8 to 11 version.
As javax.activation has been removed from JDK 11, so I added it to Maven pom.xml file :
<dependency>
<groupId>javax.activation</groupId>
<artifactId>activation</artifactId>
<version>1.1.1</version>
</dependency>
The dependency is listed in "Externat librairies" but every time I run "mvn clean install" to build my project, I get this error :
...
Caused by: java.lang.ClassNotFoundException: javax.activation.MimeTypeParseException
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50)
at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:271)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:247)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:239)
... 51 more
Have you any idea please ?
Thanks

Related

Jhipster with jdk-11

I created a project with JHipster 6.0.0-beta.0 and java version "11.0.2" 2019-01-15 LTS
I get the following error.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:compile (default-compile) on project mx: Fatal error compiling: java.lang.NoClassDefFoundError: javax/xml/bind/JAXBException: javax.xml.bind.JAXBException -> [Help 1]
Any suggestions how to solve it?
The solution is:
Add the following under <!-- For JPA static metamodel generation -->
<path>
<groupId>org.glassfish.jaxb</groupId>
<artifactId>jaxb-runtime</artifactId>
<version>${jaxb-runtime.version}</version>
</path>
In your pom.xml what is the value of balise Java. Version

jdk 11 - package javax.xml.bind.annotation is declared in the unnamed module, but module javax.xml.bind.annotation does not read it

I red all other topics (maybe work for jdk 10) and still have problem to run javax.xml.bind on jdk 11.
My dependency:
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.4.0-b180830.0359</version> <!-- 2.2.12, 2.3.1 -->
</dependency>
<dependency>
<groupId>org.glassfish.jaxb</groupId>
<artifactId>jaxb-runtime</artifactId>
<version>2.4.0-b180830.0438</version> <!-- 2.3.1 -->
<!--<scope>compile</scope>-->
<!--<scope>runtime</scope>-->
</dependency>
<dependency>
<groupId>javax.activation</groupId>
<artifactId>javax.activation-api</artifactId>
<version>1.2.0</version>
</dependency>
<dependency>
<groupId>javax.annotation</groupId>
<artifactId>javax.annotation-api</artifactId>
<version>1.3.2</version>
</dependency>
My IDE (newest intellij) still doesn't see import:
import javax.xml.bind.annotation.XmlAccessType;
Also my module-info.java doesnt see:
requires java.xml.bind;
And when i use maven i have an error:
[ERROR] ...ExceptionType.java:[6,22] package javax.xml.bind.annotation is not visible
(package javax.xml.bind.annotation is declared in the unnamed module, but module javax.xml.bind.annotation does not read it)
anybody run it on jdk 11 ?
edit:
1) App version with java module system:
i have similar problem to this:
https://youtrack.jetbrains.com/issue/IDEA-197956
but my error message is a little difrent
2) App version without java module system:
Maven compile correctly. Problem is that intellij dont see my imports:
import javax.xml.bind.JAXBContext;
import javax.xml.bind.JAXBException;
import javax.xml.bind.Marshaller;
import javax.xml.bind.Unmarshaller;
and say: cannot resolve symbol.
Also at file > project structure > modules > dependencies intellij didnt add javax.xml.bind and org.glassfish.jaxb
Why?
I added this in pom:
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.4.0-b180830.0359</version> <!-- 2.2.8, 2.4.0-b180830.0359 -->
</dependency>
<dependency>
<groupId>org.glassfish.jaxb</groupId>
<artifactId>jaxb-runtime</artifactId>
<version>2.4.0-b180830.0438</version> <!-- 2.3.0, 2.4.0-b180830.0438 -->
</dependency>
JDK: openjdk11
Intellij version: IntelliJ IDEA 2018.2.4 (Ultimate Edition), Build #IU-182.4505.22, built on September 18, 2018, JRE: 1.8.0_152-release-1248-b8 amd64
JVM: OpenJDK 64-Bit Server VM by JetBrains s.r.o
Maven:3.5.4
It seems to be a intellij bug.
Switched back to JDK 1.8 and it fixed this issue for me. How to switch in ubuntu: https://askubuntu.com/questions/740757/switch-between-multiple-java-versions
We managed to find the cause of the problem. In ide intellij I have a modules maven project and I use JLupin Platform Development Tool, so in *.iml files i dont have line:
'<'module org.jetbrains.idea.maven.project.MavenProjectsManager.isMavenModule = "true" type = "JAVA_MODULE" version = "4">
but i have for example:
'<'module org.jetbrains.idea.maven.project.MavenProjectsManager.isMavenModule = "true" type = "JLP_NATIVE_MICROSERVICE_IMPLEMENTATION_MODULE_TYPE" version = "4">
During the development, the names of the modules were changed, then the idea asked whether to remove modules (because ide did not recognize:
type = "JLP_NATIVE_MICROSERVICE_IMPLEMENTATION_MODULE_TYPE").
Although the chosen option was not, then ide automatically performed:
selecting the maven module as ignored
replace in the iml file from:
'<'module org.jetbrains.idea.maven.project.MavenProjectsManager.isMavenModule = "true" type = "JLP_NATIVE_MICROSERVICE_IMPLEMENTATION_MODULE_TYPE" version = "4">
to:
'<'module type = "JLP_NATIVE_MICROSERVICE_IMPLEMENTATION_MODULE_TYPE" version = "4">
For the above reason, the project did not correctly read the maven artifacts. After unifying the names and marking the project as not ignored, it works correctly.
The problem is not related to java packages
EDIT: its resolved as above

Giving maven dependency error in Spark 2.3

I am building spark 2.3 scala code using maven , giving following error.
error: missing or invalid dependency detected while loading class file SparkSession.class.
This is snippet of pom file, please advise
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.3.0</version>
</dependency>
You might want to check your Java and Scala versions. They should be 1.6 or higher and 2.11 respectively. Or it could also be a mismatch with other dependencies like spark_sql.Make sure you have same version across all dependencies.

cucumber Extent reporting not working

I'm not using maven and downloaded cucumber-extentsreport 3.0.1.jar file.
I also added
plugin = "com.cucumber.listener.ExtentCucumberFormatter:output/report.html"
in cucumber test runner file.
but showing
cucumber.runtime.CucumberException: java.lang.NoClassDefFoundError: com/aventstack/extentreports/reporter/ExtentHtmlReporter
at cucumber.runtime.formatter.PluginFactory.instantiate(PluginFactory.java:114)
at cucumber.runtime.formatter.PluginFactory.create(PluginFactory.java:87)
at cucumber.runtime.RuntimeOptions.getPlugins(RuntimeOptions.java:245)
at cucumber.runtime.RuntimeOptions$1.invoke(RuntimeOptions.java:291)
at com.sun.proxy.$Proxy9.done(Unknown Source)
at cucumber.runtime.junit.JUnitReporter.done(JUnitReporter.java:227)
at cucumber.api.junit.Cucumber.run(Cucumber.java:101)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:86)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:459)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:678)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:382)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:192)
Caused by: java.lang.NoClassDefFoundError: com/aventstack/extentreports/reporter/ExtentHtmlReporter
at com.cucumber.listener.ExtentCucumberFormatter.setExtentHtmlReport(ExtentCucumberFormatter.java:61)
at com.cucumber.listener.ExtentCucumberFormatter.<init>(ExtentCucumberFormatter.java:34)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at cucumber.runtime.formatter.PluginFactory.instantiate(PluginFactory.java:107)
... 12 more
It looks like from the error that you are missing the extentreports dependency. Add this to your pom.xml:
<!-- pom.xml -->
<dependency>
<groupId>com.aventstack</groupId>
<artifactId>extentreports</artifactId>
<version>3.1.1</version>
</dependency>
Here are the dependencies of the plugin: https://github.com/email2vimalraj/CucumberExtentReporter/blob/master/pom.xml
Just now I figured out, I was also getting this error again and again and then I saw in the Maven dependencies in pom XML file I was having this dependency with tag
<scope>test<scope>
Removing this line works
Do the next for this issue:
Add the dependency (Version 4.0.9)
<dependency>
<groupId>com.aventstack</groupId>
<artifactId>extentreports</artifactId>
<version>4.0.9</version>
</dependency>
And these others:
<dependency>
<groupId>com.aventstack</groupId>
<artifactId>extentreports-cucumber3-adapter</artifactId>
<version>1.0.2</version>
</dependency>
<dependency>
<groupId>com.vimalselvam</groupId>
<artifactId>cucumber-extentsreport</artifactId>
<version>3.1.1</version>
</dependency>
Works fine for me.

Exception in thread "main" java.lang.IllegalStateException: Library directory '/Users/dbl/spark/lib_managed/jars' does not exist

I built Spark 1.6 SNAPSHOT from sources with no issues:
$ mvn3 clean package -DskipTests.
I'm running:
OS X 10.10.5.
Java 1.8
Maven 3.3.3
Spark 1.6 SNAPSHOT
Scala 2.11.7
Zinc 0.3.5.3
Hadoop 3.0 SNAPSHOT
I added the following dependency to my pom.xml (to try to resolve the warning about native libraries):
<dependency>
<groupId>com.googlecode.netlib-java</groupId>
<artifactId>netlib</artifactId>
<version>1.1</version>
</dependency>
Environment variables:
HADOOP_INSTALL=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT
HADOOP_CONF_DIR=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/etc/hadoop
HADOOP_OPTS=-Djava.library.path=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native
CLASSPATH=/users/davidlaxer/trunk/core/src/test/java/:/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-dist-3.0.0-SNAPSHOT.jar:/Users/davidlaxer/clojure/target:/Users/davidlaxer/hadoop/lib/native:
SPARK_LIBRARY_PATH=/Users/davidlaxer/hadoop/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/lib/native
When I try to launch spark with: spark-shell I get the following error:
./spark-shell
Exception in thread "main" java.lang.IllegalStateException: Library directory '/Users/davidlaxer/spark/lib_managed/jars' does not exist.
at org.apache.spark.launcher.CommandBuilderUtils.checkState(CommandBuilderUtils.java:249)
at org.apache.spark.launcher.AbstractCommandBuilder.buildClassPath(AbstractCommandBuilder.java:227)
at org.apache.spark.launcher.AbstractCommandBuilder.buildJavaCommand(AbstractCommandBuilder.java:115)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildSparkSubmitCommand(SparkSubmitCommandBuilder.java:196)
at org.apache.spark.launcher.SparkSubmitCommandBuilder.buildCommand(SparkSubmitCommandBuilder.java:121)
at org.apache.spark.launcher.Main.main(Main.java:86)
I reverted to Spark 1.5 and didn't have the problem:
git clone git://github.com/apache/spark.git -b branch-1.5

Resources