So I'm having trouble with log4j and hbm2ddl.
When I put an SMTPAppender in my log4j.xml I get this ClaasNotFoundException.
Any Hints on how to solve this?
These are my config files and the stacktrace:
stackctrace:
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Building Unnamed - Mail-logging-and-hbm2ddl:Mail-logging-and-hbm2ddl:jar:1.0
[INFO] task-segment: [package]
[INFO] ------------------------------------------------------------------------
[INFO] [resources:resources {execution: default-resources}]
[WARNING] Using platform encoding (windows-1252 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 3 resources
[INFO] Copying 2 resources
[INFO] [compiler:compile {execution: default-compile}]
[INFO] Nothing to compile - all classes are up to date
[INFO] Preparing hibernate3:hbm2ddl
[WARNING] Removing: hbm2ddl from forked lifecycle, to prevent recursive invocation.
[INFO] [resources:resources {execution: default-resources}]
[WARNING] Using platform encoding (windows-1252 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 3 resources
[INFO] Copying 2 resources
[INFO] [hibernate3:hbm2ddl {execution: default}]
[INFO] Configuration XML file loaded: file:/D:/DEV/PROJECTS/Mail%20logging%20and%20hbm2ddl/src/main/resources/hibernate.cfg.xml
[FATAL ERROR] org.codehaus.mojo.hibernate3.exporter.Hbm2DDLExporterMojo#execute() caused a linkage error (java.lang.NoClassDefFoundError) and may be out-of-date. Check the realms:
[FATAL ERROR] Plugin realm = app0.child-container[org.codehaus.mojo:hibernate3-maven-plugin:2.2]
urls[0] = file:/d:/Settings/U190552/.m2/repository/org/codehaus/mojo/hibernate3-maven-plugin/2.2/hibernate3-maven-plugin-2.2.jar
urls[1] = file:/d:/Settings/U190552/.m2/repository/log4j/log4j/1.2.14/log4j-1.2.14.jar
urls[2] = file:/d:/Settings/U190552/.m2/repository/org/hibernate/hibernate-tools/3.2.3.GA/hibernate-tools-3.2.3.GA.jar
urls[3] = file:/d:/Settings/U190552/.m2/repository/org/beanshell/bsh/2.0b4/bsh-2.0b4.jar
urls[4] = file:/d:/Settings/U190552/.m2/repository/freemarker/freemarker/2.3.8/freemarker-2.3.8.jar
urls[5] = file:/d:/Settings/U190552/.m2/repository/org/hibernate/jtidy/r8-20060801/jtidy-r8-20060801.jar
urls[6] = file:/d:/Settings/U190552/.m2/repository/org/hibernate/hibernate-core/3.3.1.GA/hibernate-core-3.3.1.GA.jar
urls[7] = file:/d:/Settings/U190552/.m2/repository/antlr/antlr/2.7.6/antlr-2.7.6.jar
urls[8] = file:/d:/Settings/U190552/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar
urls[9] = file:/d:/Settings/U190552/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar
urls[10] = file:/d:/Settings/U190552/.m2/repository/xml-apis/xml-apis/1.0.b2/xml-apis-1.0.b2.jar
urls[11] = file:/d:/Settings/U190552/.m2/repository/org/slf4j/slf4j-api/1.5.6/slf4j-api-1.5.6.jar
urls[12] = file:/d:/Settings/U190552/.m2/repository/org/codehaus/mojo/hibernate3/maven-hibernate3-api/2.2/maven-hibernate3-api-2.2.jar
urls[13] = file:/d:/Settings/U190552/.m2/repository/org/codehaus/plexus/plexus-utils/1.1/plexus-utils-1.1.jar
urls[14] = file:/d:/Settings/U190552/.m2/repository/org/apache/geronimo/specs/geronimo-jta_1.0.1B_spec/1.1.1/geronimo-jta_1.0.1B_spec-1.1.1.jar
urls[15] = file:/d:/Settings/U190552/.m2/repository/org/slf4j/slf4j-log4j12/1.5.6/slf4j-log4j12-1.5.6.jar
urls[16] = file:/d:/Settings/U190552/.m2/repository/org/slf4j/jcl-over-slf4j/1.5.6/jcl-over-slf4j-1.5.6.jar
urls[17] = file:/d:/Settings/U190552/.m2/repository/org/codehaus/mojo/hibernate3/maven-hibernate3-jdk14/2.2/maven-hibernate3-jdk14-2.2.jar
urls[18] = file:/d:/Settings/U190552/.m2/repository/org/codehaus/mojo/hibernate3/maven-hibernate3-jdk15/2.2/maven-hibernate3-jdk15-2.2.jar
urls[19] = file:/d:/Settings/U190552/.m2/repository/org/hibernate/hibernate-entitymanager/3.4.0.GA/hibernate-entitymanager-3.4.0.GA.jar
urls[20] = file:/d:/Settings/U190552/.m2/repository/org/hibernate/ejb3-persistence/1.0.2.GA/ejb3-persistence-1.0.2.GA.jar
urls[21] = file:/d:/Settings/U190552/.m2/repository/org/hibernate/hibernate-commons-annotations/3.1.0.GA/hibernate-commons-annotations-3.1.0.GA.jar
urls[22] = file:/d:/Settings/U190552/.m2/repository/org/hibernate/hibernate-annotations/3.4.0.GA/hibernate-annotations-3.4.0.GA.jar
urls[23] = file:/d:/Settings/U190552/.m2/repository/javax/transaction/jta/1.1/jta-1.1.jar
urls[24] = file:/d:/Settings/U190552/.m2/repository/javassist/javassist/3.4.GA/javassist-3.4.GA.jar
urls[25] = file:/d:/Settings/U190552/.m2/repository/jboss/jboss-common/4.0.2/jboss-common-4.0.2.jar
urls[26] = file:/d:/Settings/U190552/.m2/repository/slide/webdavlib/2.0/webdavlib-2.0.jar
urls[27] = file:/d:/Settings/U190552/.m2/repository/xerces/xercesImpl/2.6.2/xercesImpl-2.6.2.jar
[FATAL ERROR] Container realm = plexus.core
urls[0] = file:/D:/DEV/TOOLS/apache-maven-2.2.1/lib/maven-2.2.1-uber.jar
[INFO] ------------------------------------------------------------------------
[ERROR] FATAL ERROR
[INFO] ------------------------------------------------------------------------
[INFO] javax/mail/internet/AddressException
javax.mail.internet.AddressException
[INFO] ------------------------------------------------------------------------
[INFO] Trace
java.lang.NoClassDefFoundError: javax/mail/internet/AddressException
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2389)
at java.lang.Class.getConstructor0(Class.java:2699)
at java.lang.Class.newInstance0(Class.java:326)
at java.lang.Class.newInstance(Class.java:308)
at org.apache.log4j.xml.DOMConfigurator.parseAppender(DOMConfigurator.java:174)
at org.apache.log4j.xml.DOMConfigurator.findAppenderByName(DOMConfigurator.java:150)
at org.apache.log4j.xml.DOMConfigurator.findAppenderByReference(DOMConfigurator.java:163)
at org.apache.log4j.xml.DOMConfigurator.parseChildrenOfLoggerElement(DOMConfigurator.java:425)
at org.apache.log4j.xml.DOMConfigurator.parseRoot(DOMConfigurator.java:394)
at org.apache.log4j.xml.DOMConfigurator.parse(DOMConfigurator.java:829)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:712)
at org.apache.log4j.xml.DOMConfigurator.doConfigure(DOMConfigurator.java:618)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:470)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:122)
at org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:73)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:209)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:221)
at org.hibernate.cfg.Configuration.<clinit>(Configuration.java:151)
at org.codehaus.mojo.hibernate3.configuration.AnnotationComponentConfiguration.createConfiguration(AnnotationComponentConfiguration.java:93)
at org.codehaus.mojo.hibernate3.configuration.AbstractComponentConfiguration.getConfiguration(AbstractComponentConfiguration.java:51)
at org.codehaus.mojo.hibernate3.exporter.Hbm2DDLExporterMojo.doExecute(Hbm2DDLExporterMojo.java:87)
at org.codehaus.mojo.hibernate3.HibernateExporterMojo.execute(HibernateExporterMojo.java:152)
at org.apache.maven.plugin.DefaultPluginManager.executeMojo(DefaultPluginManager.java:490)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoals(DefaultLifecycleExecutor.java:694)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalWithLifecycle(DefaultLifecycleExecutor.java:556)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoal(DefaultLifecycleExecutor.java:535)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeGoalAndHandleFailures(DefaultLifecycleExecutor.java:387)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.executeTaskSegments(DefaultLifecycleExecutor.java:348)
at org.apache.maven.lifecycle.DefaultLifecycleExecutor.execute(DefaultLifecycleExecutor.java:180)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:328)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:138)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:362)
at org.apache.maven.cli.compat.CompatibleMain.main(CompatibleMain.java:60)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.classworlds.Launcher.launchEnhanced(Launcher.java:315)
at org.codehaus.classworlds.Launcher.launch(Launcher.java:255)
at org.codehaus.classworlds.Launcher.mainWithExitCode(Launcher.java:430)
at org.codehaus.classworlds.Launcher.main(Launcher.java:375)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:115)
Caused by: java.lang.ClassNotFoundException: javax.mail.internet.AddressException
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at org.codehaus.classworlds.RealmClassLoader.loadClassDirect(RealmClassLoader.java:195)
at org.codehaus.classworlds.DefaultClassRealm.loadClass(DefaultClassRealm.java:255)
at org.codehaus.classworlds.DefaultClassRealm.loadClass(DefaultClassRealm.java:274)
at org.codehaus.classworlds.RealmClassLoader.loadClass(RealmClassLoader.java:214)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
... 47 more
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2 seconds
[INFO] Finished at: Fri Dec 31 11:42:20 CET 2010
[INFO] Final Memory: 10M/24M
[INFO] ------------------------------------------------------------------------
log4j.xml
<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/">
<appender name="email" class="org.apache.log4j.net.SMTPAppender">
<param name="Threshold" value="error" />
<param name="BufferSize" value="10" />
<param name="SMTPHost" value="smtp.host" />
<param name="From" value="site#domain.com" />
<param name="To" value="CDB#mail" />
<param name="Subject" value="[Site] Error - TST" />
<param name="LocationInfo" value="false" />
<layout class="org.apache.log4j.HTMLLayout">
<param name="LocationInfo" value="false" />
</layout>
</appender>
<root>
<priority value="DEBUG" />
<appender-ref ref="email" />
</root>
</log4j:configuration>
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>Mail-logging-and-hbm2ddl</groupId>
<artifactId>Mail-logging-and-hbm2ddl</artifactId>
<version>1.0</version>
<dependencies>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate</artifactId>
<version>3.2.6.ga</version>
<exclusions>
<!-- We need a higher version of ehcache -->
<exclusion>
<groupId>net.sf.ehcache</groupId>
<artifactId>ehcache</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
<version>1.4</version>
</dependency>
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>2.5.4</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-aop</artifactId>
<version>2.5.4</version>
</dependency>
</dependencies>
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>false</filtering>
</resource>
<resource>
<directory>src/main/resources-${targetprofile}</directory>
<filtering>false</filtering>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.0.2</version>
<configuration>
<source>${javaVersion}</source>
<target>${javaVersion}</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>hibernate3-maven-plugin</artifactId>
<version>2.2</version>
<executions>
<execution>
<phase>process-classes</phase>
<goals>
<goal>hbm2ddl</goal>
</goals>
</execution>
</executions>
<configuration>
<componentProperties>
<propertyfile>
src/main/resources-${targetprofile}/configuration.properties
</propertyfile>
<export>false</export>
<drop>true</drop>
<outputfilename>
${project.artifactId}-${project.version}-schema.sql
</outputfilename>
</componentProperties>
</configuration>
</plugin>
</plugins>
</build>
<properties>
<javaVersion>1.6</javaVersion>
</properties>
</project>
This other question gave me the answer answer
So the anwser is to add the javax.mail dependency to the plugin as follows :
<build>
....
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>hibernate3-maven-plugin</artifactId>
<version>2.0</version>
<executions>
<execution>
<phase>process-classes</phase>
<goals>
<goal>hbm2ddl</goal>
</goals>
</execution>
</executions>
<configuration>
<componentProperties>
<propertyfile>
src/main/resources-${targetprofile}/configuration.properties
</propertyfile>
<export>false</export>
<drop>true</drop>
<outputfilename>
${project.artifactId}-${project.version}-schema.sql
</outputfilename>
</componentProperties>
</configuration>
<dependencies>
<dependency>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
<version>1.4.3</version>
</dependency>
</dependencies>
</plugin>
From the error message as well as the debug log, it appears that javax.mail dependency is not part of the dependencies that are present when hbm2ddl is run. Since the contents above are poorly formatted and possibly incomplete, it is difficult to say why. One possibility is javax.mail dependency is not included. Or if yes, included with incorrect (say runtime) scope.
You could try running the goal removing the SMTPAppender from log4j.xml to see if it works. This will help narrow down the problem.
Related
I have a test that runs on Spock framework. I am trying to setup Allure reports with it. I don't see an example for spock integration here https://github.com/allure-examples. So i took the Junit5 maven based example,https://github.com/allure-examples/allure-junit5-maven, and trying to set it up. I modified the dependency from
<dependency>
<groupId>io.qameta.allure</groupId>
<artifactId>allure-junit5</artifactId>
<version>${allure.version}</version>
</dependency>
to
<dependency>
<groupId>io.qameta.allure</groupId>
<artifactId>allure-spock</artifactId>
<version>2.13.10</version>
</dependency>
since i am using spock here to run the tests.
Below is the pom i am using,
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>sample</groupId>
<version>0.0.1-SNAPSHOT</version>
<artifactId>sample</artifactId>
<name>sample-test</name>
<packaging>jar</packaging>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<spock.version>2.0-M5-groovy-3.0</spock.version>
<dbunit.version>2.5.1</dbunit.version>
<hamcrest.version>1.3</hamcrest.version>
<geb.version>0.13.1</geb.version>
<selenium.version>2.51.0</selenium.version>
<groovy.version>3.0.8</groovy.version>
</properties>
<repositories>
<!--other repositories if any-->
<repository>
<id>project.local</id>
<name>project</name>
<url>file:${project.basedir}/../repo</url>
</repository>
</repositories>
<dependencies>
<!--mandatory for the groovy CLI scripts -->
<dependency>
<groupId>org.mod4j.org.apache.commons</groupId>
<artifactId>cli</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>org.codehaus.groovy</groupId>
<artifactId>groovy-all</artifactId>
<version>${groovy.version}</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>com.github.jankroken</groupId>
<artifactId>commandline</artifactId>
<version>1.7.0</version>
</dependency>
<!-- Mandatory dependencies for using Spock -->
<dependency>
<groupId>org.spockframework</groupId>
<artifactId>spock-core</artifactId>
<version>${spock.version}</version>
<scope>test</scope>
</dependency>
<!-- Mandatory dependencies for tests with DB interaction -->
<dependency>
<groupId>org.dbunit</groupId>
<artifactId>dbunit</artifactId>
<version>${dbunit.version}</version>
<!-- not scoped for test since Import/export use this library -->
</dependency>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>ojdbc7</artifactId>
<version>12.1.0.1</version>
</dependency>
<dependency>
<groupId>com.oracle</groupId>
<artifactId>xdb6</artifactId>
<version>12.1.0.1</version>
</dependency>
<dependency>
<groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils</artifactId>
<version>1.4</version>
</dependency>
<dependency>
<groupId>org.jdom</groupId>
<artifactId>jdom</artifactId>
<version>1.1</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.16</version>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.2</version>
</dependency>
<dependency>
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
<version>3.2.2</version>
</dependency>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.6</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>3.3.1</version>
</dependency>
<!-- JSON serialization/de-serialization library needed for JSONDataSet-->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.7.3</version>
</dependency>
<!-- h2databse library used to query CSV files with SQL -->
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.4.191</version>
</dependency>
<!-- Geb testing support -->
<dependency>
<groupId>org.gebish</groupId>
<artifactId>geb-spock</artifactId>
<version>${geb.version}</version>
<scope>test</scope>
</dependency>
<!-- httpcomponents upgrade to fix error in HTMLUnit-->
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5</version>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-support</artifactId>
<version>${selenium.version}</version>
<scope>test</scope>
</dependency>
<!-- Selenium Web Driver Manager-->
<dependency>
<groupId>io.github.bonigarcia</groupId>
<artifactId>webdrivermanager</artifactId>
<version>1.4.5</version>
</dependency>
<!-- Hadoop dependencies -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.1</version>
</dependency>
<!-- Oozie dependencies -->
<dependency>
<groupId>org.apache.oozie</groupId>
<artifactId>oozie-client</artifactId>
<version>4.2.0</version>
</dependency>
<!-- Hive dependencies -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>2.1.0</version>
<exclusions>
<exclusion>
<artifactId>jdk.tools</artifactId>
<groupId>jdk.tools</groupId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.maven.plugins/maven-surefire-report-plugin -->
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-report-plugin</artifactId>
<version>3.0.0-M5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.maven.plugins/maven-site-plugin -->
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>3.9.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/io.qameta.allure/allure-spock -->
<dependency>
<groupId>io.qameta.allure</groupId>
<artifactId>allure-spock</artifactId>
<version>2.13.10</version>
</dependency>
<dependency>
<groupId>org.junit.platform</groupId>
<artifactId>junit-platform-surefire-provider</artifactId>
<version>1.3.2</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.30</version>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>5.8.0-M1</version>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>5.8.0-M1</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>${project.basedir}/src/main/groovy</sourceDirectory>
<testSourceDirectory>${project.basedir}/src/test/groovy</testSourceDirectory>
<resources>
<resource>
<directory>${project.basedir}/src/main/resources</directory>
</resource>
</resources>
<plugins>
<!-- Mandatory plugins for using Spock -->
<plugin>
<!-- The gmavenplus plugin is used to compile Groovy code. To learn more about this plugin,
visit https://github.com/groovy/GMavenPlus/wiki -->
<groupId>org.codehaus.gmavenplus</groupId>
<artifactId>gmavenplus-plugin</artifactId>
<version>1.12.1</version>
<executions>
<execution>
<goals>
<goal>addSources</goal>
<goal>addTestSources</goal>
<goal>compile</goal>
<goal>compileTests</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Optional plugins for using Spock -->
<!-- Only required if names of spec classes don't match default Surefire patterns (`*Test` etc.) -->
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M5</version>
<configuration>
<useFile>false</useFile>
<argLine>
-Dfile.encoding=UTF-8
-javaagent:"${settings.localRepository}/org/aspectj/aspectjweaver/1.9.6/aspectjweaver-1.9.6.jar"
</argLine>
<includes>
<include>**/*Spec.groovy</include>
<include>**/*Spec.java</include>
<include>**/*Test.groovy</include>
<include>**/*Test.java</include>
</includes>
<systemPropertyVariables>
<geb.build.reportsDir>target/test-reports/geb</geb.build.reportsDir>
<allure.results.directory>${project.build.directory}/allure-results</allure.results.directory>
<junit.jupiter.extensions.autodetection.enabled>true</junit.jupiter.extensions.autodetection.enabled>
</systemPropertyVariables>
</configuration>
<dependencies>
<dependency>
<groupId>org.junit.platform</groupId>
<artifactId>junit-platform-surefire-provider</artifactId>
<version>1.3.2</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>1.9.6</version>
</dependency>
</dependencies>
</plugin>
<plugin>
<groupId>io.qameta.allure</groupId>
<artifactId>allure-maven</artifactId>
<version>2.10.0</version>
<configuration>
<reportVersion>2.13.10</reportVersion>
<resultsDirectory>${project.build.directory}/allure-results</resultsDirectory>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>3.9.1</version>
</plugin>
</plugins>
</build>
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-report-plugin</artifactId>
<version>3.0.0-M5</version>
</plugin>
</plugins>
</reporting>
</project>
I am running a specific test class by using this command,
mvn -f pom.xml test -Dtest=CalcSpec -Dmaven.test.failure.ignore surefire-report:report
But i am getting this error while running it
[WARNING] Error injecting: org.apache.maven.plugin.surefire.SurefirePlugin
java.lang.NoClassDefFoundError: org/apache/maven/surefire/api/testset/TestSetFailedException
at java.lang.Class.getDeclaredConstructors0 (Native Method)
at java.lang.Class.privateGetDeclaredConstructors (Class.java:2671)
at java.lang.Class.getDeclaredConstructors (Class.java:2020)
at com.google.inject.spi.InjectionPoint.forConstructorOf (InjectionPoint.java:245)
at com.google.inject.internal.ConstructorBindingImpl.create (ConstructorBindingImpl.java:115)
at com.google.inject.internal.InjectorImpl.createUninitializedBinding (InjectorImpl.java:706)
at com.google.inject.internal.InjectorImpl.createJustInTimeBinding (InjectorImpl.java:930)
at com.google.inject.internal.InjectorImpl.createJustInTimeBindingRecursive (InjectorImpl.java:852)
at com.google.inject.internal.InjectorImpl.getJustInTimeBinding (InjectorImpl.java:291)
at com.google.inject.internal.InjectorImpl.getBindingOrThrow (InjectorImpl.java:222)
at com.google.inject.internal.InjectorImpl.getProviderOrThrow (InjectorImpl.java:1040)
at com.google.inject.internal.InjectorImpl.getProvider (InjectorImpl.java:1071)
at com.google.inject.internal.InjectorImpl.getProvider (InjectorImpl.java:1034)
at com.google.inject.internal.InjectorImpl.getInstance (InjectorImpl.java:1086)
at org.eclipse.sisu.space.AbstractDeferredClass.get (AbstractDeferredClass.java:48)
at com.google.inject.internal.ProviderInternalFactory.provision (ProviderInternalFactory.java:85)
at com.google.inject.internal.InternalFactoryToInitializableAdapter.provision (InternalFactoryToInitializableAdapter.java:57)
at com.google.inject.internal.ProviderInternalFactory$1.call (ProviderInternalFactory.java:66)
at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision (ProvisionListenerStackCallback.java:112)
at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision (ProvisionListenerStackCallback.java:127)
at com.google.inject.internal.ProvisionListenerStackCallback.provision (ProvisionListenerStackCallback.java:66)
at com.google.inject.internal.ProviderInternalFactory.circularGet (ProviderInternalFactory.java:61)
at com.google.inject.internal.InternalFactoryToInitializableAdapter.get (InternalFactoryToInitializableAdapter.java:47)
at com.google.inject.internal.InjectorImpl$1.get (InjectorImpl.java:1050)
at org.eclipse.sisu.inject.Guice4$1.get (Guice4.java:162)
at org.eclipse.sisu.inject.LazyBeanEntry.getValue (LazyBeanEntry.java:81)
at org.eclipse.sisu.plexus.LazyPlexusBean.getValue (LazyPlexusBean.java:51)
at org.codehaus.plexus.DefaultPlexusContainer.lookup (DefaultPlexusContainer.java:263)
at org.codehaus.plexus.DefaultPlexusContainer.lookup (DefaultPlexusContainer.java:255)
at org.apache.maven.plugin.internal.DefaultMavenPluginManager.getConfiguredMojo (DefaultMavenPluginManager.java:520)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:124)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:957)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:289)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:193)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
Caused by: java.lang.ClassNotFoundException: org.apache.maven.surefire.api.testset.TestSetFailedException
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass (SelfFirstStrategy.java:50)
at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass (ClassRealm.java:271)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass (ClassRealm.java:247)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass (ClassRealm.java:239)
at java.lang.Class.getDeclaredConstructors0 (Native Method)
at java.lang.Class.privateGetDeclaredConstructors (Class.java:2671)
at java.lang.Class.getDeclaredConstructors (Class.java:2020)
at com.google.inject.spi.InjectionPoint.forConstructorOf (InjectionPoint.java:245)
at com.google.inject.internal.ConstructorBindingImpl.create (ConstructorBindingImpl.java:115)
at com.google.inject.internal.InjectorImpl.createUninitializedBinding (InjectorImpl.java:706)
at com.google.inject.internal.InjectorImpl.createJustInTimeBinding (InjectorImpl.java:930)
at com.google.inject.internal.InjectorImpl.createJustInTimeBindingRecursive (InjectorImpl.java:852)
at com.google.inject.internal.InjectorImpl.getJustInTimeBinding (InjectorImpl.java:291)
at com.google.inject.internal.InjectorImpl.getBindingOrThrow (InjectorImpl.java:222)
at com.google.inject.internal.InjectorImpl.getProviderOrThrow (InjectorImpl.java:1040)
at com.google.inject.internal.InjectorImpl.getProvider (InjectorImpl.java:1071)
at com.google.inject.internal.InjectorImpl.getProvider (InjectorImpl.java:1034)
at com.google.inject.internal.InjectorImpl.getInstance (InjectorImpl.java:1086)
at org.eclipse.sisu.space.AbstractDeferredClass.get (AbstractDeferredClass.java:48)
at com.google.inject.internal.ProviderInternalFactory.provision (ProviderInternalFactory.java:85)
at com.google.inject.internal.InternalFactoryToInitializableAdapter.provision (InternalFactoryToInitializableAdapter.java:57)
at com.google.inject.internal.ProviderInternalFactory$1.call (ProviderInternalFactory.java:66)
at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision (ProvisionListenerStackCallback.java:112)
at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision (ProvisionListenerStackCallback.java:127)
at com.google.inject.internal.ProvisionListenerStackCallback.provision (ProvisionListenerStackCallback.java:66)
at com.google.inject.internal.ProviderInternalFactory.circularGet (ProviderInternalFactory.java:61)
at com.google.inject.internal.InternalFactoryToInitializableAdapter.get (InternalFactoryToInitializableAdapter.java:47)
at com.google.inject.internal.InjectorImpl$1.get (InjectorImpl.java:1050)
at org.eclipse.sisu.inject.Guice4$1.get (Guice4.java:162)
at org.eclipse.sisu.inject.LazyBeanEntry.getValue (LazyBeanEntry.java:81)
at org.eclipse.sisu.plexus.LazyPlexusBean.getValue (LazyPlexusBean.java:51)
at org.codehaus.plexus.DefaultPlexusContainer.lookup (DefaultPlexusContainer.java:263)
at org.codehaus.plexus.DefaultPlexusContainer.lookup (DefaultPlexusContainer.java:255)
at org.apache.maven.plugin.internal.DefaultMavenPluginManager.getConfiguredMojo (DefaultMavenPluginManager.java:520)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:124)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:957)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:289)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:193)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 22.275 s
[INFO] Finished at: 2021-05-14T07:51:58-04:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M5:test (default-test) on project pic-test: Execution default-test of goal org.apache.maven.plugins:maven-surefire-plu
gin:3.0.0-M5:test failed: A required class was missing while executing org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M5:test: org/apache/maven/surefire/api/testset/TestSetFailedException
[ERROR] -----------------------------------------------------
[ERROR] realm = plugin>org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M5
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/C:/Users/tji/.m2/repository/org/apache/maven/plugins/maven-surefire-plugin/3.0.0-M5/maven-surefire-plugin-3.0.0-M5.jar
[ERROR] urls[1] = file:/C:/Users/tji/.m2/repository/org/junit/platform/junit-platform-surefire-provider/1.3.2/junit-platform-surefire-provider-1.3.2.jar
[ERROR] urls[2] = file:/C:/Users/tji/.m2/repository/org/apiguardian/apiguardian-api/1.0.0/apiguardian-api-1.0.0.jar
[ERROR] urls[3] = file:/C:/Users/tji/.m2/repository/org/junit/platform/junit-platform-launcher/1.3.2/junit-platform-launcher-1.3.2.jar
[ERROR] urls[4] = file:/C:/Users/tji/.m2/repository/org/junit/platform/junit-platform-engine/1.3.2/junit-platform-engine-1.3.2.jar
[ERROR] urls[5] = file:/C:/Users/tji/.m2/repository/org/junit/platform/junit-platform-commons/1.3.2/junit-platform-commons-1.3.2.jar
[ERROR] urls[6] = file:/C:/Users/tji/.m2/repository/org/opentest4j/opentest4j/1.1.1/opentest4j-1.1.1.jar
[ERROR] urls[7] = file:/C:/Users/tji/.m2/repository/org/apache/maven/surefire/surefire-api/2.22.0/surefire-api-2.22.0.jar
[ERROR] urls[8] = file:/C:/Users/tji/.m2/repository/org/apache/maven/surefire/surefire-logger-api/2.22.0/surefire-logger-api-2.22.0.jar
[ERROR] urls[9] = file:/C:/Users/tji/.m2/repository/org/apache/maven/surefire/common-java5/2.22.0/common-java5-2.22.0.jar
[ERROR] urls[10] = file:/C:/Users/tji/.m2/repository/org/aspectj/aspectjweaver/1.9.6/aspectjweaver-1.9.6.jar
[ERROR] urls[11] = file:/C:/Users/tji/.m2/repository/org/apache/maven/surefire/maven-surefire-common/3.0.0-M5/maven-surefire-common-3.0.0-M5.jar
[ERROR] urls[12] = file:/C:/Users/tji/.m2/repository/org/apache/maven/surefire/surefire-extensions-api/3.0.0-M5/surefire-extensions-api-3.0.0-M5.jar
[ERROR] urls[13] = file:/C:/Users/tji/.m2/repository/org/apache/maven/surefire/surefire-booter/3.0.0-M5/surefire-booter-3.0.0-M5.jar
[ERROR] urls[14] = file:/C:/Users/tji/.m2/repository/org/apache/maven/surefire/surefire-extensions-spi/3.0.0-M5/surefire-extensions-spi-3.0.0-M5.jar
[ERROR] urls[15] = file:/C:/Users/tji/.m2/repository/org/apache/maven/shared/maven-artifact-transfer/0.11.0/maven-artifact-transfer-0.11.0.jar
[ERROR] urls[16] = file:/C:/Users/tji/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/3.1.0/maven-common-artifact-filters-3.1.0.jar
[ERROR] urls[17] = file:/C:/Users/tji/.m2/repository/commons-codec/commons-codec/1.11/commons-codec-1.11.jar
[ERROR] urls[18] = file:/C:/Users/tji/.m2/repository/org/codehaus/plexus/plexus-java/1.0.5/plexus-java-1.0.5.jar
[ERROR] urls[19] = file:/C:/Users/tji/.m2/repository/org/ow2/asm/asm/7.2/asm-7.2.jar
[ERROR] urls[20] = file:/C:/Users/tji/.m2/repository/com/thoughtworks/qdox/qdox/2.0-M9/qdox-2.0-M9.jar
[ERROR] urls[21] = file:/C:/Users/tji/.m2/repository/org/apache/maven/surefire/surefire-shared-utils/3.0.0-M4/surefire-shared-utils-3.0.0-M4.jar
[ERROR] urls[22] = file:/C:/Users/tji/.m2/repository/org/codehaus/plexus/plexus-utils/1.1/plexus-utils-1.1.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import from realm ClassRealm[maven.api, parent: null]]
[ERROR]
[ERROR] -----------------------------------------------------
[ERROR] : org.apache.maven.surefire.api.testset.TestSetFailedException
What could be the issue here ? And is there any example out there for integrating Allure reports with Spock testing framework ?
You have to add surefire-api as an explicit dependency of the surefire plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M5</version>
<dependencies>
<!-- https://mvnrepository.com/artifact/org.junit.platform/junit-platform-surefire-provider -->
<dependency>
<groupId>org.junit.platform</groupId>
<artifactId>junit-platform-surefire-provider</artifactId>
<version>1.3.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.maven.surefire/surefire-api -->
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-api</artifactId>
<version>3.0.0-M5</version>
</dependency>
...
</dependencies>
</plugin>
I am trying to resolve a spark-submit classpath runtime issue for an Apache Tika (>v 1.14) parsing job. The problem seems to involve spark-submit classpath vs my uber-jar.
Platforms: CDH 5.15 (Spark 2.3 added via CDH docs) and CDH 6 (Spark 2.2 bundled in CDH 6)
I've tried / reviewed:
(Cloudera) Where does spark-submit look for Jar files?
(stackoverflow) resolving-dependency-problems-in-apache-spark
(stackoverflow) Apache Tika ArchiveStreamFactory.detect error
Highlights:
Java 8 / Scala 2.11
I'm building an uber-jar and calling that uber-jar via spark-submit
I've tried adding --jars option to spark-submit call (see further down in this post)
I've tried adding --conf spark.driver.userClassPathFirst=true &&
--conf spark.executor.userClassPathFirst=true to spark-submit call (see further down in this post):
Results if I include --conf flag(s) to spark-submit:
$ spark-submit --master local[*] --class com.example.App --conf spark.executor.userClassPathFirst=true ./target/uber-tikaTest-1.19.jar
18/09/25 13:35:55 ERROR util.Utils: Exception encountered
java.lang.NullPointerException
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:72)
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply(ParallelCollectionRDD.scala:70)
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply(ParallelCollectionRDD.scala:70)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1307)
at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2136)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:312)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/09/25 13:35:55 ERROR util.Utils: Exception encountered
java.lang.NullPointerException
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:72)
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply(ParallelCollectionRDD.scala:70)
at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply(ParallelCollectionRDD.scala:70)
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1307)
at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2136)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2245)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2169)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2027)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:312)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Below the following error message are files for:
build-and-run.sh script (calls spark-submit -- notes about options
included)
sample app
pom.xml
mvn dependency tree output (which shows the "missing"
commons-compress library is included within the uber-jar)
The error at runtime:
18/09/25 11:47:39 ERROR executor.Executor: Exception in task 1.0 in stage 0.0 (TID 1)
java.lang.NoSuchMethodError: org.apache.commons.compress.archivers.ArchiveStreamFactory.detect(Ljava/io/InputStream;)Ljava/lang/String;
at org.apache.tika.parser.pkg.ZipContainerDetector.detectArchiveFormat(ZipContainerDetector.java:160)
at org.apache.tika.parser.pkg.ZipContainerDetector.detect(ZipContainerDetector.java:104)
at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:84)
at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:116)
at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:159)
at com.example.App$.tikaAutoDetectParser(App.scala:55)
at com.example.App$$anonfun$1.apply(App.scala:69)
at com.example.App$$anonfun$1.apply(App.scala:69)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1799)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1158)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1158)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2071)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2071)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
18/09/25 11:47:39 ERROR executor.Executor: Exception in task 5.0 in stage 0.0 (TID 5)
java.lang.NoSuchMethodError: org.apache.commons.compress.archivers.ArchiveStreamFactory.detect(Ljava/io/InputStream;)Ljava/lang/String;
at org.apache.tika.parser.pkg.ZipContainerDetector.detectArchiveFormat(ZipContainerDetector.java:160)
at org.apache.tika.parser.pkg.ZipContainerDetector.detect(ZipContainerDetector.java:104)
at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:84)
at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:116)
at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:159)
at com.example.App$.tikaAutoDetectParser(App.scala:55)
at com.example.App$$anonfun$1.apply(App.scala:69)
at com.example.App$$anonfun$1.apply(App.scala:69)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1799)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1158)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1158)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2071)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2071)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
build-and-run.sh:
Notes:
I've tried adding the --conf flags for userClassPathFirst in both
master and yarn configs below,
using the --jar flag to specify the uber-jar generated from mvn
compile with the pom.xml (provided further down in the post)
build-and-run.sh
mvn compile
if true
then
spark-submit --master local[*] --class com.example.App ./target/uber-tikaTest-1.19.jar
fi
# tried the using the userClass flags for driver and executor for above and below calls to spark-submit
# --conf spark.driver.userClassPathFirst=true \
# --conf spark.executor.userClassPathFirst=true \
if false
then
spark-submit --class com.example.App \
--master yarn \
--packages org.apache.commons:commons-compress:1.18 \
--jars ./target/uber-tikaTest-1.19.jar \
--num-executors 2 \
--executor-memory 1024m \
--executor-cores 2 \
--driver-memory 2048m \
--driver-cores 1 \
./target/uber-tikaTest-1.19.jar
fi
Sample App:
package com.example
////////// Tika Imports
import org.apache.tika.metadata.Metadata
import org.apache.tika.parser.AutoDetectParser
import org.apache.tika.sax.BodyContentHandler
////////// Java HTTP Imports
import java.net.URL;
import java.net.HttpURLConnection
import scala.collection.JavaConverters._
import scala.collection.mutable._
////////// Spark Imports
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.storage.StorageLevel
import org.apache.spark.sql.{Row,SparkSession}
object App {
case class InputStreamData(sourceURL: String, headerFields: Map[String,List[String]], inputStream: java.io.InputStream)
def openUrlStream(sourceURL:String,apiKey:String):(InputStreamData) = {
try {
val url = new URL(sourceURL)
val urlConnection = url.openConnection().asInstanceOf[HttpURLConnection]
urlConnection.setInstanceFollowRedirects(true)
val headerFields = urlConnection.getHeaderFields()
val input = urlConnection.getInputStream()
InputStreamData(sourceURL, headerFields.asScala.map(x => (x._1,x._2.asScala.toList)), input)
}
catch {
case e: Exception => {
println("**********************************************************************************************")
println("PARSEURL: INVALID URL: " + sourceURL)
println(e.toString())
println("**********************************************************************************************")
InputStreamData(sourceURL, Map("ERROR" -> List("ERROR")), null)
}
}
}
def tikaAutoDetectParser(inputStream:java.io.InputStream):String = {
var parser = new AutoDetectParser();
var handler = new BodyContentHandler(-1);
var metadata = new Metadata();
parser.parse(inputStream, handler, metadata);
return handler.toString()
}
def main(args : Array[String]) {
var sparkConf = new SparkConf().setAppName("tika-1.19-test")
val sc = new SparkContext(sparkConf)
val spark = SparkSession.builder.config(sparkConf).getOrCreate()
println("HELLO!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
var urls = List("http://www.pdf995.com/samples/pdf.pdf", "https://www.amd.com/en", "http://jeroen.github.io/images/testocr.png")
var rdd = sc.parallelize(urls)
var parsed = rdd.map(x => tikaAutoDetectParser(openUrlStream(x,"").inputStream))
println(parsed.count)
}
}
pom.xml (builds uber-jar):
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.example</groupId>
<artifactId>tikaTest</artifactId>
<version>1.19</version>
<name>${project.artifactId}</name>
<description>Testing tika 1.19 with CDH 6 and 5.x, Spark 2.x, Scala 2.11.x</description>
<inceptionYear>2018</inceptionYear>
<licenses>
<license>
<name>My License</name>
<url>http://....</url>
<distribution>repo</distribution>
</license>
</licenses>
<repositories>
<repository>
<id>cloudera</id>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
</repositories>
<profiles>
<profile>
<id>scala-2.11.12</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<properties>
<scalaVersion>2.11.12</scalaVersion>
<scalaBinaryVersion>2.11.12</scalaBinaryVersion>
</properties>
<dependencies>
<!-- ************************************************************************** -->
<!-- GOOD DEPENDENCIES +++++++++++++++++++++++++++++++++++++ -->
<!-- ************************************************************************** -->
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-compress -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-compress</artifactId>
<version>1.18</version>
</dependency>
<!-- *************** CDH flavored dependencies ***********************************************-->
<!-- https://www.cloudera.com/documentation/spark2/latest/topics/spark2_packaging.html#versions -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0.cloudera3</version>
<!-- have tried scope provided / compile -->
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0.cloudera3</version>
<!-- have tried scope provided / compile -->
<!--<scope>provided</scope>-->
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.tika/tika-core -->
<dependency>
<groupId>org.apache.tika</groupId>
<artifactId>tika-core</artifactId>
<version>1.19</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.tika/tika-parsers -->
<dependency>
<groupId>org.apache.tika</groupId>
<artifactId>tika-parsers</artifactId>
<version>1.19</version>
</dependency>
<!-- https://mvnrepository.com/artifact/javax.ws.rs/javax.ws.rs-api -->
<dependency>
<groupId>javax.ws.rs</groupId>
<artifactId>javax.ws.rs-api</artifactId>
<version>2.1.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.12</version>
</dependency>
<!-- **************************************************************************************************************************
**************************** alternative dependencies that have been tried and yield same Tika error***************************
*******************************************************************************************************************************-->
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
<!--
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
-->
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
<!--
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
-->
</dependencies>
</profile>
</profiles>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<args>
<!-- work-around for https://issues.scala-lang.org/browse/SI-8358 -->
<arg>-nobootcp</arg>
</args>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<finalName>uber-${project.artifactId}-${project.version}</finalName>
</configuration>
</plugin>
</plugins>
</build>
</project>
mvn dependency tree:
Notes:
$ mvn dependency:tree -Ddetail=true | grep compress
[INFO] +- org.apache.commons:commons-compress:jar:1.18:compile
[INFO] | +- com.ning:compress-lzf:jar:1.0.3:compile
$ mvn dependency:tree -Ddetail=true | grep commons
[INFO] +- org.apache.commons:commons-compress:jar:1.18:compile
[INFO] | | | \- commons-collections:commons-collections:jar:3.2.2:compile
[INFO] | | | +- commons-cli:commons-cli:jar:1.2:compile
[INFO] | | | +- commons-httpclient:commons-httpclient:jar:3.1:compile
[INFO] | | | +- commons-configuration:commons-configuration:jar:1.6:compile
[INFO] | | | | +- commons-digester:commons-digester:jar:1.8:compile
[INFO] | | | | | \- commons-beanutils:commons-beanutils:jar:1.7.0:compile
[INFO] | | | | \- commons-beanutils:commons-beanutils-core:jar:1.8.0:compile
[INFO] | +- org.apache.commons:commons-lang3:jar:3.5:compile
[INFO] | +- org.apache.commons:commons-math3:jar:3.4.1:compile
[INFO] | +- commons-net:commons-net:jar:2.2:compile
[INFO] | +- org.apache.commons:commons-crypto:jar:1.0.0:compile
[INFO] | | +- org.codehaus.janino:commons-compiler:jar:3.0.8:compile
[INFO] | | \- commons-lang:commons-lang:jar:2.6:compile
[INFO] | +- commons-codec:commons-codec:jar:1.11:compile
[INFO] | | \- org.apache.commons:commons-collections4:jar:4.2:compile
[INFO] | +- org.apache.commons:commons-exec:jar:1.3:compile
[INFO] | +- commons-io:commons-io:jar:2.6:compile
[INFO] | +- org.apache.commons:commons-csv:jar:1.5:compile
What is the source of exception?
This is the result of a dependency conflict.
Why can't I solve it through my POM.XML?
Because it is not an internal conflict within your jar file. It is a conflict with Apache Spark.
What is wrong exactly?
Spark 2.x distributions include old versions of commons-compress, while Tika library depends on version 1.18 of commons-compress library.
Solution
Use --driver-class-path argument in your spark-shell or spark-submit to point to a the right version of commons-compress library.
spark-submit
--driver-class-path ~/.m2/repository/org/apache/commons/commons-compress/1.18/commons-compress-1.18.jar
--class {you.main.class}
....
I was facing the same exact problem.
Found the answer in this post enter link description here
I have been trying to read data from couchbase , but failing to read due to authentication issue.
import com.couchbase.client.java.document.JsonDocument
import org.apache.spark.sql.SparkSession
import com.couchbase.spark._
object SparkRead {
def main(args: Array[String]): Unit = {
// The SparkSession is the main entry point into spark
val spark = SparkSession
.builder()
.appName("KeyValueExample")
.master("local[*]") // use the JVM as the master, great for testing
.config("spark.couchbase.nodes", "***********") // connect to couchbase on hostname
.config("spark.couchbase.bucket.beer-sample","") // open the travel-sample bucket with empty password
.config("spark.couchbase.username", "couchdb")
.config("spark.couchbase.password", "******")
.config("spark.couchbase.connectTimeout","30000")
.config("spark.couchbase.kvTimeout","10000")
.config("spark.couchbase.socketConnect","10000")
.getOrCreate()
spark.sparkContext
.couchbaseGet[com.couchbase.client.java.document.JsonDocument](Seq("airline_10123")) // Load documents from couchbase
.collect() // collect all data from the spark workers
.foreach(println) // print each document content
}
}
Below is the Build File
name := "KafkaSparkCouchReadWrite"
organization := "my.clairvoyant"
version := "1.0.0-SNAPSHOT"
scalaVersion := "2.11.11"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.1.0",
"org.apache.spark" %% "spark-streaming" % "2.1.0",
"org.apache.spark" %% "spark-sql" % "2.1.0",
"org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.2.0",
"com.couchbase.client" %% "spark-connector" % "2.1.0",
"org.glassfish.hk2" % "hk2-utils" % "2.2.0-b27",
"org.glassfish.hk2" % "hk2-locator" % "2.2.0-b27",
"javax.validation" % "validation-api" % "1.1.0.Final",
"org.apache.kafka" %% "kafka" % "0.11.0.0",
"com.googlecode.json-simple" % "json-simple" % "1.1").map(_.excludeAll(ExclusionRule("org.glassfish.hk2"),ExclusionRule("javax.validation")))
ERROR LOG
17/12/12 15:18:35 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.33.220, 52402, None)
17/12/12 15:18:35 INFO SharedState: Warehouse path is 'file:/Users/sampat/Desktop/GitClairvoyant/cpdl3-poc/KafkaSparkCouchReadWrite/spark-warehouse/'.
17/12/12 15:18:35 INFO CouchbaseCore: CouchbaseEnvironment: {sslEnabled=false, sslKeystoreFile='null', sslKeystorePassword=false, sslKeystore=null, bootstrapHttpEnabled=true, bootstrapCarrierEnabled=true, bootstrapHttpDirectPort=8091, bootstrapHttpSslPort=18091, bootstrapCarrierDirectPort=11210, bootstrapCarrierSslPort=11207, ioPoolSize=8, computationPoolSize=8, responseBufferSize=16384, requestBufferSize=16384, kvServiceEndpoints=1, viewServiceEndpoints=12, queryServiceEndpoints=12, searchServiceEndpoints=12, ioPool=NioEventLoopGroup, kvIoPool=null, viewIoPool=null, searchIoPool=null, queryIoPool=null, coreScheduler=CoreScheduler, memcachedHashingStrategy=DefaultMemcachedHashingStrategy, eventBus=DefaultEventBus, packageNameAndVersion=couchbase-java-client/2.4.2 (git: 2.4.2, core: 1.4.2), dcpEnabled=false, retryStrategy=BestEffort, maxRequestLifetime=75000, retryDelay=ExponentialDelay{growBy 1.0 MICROSECONDS, powers of 2; lower=100, upper=100000}, reconnectDelay=ExponentialDelay{growBy 1.0 MILLISECONDS, powers of 2; lower=32, upper=4096}, observeIntervalDelay=ExponentialDelay{growBy 1.0 MICROSECONDS, powers of 2; lower=10, upper=100000}, keepAliveInterval=30000, autoreleaseAfter=2000, bufferPoolingEnabled=true, tcpNodelayEnabled=true, mutationTokensEnabled=false, socketConnectTimeout=1000, dcpConnectionBufferSize=20971520, dcpConnectionBufferAckThreshold=0.2, dcpConnectionName=dcp/core-io, callbacksOnIoPool=false, disconnectTimeout=25000, requestBufferWaitStrategy=com.couchbase.client.core.env.DefaultCoreEnvironment$2#7b7b3edb, queryTimeout=75000, viewTimeout=75000, kvTimeout=2500, connectTimeout=5000, dnsSrvEnabled=false}
17/12/12 15:18:37 WARN Endpoint: [null][KeyValueEndpoint]: Authentication Failure.
17/12/12 15:18:37 INFO Endpoint: [null][KeyValueEndpoint]: Got notified from Channel as inactive, attempting reconnect.
17/12/12 15:18:37 WARN ResponseStatusConverter: Unknown ResponseStatus with Protocol HTTP: 401
17/12/12 15:18:37 WARN ResponseStatusConverter: Unknown ResponseStatus with Protocol HTTP: 401
Exception in thread "main" com.couchbase.client.java.error.InvalidPasswordException: Passwords for bucket "beer-sample" do not match.
at com.couchbase.client.java.CouchbaseAsyncCluster$OpenBucketErrorHandler.call(CouchbaseAsyncCluster.java:601)
at com.couchbase.client.java.CouchbaseAsyncCluster$OpenBucketErrorHandler.call(CouchbaseAsyncCluster.java:584)
at rx.internal.operators.OperatorOnErrorResumeNextViaFunction$4.onError(OperatorOnErrorResumeNextViaFunction.java:140)
at rx.internal.operators.OnSubscribeMap$MapSubscriber.onError(OnSubscribeMap.java:88)
Sample Code :
import com.couchbase.client.java.document.JsonDocument
import com.couchbase.spark._
import org.apache.spark.sql.SparkSession
object SparkReadCouchBase {
def main(args: Array[String]): Unit = {
val spark = SparkSession
.builder()
.appName("KeyValueExample")
.master("local[*]") // use the JVM as the master, great for testing
.config("spark.couchbase.nodes", "127.0.0.1") // connect to couchbase on hostname
.config("spark.couchbase.bucket.travel-sample","") // open the travel-sample bucket with empty password
.config("com.couchbase.username", "*******")
.config("com.couchbase.password", "*******")
.config("com.couchbase.kvTimeout","10000")
.config("com.couchbase.connectTimeout","30000")
.config("com.couchbase.socketConnect","10000")
.getOrCreate()
println("=====================================================================================")
spark.sparkContext
.couchbaseGet[JsonDocument](Seq("airline_10123")) // Load documents from couchbase
.collect() // collect all data from the spark workers
.foreach(println) // print each document content
println("=====================================================================================")
}
}
POM.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.*******.*****</groupId>
<artifactId>KafkaSparkCouch</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<properties>
<java.version>1.8</java.version>
<spark.version>2.2.0</spark.version>
<scala.version>2.11.8</scala.version>
<scala.parent.version>2.11</scala.parent.version>
<kafka.client.version>0.11.0.0</kafka.client.version>
<fat.jar.name>SparkCouch</fat.jar.name>
<scala.binary.version>2.11</scala.binary.version>
<main.class>com.*******.demo.spark.couchbase.SparkReadCouchBaseTest</main.class>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.parent.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_${scala.parent.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_${scala.parent.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.parent.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>com.couchbase.client</groupId>
<artifactId>spark-connector_${scala.parent.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>${kafka.client.version}</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>
<!--Create fat-jar file-->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<id>compile</id>
<goals>
<goal>compile</goal>
</goals>
<phase>compile</phase>
</execution>
<execution>
<id>test-compile</id>
<goals>
<goal>testCompile</goal>
</goals>
<phase>test-compile</phase>
</execution>
<execution>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.1.6</version>
<configuration>
<scalaCompatVersion>${scala.binary.version}</scalaCompatVersion>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
<!-- other settings-->
</plugin>
</plugins>
</build>
<repositories>
<repository>
<id>mavencentral</id>
<url>http://repo1.maven.org/maven2/</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
<repository>
<id>scala</id>
<name>Scala Tools</name>
<url>http://scala-tools.org/repo-releases/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>scala</id>
<name>Scala Tools</name>
<url>http://scala-tools.org/repo-releases/</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</pluginRepository>
</pluginRepositories>
<name>KafkaSparkCouch</name>
You will need to set the following couchbase configurations as system properties:
System.setProperty("com.couchbase.connectTimeout", "30000");
System.setProperty("com.couchbase.kvTimeout", "10000");
System.setProperty("com.couchbase.socketConnect", "10000");
I am trying sample code to save data to HBase from spark DataFrame.
I am not sure where i went wrong but the code is not working for me.
Below is the code, that i tried. I am able to get the RDD for existing table, but could not save it. I tried couple of ways, which i have mentioned.
Code:
import scala.reflect.runtime.universe
import org.apache.hadoop.fs.Path
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.mapreduce.TableInputFormat
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.SaveMode
case class Person(id: String, name: String)
object PheonixTest extends App {
val conf = new SparkConf;
conf.setMaster("local");
conf.setAppName("test")
val sc = new SparkContext(conf)
val sqlContext = new SQLContext(sc);
val hbaseConf = HBaseConfiguration.create()
hbaseConf.set(TableInputFormat.INPUT_TABLE, "table1")
hbaseConf.addResource(new Path("/Users/srini/softwares/hbase-1.1.2/conf/hbase-site.xml"));
import org.apache.phoenix.spark._;
val phDf = sqlContext.phoenixTableAsDataFrame("table1", Array("id", "name"), conf = hbaseConf)
println("===========>>>>>>>>>>>>>>>>>> " + phDf.show());
val rdd = sc.parallelize(Seq("sr,Srini","sr2,Srini2"))
import sqlContext.implicits._;
val df = rdd.map { x => {val array = x.split(","); Person(array(0), array(1))} }.toDF;
//df.write.format("org.apache.phoenix.spark").mode("overwrite") .option("table", "table1").option("zkUrl", "localhost:2181").save()
//df.rdd.saveToP
df.save("org.apache.phoenix.spark", SaveMode.Overwrite, Map("table" -> "table1", "zkUrl" -> "localhost:2181"))
sc.stop()
}
Pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.srini.plug</groupId>
<artifactId>data-ingestion</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.4</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.dataformat</groupId>
<artifactId>jackson-dataformat-xml</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>com.splunk</groupId>
<artifactId>splunk</artifactId>
<version>1.5.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.5.2</version>
</dependency>
<dependency>
<groupId>org.scalaj</groupId>
<artifactId>scalaj-collection_2.10</artifactId>
<version>1.5</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>12.0</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.phoenix</groupId>
<artifactId>phoenix-spark</artifactId>
<version>4.6.0-HBase-1.1</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.4.1</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>ext-release-local</id>
<url>http://splunk.artifactoryonline.com/splunk/ext-releases-local</url>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<id>compile</id>
<goals>
<goal>compile</goal>
</goals>
<phase>compile</phase>
</execution>
<execution>
<id>test-compile</id>
<goals>
<goal>testCompile</goal>
</goals>
<phase>test-compile</phase>
</execution>
<execution>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.5.3</version>
<executions>
<execution>
<id>create-archive</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<descriptorRefs>
<descriptorRef>
jar-with-dependencies
</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>com.srini.ingest.SplunkSearch</mainClass>
</manifest>
</archive>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Error:
16/01/02 18:26:29 INFO ClientCnxn: Session establishment complete on server localhost/127.0.0.1:2181, sessionid = 0x152031ff8da001c, negotiated timeout = 90000
16/01/02 18:27:18 INFO RpcRetryingCaller: Call exception, tries=10, retries=35, started=48344 ms ago, cancelled=false, msg=
16/01/02 18:27:38 INFO RpcRetryingCaller: Call exception, tries=11, retries=35, started=68454 ms ago, cancelled=false, msg=
16/01/02 18:27:58 INFO RpcRetryingCaller: Call exception, tries=12, retries=35, started=88633 ms ago, cancelled=false, msg=
16/01/02 18:28:19 INFO RpcRetryingCaller: Call exception, tries=13, retries=35, started=108817 ms ago, cancelled=false, msg=
Two issues i notice
Zk url. If you are sure the zookeeper is running locally, update your hosts file with a entry like below and pass the hostname to HBaseConfiguration.
ipaddress hostname
Phoenix by defaults upper cases your table name and columns. So , change the above code as
val phDf = sqlContext.phoenixTableAsDataFrame("TABLE1", Array("ID", "NAME"), conf = hbaseConf)
I am learning how to write BDD test scripts in JAVA using Cucumber. However, I keep getting the above error and not sure why. I have the Cukes Gherkin as a dependency.
POM
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>Cucumber</groupId>
<artifactId>Cucumber</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>gherkin</artifactId>
<version>1.2.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-jvm-deps</artifactId>
<version>1.0.5</version>
<scope>test</scope>
<exclusions>
<exclusion>
<groupId>com.thoughtworks.xstream</groupId>
<artifactId>xstream</artifactId>
</exclusion>
<exclusion>
<groupId>com.googlecode.java-diff-utils</groupId>
<artifactId>diffutils</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-core</artifactId>
<version>1.2.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-java</artifactId>
<version>1.2.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>1.2.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.picocontainer</groupId>
<artifactId>picocontainer</artifactId>
<version>2.15</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>codehaus</id>
<url>http://repository.codehaus.org</url>
</repository>
</repositories>
<profiles>
<profile>
<id>junit-4.12</id>
<properties>
<junit.version>4.12</junit.version>
</properties>
</profile>
</profiles>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.5</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-clean-plugin</artifactId>
<version>2.6.1</version>
<configuration>
<filesets>
<fileset>
<directory>.</directory>
<includes>
<include>**/*.ser</include>
</includes>
</fileset>
</filesets>
</configuration>
</plugin>
</plugins>
</build>
</project>
Feature
Feature: Letter
Scenario: Check Letter
Given I have the letter "A"
When Icheck the letter "A"
Then I should see an output
Steps
package cucumber.steps;
import cucumber.api.CucumberOptions;
import cucumber.api.java.en.*;
import cucumber.api.junit.Cucumber;
import org.junit.Assert;
import org.junit.runner.RunWith;
/**
* Created by Dustin on 8/31/2015.
*/
#RunWith(Cucumber.class)
#CucumberOptions(
plugin = {"json-pretty", "html:target/cucumber"},
features = "src/main/java/cucumber/steps/LetterStepDefs"
)
public class LetterStepDefs {
private String letter;
private String message;
#Given("^I have the letter \"([^\"]*)\"$")
public void I_have_the_letter(String letter) throws Throwable {
// Express the Regexp above with the code you wish you had
this.letter = letter;
}
#When("^Icheck the letter \"([^\"]*)\"$")
public void Icheck_the_letter(String letter) throws Throwable {
// Express the Regexp above with the code you wish you had
try
{
Assert.assertEquals(this.letter, letter);
}
catch (Exception exc)
{
message = exc.getMessage();
}
}
#Then("^I should see an output$")
public void I_should_see_an_output() throws Throwable {
// Express the Regexp above with the code you wish you had
System.out.println(message);
}
}
Output
Testing started at 4:41 PM ...
Connected to the target VM, address: '127.0.0.1:58473', transport: 'socket'
JUnit version 4.12
Exception in thread "main" java.lang.NoClassDefFoundError: gherkin/formatter/Formatter
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:274)
at org.junit.internal.Classes.getClass(Classes.java:16)
at org.junit.runner.JUnitCommandLineParseResult.parseParameters(JUnitCommandLineParseResult.java:100)
at org.junit.runner.JUnitCommandLineParseResult.parseArgs(JUnitCommandLineParseResult.java:50)
at org.junit.runner.JUnitCommandLineParseResult.parse(JUnitCommandLineParseResult.java:44)
at org.junit.runner.JUnitCore.runMain(JUnitCore.java:72)
at org.junit.runner.JUnitCore.main(JUnitCore.java:36)
Caused by: java.lang.ClassNotFoundException: gherkin.formatter.Formatter
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 20 more
Disconnected from the target VM, address: '127.0.0.1:58473', transport: 'socket'
Process finished with exit code 1
Any help is much appreciated!
I was working with cucumber with some selenium scripts today and came across a similar issue whenever I was using gherkin3 jar file.
Once I switch back to using gherkin 2.12.2, the issue went away.
You can download the jar from the following location:
http://search.maven.org/#search%7Cga%7C1%7Cgherkin
It is certainly worth trying this and checking if you get the same issue.
I would also try running your feature file without any methods to check if you get it to return the methods you need to create, similar to the what is detailed in the document here:
http://www.toolsqa.com/cucumber/first-cucumber-selenium-java-test/
You don't need the glue option though like detailed in the example when you just what to run the feature file.