Jira plugin Apache Poi - apache-poi

I'm having some problems including Apache Poi in a simple Jira plugin. While trying to make a simple proof of concept to generate an Excel file (ooxml actually) I'm getting some dependency / class cast exception issues. My plugin extends AbstractSearchRequestView and the following code snippets tries to output an empty xlsx file.
public void writeSearchResults(SearchRequest sr, SearchRequestParams srp, Writer writer) throws SearchException
{
XSSFWorkbook wb = new XSSFWorkbook();
WriterOutputStream out = new WriterOutputStream(writer);
wb.write(out);
}
Now I have my export option available in the Jira issue search screen, but when running it I'm getting the following classcastexception:
java.lang.ClassCastException: com.ctc.wstx.stax.WstxEventFactory cannot be cast to javax.xml.stream.XMLEventFactory
My pom file POI dependencies look like this:
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>3.15</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.15</version>
</dependency>
I have tried to configure everything like in this question / answer but I get the same issue.

Try with below set of dependencies.
<dependency>
<groupId>org.apache.xmlbeans</groupId>
<artifactId>xmlbeans</artifactId>
<version>2.6.0</version>
<exclusions>
<exclusion>
<groupId>stax</groupId>
<artifactId>stax-api</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>3.14</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.14</version>
<exclusions>
<exclusion>
<groupId>stax</groupId>
<artifactId>stax-api</artifactId>
</exclusion>
<exclusion>
<groupId>xml-apis</groupId>
<artifactId>xml-apis</artifactId>
</exclusion>
<exclusion>
<groupId>dom4j</groupId>
<artifactId>dom4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>ooxml-schemas</artifactId>
<version>1.3</version>
</dependency>

Related

Spark job with oozie TFS FileSystem implementation error

I am new to spark.
I need to run a spark job within oozie.
individually i am able to run the spark job but with oozie after the job is launched i am getting the following error:
017-01-12 13:51:57,696 INFO [main] org.apache.hadoop.service.AbstractService: Service org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED; cause: java.lang.UnsupportedOperationException: Not implemented by the TFS FileSystem implementation
java.lang.UnsupportedOperationException: Not implemented by the TFS FileSystem implementation
at org.apache.hadoop.fs.FileSystem.getScheme(FileSystem.java:216)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2564)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.getFileSystem(MRAppMaster.java:497)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:281)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1499)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1496)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1429)
Spark version: spark-1.5.2-bin-hadoop2.6 Hadoop: hadoop-2.6.2 Hbase :
hbase-1.1.5 Oozie: oozie-4.2.0
snapshot of my pom.xml is:
<dependency>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
<version>3.4.8</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-common</artifactId>
<version>1.1.5</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.1.5</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>1.1.5</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-testing-util</artifactId>
<version>1.1.5</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.5.2</version>
<exclusions>
<exclusion>
<artifactId>javax.servlet</artifactId>
<groupId>org.eclipse.jetty.orbit</groupId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.5.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-yarn_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-yarn_2.11</artifactId>
<version>1.5.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.mongodb.mongo-hadoop/mongo-hadoop-core -->
<dependency>
<groupId>org.mongodb.mongo-hadoop</groupId>
<artifactId>mongo-hadoop-core</artifactId>
<version>1.5.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.2</version>
<exclusions>
<exclusion>
<artifactId>servlet-api</artifactId>
<groupId>javax.servlet</groupId>
</exclusion>
<exclusion>
<artifactId>jetty-util</artifactId>
<groupId>org.mortbay.jetty</groupId>
</exclusion>
<exclusion>
<artifactId>jsp-api</artifactId>
<groupId>javax.servlet.jsp</groupId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-client -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.2</version>
<exclusions>
<exclusion>
<artifactId>jetty-util</artifactId>
<groupId>org.mortbay.jetty</groupId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-mapreduce-client-core -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>2.6.2</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.2.1</version>
</dependency>
<!-- hadoop dependency -->
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-core -->
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
<exclusions>
<exclusion>
<artifactId>jetty-util</artifactId>
<groupId>org.mortbay.jetty</groupId>
</exclusion>
</exclusions>
</dependency>
Till now I have searched several blogs. What i do understand from reading those blogs iis that there is some issue with the tachyon jar which is embedded in spark-assembly-1.5.2-hadoop2.6.0.jar.
I tried removing tachyon-0.5.0.jar tachyon-client-0.5.0.jar from shared library of oozie (was present under spark library) but then i started getting error:
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], main() threw exception, org.apache.spark.util.Utils$.DEFAULT_DRIVER_MEM_MB()I
java.lang.NoSuchMethodError: org.apache.spark.util.Utils$.DEFAULT_DRIVER_MEM_MB()I
Please help me debug and solve it.

Can't build ParagraphVectors in Linux

I'm using the Doc2Vec algorithm with Deeplearning4j and it works fine when I run it on my Windows 10 PC, however when I try to run it on a Linux box, i get the following error:
java.lang.NoClassDefFoundError: Could not initialize class org.nd4j.linalg.factory.Nd4j
at org.deeplearning4j.models.embeddings.inmemory.InMemoryLookupTable$Builder.<init>(InMemoryLookupTable.java:581) ~[run.jar:?]
at org.deeplearning4j.models.sequencevectors.SequenceVectors$Builder.presetTables(SequenceVectors.java:801) ~[run.jar:?]
at org.deeplearning4j.models.paragraphvectors.ParagraphVectors$Builder.build(ParagraphVectors.java:663) ~[run.jar:?]
I've tried this on a couple of Linux machines, both of which were running Xubuntu and had sudo permissions
Here is the code for creating my ParagraphVectors:
InputStream is = new ByteArrayInputStream(baos.toByteArray());
LabelAwareSentenceIterator iter;
iter = new LabelAwareListSentenceIterator(is, DELIM);
iter.setPreProcessor(new SentencePreProcessor() {
#Override
public String preProcess(String sentence) {
return new InputHomogenization(sentence).transform();
}
});
TokenizerFactory tokenizerFactory = new DefaultTokenizerFactory();
vec = new ParagraphVectors.Builder().minWordFrequency(minWordFrequency).batchSize(batchSize)
.iterations(iterations).layerSize(layerSize).stopWords(stopWords).windowSize(windowSize)
.learningRate(learningRate).tokenizerFactory(tokenizerFactory).iterate(iter).build();
vec.fit();
And here is my pom.xml (versions are all 0.7.1, but I had been using 0.4-rc3.9 and got the same error) :
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-ui-model</artifactId>
<version>${dl4j.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.deeplearning4j</groupId>
<artifactId>deeplearning4j-nlp</artifactId>
<version>${dl4j.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.nd4j</groupId>
<artifactId>nd4j-native</artifactId>
<version>${nd4j.version}</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.datavec/datavec-api -->
<dependency>
<groupId>org.datavec</groupId>
<artifactId>datavec-api</artifactId>
<version>${nd4j.version}</version>
</dependency>
Always stick to the latest version first of all. Could you post the full stack trace? This is definitely not the root cause. Maybe try using nd4j-native-platform instead? Usually this is a problem with missing native artifacts.

Apache POI usage with Apache Felix

I'm trying to import Apache POI to Atlassian Jira Plugin for reading excel files.
At the beginning, I started with adding just
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>${poi.version}</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml-schemas</artifactId>
<version>${poi.version}</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>${poi.version}</version>
</dependency>
poi.version is 3.14
then when I started the plugin it gave
Unresolved constraint in bundle com.tezExtensions [165]: Unable to resolve 165.0: missing requirement [165.0] osgi.wiring.package; (osgi.wiring.package=com.sun.javadoc)
Then I edited pom with some instructions which I found on another StackOverflow question Pax Exam issue with Apache POI wrapped bundle
<instructions>
<Atlassian-Plugin-Key>${atlassian.plugin.key}</Atlassian-Plugin-Key>
<!-- Add package to export here -->
<Export-Package>com.sony.poc.api,</Export-Package>
<_exportcontents>
org.apache.poi.*;version=${poi.version},
org.openxmlformats.schemas.*;version=${poi.schema.version},
schemasMicrosoftComOfficeExcel.*;version=${poi.schema.version},
schemasMicrosoftComOfficeOffice.*;version=${poi.schema.version},
schemasMicrosoftComOfficePowerpoint.*;version=${poi.schema.version},
schemasMicrosoftComVml.*;version=${poi.schema.version},
org.etsi.uri.*;version=${poi.security.version}
</_exportcontents>
<!-- Add package import here -->
<Import-Package>
com.sun.javadoc;resolution:=optional,
com.sun.tools.javadoc;resolution:=optional,
org.apache.crimson.jaxp;resolution:=optional,
org.apache.tools.ant;resolution:=optional,
org.apache.tools.ant.taskdefs;resolution:=optional,
org.apache.tools.ant.types;resolution:=optional,
junit.framework.*;resolution:=optional,
junit.textui.*;resolution:=optional,
org.junit.*;resolution:=optional,
org.apache.xml.security.*;resolution:=optional,
org.apache.jcp.xml.dsig.internal.dom.*;resolution:=optional,
org.springframework.osgi.*;resolution:="optional", org.eclipse.gemini.blueprint.*;resolution:="optional", *</Import-Package>
<DynamicImport-Package>
org.apache.xmlbeans.*,
schemaorg_apache_xmlbeans.*
</DynamicImport-Package>
With this configuration, I get this error;
Unable to resolve 165.0: missing requirement [165.0] osgi.wiring.package; (osgi.wiring.package=org.apache.xml.resolver)
Is there anyone have an idea?
Finally, I found a solution.
I have added these as dependency
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>${poi.version}</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-scratchpad</artifactId>
<version>${poi.version}</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>${poi.version}</version>
<exclusions>
<exclusion>
<groupId>stax</groupId>
<artifactId>stax-api</artifactId>
</exclusion>
<exclusion>
<groupId>xml-apis</groupId>
<artifactId>xml-apis</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.codehaus.woodstox</groupId>
<artifactId>woodstox-core-asl</artifactId>
<version>4.4.1</version>
</dependency>
And also added these are to inside of Import-Package tag
*;resolution:=optional, com.ctc.wstx.stax.*
That's all.

Error while implementing XSSF example

I want to read large Excel files (.xlsx) with java. Apache has an example how to go this here.
So I just copied the whole class, added the path to my .xlsx file and tried to execute it but I get this error:
Error:(96, 69) java: cannot access org.openxmlformats.schemas.spreadsheetml.x2006.main.CTRst
class file for org.openxmlformats.schemas.spreadsheetml.x2006.main.CTRst not found
This error is triggered in this line in the method endElement():
lastContents = new XSSFRichTextString(sst.getEntryAt(idx)).toString();
I use this dependency for maven:
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.9</version>
</dependency>
How do I fix this error?
With the help from Gagravarr I was able to fix this issue. I just needed to add the following dependencied to maven:
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.9</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>3.9</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.9</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml-schemas</artifactId>
<version>3.9</version>
</dependency>
<dependency>
<groupId>stax</groupId>
<artifactId>stax-api</artifactId>
<version>1.0.1</version>
</dependency>
<dependency>
<groupId>org.apache.xmlbeans</groupId>
<artifactId>xmlbeans</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>dom4j</groupId>
<artifactId>dom4j</artifactId>
<version>1.6.1</version>
</dependency>

Use apache-curator for store kafka offset in spark-streaming on yarn. exception: User class threw exception: java.lang.NoSuchMethodError

User class threw exception: java.lang.NoSuchMethodError: org.apache.curator.framework.api.CreateBuilder.creatingParentContainersIfNeeded()Lorg/apache/curator/framework/api/ProtectACLCreateModeStatPathAndBytesable;
Run as local modle, no exception throw out.
apache-spark: 2.2.0
apache-curator: 4.0.1, zookeeper-client: 3.4.5, zookeeper-server:3.4.5
exception code:
curator.create().creatingParentsIfNeeded().withMode(CreateMode.PERSISTENT).forPath(path, data)
maven pom:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<exclusions>
<exclusion>
<groupId>org.apache.curator</groupId>
<artifactId>curator-framework</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.curator</groupId>
<artifactId>curator-client</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<exclusions>
<exclusion>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
</dependency>
<dependency>
<groupId>org.apache.curator</groupId>
<artifactId>curator-framework</artifactId>
<version>4.0.1</version>
<exclusions>
<exclusion>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.zookeeper</groupId>
<artifactId>zookeeper</artifactId>
<version>3.4.5</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
<exclusion>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
</exclusion>
</exclusions>
</dependency>
I try to exclute curator dependecy in spark-core, but still with the same exception.

Resources