Currently I work using Spring Framework. I was asked to print a report with a PDF output that retrieves data from a SQL database, but the project I'm working on has Apache POI installed where the output is Excel XLSX.
Is there a way to convert Excel XLSX format to PDF?
I already have Apache POI and recently added iTextPDF.
Apache POI
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.15</version>
</dependency>
iTextPDF
<properties>
<itext.version>7.2.3</itext.version>
</properties>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>itext7-core</artifactId>
<version>${itext.version}</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>kernel</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>io</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>layout</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>forms</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>pdfa</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>sign</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>barcodes</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>font-asian</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>hyph</artifactId>
<version>${itext.version}</version>
</dependency>
EDIT : This is the code to generate XLSX file
#RequestMapping(value = "/download", method = RequestMethod.POST)
#PreAuthorize("hasAuthority('CTRL_REPORT_READ')")
#ResponseBody
public void download(#Valid #ModelAttribute ReportForm reportForm, BindingResult result, RedirectAttributes redirectAttrs, HttpServletRequest request, HttpServletResponse response) throws IOException {
logger.debug("IN: reimbursement/dowload report-POST");
XSSFWorkbook workbook = null;
try {
workbook = new XSSFWorkbook();
downloadExcel(reportForm, workbook);
response.setContentType("application/vnd.ms-excel");
SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMdd_HHmmss");
String fileName = "Reimbursement Report " + sdf.format(new Date()) + ".xlsx";
response.setHeader("Content-disposition", "attachment; filename=" + fileName);
workbook.write(response.getOutputStream());
} catch (Exception e) {
logger.error("Error occurs when add new data in method reimbursement with error message : " + e.getMessage());
redirectAttrs.addFlashAttribute(AppDataConstant.ERROR_FLASH_RESP, "Download reimbursement report failed");
} finally {
if (workbook != null) {
workbook.close();
}
}
}
Thank you very much before
Related
I have an issue with block blob storage of Azure. The issue is that If I try to access the storage and create a container from a jar, it works fine. If I try to run it from a spark-submit command, it doesn't work. I'am trying to capture the traffic between my code and Azure to see where it goes wrong but the problem is that fiddler doesn't capture that kind of traffic although I can capture traffic when accessing other sites like www.google.com.
this works:
import java.net.*;
import java.io.*;
public class Example
{
public static void main(String[] args) throws Exception
{
System.setProperty("proxySet", "true");
System.setProperty("proxyHost", "127.0.0.1");
System.setProperty("proxyPort", "9090");
System.setProperty("javax.net.ssl.trustStore", "C:\\data\\keys\\FiddlerKeystore");
System.setProperty("javax.net.ssl.trustStorePassword", "password");
URL x = new URL("https://www.google.com");
HttpURLConnection hc = (HttpURLConnection)x.openConnection();
hc.setRequestProperty("User-Agent","Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2");
InputStream is = hc.getInputStream();
int u = 0;
byte[] kj = new byte[1024];
while((u = is.read(kj)) != -1)
{
System.out.write(kj,0,u);
}
is.close();
}
}
Now, If I do the same with Azure code Fiddler doesn't capture anything:
Here is my Azure code:
import azure.AzureBlockBlobClient;
import common.AzureConf;
import org.apache.log4j.BasicConfigurator;
import java.io.IOException;
public class AzureExample {
private AzureBlockBlobClient azureBlockBlobClient;
private static final org.slf4j.Logger log = org.slf4j.LoggerFactory.getLogger(AzureExample.class);
public AzureExample() {
azureBlockBlobClient = new AzureBlockBlobClient(AzureConf.ACCOUNT_NAME,AzureConf.ACCOUNT_KEY, AzureConf.CONTAINER_NAME);
azureBlockBlobClient.createContainer();
}
public static void main(String... args) throws IOException {
BasicConfigurator.configure();
System.setProperty("proxySet", "true");
System.setProperty("proxyHost", "127.0.0.1");
System.setProperty("proxyPort", "9090");
System.setProperty("javax.net.ssl.trustStore", "C:\\data\\keys\\FiddlerKeystore");
System.setProperty("javax.net.ssl.trustStorePassword", "password");
new AzureExample();
System.exit(0);
}
}
Here is the client the connects to Azure:
public AzureBlockBlobClient(String accountName, String accountKey, String containerName) {
this.accountName = accountName;
this.accountKey = accountKey;
this.containerName = containerName;
init();
}
private void init() {
log.info("Init AzureBlockBlobClient started...");
try {
SharedKeyCredentials creds = new SharedKeyCredentials(accountName, accountKey);
serviceURL = new ServiceURL(new URL("https://" + accountName + ".blob.core.windows.net/"),
StorageURL.createPipeline(creds, new PipelineOptions()));
containerURL = serviceURL.createContainerURL(containerName);
}catch (InvalidKeyException e){
log.error("Authentication error while trying to access storage account", e);
}catch (MalformedURLException e) {
log.error("Invalid Service URL", e);
e.printStackTrace();
}catch (Exception e) {
e.printStackTrace();
log.error("Error initializing AzureBlockBlobClient", e);
}
log.info("Init AzureBlockBlobClient Done!");
}
public void createContainer(){
try {
// Let's create a container using a blocking call to Azure Storage
// If container exists, we'll catch and continue
log.info("Creating container {}." , containerName);
ContainerCreateResponse response = containerURL.create(null, null, null).blockingGet();
log.info("Container Create Response was {}." , response.statusCode());
}
catch (RestException e){
if (e instanceof RestException && e.response().statusCode() != 409) {
log.error("Error Creating container", e);
} else {
log.info("Container {} already exists, resuming...", containerName);
}
}
}
And this is where my constants are:
public interface AzureConf {
String ACCOUNT_KEY ="<SomeAccountKey>";
String ACCOUNT_NAME = "storage";
String CONTAINER_NAME = "My-container";
}
This is my maven pom.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>examples</groupId>
<artifactId>spark-azure-storage</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<junit.version>4.12</junit.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-azure</artifactId>
<version>2.7.1</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage</artifactId>
<version>2.0.0</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage-blob</artifactId>
<version>10.1.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>2.2.3</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.16</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.typesafe.akka/akka-actor -->
<dependency>
<groupId>com.microsoft.rest.v2</groupId>
<artifactId>client-runtime</artifactId>
<version>2.0.0</version>
<!--I have to exclude following dependencies and include version 2.9.7 of them otherwise I get
SoSuchMethodError-->
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.16</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"></transformer>
</transformers>
</configuration>
</plugin>
</plugins>
</build>
Any help to get this to work?
Thank you in advance
According to the Oracle offical document Java Networking and Proxies, there is not these properties proxySet, proxyHost and proxyPort for Java System Properties.
Please use https.proxyHost and https.proxyPort instead of them, which work for me.
I am having the following issue while working with cucumber and cucumber extended reports. I am using the method that takes a screenshot when a step in a scenario fails and the embed that screenshot in the cucumber detailed report. The feature is working fine with the exception that it generates multiple screenshots when I only want one screenshot of the page where failure occurred. If anyone knows how to solve this issue, can you please shed some light. I would really appreciate it.
Thanks in advance
This my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.moneris.automation</groupId>
<artifactId>md01_automation_demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>merchant_direct</name>
<profiles>
<profile>
<id>md_01_automation</id>
</profile>
</profiles>
<dependencies>
<!-- This is what I added for the extended report BEGIN -->
<dependency>
<groupId>com.github.mkolisnyk</groupId>
<artifactId>cucumber-runner</artifactId>
<version>1.3.3</version>
</dependency>
<dependency>
<groupId>com.github.mkolisnyk</groupId>
<artifactId>cucumber-reports</artifactId>
<version>1.3.3</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>3.8</version>
</dependency>
<!-- https://mvnrepository.com/artifact/info.cukes/cucumber-testng -->
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-testng</artifactId>
<version>1.2.5</version>
</dependency>
<dependency>
<groupId>com.vimalselvam</groupId>
<artifactId>cucumber-extentsreport</artifactId>
<version>3.0.2</version>
</dependency>
<dependency>
<groupId>com.relevantcodes</groupId>
<artifactId>extentreports</artifactId>
<version>2.41.2</version>
</dependency>
<dependency>
<groupId>com.aventstack</groupId>
<artifactId>extentreports</artifactId>
<version>3.1.5</version>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-java</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.sun</groupId>
<artifactId>tools</artifactId>
<version>1.7.0.13</version>
<scope>system</scope>
<systemPath>${env.JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>gherkin</artifactId>
<version>2.12.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.freemarker</groupId>
<artifactId>freemarker</artifactId>
<version>2.3.26-incubating</version>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.14.3</version>
<scope>test</scope>
</dependency>
<!-- This is what I added for the extended report END -->
<!-- <dependency> <groupId>info.cukes</groupId> <artifactId>cucumber-java</artifactId>
<version>1.2.4</version> <scope>test</scope> </dependency> -->
<!-- <dependency> <groupId>info.cukes</groupId> <artifactId>cucumber-junit</artifactId>
<version>1.2.4</version> <scope>test</scope> </dependency> -->
<!-- https://mvnrepository.com/artifact/info.cukes/cucumber-picocontainer -->
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.12.0</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.17</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-io -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-io</artifactId>
<version>1.3.2</version>
</dependency>
<!-- It needs this dependency to run TestNG from command line -->
<dependency>
<groupId>com.beust</groupId>
<artifactId>jcommander</artifactId>
<version>1.72</version>
</dependency>
<!-- To take the screenshots of the entire page -->
<dependency>
<groupId>ru.yandex.qatools.ashot</groupId>
<artifactId>ashot</artifactId>
<version>1.5.3</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.assertthat/selenium-shutterbug -->
<dependency>
<groupId>com.assertthat</groupId>
<artifactId>selenium-shutterbug</artifactId>
<version>0.7</version>
</dependency>
<!-- Library to compare two text files -->
<dependency>
<groupId>com.github.wumpz</groupId>
<artifactId>diffutils</artifactId>
<version>2.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.hp.gagawa/gagawa -->
<dependency>
<groupId>com.hp.gagawa</groupId>
<artifactId>gagawa</artifactId>
<version>1.0.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.3</version>
<configuration>
<descriptor>src/test/resources/assembly.xml</descriptor>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<archive>
<manifest>
<mainClass>com.moneris.automation.main.Execute</mainClass>
</manifest>
</archive>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<fork>true</fork>
<executable>C:\Program Files\Java\jdk1.8.0_171\bin\javac.exe</executable>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<testFailureIgnore>true</testFailureIgnore>
<includes>
<include>**/*TestRunner.java</include>
</includes>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-report-plugin</artifactId>
<version>2.20</version>
</plugin>
<plugin>
<groupId>net.masterthought</groupId>
<artifactId>maven-cucumber-reporting</artifactId>
<version>2.0.0</version>
<executions>
<execution>
<id>execution</id>
<phase>test</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<projectName>Merchant Direct</projectName>
<outputDirectory>${project.build.directory}/site/cucumber-reports</outputDirectory>
<cucumberOutput>${project.build.directory}/Extended-Report/cucumber.json</cucumberOutput>
<skippedFails>true</skippedFails>
<enableFlashCharts>true</enableFlashCharts>
<buildNumber>1</buildNumber>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
This my is Runner class
package com.moneris.automation.runner;
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import org.apache.commons.lang.StringUtils;
import org.testng.annotations.AfterSuite;
import com.github.mkolisnyk.cucumber.reporting.CucumberDetailedResults;
import com.github.mkolisnyk.cucumber.reporting.CucumberResultsOverview;
import com.github.mkolisnyk.cucumber.runner.BeforeSuite;
import com.github.mkolisnyk.cucumber.runner.ExtendedCucumberOptions;
import com.github.mkolisnyk.cucumber.runner.ExtendedTestNGRunner;
import com.moneris.automation.counter.HelperClass;
import com.moneris.automation.stepsDefinitions.Drivers;
import com.moneris.automation.utilities.ConfigUtility;
import cucumber.api.CucumberOptions;
#ExtendedCucumberOptions(jsonReport = "target/Extended-Report/cucumber.json",
retryCount = 0,
detailedReport = true,
detailedAggregatedReport = true,
overviewReport = true,
coverageReport = true,
featureOverviewChart = true,
overviewChartsReport = true,
jsonUsageReport = "target/Extended-Report/cucumber-usage.json",
usageReport = true,
systemInfoReport = true,
toPDF = true,
screenShotLocation="./../",
screenShotSize = "85%",
outputFolder = "target/Extended-Report")
#CucumberOptions(
strict=true,
features = "resources/features",
glue = {"com.moneris.automation.stepsDefinitions"},
plugin = {
"pretty:STDOUT","html:target/reports/cucumber-pretty",
"json:target/Extended-Report/cucumber.json",
"usage:target/Extended-Report/cucumber-usage.json"
}
,
monochrome = true
)
public class TestRunner extends ExtendedTestNGRunner {
#org.testng.annotations.AfterClass
public static void setup() throws IOException {
Drivers.getDriver().quit();
if (Drivers.browser.equals("chrome"))
Runtime.getRuntime().exec("taskkill /F /IM ChromeDriver.exe");
}
#BeforeSuite
public void generateDirectories() {
ConfigUtility.prop();
String resultPath = ConfigUtility.get("resultPath");
File dirs = new File(resultPath);
dirs.mkdirs();
Date date = new Date();
String browserFolderName = new SimpleDateFormat("yyyyMMdd_HH.mm.ss")
.format(date)+
"_"+StringUtils
.capitalize(
Drivers.browser);
File browserDir = new File(resultPath+"\\"+browserFolderName);
browserDir.mkdirs();
HelperClass fc = HelperClass.getInstance();
fc.browserFolderName = browserFolderName;
}
#AfterSuite
public void generateReport() {
ConfigUtility.prop();
String resultPath = ConfigUtility.get("resultPath");
HelperClass hc = HelperClass.getInstance();
CucumberResultsOverview results = new CucumberResultsOverview();
results.setOutputDirectory(resultPath+"/"+hc.browserFolderName);
results.setOutputName("cucumber-results-overview");
results.setSourceFile("target/Extended-Report/cucumber.json");
try {
results.execute();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
CucumberDetailedResults detailedResult = new CucumberDetailedResults();
detailedResult.setOutputDirectory(resultPath+"/"+hc.browserFolderName);
detailedResult.setOutputName("cucumber-detailed");
detailedResult.setSourceFile("target/Extended-Report/cucumber.json");
detailedResult.setScreenShotLocation("./Error_Screenshots/");
detailedResult.setScreenShotWidth("100%");
try {
detailedResult.execute();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
This the code I am using to take the screenshot on failure
#After
public void tearDown(Scenario scenario) {
if (Drivers.getDriver() == null || Drivers.getDriver().toString().contains("(null)"))
return;
if (scenario.isFailed() || scenario.getStatus().equalsIgnoreCase("failed")) {
final byte[] screenshot = ((TakesScreenshot) Drivers.getDriver()).getScreenshotAs(OutputType.BYTES);
scenario.embed(screenshot, "image/png");
}
}
This how the report looks. I want only one screenshot.
Recently we moved to HDP 2.5 which has Kafka 0.10.0 and Spark 1.6.2. So I modified my pom and some of the APIs to work with new Kafka. I can run the code but I do not see any messages coming in. I have added a code snippet below. I have also posted my pom. I am not sure what is going wrong here. Can someone please help.
SparkConf conf = new SparkConf().setMaster("local[2]").setAppName(
"SparkApp");
JavaStreamingContext jssc = new JavaStreamingContext(conf,
Durations.seconds(2));
Map<String, Integer> topicMap = new HashMap<String, Integer>();
topicMap.put(this.topic, this.numThreads);
Map<String, String> kafkaParams = new HashMap<>();
kafkaParams.put("metadata.broker.list", kfkBroker);
kafkaParams.put("zookeeper.connect", zkBroker);
kafkaParams.put("group.id", "default");
kafkaParams.put("fetch.message.max.bytes", "60000000");
JavaPairReceiverInputDStream<String, String> kafkaInStream = KafkaUtils.createStream(
jssc,
String.class,
String.class,
kafka.serializer.StringDecoder.class,
kafka.serializer.StringDecoder.class,
kafkaParams,
topicMap,
StorageLevel.MEMORY_AND_DISK());
kafkaInStream.foreachRDD(new VoidFunction<JavaPairRDD<String, String>>()
{
/**
*
*/
private static final long serialVersionUID = 1L;
public void call(JavaPairRDD<String, String> v1) throws Exception
{
System.out.println("inside call.. JavaPairRDD size " + v1.count());
for (Tuple2<String, String> test : v1.collect())
{
this.eventMessage.setMessage(test._2);
}
}
});
I get an output "inside call.. JavaPairRDD size 0" always which indicates that spark is not reading any data. I tried pushing some data into the topic through console consumer.But that did not help.
Here is my pom.xml (only dependencies added)
<dependencies>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.10.1.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka_2.10 -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.10.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20160810</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.101tec</groupId>
<artifactId>zkclient</artifactId>
<version>0.8</version>
</dependency>
</dependencies>
spark-streaming-kafka_2.10 only works with Kafka 0.8+ client. You can still use Kafka 0.8+ client to connect to a 0.10+ cluster but loss some performance.
I suggest that you just use --packages to submit your application to avoid setting Kafka in your dependencies. E.g.,
bin/spark-submit --packages org.apache.spark:spark-streaming-kafka_2.10:1.6.2 ...
I am trying to convert a word document to pdf using apache-POI, but I get an exception:
java.lang.NoSuchMethodError:org.apache.poi.util.POILogger.log(I[Ljava/lang/Object;)V
This is my code:
InputStream is = new FileInputStream(new File("D:\\2161-1041-5-157.docx"));
XWPFDocument document = new XWPFDocument(is);
// 2) Prepare Pdf options
PdfOptions options = PdfOptions.create();
// 3) Convert XWPFDocument to Pdf
OutputStream out = new FileOutputStream(new File("D:\\2161-1041-5-157.pdf"));
PdfConverter.getInstance().convert(document, out, options);
The full exception and callstack is:
java.lang.NoSuchMethodError:
org.apache.poi.util.POILogger.log(I[Ljava/lang/Object;)V at
org.apache.poi.openxml4j.opc.PackageRelationshipCollection.parseRelationshipsPart(PackageRelationshipCollection.java:314)
at
org.apache.poi.openxml4j.opc.PackageRelationshipCollection.(PackageRelationshipCollection.java:164)
at
org.apache.poi.openxml4j.opc.PackageRelationshipCollection.(PackageRelationshipCollection.java:132)
at
org.apache.poi.openxml4j.opc.PackagePart.loadRelationships(PackagePart.java:561)
at
org.apache.poi.openxml4j.opc.PackagePart.(PackagePart.java:109)
at
org.apache.poi.openxml4j.opc.PackagePart.(PackagePart.java:80)
at
org.apache.poi.openxml4j.opc.PackagePart.(PackagePart.java:125)
at
org.apache.poi.openxml4j.opc.ZipPackagePart.(ZipPackagePart.java:78)
at
org.apache.poi.openxml4j.opc.ZipPackage.getPartsImpl(ZipPackage.java:237)
at
org.apache.poi.openxml4j.opc.OPCPackage.getParts(OPCPackage.java:696)
at org.apache.poi.openxml4j.opc.OPCPackage.open(OPCPackage.java:280)
at org.apache.poi.util.PackageHelper.open(PackageHelper.java:37) at
org.apache.poi.xwpf.usermodel.XWPFDocument.(XWPFDocument.java:128)
at org.open.word.POIWordToPDF.createPDF(POIWordToPDF.java:27) at
org.open.word.POIWordToPDF.main(POIWordToPDF.java:18)
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.14</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>3.11</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml-schemas</artifactId>
<version>3.14</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-scratchpad</artifactId>
<version>3.14</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>ooxml-schemas</artifactId>
<version>1.3</version>
</dependency>
Okay, this is probably an ID10T error somewhere, but I just am not seeing it. I have just a shell of the test but I am seeing the methods get, status and content saying unresolved. I don't know what I am missing either in the pom or for an import. I am missing something somewhere, just not seeing it.
Here is the unit test shell.
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.MockitoAnnotations;
import org.springframework.http.MediaType;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.context.web.WebAppConfiguration;
import org.springframework.test.web.servlet.MockMvc;
import org.springframework.test.web.servlet.setup.MockMvcBuilders;
import com.dstbs.prime.service.interfaces.AccountServiceI;
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={"file:src/test/test-context.xml"})
#WebAppConfiguration
public class AccountControllerTest
{
#Mock
private AccountServiceI acctSrvc;
private MockMvc mockMvc;
#Before
public void setup() {
// Process mock annotations
MockitoAnnotations.initMocks(this);
// Setup Spring test in standalone mode
this.mockMvc = MockMvcBuilders.standaloneSetup(new AccountController()).build();
}
//BELOW SAYS THAT get(), status() and content() are unresolved.
#Test
public void testGetAccount() throws Exception {
mockMvc.perform(get("/account").accept(MediaType.parseMediaType("application/json")))
.andExpect(status().isOk())
.andExpect(content().contentType("application/json"));
}
}
Here are the spring and mockito pom entries I have
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-oxm</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${org.springframework-version}</version>
<exclusions>
<!-- Exclude Commons Logging in favor of SLF4j -->
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context-support</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.9.5</version>
<scope>test</scope>
</dependency>
You need to include the necessary static imports.
See the Static Imports section of the Spring Reference Manual for details.