I am trying to use spark and cassandra through Spring in netbeans and I get an error:
type=Internal Server Error, status=500
Failed to open native connection to Cassandra at {127.0.0.1}:9042.
Spark and Cassandra were functioning just fine before I try to integrate Spring. There are already data in my Cassandra database which I take through spark and process them. Basically, I want to print the results(a matrix) in a /welcome page through a RestController.
Here is my really simple File Structure:
image
Here is my pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany</groupId>
<artifactId>my-app</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<debug>true</debug>
</configuration>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.0.0.RELEASE</version>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<!--Spring dependencies-->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.2</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>2.0.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>5.0.4.RELEASE</version>
</dependency>
<!--Spark dependencies-->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.2.1</version>
</dependency>
<!--Cassandra dependencies-->
<!--Spark cassandra connector dependencies-->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.0.7</version>
</dependency>
</dependencies>
The spark context and session initialization:
#Configuration
public class Sparkstart {
#Bean
public SparkSession sparksession() {
SparkSession sp = SparkSession
.builder()
.master("local[*]")
.appName("preprocessing")
.config("spark.cassandra.connection.host","127.0.0.1")
.getOrCreate();
return sp;
}
#Bean
public JavaSparkContext sc(){
JavaSparkContext sc = new JavaSparkContext(sparksession().sparkContext());
return sc;
}
}
The class where I take the data from Cassandra database:
#Component
public class Aftersparkstart {
#Autowired
private SparkSession sp;
#Autowired
private JavaSparkContext sc;
#Autowired
private Pearsonclass prs;
public Matrix start(){
List<String> desclist = new ArrayList<>();
desclist.add(some data);
desclist.add(some data);
Dataset<Row> peakset = sp.read().format("org.apache.spark.sql.cassandra")
.options(new HashMap<String, String>() {
{
put("keyspace", "mdb");
put("table", "filepeaks");
}
})
.load().select(col("directoryname"), col("description"), col("intensity")).filter(col("description").isin(desclist.toArray()));
Dataset<Row> finalpeaks = peakset.groupBy(peakset.col("description"), peakset.col("directoryname")).avg("intensity").orderBy(asc("directoryname"), asc("description"));
Matrix r=prs.pearsonmethod(finalpeaks,dirlist,desclist);
return r;
}
}
And the class where the processing by spark takes place:
#Component
public class Pearsonclass{
public Matrix pearsonmethod(Dataset<Row> peaks, List<String> dirlist, List<String> desclist) {
"...stuff..."
return r2;
}
}
And finally the RestController:
#RestController
public class Firstcontroller {
#Autowired
private Aftersparkstart str;
#RequestMapping("/welcome")
public Matrix welcome(){
//return wlc.retrievemsg();
return str.start();
}
}
I am pretty sure I am missing something in the dependencies but I don't know what!
Got it! I just upgraded my Cassandra version from 3.11.0 to 3.11.2. The problem was JDK incompatibility with Cassandra. I have 1.8.0_162-8u162 with which the previous Cassandra version didn't get along..!
Related
I have an issue with block blob storage of Azure. The issue is that If I try to access the storage and create a container from a jar, it works fine. If I try to run it from a spark-submit command, it doesn't work. I'am trying to capture the traffic between my code and Azure to see where it goes wrong but the problem is that fiddler doesn't capture that kind of traffic although I can capture traffic when accessing other sites like www.google.com.
this works:
import java.net.*;
import java.io.*;
public class Example
{
public static void main(String[] args) throws Exception
{
System.setProperty("proxySet", "true");
System.setProperty("proxyHost", "127.0.0.1");
System.setProperty("proxyPort", "9090");
System.setProperty("javax.net.ssl.trustStore", "C:\\data\\keys\\FiddlerKeystore");
System.setProperty("javax.net.ssl.trustStorePassword", "password");
URL x = new URL("https://www.google.com");
HttpURLConnection hc = (HttpURLConnection)x.openConnection();
hc.setRequestProperty("User-Agent","Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2");
InputStream is = hc.getInputStream();
int u = 0;
byte[] kj = new byte[1024];
while((u = is.read(kj)) != -1)
{
System.out.write(kj,0,u);
}
is.close();
}
}
Now, If I do the same with Azure code Fiddler doesn't capture anything:
Here is my Azure code:
import azure.AzureBlockBlobClient;
import common.AzureConf;
import org.apache.log4j.BasicConfigurator;
import java.io.IOException;
public class AzureExample {
private AzureBlockBlobClient azureBlockBlobClient;
private static final org.slf4j.Logger log = org.slf4j.LoggerFactory.getLogger(AzureExample.class);
public AzureExample() {
azureBlockBlobClient = new AzureBlockBlobClient(AzureConf.ACCOUNT_NAME,AzureConf.ACCOUNT_KEY, AzureConf.CONTAINER_NAME);
azureBlockBlobClient.createContainer();
}
public static void main(String... args) throws IOException {
BasicConfigurator.configure();
System.setProperty("proxySet", "true");
System.setProperty("proxyHost", "127.0.0.1");
System.setProperty("proxyPort", "9090");
System.setProperty("javax.net.ssl.trustStore", "C:\\data\\keys\\FiddlerKeystore");
System.setProperty("javax.net.ssl.trustStorePassword", "password");
new AzureExample();
System.exit(0);
}
}
Here is the client the connects to Azure:
public AzureBlockBlobClient(String accountName, String accountKey, String containerName) {
this.accountName = accountName;
this.accountKey = accountKey;
this.containerName = containerName;
init();
}
private void init() {
log.info("Init AzureBlockBlobClient started...");
try {
SharedKeyCredentials creds = new SharedKeyCredentials(accountName, accountKey);
serviceURL = new ServiceURL(new URL("https://" + accountName + ".blob.core.windows.net/"),
StorageURL.createPipeline(creds, new PipelineOptions()));
containerURL = serviceURL.createContainerURL(containerName);
}catch (InvalidKeyException e){
log.error("Authentication error while trying to access storage account", e);
}catch (MalformedURLException e) {
log.error("Invalid Service URL", e);
e.printStackTrace();
}catch (Exception e) {
e.printStackTrace();
log.error("Error initializing AzureBlockBlobClient", e);
}
log.info("Init AzureBlockBlobClient Done!");
}
public void createContainer(){
try {
// Let's create a container using a blocking call to Azure Storage
// If container exists, we'll catch and continue
log.info("Creating container {}." , containerName);
ContainerCreateResponse response = containerURL.create(null, null, null).blockingGet();
log.info("Container Create Response was {}." , response.statusCode());
}
catch (RestException e){
if (e instanceof RestException && e.response().statusCode() != 409) {
log.error("Error Creating container", e);
} else {
log.info("Container {} already exists, resuming...", containerName);
}
}
}
And this is where my constants are:
public interface AzureConf {
String ACCOUNT_KEY ="<SomeAccountKey>";
String ACCOUNT_NAME = "storage";
String CONTAINER_NAME = "My-container";
}
This is my maven pom.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>examples</groupId>
<artifactId>spark-azure-storage</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<junit.version>4.12</junit.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-azure</artifactId>
<version>2.7.1</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage</artifactId>
<version>2.0.0</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage-blob</artifactId>
<version>10.1.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>2.2.3</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.16</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.typesafe.akka/akka-actor -->
<dependency>
<groupId>com.microsoft.rest.v2</groupId>
<artifactId>client-runtime</artifactId>
<version>2.0.0</version>
<!--I have to exclude following dependencies and include version 2.9.7 of them otherwise I get
SoSuchMethodError-->
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.16</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"></transformer>
</transformers>
</configuration>
</plugin>
</plugins>
</build>
Any help to get this to work?
Thank you in advance
According to the Oracle offical document Java Networking and Proxies, there is not these properties proxySet, proxyHost and proxyPort for Java System Properties.
Please use https.proxyHost and https.proxyPort instead of them, which work for me.
I am having the following issue while working with cucumber and cucumber extended reports. I am using the method that takes a screenshot when a step in a scenario fails and the embed that screenshot in the cucumber detailed report. The feature is working fine with the exception that it generates multiple screenshots when I only want one screenshot of the page where failure occurred. If anyone knows how to solve this issue, can you please shed some light. I would really appreciate it.
Thanks in advance
This my pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.moneris.automation</groupId>
<artifactId>md01_automation_demo</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>merchant_direct</name>
<profiles>
<profile>
<id>md_01_automation</id>
</profile>
</profiles>
<dependencies>
<!-- This is what I added for the extended report BEGIN -->
<dependency>
<groupId>com.github.mkolisnyk</groupId>
<artifactId>cucumber-runner</artifactId>
<version>1.3.3</version>
</dependency>
<dependency>
<groupId>com.github.mkolisnyk</groupId>
<artifactId>cucumber-reports</artifactId>
<version>1.3.3</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.opencsv</groupId>
<artifactId>opencsv</artifactId>
<version>3.8</version>
</dependency>
<!-- https://mvnrepository.com/artifact/info.cukes/cucumber-testng -->
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-testng</artifactId>
<version>1.2.5</version>
</dependency>
<dependency>
<groupId>com.vimalselvam</groupId>
<artifactId>cucumber-extentsreport</artifactId>
<version>3.0.2</version>
</dependency>
<dependency>
<groupId>com.relevantcodes</groupId>
<artifactId>extentreports</artifactId>
<version>2.41.2</version>
</dependency>
<dependency>
<groupId>com.aventstack</groupId>
<artifactId>extentreports</artifactId>
<version>3.1.5</version>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-java</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.sun</groupId>
<artifactId>tools</artifactId>
<version>1.7.0.13</version>
<scope>system</scope>
<systemPath>${env.JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>gherkin</artifactId>
<version>2.12.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.freemarker</groupId>
<artifactId>freemarker</artifactId>
<version>2.3.26-incubating</version>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.14.3</version>
<scope>test</scope>
</dependency>
<!-- This is what I added for the extended report END -->
<!-- <dependency> <groupId>info.cukes</groupId> <artifactId>cucumber-java</artifactId>
<version>1.2.4</version> <scope>test</scope> </dependency> -->
<!-- <dependency> <groupId>info.cukes</groupId> <artifactId>cucumber-junit</artifactId>
<version>1.2.4</version> <scope>test</scope> </dependency> -->
<!-- https://mvnrepository.com/artifact/info.cukes/cucumber-picocontainer -->
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.12.0</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.17</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-io -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-io</artifactId>
<version>1.3.2</version>
</dependency>
<!-- It needs this dependency to run TestNG from command line -->
<dependency>
<groupId>com.beust</groupId>
<artifactId>jcommander</artifactId>
<version>1.72</version>
</dependency>
<!-- To take the screenshots of the entire page -->
<dependency>
<groupId>ru.yandex.qatools.ashot</groupId>
<artifactId>ashot</artifactId>
<version>1.5.3</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.assertthat/selenium-shutterbug -->
<dependency>
<groupId>com.assertthat</groupId>
<artifactId>selenium-shutterbug</artifactId>
<version>0.7</version>
</dependency>
<!-- Library to compare two text files -->
<dependency>
<groupId>com.github.wumpz</groupId>
<artifactId>diffutils</artifactId>
<version>2.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.hp.gagawa/gagawa -->
<dependency>
<groupId>com.hp.gagawa</groupId>
<artifactId>gagawa</artifactId>
<version>1.0.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.3</version>
<configuration>
<descriptor>src/test/resources/assembly.xml</descriptor>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<archive>
<manifest>
<mainClass>com.moneris.automation.main.Execute</mainClass>
</manifest>
</archive>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<fork>true</fork>
<executable>C:\Program Files\Java\jdk1.8.0_171\bin\javac.exe</executable>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<testFailureIgnore>true</testFailureIgnore>
<includes>
<include>**/*TestRunner.java</include>
</includes>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-report-plugin</artifactId>
<version>2.20</version>
</plugin>
<plugin>
<groupId>net.masterthought</groupId>
<artifactId>maven-cucumber-reporting</artifactId>
<version>2.0.0</version>
<executions>
<execution>
<id>execution</id>
<phase>test</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<projectName>Merchant Direct</projectName>
<outputDirectory>${project.build.directory}/site/cucumber-reports</outputDirectory>
<cucumberOutput>${project.build.directory}/Extended-Report/cucumber.json</cucumberOutput>
<skippedFails>true</skippedFails>
<enableFlashCharts>true</enableFlashCharts>
<buildNumber>1</buildNumber>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
This my is Runner class
package com.moneris.automation.runner;
import java.io.File;
import java.io.IOException;
import java.text.SimpleDateFormat;
import java.util.Date;
import org.apache.commons.lang.StringUtils;
import org.testng.annotations.AfterSuite;
import com.github.mkolisnyk.cucumber.reporting.CucumberDetailedResults;
import com.github.mkolisnyk.cucumber.reporting.CucumberResultsOverview;
import com.github.mkolisnyk.cucumber.runner.BeforeSuite;
import com.github.mkolisnyk.cucumber.runner.ExtendedCucumberOptions;
import com.github.mkolisnyk.cucumber.runner.ExtendedTestNGRunner;
import com.moneris.automation.counter.HelperClass;
import com.moneris.automation.stepsDefinitions.Drivers;
import com.moneris.automation.utilities.ConfigUtility;
import cucumber.api.CucumberOptions;
#ExtendedCucumberOptions(jsonReport = "target/Extended-Report/cucumber.json",
retryCount = 0,
detailedReport = true,
detailedAggregatedReport = true,
overviewReport = true,
coverageReport = true,
featureOverviewChart = true,
overviewChartsReport = true,
jsonUsageReport = "target/Extended-Report/cucumber-usage.json",
usageReport = true,
systemInfoReport = true,
toPDF = true,
screenShotLocation="./../",
screenShotSize = "85%",
outputFolder = "target/Extended-Report")
#CucumberOptions(
strict=true,
features = "resources/features",
glue = {"com.moneris.automation.stepsDefinitions"},
plugin = {
"pretty:STDOUT","html:target/reports/cucumber-pretty",
"json:target/Extended-Report/cucumber.json",
"usage:target/Extended-Report/cucumber-usage.json"
}
,
monochrome = true
)
public class TestRunner extends ExtendedTestNGRunner {
#org.testng.annotations.AfterClass
public static void setup() throws IOException {
Drivers.getDriver().quit();
if (Drivers.browser.equals("chrome"))
Runtime.getRuntime().exec("taskkill /F /IM ChromeDriver.exe");
}
#BeforeSuite
public void generateDirectories() {
ConfigUtility.prop();
String resultPath = ConfigUtility.get("resultPath");
File dirs = new File(resultPath);
dirs.mkdirs();
Date date = new Date();
String browserFolderName = new SimpleDateFormat("yyyyMMdd_HH.mm.ss")
.format(date)+
"_"+StringUtils
.capitalize(
Drivers.browser);
File browserDir = new File(resultPath+"\\"+browserFolderName);
browserDir.mkdirs();
HelperClass fc = HelperClass.getInstance();
fc.browserFolderName = browserFolderName;
}
#AfterSuite
public void generateReport() {
ConfigUtility.prop();
String resultPath = ConfigUtility.get("resultPath");
HelperClass hc = HelperClass.getInstance();
CucumberResultsOverview results = new CucumberResultsOverview();
results.setOutputDirectory(resultPath+"/"+hc.browserFolderName);
results.setOutputName("cucumber-results-overview");
results.setSourceFile("target/Extended-Report/cucumber.json");
try {
results.execute();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
CucumberDetailedResults detailedResult = new CucumberDetailedResults();
detailedResult.setOutputDirectory(resultPath+"/"+hc.browserFolderName);
detailedResult.setOutputName("cucumber-detailed");
detailedResult.setSourceFile("target/Extended-Report/cucumber.json");
detailedResult.setScreenShotLocation("./Error_Screenshots/");
detailedResult.setScreenShotWidth("100%");
try {
detailedResult.execute();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
This the code I am using to take the screenshot on failure
#After
public void tearDown(Scenario scenario) {
if (Drivers.getDriver() == null || Drivers.getDriver().toString().contains("(null)"))
return;
if (scenario.isFailed() || scenario.getStatus().equalsIgnoreCase("failed")) {
final byte[] screenshot = ((TakesScreenshot) Drivers.getDriver()).getScreenshotAs(OutputType.BYTES);
scenario.embed(screenshot, "image/png");
}
}
This how the report looks. I want only one screenshot.
Recently we moved to HDP 2.5 which has Kafka 0.10.0 and Spark 1.6.2. So I modified my pom and some of the APIs to work with new Kafka. I can run the code but I do not see any messages coming in. I have added a code snippet below. I have also posted my pom. I am not sure what is going wrong here. Can someone please help.
SparkConf conf = new SparkConf().setMaster("local[2]").setAppName(
"SparkApp");
JavaStreamingContext jssc = new JavaStreamingContext(conf,
Durations.seconds(2));
Map<String, Integer> topicMap = new HashMap<String, Integer>();
topicMap.put(this.topic, this.numThreads);
Map<String, String> kafkaParams = new HashMap<>();
kafkaParams.put("metadata.broker.list", kfkBroker);
kafkaParams.put("zookeeper.connect", zkBroker);
kafkaParams.put("group.id", "default");
kafkaParams.put("fetch.message.max.bytes", "60000000");
JavaPairReceiverInputDStream<String, String> kafkaInStream = KafkaUtils.createStream(
jssc,
String.class,
String.class,
kafka.serializer.StringDecoder.class,
kafka.serializer.StringDecoder.class,
kafkaParams,
topicMap,
StorageLevel.MEMORY_AND_DISK());
kafkaInStream.foreachRDD(new VoidFunction<JavaPairRDD<String, String>>()
{
/**
*
*/
private static final long serialVersionUID = 1L;
public void call(JavaPairRDD<String, String> v1) throws Exception
{
System.out.println("inside call.. JavaPairRDD size " + v1.count());
for (Tuple2<String, String> test : v1.collect())
{
this.eventMessage.setMessage(test._2);
}
}
});
I get an output "inside call.. JavaPairRDD size 0" always which indicates that spark is not reading any data. I tried pushing some data into the topic through console consumer.But that did not help.
Here is my pom.xml (only dependencies added)
<dependencies>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.10.1.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka_2.10 -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.10.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20160810</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.101tec</groupId>
<artifactId>zkclient</artifactId>
<version>0.8</version>
</dependency>
</dependencies>
spark-streaming-kafka_2.10 only works with Kafka 0.8+ client. You can still use Kafka 0.8+ client to connect to a 0.10+ cluster but loss some performance.
I suggest that you just use --packages to submit your application to avoid setting Kafka in your dependencies. E.g.,
bin/spark-submit --packages org.apache.spark:spark-streaming-kafka_2.10:1.6.2 ...
I has wrote an app of spark,the code is :
package com.demo;
import com.alibaba.fastjson.JSON;
import com.alibaba.fastjson.JSONObject;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaPairRDD;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.FlatMapFunction;
import org.apache.spark.api.java.function.Function2;
import org.apache.spark.api.java.function.PairFunction;
import scala.Tuple2;
import java.util.ArrayList;
import java.util.List;
/**
* Created by sdvdxl on 2016/3/14.
*/
public class SparkCalcDemo {
private static final String HADOOP_URL = "hdfs://10.10.1.110:8020/";
public static void main(String[] args) throws Exception {
SparkConf conf = new SparkConf().setAppName("test").setMaster("local[1]");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaRDD<String> textFile = sc.textFile(HADOOP_URL + "/xd/dataset1/2016/03/14/15/01", 1);
JavaRDD<String> words = textFile.flatMap(new FlatMapFunction<String, String>() {
public Iterable<String> call(String s) {
List<String> list = new ArrayList<String>();
JSONObject jobj = JSON.parseObject(new String(org.apache.commons.codec.binary.Base64.decodeBase64(s.substring(1, s.length() - 1))));
list.add(jobj.getString("name"));
list.add(jobj.getString("random"));
return list;
}
});
JavaPairRDD<String, Integer> pairs = words.mapToPair(new PairFunction<String, String, Integer>() {
public Tuple2<String, Integer> call(String s) {
return new Tuple2<String, Integer>(s, 1);
}
});
JavaPairRDD<String, Integer> counts = pairs.reduceByKey(new Function2<Integer, Integer, Integer>() {
public Integer call(Integer a, Integer b) {
return a + b;
}
});
counts.foreach(tuple2 ->
System.out.println(tuple2._1 + " : " + tuple2._2));
}
}
and the pom content is:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>kafka-demo</groupId>
<artifactId>kafka-demo</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.7</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.alibaba</groupId>
<artifactId>fastjson</artifactId>
<version>1.2.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.6.1</version>
<!--<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_2.10</artifactId>
</exclusion>
</exclusions>-->
</dependency>
<!--dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_2.10</artifactId>
<version>2.7.2</version>
</dependency>-->
<dependency>
<groupId>commons-codec</groupId>
<artifactId>commons-codec</artifactId>
<version>1.10</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-base64</artifactId>
<version>2.16.2</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<repositories>
<repository>
<id>oschina</id>
<url>http://maven.oschina.net/content/groups/public</url>
</repository>
<repository>
<id>mavenspring</id>
<url>http://maven.springframework.org/release</url>
</repository>
<repository>
<id>jcenter</id>
<url>http://jcenter.bintray.com</url>
</repository>
<repository>
<id>spring-milestones</id>
<name>Spring Milestones</name>
<url>http://repo.spring.io/milestone</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
<repository>
<id>spring-release</id>
<name>Spring Releases</name>
<url>http://repo.spring.io/libs-release</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
<repository>
<id>spring-snapshots</id>
<name>Spring Snapshots</name>
<url>http://repo.spring.io/snapshot</url>
<snapshots>
<enabled>true</enabled>
</snapshots>
<releases>
<enabled>false</enabled>
</releases>
</repository>
</repositories>
</project>
when I submit through spark submit-shell,it worked well.But when submitted by xd job job create --name sparkAppDemo --definition "sparkapp --mainClass=com.demo.SparkCalcDemo --appJar=/home/spark/spark-app.jar --master=spark://localhost:7077" --deploy job launch sparkAppDemo, no result output and in the spring-xd admin ui's status is STARTED, it looks something wrong and hang up.
I screenshot two pictures here:
job status
job detail
Okay, this is probably an ID10T error somewhere, but I just am not seeing it. I have just a shell of the test but I am seeing the methods get, status and content saying unresolved. I don't know what I am missing either in the pom or for an import. I am missing something somewhere, just not seeing it.
Here is the unit test shell.
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.MockitoAnnotations;
import org.springframework.http.MediaType;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.context.web.WebAppConfiguration;
import org.springframework.test.web.servlet.MockMvc;
import org.springframework.test.web.servlet.setup.MockMvcBuilders;
import com.dstbs.prime.service.interfaces.AccountServiceI;
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={"file:src/test/test-context.xml"})
#WebAppConfiguration
public class AccountControllerTest
{
#Mock
private AccountServiceI acctSrvc;
private MockMvc mockMvc;
#Before
public void setup() {
// Process mock annotations
MockitoAnnotations.initMocks(this);
// Setup Spring test in standalone mode
this.mockMvc = MockMvcBuilders.standaloneSetup(new AccountController()).build();
}
//BELOW SAYS THAT get(), status() and content() are unresolved.
#Test
public void testGetAccount() throws Exception {
mockMvc.perform(get("/account").accept(MediaType.parseMediaType("application/json")))
.andExpect(status().isOk())
.andExpect(content().contentType("application/json"));
}
}
Here are the spring and mockito pom entries I have
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-oxm</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${org.springframework-version}</version>
<exclusions>
<!-- Exclude Commons Logging in favor of SLF4j -->
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context-support</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.9.5</version>
<scope>test</scope>
</dependency>
You need to include the necessary static imports.
See the Static Imports section of the Spring Reference Manual for details.