Migrating from JSF 2.2 to JSF 2.3 - jsf

I want to migrate from JSF 2.2 To JSF 2.3, the application was working fine when i have those two dependencies of JSF 2.2
<dependency>
<groupId>org.apache.myfaces.core</groupId>
<artifactId>myfaces-api</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.myfaces.core</groupId>
<artifactId>myfaces-impl</artifactId>
<version>2.2.0</version>
</dependency>
But when i replaced them by those dependencies
<dependency>
<groupId>org.apache.myfaces.core</groupId>
<artifactId>myfaces-api</artifactId>
<version>2.3-next-M3</version>
</dependency>
<dependency>
<groupId>org.apache.myfaces.core</groupId>
<artifactId>myfaces-impl</artifactId>
<version>2.3-next-M3</version>
</dependency>
<dependency>
<groupId>javax.enterprise</groupId>
<artifactId>cdi-api</artifactId>
<version>1.2</version>
<scope>provided</scope>
</dependency>
i got an error saying
No Factories configured for this Application
A typical config looks like this;
<listener>
<listener-class>org.apache.myfaces.webapp.StartupServletContextListener</listener-
class>
</listener>
Here is the Servlet Registration Bean
#Bean
public ServletRegistrationBean facesServlet() {
FacesServlet servlet = new FacesServlet();
ServletRegistrationBean registration = new ServletRegistrationBean(servlet, "*.jsf");
registration.setName("Faces Servlet");
registration.setLoadOnStartup(1);
registration.setMultipartConfig(new
MultipartConfigElement((String) null));
return registration;
}
The ServletContext Initilializer
#Bean
public ServletContextInitializer servletContextInitializer() {
return servletContext -> {
servletContext.setInitParameter("javax.faces.FACELETS_SKIP_COMMENTS", Boolean.TRUE.toString());
servletContext.setInitParameter("primefaces.FONT_AWESOME", Boolean.FALSE.toString());
servletContext.setInitParameter("javax.faces.FACELETS_LIBRARIES", "/WEB-INF/primefaces-california.taglib.xml");
servletContext.setInitParameter("primefaces.THEME", "california-#{guestPreferences.theme}");
servletContext.setInitParameter("javax.faces.PROJECT_STAGE", "PRODUCTION");
servletContext.setInitParameter("javax.faces.STATE_SAVING_METHOD", "server");
};
}
And iam using Tomcat server
<dependency>
<groupId>org.apache.tomcat.embed</groupId>
<artifactId>tomcat-embed-jasper</artifactId>
</dependency>
I tried to add the StartupServletContextListener in web.xml but nothing happens

Related

Apache POI Excel XLSX save as PDF

Currently I work using Spring Framework. I was asked to print a report with a PDF output that retrieves data from a SQL database, but the project I'm working on has Apache POI installed where the output is Excel XLSX.
Is there a way to convert Excel XLSX format to PDF?
I already have Apache POI and recently added iTextPDF.
Apache POI
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.15</version>
</dependency>
iTextPDF
<properties>
<itext.version>7.2.3</itext.version>
</properties>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>itext7-core</artifactId>
<version>${itext.version}</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>kernel</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>io</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>layout</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>forms</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>pdfa</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>sign</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>barcodes</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>font-asian</artifactId>
<version>${itext.version}</version>
</dependency>
<dependency>
<groupId>com.itextpdf</groupId>
<artifactId>hyph</artifactId>
<version>${itext.version}</version>
</dependency>
EDIT : This is the code to generate XLSX file
#RequestMapping(value = "/download", method = RequestMethod.POST)
#PreAuthorize("hasAuthority('CTRL_REPORT_READ')")
#ResponseBody
public void download(#Valid #ModelAttribute ReportForm reportForm, BindingResult result, RedirectAttributes redirectAttrs, HttpServletRequest request, HttpServletResponse response) throws IOException {
logger.debug("IN: reimbursement/dowload report-POST");
XSSFWorkbook workbook = null;
try {
workbook = new XSSFWorkbook();
downloadExcel(reportForm, workbook);
response.setContentType("application/vnd.ms-excel");
SimpleDateFormat sdf = new SimpleDateFormat("yyyyMMdd_HHmmss");
String fileName = "Reimbursement Report " + sdf.format(new Date()) + ".xlsx";
response.setHeader("Content-disposition", "attachment; filename=" + fileName);
workbook.write(response.getOutputStream());
} catch (Exception e) {
logger.error("Error occurs when add new data in method reimbursement with error message : " + e.getMessage());
redirectAttrs.addFlashAttribute(AppDataConstant.ERROR_FLASH_RESP, "Download reimbursement report failed");
} finally {
if (workbook != null) {
workbook.close();
}
}
}
Thank you very much before

Capture Azure block blob traffic with fiddler

I have an issue with block blob storage of Azure. The issue is that If I try to access the storage and create a container from a jar, it works fine. If I try to run it from a spark-submit command, it doesn't work. I'am trying to capture the traffic between my code and Azure to see where it goes wrong but the problem is that fiddler doesn't capture that kind of traffic although I can capture traffic when accessing other sites like www.google.com.
this works:
import java.net.*;
import java.io.*;
public class Example
{
public static void main(String[] args) throws Exception
{
System.setProperty("proxySet", "true");
System.setProperty("proxyHost", "127.0.0.1");
System.setProperty("proxyPort", "9090");
System.setProperty("javax.net.ssl.trustStore", "C:\\data\\keys\\FiddlerKeystore");
System.setProperty("javax.net.ssl.trustStorePassword", "password");
URL x = new URL("https://www.google.com");
HttpURLConnection hc = (HttpURLConnection)x.openConnection();
hc.setRequestProperty("User-Agent","Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.874.121 Safari/535.2");
InputStream is = hc.getInputStream();
int u = 0;
byte[] kj = new byte[1024];
while((u = is.read(kj)) != -1)
{
System.out.write(kj,0,u);
}
is.close();
}
}
Now, If I do the same with Azure code Fiddler doesn't capture anything:
Here is my Azure code:
import azure.AzureBlockBlobClient;
import common.AzureConf;
import org.apache.log4j.BasicConfigurator;
import java.io.IOException;
public class AzureExample {
private AzureBlockBlobClient azureBlockBlobClient;
private static final org.slf4j.Logger log = org.slf4j.LoggerFactory.getLogger(AzureExample.class);
public AzureExample() {
azureBlockBlobClient = new AzureBlockBlobClient(AzureConf.ACCOUNT_NAME,AzureConf.ACCOUNT_KEY, AzureConf.CONTAINER_NAME);
azureBlockBlobClient.createContainer();
}
public static void main(String... args) throws IOException {
BasicConfigurator.configure();
System.setProperty("proxySet", "true");
System.setProperty("proxyHost", "127.0.0.1");
System.setProperty("proxyPort", "9090");
System.setProperty("javax.net.ssl.trustStore", "C:\\data\\keys\\FiddlerKeystore");
System.setProperty("javax.net.ssl.trustStorePassword", "password");
new AzureExample();
System.exit(0);
}
}
Here is the client the connects to Azure:
public AzureBlockBlobClient(String accountName, String accountKey, String containerName) {
this.accountName = accountName;
this.accountKey = accountKey;
this.containerName = containerName;
init();
}
private void init() {
log.info("Init AzureBlockBlobClient started...");
try {
SharedKeyCredentials creds = new SharedKeyCredentials(accountName, accountKey);
serviceURL = new ServiceURL(new URL("https://" + accountName + ".blob.core.windows.net/"),
StorageURL.createPipeline(creds, new PipelineOptions()));
containerURL = serviceURL.createContainerURL(containerName);
}catch (InvalidKeyException e){
log.error("Authentication error while trying to access storage account", e);
}catch (MalformedURLException e) {
log.error("Invalid Service URL", e);
e.printStackTrace();
}catch (Exception e) {
e.printStackTrace();
log.error("Error initializing AzureBlockBlobClient", e);
}
log.info("Init AzureBlockBlobClient Done!");
}
public void createContainer(){
try {
// Let's create a container using a blocking call to Azure Storage
// If container exists, we'll catch and continue
log.info("Creating container {}." , containerName);
ContainerCreateResponse response = containerURL.create(null, null, null).blockingGet();
log.info("Container Create Response was {}." , response.statusCode());
}
catch (RestException e){
if (e instanceof RestException && e.response().statusCode() != 409) {
log.error("Error Creating container", e);
} else {
log.info("Container {} already exists, resuming...", containerName);
}
}
}
And this is where my constants are:
public interface AzureConf {
String ACCOUNT_KEY ="<SomeAccountKey>";
String ACCOUNT_NAME = "storage";
String CONTAINER_NAME = "My-container";
}
This is my maven pom.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>examples</groupId>
<artifactId>spark-azure-storage</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<junit.version>4.12</junit.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-azure</artifactId>
<version>2.7.1</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage</artifactId>
<version>2.0.0</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage-blob</artifactId>
<version>10.1.0</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>io.reactivex.rxjava2</groupId>
<artifactId>rxjava</artifactId>
<version>2.2.3</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.16</version>
</dependency>
<!-- https://mvnrepository.com/artifact/com.typesafe.akka/akka-actor -->
<dependency>
<groupId>com.microsoft.rest.v2</groupId>
<artifactId>client-runtime</artifactId>
<version>2.0.0</version>
<!--I have to exclude following dependencies and include version 2.9.7 of them otherwise I get
SoSuchMethodError-->
<exclusions>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.16</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.9.7</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"></transformer>
</transformers>
</configuration>
</plugin>
</plugins>
</build>
Any help to get this to work?
Thank you in advance
According to the Oracle offical document Java Networking and Proxies, there is not these properties proxySet, proxyHost and proxyPort for Java System Properties.
Please use https.proxyHost and https.proxyPort instead of them, which work for me.

Spring Spark Cassandra - Whitelabel Error Page

I am trying to use spark and cassandra through Spring in netbeans and I get an error:
type=Internal Server Error, status=500
Failed to open native connection to Cassandra at {127.0.0.1}:9042.
Spark and Cassandra were functioning just fine before I try to integrate Spring. There are already data in my Cassandra database which I take through spark and process them. Basically, I want to print the results(a matrix) in a /welcome page through a RestController.
Here is my really simple File Structure:
image
Here is my pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.mycompany</groupId>
<artifactId>my-app</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<debug>true</debug>
</configuration>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.0.0.RELEASE</version>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<!--Spring dependencies-->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.9.0</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.2</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>2.0.0.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>5.0.4.RELEASE</version>
</dependency>
<!--Spark dependencies-->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.2.1</version>
</dependency>
<!--Cassandra dependencies-->
<!--Spark cassandra connector dependencies-->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>2.0.7</version>
</dependency>
</dependencies>
The spark context and session initialization:
#Configuration
public class Sparkstart {
#Bean
public SparkSession sparksession() {
SparkSession sp = SparkSession
.builder()
.master("local[*]")
.appName("preprocessing")
.config("spark.cassandra.connection.host","127.0.0.1")
.getOrCreate();
return sp;
}
#Bean
public JavaSparkContext sc(){
JavaSparkContext sc = new JavaSparkContext(sparksession().sparkContext());
return sc;
}
}
The class where I take the data from Cassandra database:
#Component
public class Aftersparkstart {
#Autowired
private SparkSession sp;
#Autowired
private JavaSparkContext sc;
#Autowired
private Pearsonclass prs;
public Matrix start(){
List<String> desclist = new ArrayList<>();
desclist.add(some data);
desclist.add(some data);
Dataset<Row> peakset = sp.read().format("org.apache.spark.sql.cassandra")
.options(new HashMap<String, String>() {
{
put("keyspace", "mdb");
put("table", "filepeaks");
}
})
.load().select(col("directoryname"), col("description"), col("intensity")).filter(col("description").isin(desclist.toArray()));
Dataset<Row> finalpeaks = peakset.groupBy(peakset.col("description"), peakset.col("directoryname")).avg("intensity").orderBy(asc("directoryname"), asc("description"));
Matrix r=prs.pearsonmethod(finalpeaks,dirlist,desclist);
return r;
}
}
And the class where the processing by spark takes place:
#Component
public class Pearsonclass{
public Matrix pearsonmethod(Dataset<Row> peaks, List<String> dirlist, List<String> desclist) {
"...stuff..."
return r2;
}
}
And finally the RestController:
#RestController
public class Firstcontroller {
#Autowired
private Aftersparkstart str;
#RequestMapping("/welcome")
public Matrix welcome(){
//return wlc.retrievemsg();
return str.start();
}
}
I am pretty sure I am missing something in the dependencies but I don't know what!
Got it! I just upgraded my Cassandra version from 3.11.0 to 3.11.2. The problem was JDK incompatibility with Cassandra. I have 1.8.0_162-8u162 with which the previous Cassandra version didn't get along..!

Unable to receive any messages in Kafka 0.10.0 with Spark stream 1.6.2

Recently we moved to HDP 2.5 which has Kafka 0.10.0 and Spark 1.6.2. So I modified my pom and some of the APIs to work with new Kafka. I can run the code but I do not see any messages coming in. I have added a code snippet below. I have also posted my pom. I am not sure what is going wrong here. Can someone please help.
SparkConf conf = new SparkConf().setMaster("local[2]").setAppName(
"SparkApp");
JavaStreamingContext jssc = new JavaStreamingContext(conf,
Durations.seconds(2));
Map<String, Integer> topicMap = new HashMap<String, Integer>();
topicMap.put(this.topic, this.numThreads);
Map<String, String> kafkaParams = new HashMap<>();
kafkaParams.put("metadata.broker.list", kfkBroker);
kafkaParams.put("zookeeper.connect", zkBroker);
kafkaParams.put("group.id", "default");
kafkaParams.put("fetch.message.max.bytes", "60000000");
JavaPairReceiverInputDStream<String, String> kafkaInStream = KafkaUtils.createStream(
jssc,
String.class,
String.class,
kafka.serializer.StringDecoder.class,
kafka.serializer.StringDecoder.class,
kafkaParams,
topicMap,
StorageLevel.MEMORY_AND_DISK());
kafkaInStream.foreachRDD(new VoidFunction<JavaPairRDD<String, String>>()
{
/**
*
*/
private static final long serialVersionUID = 1L;
public void call(JavaPairRDD<String, String> v1) throws Exception
{
System.out.println("inside call.. JavaPairRDD size " + v1.count());
for (Tuple2<String, String> test : v1.collect())
{
this.eventMessage.setMessage(test._2);
}
}
});
I get an output "inside call.. JavaPairRDD size 0" always which indicates that spark is not reading any data. I tried pushing some data into the topic through console consumer.But that did not help.
Here is my pom.xml (only dependencies added)
<dependencies>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.10.1.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka_2.10 -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.10.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.6.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.10</artifactId>
<version>1.6.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20160810</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.101tec</groupId>
<artifactId>zkclient</artifactId>
<version>0.8</version>
</dependency>
</dependencies>
spark-streaming-kafka_2.10 only works with Kafka 0.8+ client. You can still use Kafka 0.8+ client to connect to a 0.10+ cluster but loss some performance.
I suggest that you just use --packages to submit your application to avoid setting Kafka in your dependencies. E.g.,
bin/spark-submit --packages org.apache.spark:spark-streaming-kafka_2.10:1.6.2 ...

Testing spring controllers and setting up Junit test

Okay, this is probably an ID10T error somewhere, but I just am not seeing it. I have just a shell of the test but I am seeing the methods get, status and content saying unresolved. I don't know what I am missing either in the pom or for an import. I am missing something somewhere, just not seeing it.
Here is the unit test shell.
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.mockito.Mock;
import org.mockito.MockitoAnnotations;
import org.springframework.http.MediaType;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import org.springframework.test.context.web.WebAppConfiguration;
import org.springframework.test.web.servlet.MockMvc;
import org.springframework.test.web.servlet.setup.MockMvcBuilders;
import com.dstbs.prime.service.interfaces.AccountServiceI;
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={"file:src/test/test-context.xml"})
#WebAppConfiguration
public class AccountControllerTest
{
#Mock
private AccountServiceI acctSrvc;
private MockMvc mockMvc;
#Before
public void setup() {
// Process mock annotations
MockitoAnnotations.initMocks(this);
// Setup Spring test in standalone mode
this.mockMvc = MockMvcBuilders.standaloneSetup(new AccountController()).build();
}
//BELOW SAYS THAT get(), status() and content() are unresolved.
#Test
public void testGetAccount() throws Exception {
mockMvc.perform(get("/account").accept(MediaType.parseMediaType("application/json")))
.andExpect(status().isOk())
.andExpect(content().contentType("application/json"));
}
}
Here are the spring and mockito pom entries I have
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-oxm</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-jdbc</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-core</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${org.springframework-version}</version>
<exclusions>
<!-- Exclude Commons Logging in favor of SLF4j -->
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context-support</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-tx</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>${org.springframework-version}</version>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-all</artifactId>
<version>1.9.5</version>
<scope>test</scope>
</dependency>
You need to include the necessary static imports.
See the Static Imports section of the Spring Reference Manual for details.

Resources