I am trying to save real time tweets to cassandra, however, Eclipse is showing a marker in this line:
CassandraJavaUtil.javaFunctions(tweets).writerBuilder("Twitter","tweets", CassandraJavaUtil.mapToRow(tweets.class)).saveToCassandra();
Eclipse does not recognize mapToRow() and javaFunctions() even though i've imported all the packages.
I am using the 1.4.0 Cassandra connector in order to be compatible with spark version.
I've seen people using this approach,
javaFunctions(, .class).saveToCassandra("java_api", "sales");
but the documentation tells that it is deprecated with the new versions.
<dependencies>
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-core</artifactId>
<version>3.0.3</version>
</dependency>
<!--Spark Cassandra Connector -->
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.10</artifactId>
<version>1.4.0-M1</version>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector-java_2.10</artifactId>
<version>1.4.0-M1</version>
</dependency>
<!--Spark -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.4.0</version>
</dependency>
<!-- Twitter -->
<dependency>
<groupId>org.twitter4j</groupId>
<artifactId>twitter4j-stream</artifactId>
<version>3.0.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-twitter_2.10</artifactId>
<version>1.4.0</version>
</dependency>
Related
the method belongs to SparkSession , the name is getOrCreate()
detailed exception is
java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.JsonMappingException.<init>(Ljava/io/Closeable;Ljava/lang/String;)V
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule$(JacksonModule.scala:46)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:17)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:718)
at org.apache.spark.util.JsonProtocol$.<init>(JsonProtocol.scala:62)
at org.apache.spark.util.JsonProtocol$.<clinit>(JsonProtocol.scala)
at org.apache.spark.scheduler.EventLoggingListener.initEventLog(EventLoggingListener.scala:89)
at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:84)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:610)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2690)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:949)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:943)
at com.hiido.server.service.impl.SparkSqlJob.executing(SparkSqlJob.java:56)
at com.hiido.server.service.impl.SparkSqlJob.main(SparkSqlJob.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:737)
Someone said it happened due to version conflict, I disagree.
Because I have checked my spark version, it's spark_core_2.12-3.2.1 and jackson version is 2.12.3 and spark_version is 3.2.1-bin-hadoop2.7
I have no idea about this problem.
This problem only happend at spark cluster , when i use local it's ok.
thx.
additional:
this is my pom.xml , I only show my depency , sry
<properties>
<java.version>1.8</java.version>
<geospark.version>1.2.0</geospark.version>
<spark.compatible.verison>2.3</spark.compatible.verison>
<!--<spark.version>2.3.4</spark.version>-->
<spark.version>3.2.1</spark.version>
<hadoop.version>2.7.2</hadoop.version>
<geotools.version>19.0</geotools.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<!-- https://mvnrepository.com/artifact/io.netty/netty-all -->
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.68.Final</version>
</dependency>
<!-- <dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.4.4</version>
</dependency>-->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId> jackson-annotations</artifactId>
<version>2.12.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId> jackson-core</artifactId>
<version>2.12.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.12.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>commons-compiler</artifactId>
<version>3.0.8</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.12.15</version>
</dependency>
<!-- geospark -->
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark</artifactId>
<version>${geospark.version}</version>
</dependency>
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark-sql_${spark.compatible.verison}</artifactId>
<version>${geospark.version}</version>
</dependency>
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>geospark-viz_${spark.compatible.verison}</artifactId>
<version>${geospark.version}</version>
</dependency>
<!-- geospark -->
<dependency>
<groupId>org.apache.sedona</groupId>
<artifactId>sedona-core-3.0_2.12</artifactId>
<version>1.1.1-incubating</version>
</dependency>
<dependency>
<groupId>org.apache.sedona</groupId>
<artifactId>sedona-sql-3.0_2.12</artifactId>
<version>1.1.1-incubating</version>
</dependency>
<dependency>
<groupId>org.apache.sedona</groupId>
<artifactId>sedona-viz-3.0_2.12</artifactId>
<version>1.1.1-incubating</version>
</dependency>
<dependency>
<groupId>org.datasyslab</groupId>
<artifactId>sernetcdf</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<!-- <version>2.3.4</version>-->
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.12</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.2.1</version>
<!--<scope>provided</scope>-->
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.12</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.12</artifactId>
<version>3.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-mapreduce-client-core</artifactId>
<version>${hadoop.version}</version>
<scope>${dependency.scope}</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
<scope>${dependency.scope}</scope>
</dependency>
<dependency>
<groupId>net.sourceforge.javacsv</groupId>
<artifactId>javacsv</artifactId>
<version>2.0</version>
</dependency>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>42.2.5</version>
</dependency>
<dependency>
<groupId>org.codehaus.janino</groupId>
<artifactId>janino</artifactId>
<version>3.0.8</version>
</dependency>
<dependency>
<groupId>org.geotools</groupId>
<artifactId>gt-grid</artifactId>
<version>${geotools.version}</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.locationtech.spatial4j/spatial4j -->
<dependency>
<groupId>org.locationtech.spatial4j</groupId>
<artifactId>spatial4j</artifactId>
<version>0.8</version>
</dependency>
<!-- JTS is essentially only used for polygons. -->
<!-- https://mvnrepository.com/artifact/org.locationtech.jts/jts-core -->
<dependency>
<groupId>org.locationtech.jts</groupId>
<artifactId>jts-core</artifactId>
<version>1.18.1</version>
</dependency>
</dependencies>
This is most likely caused by the wrong Spark and Hadoop packaging strategy in your Maven pom.xml.
Your POM.xml includes many packages that shouldn't be put in 'compile' scope. E.g., spark, and hadoop dependencies. Spark clusters usually have all these libs already. If you mistakenly include them in your jars, it is uncertain that which jackson version will be used. Please change them to 'provided' scope. What Spark and Sedona developers usually do is that, use compile scope for local testing and change to provided scope when deploy to a cluster.
You usually don't need to include Hadoop dependencies since Spark jars come with many Hadoop dependencies. This will lead to many version conflicts in JackSon.
Your GeoSpark dependency is wrong. Please remove both your old GeoSpark and Sedona dependecies and follow the instruction here: https://sedona.apache.org/setup/maven-coordinates/#use-sedona-fat-jars
Here is a runnable example of Sedona + Spark project: https://github.com/apache/incubator-sedona/blob/master/examples/sql/build.sbt#L59 . Although it is written in sbt, POM shares the same logic.
Please pay close attention to those dependency scope and exclude parts.
I am getting AbstractMethodError exception while creating a JavaStreamingContext.
My dependency pom is as below; Unable to find the clue , Can anyone suggest whats going wrong here ?
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.3.1</version>
<scope>provided</scope>
</dependency>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.3.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<version>0.8.2.2</version>
</dependency>
Exception in thread "main" java.lang.AbstractMethodError
at org.apache.spark.util.ListenerBus$class.$init$(ListenerBus.scala:35)
at org.apache.spark.streaming.scheduler.StreamingListenerBus.(StreamingListenerBus.scala:30)
at org.apache.spark.streaming.scheduler.JobScheduler.(JobScheduler.scala:57)
at org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:184)
at org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:76)
at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaStreamingContext.scala:130)
You're mixing many versions of Spark here
First of all, if you're using Apache Spark 2.3.1 and kafka 0.10+
I suggest the following :
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.3.1</version>
<scope>provided</scope>
</dependency>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.3.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.5</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<!-- Keep the same Spark version as before -->
<version>2.3.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<!-- Keep the same Spark version as before -->
<version>2.3.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.10</artifactId>
<!-- Support for kafka version 0.8 is deprecated as of Spark 2.3.1 and you add the dependencies for kafka 0.10+ above -->
<version><your_kafka_version_0.10+></version>
</dependency>
It would be nice to know how you build/deploy your application ? according to your runtime environment, you may want to add some scope provided to prevent conflict between the built package and the existing environment.
Hope it helps
I use xgboost with spark
scala version:2.11.8
java version:1.7
spark version:2.1.0
maven dependencies:
<dependencies>
<!-- https://mvnrepository.com/artifact/ml.dmlc/xgboost4j-spark -->
<dependency>
<groupId>ml.dmlc</groupId>
<artifactId>xgboost4j-spark</artifactId>
<version>0.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
i use idea ,when i write codes,there is no error,but when i want to package my project to jar ,it shows that error.
the reason is lacking xgboost4j dependency,just add in maven:
<dependency>
<groupId>ml.dmlc</groupId>
<artifactId>xgboost4j</artifactId>
<version>0.8</version>
</dependency>
I am having trouble to run a sql which loads data to partition table in hive context , I did set dynamic partition = true but still I am having issue.
SQL:
insert overwrite table target_table PARTITION (column1,column2) select * , deletion_flag ,'2018-12-23' as date_feed from source_table
Hive setconf:-
hiveContext.setConf("hive.exec.dynamic.partition","true")
hiveContext.setConf("hive.exec.max.dynamic.partitions","2048")
hiveContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
Error:
org.apache.hadoop.hive.ql.metadata.Hive.loadDynamicPartitions(org.apache.hadoop.fs.Path, java.lang.String, java.util.Map, boolean, int, boolean, boolean, boolean
Maven Dependency:-
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>1.1.0</version>
</dependency>
Thanks
I solved the issues after getting all the maven dependencies from cloudera repo.
<dependencies>
<!-- Scala and Spark dependencies -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0-cdh5.9.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.0-cdh5.9.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.6.0-cdh5.9.2</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.hive/hive-exec -->
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>1.1.0-cdh5.9.2</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.10</artifactId>
<version>3.0.0-SNAP4</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.4.1</version>
</dependency>
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.10</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-xml_2.10</artifactId>
<version>0.2.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk</artifactId>
<version>1.0.12</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.11.172</version>
</dependency>
<dependency>
<groupId>com.github.scopt</groupId>
<artifactId>scopt_2.10</artifactId>
<version>3.2.0</version>
</dependency>
<dependency>
<groupId>javax.mail</groupId>
<artifactId>mail</artifactId>
<version>1.4</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>maven-hadoop</id>
<name>Hadoop Releases</name>
<url>https://repository.cloudera.com/content/repositories/releases/</url>
</repository>
<repository>
<id>cloudera-repos</id>
<name>Cloudera Repos</name>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
</repositories>
I'm tring to get work Spring-boot with primfaces. I followed this examle https://github.com/Zergleb/Spring-Boot-JSF-Example . I tried moved it from gradle to maven becasue other part of project use maven, but I allways get
java.lang.NoClassDefFoundError: javax/servlet/jsp/JspFactory
I was searching and I found this solution java.lang.ClassNotFoundException: javax.servlet.jsp.jstl.core.Config but it didn't worked for me.
This is my dependencies list from pom file:
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.sun.faces</groupId>
<artifactId>jsf-impl</artifactId>
<version>2.2.8-02</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.sun.faces</groupId>
<artifactId>jsf-api</artifactId>
<version>2.2.8-02</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.primefaces</groupId>
<artifactId>primefaces</artifactId>
<version>5.0</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>javax.el</groupId>
<artifactId>el-api</artifactId>
<version>2.2</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.glassfish.web</groupId>
<artifactId>el-impl</artifactId>
<version>2.2</version>
<scope>compile</scope>
</dependency>
<!-- Spring web -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
</dependency>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>9.4-1201-jdbc41</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-rest</artifactId>
</dependency>
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-cypher-compiler-2.1</artifactId>
<version>2.1.5</version>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-web</artifactId>
<version>3.2.8.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-config</artifactId>
<version>3.2.8.RELEASE</version>
</dependency>
</dependencies>
I also tried download needed libraries and put them into WEB-INF/lib but didn't work.I'm using Intelij 14.
this combination of dependencies helped me out to get Spring Boot up and running with JSF.
These libraries provide to the embedded Tomcat the necessary classes to handle the JSF servlet requests an EL expressions correctly.
<dependency>
<groupId>javax.servlet.jsp</groupId>
<artifactId>jsp-api</artifactId>
<version>2.2</version>
</dependency>
<dependency>
<groupId>com.sun.el</groupId>
<artifactId>el-ri</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
I am able to run it using mvn springboot:run at the command line. In Intellij Idea we can run it as a Maven project and giving "spring-boot:run" at the command line parameter field. Runs perfectly fine for Tomcat 7, Spring Boot 1.2.3 , JSF Mojarra 2.2.11 , PrimeFaces 6.1 . The following needed to be added to the pom.xml.
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.3</version>
<scope>provided</scope>
</dependency>
<!-- https://mvnrepository.com/artifact/javax.servlet/javax.servlet-api -->
<!--<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>3.1.0</version>
<scope>provided</scope>
</dependency>-->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jsp-api</artifactId>
<version>2.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
I had this problem and change the provided scope(doesn't send jar file to tomcat) to compile scope(add spring-boot-starter-tomcat.jar to WEB-INF\lib) in POM.xml.
after this change tomcat can start my application without any exceptions.
I hope, It can help others to solve such exception.