Why isn't PDDL compiling correctly and resulting in parse error? - planning

I am new to PDDL so I don't understand very well what's going on but I have this issue and it seems that black-box planner isn't able to parse my PDDL code.
Here there is the domain:
(define (domain vacuum)
(:requirements :strips)
(:predicates
(clean ?x)
(in_room ?x ?r)
(door_to ?r1 ?r2)
(dirty ?x)
)
(:action moveToAction
:parameters (?ob ?r1 ?r2)
:precondition (and
(in_room ?ob ?r1)
(door_to ?r1 ?r2)
)
:effect (and
(not (in_room ?r1))
(in_room ?ob ?r2)
)
)
(:action cleanAction
:parameters (?r1)
:precondition (and
(dirty ?r1)
)
:effect (and
(clean ?r1)
)
)
)
and here there is the problem:
(define (problem vacuum)
(:domain vacuum)
(:objects vacuum kitchen corridor bedroom1 bedroom2 bathroom living_room)
(:init
(dirty kitchen)
(dirty bedroom1)
(dirty bedroom2)
(dirty living_room)
(dirty bathroom)
(dirty corridor)
(in_room vacuum living_room)
(not in_room vacuum kitchen)
(not in_room vacuum bedroom1)
(not in_room vacuum bedroom2)
(not in_room vacuum bathroom)
(not in_room vacuum corridor)
(door_to bathroom corridor)
(door_to kitchen corridor)
(door_to living_room corridor)
(door_to bedroom1 corridor)
(door_to bedroom2 corridor)
(door_to corridor bathroom)
(door_to corridor kitchen)
(door_to corridor living_room)
(door_to corridor bedroom1)
(door_to corridor bedroom2)
)
(:goal
(and (clean living_room))
)
)
The exact error that the solver gives me is:
parse error
Error occurred at or near line 40
If I understood correctly the parser reads the domain file first and the problem file after that, so the error should be in the problem file, and in particular in the init. The precise line should be (dirty corridor).
I do not understand what I did wrong, did I miss the syntax completely or did I just mess something?

you have to remove the negated predicates from the initial state in the problem due to the Closed World Assumption, plus you have a small typo in the domain in_room takes two variables, and not just one!
Then your domain / problem will look like:
Domain:
(define (domain vacuum)
(:requirements :strips)
(:predicates
(clean ?x)
(in_room ?x ?r)
(door_to ?r1 ?r2)
(dirty ?x)
)
(:action moveToAction
:parameters (?ob ?r1 ?r2)
:precondition (and
(in_room ?ob ?r1)
(door_to ?r1 ?r2)
)
:effect (and
(not (in_room ?ob ?r1))
(in_room ?ob ?r2)
)
)
(:action cleanAction
:parameters (?r1)
:precondition (and
(dirty ?r1)
)
:effect (and
(clean ?r1)
)
)
)
Problem
(define (problem vacuum)
(:domain vacuum)
(:objects vacuum kitchen corridor bedroom1 bedroom2 bathroom living_room)
(:init
(dirty kitchen)
(dirty bedroom1)
(dirty bedroom2)
(dirty living_room)
(dirty bathroom)
(dirty corridor)
(in_room vacuum living_room)
(door_to bathroom corridor)
(door_to kitchen corridor)
(door_to living_room corridor)
(door_to bedroom1 corridor)
(door_to bedroom2 corridor)
(door_to corridor bathroom)
(door_to corridor kitchen)
(door_to corridor living_room)
(door_to corridor bedroom1)
(door_to corridor bedroom2)
)
(:goal
(and (clean living_room))
)
)
Output Plan:
(:action cleanaction
:parameters (living_room)
:precondition
(and
(dirty living_room)
)
:effect
(and
(clean living_room)
)
)

Related

Error while trying to insert into Hive table with Spark 3.1.2: Can only write data to relations with a single path

The insret statment from Java code:
INSERT INTO table1 PARTITION (part1,part2) SELECT * FROM TEMP_TABLE_APPEND TEMP_TABLE
The excetpion stack trace:
org.apache.spark.sql.AnalysisException: Can only write data to relations with a single path.
at org.apache.spark.sql.execution.datasources.DataSourceAnalysis$$anonfun$apply$1.applyOrElse(DataSourceStrategy.scala:202)
at org.apache.spark.sql.execution.datasources.DataSourceAnalysis$$anonfun$apply$1.applyOrElse(DataSourceStrategy.scala:150)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$2(AnalysisHelper.scala:108)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:74)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDown$1(AnalysisHelper.scala:108)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:221)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown(AnalysisHelper.scala:106)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDown$(AnalysisHelper.scala:104)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDown(LogicalPlan.scala:29)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators(AnalysisHelper.scala:73)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators$(AnalysisHelper.scala:72)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:29)
at org.apache.spark.sql.execution.datasources.DataSourceAnalysis$.apply(DataSourceStrategy.scala:150)
at org.apache.spark.sql.execution.datasources.DataSourceAnalysis$.apply(DataSourceStrategy.scala:58)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:216)
at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
at scala.collection.immutable.List.foldLeft(List.scala:89)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:213)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:205)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:205)
at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:196)
at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:190)
at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:155)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:183)
at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:183)
at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:174)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:228)
at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:173)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:73)
at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:143)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:143)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:73)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:71)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:63)
at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:98)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96)
at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618)
at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613)
In case anyone else has this problem....
The issue was a parameter in spark-defaults.conf:
spark.sql.hive.manageFilesourcePartitions=false
It was set to false for some reason (the default is true).
Commneting this line solved the problem

spark sql ambiguity with reserved keyword and regex

I have used spark.sql.parser.quotedRegexColumnNames=true to use regex operation in spark SQL
The same query consist of reserved key word timestamp.
Here is the query
SELECT cast(delta.day as string) as day,
`timestamp` as `timestamp`,
delta.`(p_[0-9]+)`
FROM test_table delta
Executing the query is failing with following exception
Caused by: org.apache.spark.sql.AnalysisException: Invalid usage of '*' in expression 'alias';
at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.failAnalysis(CheckAnalysis.scala:42)
at org.apache.spark.sql.catalyst.analysis.Analyzer.failAnalysis(Analyzer.scala:95)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$expandStarExpression$1.applyOrElse(Analyzer.scala:1021)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$expandStarExpression$1.applyOrElse(Analyzer.scala:997)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:278)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:278)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:277)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.expandStarExpression(Analyzer.scala:997)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveReferences$$buildExpandedProjectList$1.apply(Analyzer.scala:982)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveReferences$$buildExpandedProjectList$1.apply(Analyzer.scala:977)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.immutable.List.flatMap(List.scala:355)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveReferences$$buildExpandedProjectList(Analyzer.scala:977)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$9.applyOrElse(Analyzer.scala:905)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$9.applyOrElse(Analyzer.scala:900)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$apply$1.apply(AnalysisHelper.scala:90)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$apply$1.apply(AnalysisHelper.scala:90)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:89)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:900)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:758)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:87)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:84)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:84)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:76)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:76)
at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:127)
at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:121)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:106)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:651)
removing alias
as `timestamp`
from select query executed successfully. I believe there is an ambiguity to resolve the difference between regex and reserved keywords.
Any help to resolve this is much appreciated
easiest way to reproduce the error is executing following queries in spark-shell
scala> spark.sql("SET spark.sql.parser.quotedRegexColumnNames=true").show(false)
+---------------------------------------+-----+
|key |value|
+---------------------------------------+-----+
|spark.sql.parser.quotedRegexColumnNames|true |
+---------------------------------------+-----+
scala> spark.sql("select `timestamp` as `timestamp` from test_table").show();
Got in touch with the Cloudera team and got to know that this feature is being worked upon currently.
https://issues.apache.org/jira/browse/SPARK-12139?focusedCommentId=17346581&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17346581

Apache FileUtils stopped working in XPages application

For years I have been using Apache Commons IO FileUtils in XPages. In one of my applications, the FileUtils (v2.5) suddenly stopped working a few days ago. There has neither been a Domino update, nor have there been any changes applied to the application's build path. Whenever I try to call a function from FileUtils in server-side JavaScript, exceptions like this are thrown:
Java method 'writeStringToFile(java.io.File, string, string, boolean)' on java class 'org.apache.commons.io.FileUtils' not found
Java method 'moveFile(java.io.File, java.io.File)' on java class 'org.apache.commons.io.FileUtils' not found
...even though the respective methods exist and the code used to work fine.
I have restarted the Domino server multiple times, tried to clean and build the project multiple times, removed and re-added the commons-io binary multiple times from/to the build path, but nothing fixed the problem.
Does anybody have an idea how this could be fixed?
Update: The stacktrace of the exceptions looks like this:
Java method 'writeStringToFile(java.io.File, string, string, boolean)' on java class 'org.apache.commons.io.FileUtils' not found
at com.ibm.jscript.types.JavaAccessObject.call(JavaAccessObject.java:377)
at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161)
at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:197)
at com.ibm.jscript.ASTTree.ASTTry.interpret(ASTTry.java:109)
at com.ibm.jscript.std.FunctionObject._executeFunction(FunctionObject.java:261)
at com.ibm.jscript.std.FunctionObject.executeFunction(FunctionObject.java:185)
at com.ibm.jscript.std.FunctionObject.call(FunctionObject.java:171)
at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161)
at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:197)
at com.ibm.jscript.std.FunctionObject._executeFunction(FunctionObject.java:261)
at com.ibm.jscript.std.FunctionObject.executeFunction(FunctionObject.java:185)
at com.ibm.jscript.std.FunctionObject.call(FunctionObject.java:171)
at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161)
at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:197)
at com.ibm.jscript.ASTTree.ASTAssign.interpret(ASTAssign.java:91)
at com.ibm.jscript.ASTTree.ASTBlock.interpret(ASTBlock.java:100)
at com.ibm.jscript.ASTTree.ASTIf.interpret(ASTIf.java:90)
at com.ibm.jscript.std.FunctionObject._executeFunction(FunctionObject.java:261)
at com.ibm.jscript.std.FunctionObject.executeFunction(FunctionObject.java:185)
at com.ibm.jscript.std.FunctionObject.call(FunctionObject.java:171)
at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161)
at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:197)
at com.ibm.jscript.ASTTree.ASTReturn.interpret(ASTReturn.java:49)
at com.ibm.jscript.std.FunctionObject._executeFunction(FunctionObject.java:261)
at com.ibm.jscript.std.FunctionObject.executeFunction(FunctionObject.java:185)
at com.ibm.jscript.std.FunctionObject.call(FunctionObject.java:171)
at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161)
at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:197)
at com.ibm.jscript.ASTTree.ASTAssign.interpret(ASTAssign.java:91)
at com.ibm.jscript.ASTTree.ASTIf.interpret(ASTIf.java:79)
at com.ibm.jscript.ASTTree.ASTBlock.interpret(ASTBlock.java:100)
at com.ibm.jscript.ASTTree.ASTIf.interpret(ASTIf.java:85)
at com.ibm.jscript.ASTTree.ASTBlock.interpret(ASTBlock.java:100)
at com.ibm.jscript.ASTTree.ASTIf.interpret(ASTIf.java:85)
at com.ibm.jscript.std.FunctionObject._executeFunction(FunctionObject.java:261)
at com.ibm.jscript.std.FunctionObject.executeFunction(FunctionObject.java:185)
at com.ibm.jscript.std.FunctionObject.call(FunctionObject.java:171)
at com.ibm.jscript.std.FunctionPrototype$FunctionMethod.call(FunctionPrototype.java:169)
at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161)
at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:197)
at com.ibm.jscript.ASTTree.ASTAssign.interpret(ASTAssign.java:91)
at com.ibm.jscript.ASTTree.ASTBlock.interpret(ASTBlock.java:100)
at com.ibm.jscript.ASTTree.ASTIf.interpret(ASTIf.java:85)
at com.ibm.jscript.ASTTree.ASTBlock.interpret(ASTBlock.java:100)
at com.ibm.jscript.ASTTree.ASTIf.interpret(ASTIf.java:85)
at com.ibm.jscript.std.FunctionObject._executeFunction(FunctionObject.java:261)
at com.ibm.jscript.std.FunctionObject.executeFunction(FunctionObject.java:185)
at com.ibm.jscript.std.FunctionObject.call(FunctionObject.java:171)
at com.ibm.jscript.std.FunctionPrototype$FunctionMethod.call(FunctionPrototype.java:169)
at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161)
at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:197)
at com.ibm.jscript.ASTTree.ASTAssign.interpret(ASTAssign.java:91)
at com.ibm.jscript.ASTTree.ASTBlock.interpret(ASTBlock.java:100)
at com.ibm.jscript.ASTTree.ASTIf.interpret(ASTIf.java:85)
at com.ibm.jscript.ASTTree.ASTBlock.interpret(ASTBlock.java:100)
at com.ibm.jscript.ASTTree.ASTTry.interpret(ASTTry.java:109)
at com.ibm.jscript.std.FunctionObject._executeFunction(FunctionObject.java:261)
at com.ibm.jscript.std.FunctionObject.executeFunction(FunctionObject.java:185)
at com.ibm.jscript.std.FunctionObject.call(FunctionObject.java:171)
at com.ibm.jscript.types.FBSObject.call(FBSObject.java:161)
at com.ibm.jscript.ASTTree.ASTCall.interpret(ASTCall.java:197)
at com.ibm.jscript.ASTTree.ASTProgram.interpret(ASTProgram.java:119)
at com.ibm.jscript.ASTTree.ASTProgram.interpretEx(ASTProgram.java:139)
at com.ibm.jscript.JSExpression._interpretExpression(JSExpression.java:435)
at com.ibm.jscript.JSExpression.access$1(JSExpression.java:424)
at com.ibm.jscript.JSExpression$2.run(JSExpression.java:414)
at java.security.AccessController.doPrivileged(AccessController.java:686)
at com.ibm.jscript.JSExpression.interpretExpression(JSExpression.java:410)
at com.ibm.jscript.JSExpression.evaluateValue(JSExpression.java:251)
at com.ibm.jscript.JSExpression.evaluateValue(JSExpression.java:234)
at com.ibm.xsp.javascript.JavaScriptInterpreter.interpret(JavaScriptInterpreter.java:222)
at com.ibm.xsp.binding.javascript.JavaScriptMethodBinding.invoke(JavaScriptMethodBinding.java:111)
at com.ibm.xsp.component.UIViewRootEx.invokePhaseMethodBinding(UIViewRootEx.java:1735)
at com.ibm.xsp.controller.FacesControllerImpl.invokePhaseMethodBinding(FacesControllerImpl.java:450)
at com.ibm.xsp.controller.FacesControllerImpl.access$0(FacesControllerImpl.java:444)
at com.ibm.xsp.controller.FacesControllerImpl$ViewPhaseListener.beforePhase(FacesControllerImpl.java:533)
at com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:197)
at com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:120)
at com.ibm.xsp.controller.FacesControllerImpl.render(FacesControllerImpl.java:270)
at com.ibm.xsp.webapp.FacesServlet.serviceView(FacesServlet.java:260)
at com.ibm.xsp.webapp.FacesServletEx.serviceView(FacesServletEx.java:157)
at com.ibm.xsp.webapp.FacesServlet.service(FacesServlet.java:159)
at com.ibm.xsp.webapp.FacesServletEx.service(FacesServletEx.java:138)
at com.ibm.xsp.webapp.DesignerFacesServlet.service(DesignerFacesServlet.java:103)
at com.ibm.designer.runtime.domino.adapter.ComponentModule.invokeServlet(ComponentModule.java:588)
at com.ibm.domino.xsp.module.nsf.NSFComponentModule.invokeServlet(NSFComponentModule.java:1335)
at com.ibm.designer.runtime.domino.adapter.ComponentModule$AdapterInvoker.invokeServlet(ComponentModule.java:865)
at com.ibm.designer.runtime.domino.adapter.ComponentModule$ServletInvoker.doService(ComponentModule.java:808)
at com.ibm.designer.runtime.domino.adapter.ComponentModule.doService(ComponentModule.java:577)
at com.ibm.domino.xsp.module.nsf.NSFComponentModule.doService(NSFComponentModule.java:1319)
at com.ibm.domino.xsp.module.nsf.NSFService.doServiceInternal(NSFService.java:662)
at com.ibm.domino.xsp.module.nsf.NSFService.doService(NSFService.java:482)
at com.ibm.designer.runtime.domino.adapter.LCDEnvironment.doService(LCDEnvironment.java:357)
at com.ibm.designer.runtime.domino.adapter.LCDEnvironment.service(LCDEnvironment.java:313)
at com.ibm.domino.xsp.bridge.http.engine.XspCmdManager.service(XspCmdManager.java:272)

Crash in Spark-SQL for windowed function

Using Spark 2.3.2 and Spark-SQL, the following query 'b' fails:
import spark.implicits._
val dataset = Seq((30, 2.0), (20, 3.0), (19, 20.0)).toDF("age", "size")
import functions._
val a0 = dataset.withColumn("rank", rank() over Window.partitionBy('age).orderBy('size))
val a1 = a0.agg(avg('rank))
//a1.show()
//OK
//same thing but in one expression, crashes:
val b = dataset.agg(functions.avg(functions.rank().over(Window.partitionBy('age).orderBy('size))))
AFAIK this is pretty weird but this is a legit SQL query:
I'm defining a column that is the result of a windowing function
then taking the average
Doing it by using an intermediary column works, but doing it in a single expression makes catalyst crash with a stack overflow:
Exception in thread "main" java.lang.StackOverflowError
at scala.Option.orElse(Option.scala:289)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1.apply(TreeNode.scala:109)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1.apply(TreeNode.scala:109)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.catalyst.trees.TreeNode.find(TreeNode.scala:109)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1$$anonfun$apply$1.apply(TreeNode.scala:109)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1$$anonfun$apply$1.apply(TreeNode.scala:109)
at scala.Option.orElse(Option.scala:289)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1.apply(TreeNode.scala:109)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1.apply(TreeNode.scala:109)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.catalyst.trees.TreeNode.find(TreeNode.scala:109)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1$$anonfun$apply$1.apply(TreeNode.scala:109)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1$$anonfun$apply$1.apply(TreeNode.scala:109)
at scala.Option.orElse(Option.scala:289)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1.apply(TreeNode.scala:109)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$find$1.apply(TreeNode.scala:109)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.catalyst.trees.TreeNode.find(TreeNode.scala:109)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ExtractWindowExpressions$.org$apache$spark$sql$catalyst$analysis$Analyzer$ExtractWindowExpressions$$hasWindowFunction(Analyzer.scala:1757)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ExtractWindowExpressions$$anonfun$71.apply(Analyzer.scala:1781)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ExtractWindowExpressions$$anonfun$71.apply(Analyzer.scala:1781)
at scala.collection.TraversableLike$$anonfun$partition$1.apply(TraversableLike.scala:314)
at scala.collection.TraversableLike$$anonfun$partition$1.apply(TraversableLike.scala:314)
at scala.collection.immutable.List.foreach(List.scala:392)
at scala.collection.TraversableLike$class.partition(TraversableLike.scala:314)
at scala.collection.AbstractTraversable.partition(Traversable.scala:104)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ExtractWindowExpressions$.org$apache$spark$sql$catalyst$analysis$Analyzer$ExtractWindowExpressions$$extract(Analyzer.scala:1781)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ExtractWindowExpressions$$anonfun$apply$28.applyOrElse(Analyzer.scala:1950)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ExtractWindowExpressions$$anonfun$apply$28.applyOrElse(Analyzer.scala:1925)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:267)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:267)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:266)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformDown$1.apply(TreeNode.scala:272)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformDown$1.apply(TreeNode.scala:272)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)
at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:272) [...]
Is this a known issue?
I'm not 100% sure my query is correct but it should at least not crash catalyst, as it's crashing even before I'm able to evaluate my query
avg() takes in column name as argument, but rank() is passing the actual column data, so it couldn't find the column name.
It works same as:
dataset.select( rank().over(Window.partitionBy("age").orderBy("size")).as("rank_XX")
).
agg(avg("rank_XX")).show()
Output:
+------------+
|avg(rank_XX)|
+------------+
| 1.0|
+------------+

java.util.zip.ZipException: invalid distance too far back

I am new to spring boot. I have developed an app which uses an external jar file containing an apache's spark application. My app runs without any error when I run in my local computer. However, when creating a fat jar file and intending to run it in a server, I have faced many issues.
This external jar file is quite big (160K KB). From my searches and the stackoverflow, I have figured out that there are some binding files (e.g. logging, SAXParser, security etc.) from both spring boot and spark resulting in conflicts hindering to run the jar file in a different environment.
To remedy, I have manually open the external jar file through 7-zip and deleted the binding files, such as javax/security and org/slf4j/impl/StaticLoggerBinder.class etc. As a result, now I can start my code in another server.
First of all, does this approach make sense? If not, what should be the way?
In my code, when I run the part related to the external jar file, I get the following error. From the similar posts in stackoverflow, the jar files can be corrupted. However, I did not get how to remedy this issue.
Any suggestion will be highly appreciated.
2017-04-14 18:22:10.222 ERROR 17324 --- [nio-8080-exec-7] o.a.c.c.C.[.[.[/].[dis
patcherServlet] : Servlet.service() for servlet [dispatcherServlet] in contex
t with path [] threw exception [Handler dispatch failed; nested exception is jav
a.lang.ExceptionInInitializerError] with root cause
java.util.zip.ZipException: invalid distance too far back
at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:164)
~[na:1.8.0_121]
at org.springframework.boot.loader.jar.ZipInflaterInputStream.read(ZipIn
flaterInputStream.java:52) ~[mlee.jar:0.0.1-SNAPSHOT]
at java.io.FilterInputStream.read(FilterInputStream.java:107) ~[na:1.8.0
_121]
at java.util.Properties$LineReader.readLine(Properties.java:435) ~[na:1.
8.0_121]
at java.util.Properties.load0(Properties.java:353) ~[na:1.8.0_121]
at java.util.Properties.load(Properties.java:341) ~[na:1.8.0_121]
at scala.util.PropertiesTrait$$anonfun$scalaProps$1.apply$mcV$sp(Propert
ies.scala:37) ~[cosinelsh-1.0.jar:na]
at scala.util.PropertiesTrait$class.quietlyDispose(Properties.scala:43)
~[cosinelsh-1.0.jar:na]
at scala.util.PropertiesTrait$class.scalaProps(Properties.scala:37) ~[co
sinelsh-1.0.jar:na]
at scala.util.Properties$.scalaProps$lzycompute(Properties.scala:16) ~[c
osinelsh-1.0.jar:na]
at scala.util.Properties$.scalaProps(Properties.scala:16) ~[cosinelsh-1.
0.jar:na]
at scala.util.PropertiesTrait$class.scalaPropOrNone(Properties.scala:65)
~[cosinelsh-1.0.jar:na]
at scala.util.Properties$.scalaPropOrNone(Properties.scala:16) ~[cosinel
sh-1.0.jar:na]
at scala.util.PropertiesTrait$class.$init$(Properties.scala:77) ~[cosine
lsh-1.0.jar:na]
at scala.util.Properties$.<init>(Properties.scala:16) ~[cosinelsh-1.0.ja
r:na]
at scala.util.Properties$.<clinit>(Properties.scala) ~[cosinelsh-1.0.jar
:na]
at scala.compat.Platform$.<init>(Platform.scala:112) ~[cosinelsh-1.0.jar
:na]
at scala.compat.Platform$.<clinit>(Platform.scala) ~[cosinelsh-1.0.jar:n
a]
at scala.Array$.copy(Array.scala:105) ~[cosinelsh-1.0.jar:na]
at scala.collection.immutable.HashMap$HashTrieMap.updated0(HashMap.scala
:335) ~[cosinelsh-1.0.jar:na]
at scala.collection.immutable.HashMap.$plus(HashMap.scala:57) ~[cosinels
h-1.0.jar:na]
at scala.collection.immutable.HashMap.$plus(HashMap.scala:36) ~[cosinels
h-1.0.jar:na]
at scala.collection.mutable.MapBuilder.$plus$eq(MapBuilder.scala:28) ~[c
osinelsh-1.0.jar:na]
at scala.collection.mutable.MapBuilder.$plus$eq(MapBuilder.scala:24) ~[c
osinelsh-1.0.jar:na]
at scala.collection.generic.Growable$$anonfun$$plus$plus$eq$1.apply(Grow
able.scala:48) ~[cosinelsh-1.0.jar:na]
at scala.collection.generic.Growable$$anonfun$$plus$plus$eq$1.apply(Grow
able.scala:48) ~[cosinelsh-1.0.jar:na]
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimize
d.scala:33) ~[cosinelsh-1.0.jar:na]
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
~[cosinelsh-1.0.jar:na]
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:
48) ~[cosinelsh-1.0.jar:na]
at scala.collection.mutable.MapBuilder.$plus$plus$eq(MapBuilder.scala:24
) ~[cosinelsh-1.0.jar:na]
at scala.collection.TraversableLike$class.$plus$plus(TraversableLike.sca
la:157) ~[cosinelsh-1.0.jar:na]
at scala.collection.AbstractTraversable.$plus$plus(Traversable.scala:105
) ~[cosinelsh-1.0.jar:na]
at scala.collection.immutable.HashMap.$plus(HashMap.scala:60) ~[cosinels
h-1.0.jar:na]
at scala.collection.immutable.Map$Map4.updated(Map.scala:172) ~[cosinels
h-1.0.jar:na]
at scala.collection.immutable.Map$Map4.$plus(Map.scala:173) ~[cosinelsh-
1.0.jar:na]
at scala.collection.immutable.Map$Map4.$plus(Map.scala:158) ~[cosinelsh-
1.0.jar:na]
at scala.collection.mutable.MapBuilder.$plus$eq(MapBuilder.scala:28) ~[c
osinelsh-1.0.jar:na]
at scala.collection.mutable.MapBuilder.$plus$eq(MapBuilder.scala:24) ~[c
osinelsh-1.0.jar:na]
at scala.collection.generic.Growable$$anonfun$$plus$plus$eq$1.apply(Grow
able.scala:48) ~[cosinelsh-1.0.jar:na]
at scala.collection.generic.Growable$$anonfun$$plus$plus$eq$1.apply(Grow
able.scala:48) ~[cosinelsh-1.0.jar:na]
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.
scala:59) ~[cosinelsh-1.0.jar:na]
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) ~[
cosinelsh-1.0.jar:na]
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:
48) ~[cosinelsh-1.0.jar:na]
at scala.collection.mutable.MapBuilder.$plus$plus$eq(MapBuilder.scala:24
) ~[cosinelsh-1.0.jar:na]
at scala.collection.generic.GenMapFactory.apply(GenMapFactory.scala:47)
~[cosinelsh-1.0.jar:na]
at scala.sys.package$.env(package.scala:61) ~[cosinelsh-1.0.jar:na]
at org.apache.spark.util.Utils$.<init>(Utils.scala:855) ~[cosinelsh-1.0.
jar:na]
at org.apache.spark.util.Utils$.<clinit>(Utils.scala) ~[cosinelsh-1.0.ja
r:na]
at org.apache.spark.SparkConf.<init>(SparkConf.scala:58) ~[cosinelsh-1.0
.jar:na]
at org.apache.spark.SparkConf.<init>(SparkConf.scala:52) ~[cosinelsh-1.0
.jar:na]
at com.soundcloud.lsh.Main5$.SpringLsh(Main5.scala:29) ~[cosinelsh-1.0.j
ar:na]
at com.soundcloud.lsh.Main5.SpringLsh(Main5.scala) ~[cosinelsh-1.0.jar:n
a]
at softuniBlog.controller.ReferencesController.refFormProcess(References
Controller.java:88) ~[classes!/:0.0.1-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.
0_121]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:62) ~[na:1.8.0_121]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:43) ~[na:1.8.0_121]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_121]
at org.springframework.web.method.support.InvocableHandlerMethod.doInvok
e(InvocableHandlerMethod.java:205) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEAS
E]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeF
orRequest(InvocableHandlerMethod.java:133) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.
6.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocabl
eHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:116) ~[spring-
webmvc-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingH
andlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827) ~[sprin
g-webmvc-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingH
andlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738) ~[spring-web
mvc-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapt
er.handle(AbstractHandlerMethodAdapter.java:85) ~[spring-webmvc-4.3.6.RELEASE.ja
r!/:4.3.6.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(Dispatch
erServlet.java:963) ~[spring-webmvc-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.web.servlet.DispatcherServlet.doService(Dispatche
rServlet.java:897) ~[spring-webmvc-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.processRequest(Frame
workServlet.java:970) ~[spring-webmvc-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServ
let.java:872) ~[spring-webmvc-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:648) ~[tomcat
-embed-core-8.5.11.jar!/:8.5.11]
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkSer
vlet.java:846) ~[spring-webmvc-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729) ~[tomcat
-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
icationFilterChain.java:230) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
ilterChain.java:165) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52
) ~[tomcat-embed-websocket-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
icationFilterChain.java:192) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
ilterChain.java:165) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:317) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.access.intercept.FilterSecurityInter
ceptor.invoke(FilterSecurityInterceptor.java:127) ~[spring-security-web-4.2.1.RE
LEASE.jar!/:4.2.1.RELEASE]
at org.springframework.security.web.access.intercept.FilterSecurityInter
ceptor.doFilter(FilterSecurityInterceptor.java:91) ~[spring-security-web-4.2.1.R
ELEASE.jar!/:4.2.1.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.access.ExceptionTranslationFilter.do
Filter(ExceptionTranslationFilter.java:114) ~[spring-security-web-4.2.1.RELEASE.
jar!/:4.2.1.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.session.SessionManagementFilter.doFi
lter(SessionManagementFilter.java:137) ~[spring-security-web-4.2.1.RELEASE.jar!/
:4.2.1.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.authentication.AnonymousAuthenticati
onFilter.doFilter(AnonymousAuthenticationFilter.java:111) ~[spring-security-web-
4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.servletapi.SecurityContextHolderAwar
eRequestFilter.doFilter(SecurityContextHolderAwareRequestFilter.java:170) ~[spri
ng-security-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.savedrequest.RequestCacheAwareFilter
.doFilter(RequestCacheAwareFilter.java:63) ~[spring-security-web-4.2.1.RELEASE.j
ar!/:4.2.1.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.authentication.AbstractAuthenticatio
nProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200) ~[sp
ring-security-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.authentication.logout.LogoutFilter.d
oFilter(LogoutFilter.java:116) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.2.1.R
ELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.csrf.CsrfFilter.doFilterInternal(Csr
fFilter.java:124) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerR
equestFilter.java:107) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.header.HeaderWriterFilter.doFilterIn
ternal(HeaderWriterFilter.java:64) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.2
.1.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerR
equestFilter.java:107) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.context.SecurityContextPersistenceFi
lter.doFilter(SecurityContextPersistenceFilter.java:105) ~[spring-security-web-4
.2.1.RELEASE.jar!/:4.2.1.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.context.request.async.WebAsyncManage
rIntegrationFilter.doFilterInternal(WebAsyncManagerIntegrationFilter.java:56) ~[
spring-security-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerR
equestFilter.java:107) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.
doFilter(FilterChainProxy.java:331) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.
2.1.RELEASE]
at org.springframework.security.web.FilterChainProxy.doFilterInternal(Fi
lterChainProxy.java:214) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE
]
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChai
nProxy.java:177) ~[spring-security-web-4.2.1.RELEASE.jar!/:4.2.1.RELEASE]
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(D
elegatingFilterProxy.java:346) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(Delegat
ingFilterProxy.java:262) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
icationFilterChain.java:192) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
ilterChain.java:165) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.springframework.web.filter.RequestContextFilter.doFilterInternal(
RequestContextFilter.java:99) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerR
equestFilter.java:107) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
icationFilterChain.java:192) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
ilterChain.java:165) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInter
nal(HttpPutFormContentFilter.java:105) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RE
LEASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerR
equestFilter.java:107) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
icationFilterChain.java:192) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
ilterChain.java:165) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInterna
l(HiddenHttpMethodFilter.java:81) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE
]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerR
equestFilter.java:107) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
icationFilterChain.java:192) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
ilterChain.java:165) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.springframework.web.filter.CharacterEncodingFilter.doFilterIntern
al(CharacterEncodingFilter.java:197) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELE
ASE]
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerR
equestFilter.java:107) ~[spring-web-4.3.6.RELEASE.jar!/:4.3.6.RELEASE]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(Appl
icationFilterChain.java:192) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationF
ilterChain.java:165) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperV
alve.java:198) ~[tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextV
alve.java:96) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(Authentica
torBase.java:474) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.j
ava:140) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.j
ava:79) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineVal
ve.java:87) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.jav
a:349) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java
:783) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLig
ht.java:66) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(Abstract
Protocol.java:798) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpo
int.java:1434) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBas
e.java:49) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.
java:1142) [na:1.8.0_121]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor
.java:617) [na:1.8.0_121]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskTh
read.java:61) [tomcat-embed-core-8.5.11.jar!/:8.5.11]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_121]
Thanks in advance!
Some recommendations:
Actually if you have duplicate version of jars or any jar conflict, you must have to correct by removing another jar.
Verify that you have the exact same Java version on both your local and server systems.
You may have a corrupted Jar. Rebuild it after resolving conflict.
Ensure that everything is identical and try again
Resource Link: https://stackoverflow.com/a/16596749/2293534

Resources