I am sending a valid CMIS query to Documentum 7.1 and get a Bad Request response:
GET /emc-cmis/resources/repositories/myrepo/path?p=%2FResources&filter=cmis%3AobjectId%2Ccmis%3Aname%2Ccmis%3AcontentStreamFileName%2Ccmis%3AcontentStreamLength%2Ccmis%3AlastModificationDate%2Ccmis%3AlastModifiedBy%2Ccmis%3Apath%2Ccmis%3AbaseTypeId%2Ccmis%3AobjectTypeId&includeAllowableActions=true&includePolicyIds=false&includeRelationships=none&includeACL=false&renditionFilter= HTTP/1.1
User-Agent: Apache Chemistry DotCMIS
Authorization: Basic dDE6dDE=
Host: documentum:8080
HTTP/1.1 400 Bad Request
Server: Apache-Coyote/1.1
Content-Type: text/plain;charset=UTF-8
Transfer-Encoding: chunked
Date: Fri, 21 Nov 2014 06:45:01 GMT
Connection: close
1630
[CMIS AtomPub binding]
STATUS CODE:
400
EXCEPTION:
filterNotValid
ERROR:
Invalid property definition ID was supplied : cmis:contentStreamFileName
STACK TRACE:
org.cmis.ws.CmisException: Invalid property definition ID was supplied : cmis:contentStreamFileName
.at com.emc.documentum.fs.cmis.impl.dfs.utils.CmisExceptionFactory.create(CmisExceptionFactory.java:24)
.at com.emc.documentum.fs.cmis.impl.dfs.utils.CmisExceptionBuilder.build(CmisExceptionBuilder.java:31)
.at com.emc.documentum.fs.cmis.impl.dfs.converter.object.DataObjectConverter.convertProperty(DataObjectConverter.java:241)
.at com.emc.documentum.fs.cmis.impl.dfs.converter.object.DataObjectConverter.convertProperties(DataObjectConverter.java:212)
.at com.emc.documentum.fs.cmis.impl.dfs.converter.object.DataObjectConverter.getCmisPropertiesByFilter(DataObjectConverter.java:202)
.at com.emc.documentum.fs.cmis.impl.dfs.converter.object.DataObjectConverter.toCmisObject(DataObjectConverter.java:124)
.at com.emc.documentum.fs.cmis.impl.dfs.action.GetObjectByPathAction.execute(GetObjectByPathAction.java:73)
.at com.emc.documentum.fs.cmis.impl.dfs.action.GetObjectByPathAction.execute(GetObjectByPathAction.java:45)
.at com.emc.documentum.fs.cmis.impl.filter.LinkedActionFilterChain.run(LinkedActionFilterChain.java:23)
.at com.emc.documentum.fs.cmis.impl.filter.RequestValidationFilter.doFilter(RequestValidationFilter.java:16)
.at com.emc.documentum.fs.cmis.impl.filter.LinkedActionFilterChain.run(LinkedActionFilterChain.java:21)
.at com.emc.documentum.fs.cmis.rs.impl.resource.ObjectPathResource.getProperties(ObjectPathResource.java:66)
.at sun.reflect.GeneratedMethodAccessor78.invoke(Unknown Source)
.at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
.at java.lang.reflect.Method.invoke(Unknown Source)
.at com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
.at com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$ResponseOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:205)
.at com.sun.jersey.server.impl.model.method.dispatch.ResourceJavaMethodDispatcher.dispatch(ResourceJavaMethodDispatcher.java:75)
.at com.sun.jersey.server.impl.uri.rules.HttpMethodRule.accept(HttpMethodRule.java:288)
.at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:134)
.at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
.at com.sun.jersey.server.impl.uri.rules.SubLocatorRule.accept(SubLocatorRule.java:134)
.at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
.at com.sun.jersey.server.impl.uri.rules.ResourceClassRule.accept(ResourceClassRule.java:108)
.at com.sun.jersey.server.impl.uri.rules.RightHandPathRule.accept(RightHandPathRule.java:147)
.at com.sun.jersey.server.impl.uri.rules.RootResourceClassesRule.accept(RootResourceClassesRule.java:84)
.at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1483)
.at com.sun.jersey.server.impl.application.WebApplicationImpl._handleRequest(WebApplicationImpl.java:1414)
.at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1363)
.at com.sun.jersey.server.impl.application.WebApplicationImpl.handleRequest(WebApplicationImpl.java:1353)
.at com.sun.jersey.spi.container.servlet.WebComponent.service(WebComponent.java:414)
.at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:537)
.at com.sun.jersey.spi.container.servlet.ServletContainer.service(ServletContainer.java:708)
.at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
.at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
.at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
.at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
.at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
.at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
.at com.emc.documentum.fs.cmis.rs.impl.web.SetCharacterEncodingFilter.doFilter(SetCharacterEncodingFilter.java:104)
.at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
.at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
.at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
.at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
.at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:501)
.at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170)
.at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:98)
.at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:950)
.at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
.at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
.at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1040)
.at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:607)
.at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:315)
.at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
.at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
.at java.lang.Thread.run(Unknown Source)
I am 100% sure my request is valid according to the CMIS protocol. It is actually sent by DotCMIS.
Documentum answers correctly to GetRepository requests.
What is the problem?
Here is the filter field of the request parameters decoded:
cmis:objectId
cmis:name
cmis:contentStreamFileName (the one which seems to be problematic)
cmis:contentStreamLength
cmis:lastModificationDate
cmis:lastModifiedBy
cmis:path
cmis:baseTypeId
cmis:objectTypeId
If I remove the contentStreamFileName filter, Documentum complains about contentStreamLength. If I remove this filter too, I finally get a correct response:
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<atom:entry xmlns:atom="http://www.w3.org/2005/Atom" xmlns:app="http://www.w3.org/2007/app" xmlns:cmis="http://docs.oasis-open.org/ns/cmis/core/200908/" xmlns:cmisra="http://docs.oasis-open.org/ns/cmis/restatom/200908/" xmlns:cmism="http://docs.oasis-open.org/ns/cmis/messaging/200908/" xmlns:ns6="http://wadl.dev.java.net/2009/02" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<atom:id>http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130</atom:id>
<atom:title type="text">Resources</atom:title>
<atom:author>
<atom:name>myauthor</atom:name>
</atom:author>
<atom:summary type="text">dm_cabinet object</atom:summary>
<atom:published>2014-10-28T01:19:52.000+09:00</atom:published>
<atom:updated>2014-10-27T17:28:14.000+09:00</atom:updated>
<atom:link type="application/atomsvc+xml" rel="service" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo"/>
<atom:link type="application/atom+xml;type=entry" rel="self" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130" cmisra:id="0c00000180000130"/>
<atom:link type="application/atom+xml;type=entry" rel="edit" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130" cmisra:id="0c00000180000130"/>
<atom:link type="application/cmisallowableactions+xml" rel="http://docs.oasis-open.org/ns/cmis/link/200908/allowableactions" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130/actions"/>
<atom:link type="application/cmisacl+xml" rel="http://docs.oasis-open.org/ns/cmis/link/200908/acl" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130/acl"/>
<atom:link type="application/atom+xml;type=entry" rel="describedby" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/types/dm_cabinet" cmisra:id="dm_cabinet"/>
<atom:content>0c00000180000130</atom:content>
<atom:link type="application/atom+xml;type=entry" rel="up" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130/parents"/>
<atom:link type="application/atom+xml;type=feed" rel="down" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130/children"/>
<atom:link type="application/cmistree+xml" rel="down" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130/descendants"/>
<atom:link type="application/atom+xml;type=feed" rel="http://docs.oasis-open.org/ns/cmis/link/200908/foldertree" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130/tree"/>
<atom:link type="application/atom+xml;type=feed" rel="http://docs.oasis-open.org/ns/cmis/link/200908/relationships" href="http://192.168.209.100:8080/emc-cmis/resources/repositories/myrepo/objects/0c00000180000130/relationships"/>
<app:edited>2014-10-27T17:28:14.000+09:00</app:edited>
<cmisra:object>
<cmis:properties>
<cmis:propertyId queryName="cmis:objectId" displayName="Object Id" localName="r_object_id" propertyDefinitionId="cmis:objectId">
<cmis:value>0c00000180000130</cmis:value>
</cmis:propertyId>
<cmis:propertyString queryName="cmis:name" displayName="Name" localName="object_name" propertyDefinitionId="cmis:name">
<cmis:value>Resources</cmis:value>
</cmis:propertyString>
<cmis:propertyDateTime queryName="cmis:lastModificationDate" displayName="Last Modification Date" localName="r_modify_date" propertyDefinitionId="cmis:lastModificationDate">
<cmis:value>2014-10-27T17:28:14.000+09:00</cmis:value>
</cmis:propertyDateTime>
<cmis:propertyString queryName="cmis:lastModifiedBy" displayName="Last Modified By" localName="r_modifier" propertyDefinitionId="cmis:lastModifiedBy">
<cmis:value>dmadmin</cmis:value>
</cmis:propertyString>
<cmis:propertyString queryName="cmis:path" displayName="Folder Path" localName="r_folder_path" propertyDefinitionId="cmis:path">
<cmis:value>/Resources</cmis:value>
</cmis:propertyString>
<cmis:propertyId queryName="cmis:baseTypeId" displayName="Base Type Id" localName="" propertyDefinitionId="cmis:baseTypeId">
<cmis:value>cmis:folder</cmis:value>
</cmis:propertyId>
<cmis:propertyId queryName="cmis:objectTypeId" displayName="Object Type ID" localName="r_object_type" propertyDefinitionId="cmis:objectTypeId">
<cmis:value>dm_cabinet</cmis:value>
</cmis:propertyId>
</cmis:properties>
<cmis:allowableActions>
<cmis:canDeleteObject>false</cmis:canDeleteObject>
<cmis:canUpdateProperties>true</cmis:canUpdateProperties>
<cmis:canGetFolderTree>true</cmis:canGetFolderTree>
<cmis:canGetProperties>true</cmis:canGetProperties>
<cmis:canGetObjectRelationships>true</cmis:canGetObjectRelationships>
<cmis:canGetObjectParents>true</cmis:canGetObjectParents>
<cmis:canGetFolderParent>true</cmis:canGetFolderParent>
<cmis:canGetDescendants>true</cmis:canGetDescendants>
<cmis:canMoveObject>true</cmis:canMoveObject>
<cmis:canApplyPolicy>false</cmis:canApplyPolicy>
<cmis:canGetAppliedPolicies>false</cmis:canGetAppliedPolicies>
<cmis:canRemovePolicy>false</cmis:canRemovePolicy>
<cmis:canGetChildren>true</cmis:canGetChildren>
<cmis:canCreateDocument>true</cmis:canCreateDocument>
<cmis:canCreateFolder>true</cmis:canCreateFolder>
<cmis:canCreateRelationship>true</cmis:canCreateRelationship>
<cmis:canDeleteTree>false</cmis:canDeleteTree>
<cmis:canGetACL>false</cmis:canGetACL>
<cmis:canApplyACL>false</cmis:canApplyACL>
</cmis:allowableActions>
</cmisra:object>
</atom:entry>
Related
What I did (Structured Streaming)
> 1.) ./bin/pyspark
> 2.) spark
> 3.) static = spark.read.json("/data/activity-data/")
> 4.) dataSchema = static.schema
> 5.) streaming = spark.readStream.schema(dataSchema).option("maxFilesPerTrigger", 1)\
> 6.) .json("/data/activity-data")
> 7.) activityCounts = streaming.groupBy("gt").count()
Then I got this huge error.
Could you help me how to troubleshoot this?
ERRORs:
org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
at
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$3.applyOrElse(CheckAnalysis.scala:110)
at
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$3.applyOrElse(CheckAnalysis.scala:107)
at
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:278)
at
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:278)
at
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at
org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:277)
at
org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:93)
at
org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:93)
at
org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1.apply(QueryPlan.scala:105)
at
org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1.apply(QueryPlan.scala:105)
at
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at
org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:104)
at
org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(QueryPlan.scala:116)
at
org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1$2.apply(QueryPlan.scala:121)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at
scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at
org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(QueryPlan.scala:121)
at
org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$2.apply(QueryPlan.scala:126)
at
org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
at
org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:126)
at
org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsUp(QueryPlan.scala:93)
at
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:107)
at
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:85)
at
org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127)
at
org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:85)
at
org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:95)
at
org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:108)
at
org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
at
org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
at
org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
at
org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at
org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at
org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78) at
org.apache.spark.sql.RelationalGroupedDataset.toDF(RelationalGroupedDataset.scala:65)
at
org.apache.spark.sql.RelationalGroupedDataset.count(RelationalGroupedDataset.scala:237)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at
py4j.Gateway.invoke(Gateway.java:282) at
py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79) at
py4j.GatewayConnection.run(GatewayConnection.java:238) at
java.lang.Thread.run(Thread.java:748)
If you have duplicate keys, you will get this error.
You can also refer to https://issues.apache.org/jira/browse/SPARK-10925
I wanted to convert a large .csv vile into .parquet format using pyspark.
I am using python 3. I tried changing the codec used for compression, as suggested in a similar thread, but still the same error.
This is the code I am using to read the file and save it into parquet format.
spark.conf.set("spark.sql.parquet.compression.codec", "gzip")
df.write.parquet("adobe20180615.parquet")
I get the following error:
Py4JJavaError: An error occurred while calling o1071.parquet.
: org.apache.spark.SparkException: Job aborted.
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:196)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:159)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:668)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:668)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:276)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:270)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:228)
at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:557)
at sun.reflect.GeneratedMethodAccessor86.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Unknown Source)
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 41.0 failed 1 times, most recent failure: Lost task 0.0 in stage 41.0 (TID 146, localhost, executor driver): java.io.IOException: (null) entry in command string: null chmod 0644 C:\Users\b35884\Documents\Python Scripts\Spark\adobe.parquet\_temporary\0\_temporary\attempt_20190411174312_0041_m_000000_0\part-00000-e2381e5d-0a9d-407e-8bcb-52d589f7569a-c000.gz.parquet
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:225)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:209)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.<init>(ChecksumFileSystem.java:398)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:461)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:440)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.parquet.hadoop.util.HadoopOutputFile.create(HadoopOutputFile.java:74)
at org.apache.parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:248)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:390)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:349)
at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.<init>(ParquetOutputWriter.scala:37)
at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anon$1.newInstance(ParquetFileFormat.scala:151)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.newOutputWriter(FileFormatDataWriter.scala:120)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.<init>(FileFormatDataWriter.scala:108)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:233)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:168)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1887)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1875)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1874)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:166)
... 32 more
Caused by: java.io.IOException: (null) entry in command string: null chmod 0644 C:\Users\b35884\Documents\Python Scripts\Spark\adobe.parquet\_temporary\0\_temporary\attempt_20190411174312_0041_m_000000_0\part-00000-e2381e5d-0a9d-407e-8bcb-52d589f7569a-c000.gz.parquet
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:866)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:849)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:733)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:225)
at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:209)
at org.apache.hadoop.fs.RawLocalFileSystem.createOutputStreamWithMode(RawLocalFileSystem.java:307)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:296)
at org.apache.hadoop.fs.RawLocalFileSystem.create(RawLocalFileSystem.java:328)
at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.<init>(ChecksumFileSystem.java:398)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:461)
at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:440)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:911)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:892)
at org.apache.parquet.hadoop.util.HadoopOutputFile.create(HadoopOutputFile.java:74)
at org.apache.parquet.hadoop.ParquetFileWriter.<init>(ParquetFileWriter.java:248)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:390)
at org.apache.parquet.hadoop.ParquetOutputFormat.getRecordWriter(ParquetOutputFormat.java:349)
at org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.<init>(ParquetOutputWriter.scala:37)
at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$$anon$1.newInstance(ParquetFileFormat.scala:151)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.newOutputWriter(FileFormatDataWriter.scala:120)
at org.apache.spark.sql.execution.datasources.SingleDirectoryDataWriter.<init>(FileFormatDataWriter.scala:108)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.org$apache$spark$sql$execution$datasources$FileFormatWriter$$executeTask(FileFormatWriter.scala:233)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:169)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$$anonfun$write$1.apply(FileFormatWriter.scala:168)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
... 1 more
My problem was that i had installed pyspark using the pip command inside of the Jupyter notebook (since that had worked for all other packages, and seemed to install successfully). Apparently that doesn't install nearly all the things needed for pyspark to be fully functional. I installed it manually using this tutorial and everything works!
https://www.youtube.com/watch?v=WQErwxRTiW0
There might be one of the reasons given below:
User does not have permission to save file
HADOOP_HOME is not set in environment variable (Windows)
winutils.exe is not available in hadoop folder (Windows)
SPARK_HOME is not set in environment variable
There is already one folder with same name
This question already has answers here:
Spark Error:expected zero arguments for construction of ClassDict (for numpy.core.multiarray._reconstruct)
(6 answers)
Closed 5 years ago.
I have a DataFrame column with an array of strings. I've tried creating a udf and using numpy to permute (unit is the column name):
def permute(row):
return np.random.permutation(row)
udfPermute = udf(permute, ArrayType(StringType()))
print(units.withColumn("shuffled", udfPermute("unit")).head(5))
Py4JJavaError: An error occurred while calling o4246.collectToPython.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 871.0 failed 1 times, most recent failure: Lost task 0.0 in stage 871.0 (TID 1224, localhost, executor driver): net.razorvine.pickle.PickleException: expected zero arguments for construction of ClassDict (for numpy.core.multiarray._reconstruct)
at net.razorvine.pickle.objects.ClassDictConstructor.construct(ClassDictConstructor.java:23)
at net.razorvine.pickle.Unpickler.load_reduce(Unpickler.java:707)
at net.razorvine.pickle.Unpickler.dispatch(Unpickler.java:175)
at net.razorvine.pickle.Unpickler.load(Unpickler.java:99)
at net.razorvine.pickle.Unpickler.loads(Unpickler.java:112)
at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1$$anonfun$apply$6.apply(BatchEvalPythonExec.scala:156)
at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1$$anonfun$apply$6.apply(BatchEvalPythonExec.scala:155)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:231)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:225)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:826)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:826)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944)
at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:333)
at org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38)
at org.apache.spark.sql.Dataset$$anonfun$collectToPython$1.apply$mcI$sp(Dataset.scala:2745)
at org.apache.spark.sql.Dataset$$anonfun$collectToPython$1.apply(Dataset.scala:2742)
at org.apache.spark.sql.Dataset$$anonfun$collectToPython$1.apply(Dataset.scala:2742)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2765)
at org.apache.spark.sql.Dataset.collectToPython(Dataset.scala:2742)
at sun.reflect.GeneratedMethodAccessor77.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:280)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:745)
Caused by: net.razorvine.pickle.PickleException: expected zero arguments for construction of ClassDict (for numpy.core.multiarray._reconstruct)
at net.razorvine.pickle.objects.ClassDictConstructor.construct(ClassDictConstructor.java:23)
at net.razorvine.pickle.Unpickler.load_reduce(Unpickler.java:707)
at net.razorvine.pickle.Unpickler.dispatch(Unpickler.java:175)
at net.razorvine.pickle.Unpickler.load(Unpickler.java:99)
at net.razorvine.pickle.Unpickler.loads(Unpickler.java:112)
at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1$$anonfun$apply$6.apply(BatchEvalPythonExec.scala:156)
at org.apache.spark.sql.execution.python.BatchEvalPythonExec$$anonfun$doExecute$1$$anonfun$apply$6.apply(BatchEvalPythonExec.scala:155)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:377)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:231)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$2.apply(SparkPlan.scala:225)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:826)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:826)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
... 1 more
In [161]:
How can I accomplish this?
You are returning a numpy array, you need to return a list instead.
Change your UDF as below and it should work.
def permute(row):
return np.random.permutation(row).tolist()
Someone was playing with chmod or chown in order to paste the war files in the TomEE installation directory without deleting the file first, from that moment we had the following problem:
JAX-RS annotated methods with multipart doesn't work anymore, for example, the following method is never reached:
#POST
#Path("test")
#Consumes(MediaType.MULTIPART_FORM_DATA)
#Produces(MediaType.APPLICATION_JSON)
public String test(
#Multipart(value = "a", required = false) Attachment a,
#Multipart(value = "b", required = false) Attachment b) {
System.out.println("THIS IS NEVER PRINTED");
return "THIS IS NEVER CALLED RETURNED";
}
And after 20 seconds (I assume 20 seconds from the OS configuration) we got the following
Dec 31, 2015 11:19:43 PM org.apache.cxf.jaxrs.impl.WebApplicationExceptionMapper toResponse
WARNING: javax.ws.rs.WebApplicationException: org.apache.cxf.interceptor.Fault: Read timed out
at org.apache.cxf.jaxrs.utils.JAXRSUtils.readFromMessageBody(JAXRSUtils.java:1040)
at org.apache.cxf.jaxrs.utils.JAXRSUtils.processParameter(JAXRSUtils.java:610)
at org.apache.cxf.jaxrs.utils.JAXRSUtils.processParameters(JAXRSUtils.java:574)
at org.apache.cxf.jaxrs.interceptor.JAXRSInInterceptor.processRequest(JAXRSInInterceptor.java:242)
at org.apache.cxf.jaxrs.interceptor.JAXRSInInterceptor.handleMessage(JAXRSInInterceptor.java:91)
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:263)
at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)
at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:240)
at org.apache.openejb.server.cxf.rs.CxfRsHttpListener.doInvoke(CxfRsHttpListener.java:227)
at org.apache.tomee.webservices.CXFJAXRSFilter.doFilter(CXFJAXRSFilter.java:94)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.filters.CorsFilter.handleNonCORS(CorsFilter.java:438)
at org.apache.catalina.filters.CorsFilter.doFilter(CorsFilter.java:179)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:241)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:208)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:220)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:122)
at org.apache.tomee.catalina.OpenEJBValve.invoke(OpenEJBValve.java:44)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:505)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:170)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:103)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:957)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:116)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:423)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1079)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:620)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:316)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.cxf.interceptor.Fault: Read timed out
at org.apache.cxf.interceptor.AttachmentInInterceptor.handleMessage(AttachmentInInterceptor.java:66)
at org.apache.cxf.jaxrs.ext.MessageContextImpl.createAttachments(MessageContextImpl.java:253)
at org.apache.cxf.jaxrs.ext.MessageContextImpl.get(MessageContextImpl.java:75)
at org.apache.cxf.jaxrs.impl.tl.ThreadLocalMessageContext.get(ThreadLocalMessageContext.java:38)
at org.apache.cxf.jaxrs.utils.multipart.AttachmentUtils.getMultipartBody(AttachmentUtils.java:90)
at org.apache.cxf.jaxrs.utils.multipart.AttachmentUtils.getAttachments(AttachmentUtils.java:95)
at org.apache.cxf.jaxrs.provider.MultipartProvider.readFrom(MultipartProvider.java:147)
at org.apache.cxf.jaxrs.utils.JAXRSUtils.readFromMessageBody(JAXRSUtils.java:1032)
... 34 more
Caused by: java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.read(SocketInputStream.java:152)
at java.net.SocketInputStream.read(SocketInputStream.java:122)
at org.apache.coyote.http11.InternalInputBuffer.fill(InternalInputBuffer.java:535)
at org.apache.coyote.http11.InternalInputBuffer.fill(InternalInputBuffer.java:504)
at org.apache.coyote.http11.InternalInputBuffer$InputStreamInputBuffer.doRead(InternalInputBuffer.java:566)
at org.apache.coyote.http11.filters.IdentityInputFilter.doRead(IdentityInputFilter.java:137)
at org.apache.coyote.http11.AbstractInputBuffer.doRead(AbstractInputBuffer.java:341)
at org.apache.coyote.Request.doRead(Request.java:431)
at org.apache.catalina.connector.InputBuffer.realReadBytes(InputBuffer.java:290)
at org.apache.tomcat.util.buf.ByteChunk.substract(ByteChunk.java:390)
at org.apache.catalina.connector.InputBuffer.readByte(InputBuffer.java:304)
at org.apache.catalina.connector.CoyoteInputStream.read(CoyoteInputStream.java:106)
at java.io.FilterInputStream.read(FilterInputStream.java:83)
at java.io.FilterInputStream.read(FilterInputStream.java:83)
at java.io.PushbackInputStream.read(PushbackInputStream.java:139)
at org.apache.cxf.attachment.AttachmentDeserializer.readTillFirstBoundary(AttachmentDeserializer.java:251)
at org.apache.cxf.attachment.AttachmentDeserializer.initializeRootMessage(AttachmentDeserializer.java:122)
at org.apache.cxf.attachment.AttachmentDeserializer.initializeAttachments(AttachmentDeserializer.java:92)
at org.apache.cxf.interceptor.AttachmentInInterceptor.handleMessage(AttachmentInInterceptor.java:64)
... 41 more
We thought that it was because of the file size, so we tried again with two 0 byte files and we noted an ERR_CONNECTION_RESET.
We have done everything from fresh tomee installation to delete everything in the project except that class with that method above.
Any ideas of what could have happened?
While Deploying ear in jboss 6 its giving Runtime Exception please help me how to solve this error
11:14:14,684 ERROR [ProfileDeployAction] Failed to add deployment: FirstGen.ear: org.jboss.deployers.spi.DeploymentException: Exception determining structure: AbstractVFSDeployment(FirstGen.ear)
at org.jboss.deployers.spi.DeploymentException.rethrowAsDeploymentException(DeploymentException.java:49) [:2.2.0.GA]
at org.jboss.deployers.structure.spi.helpers.AbstractStructuralDeployers.determineStructure(AbstractStructuralDeployers.java:85) [:2.2.0.GA]
at org.jboss.deployers.plugins.main.MainDeployerImpl.determineStructure(MainDeployerImpl.java:1106) [:2.2.0.GA]
at org.jboss.deployers.plugins.main.MainDeployerImpl.determineDeploymentContext(MainDeployerImpl.java:417) [:2.2.0.GA]
at org.jboss.deployers.plugins.main.MainDeployerImpl.addDeployment(MainDeployerImpl.java:367) [:2.2.0.GA]
at org.jboss.deployers.plugins.main.MainDeployerImpl.addDeployment(MainDeployerImpl.java:277) [:2.2.0.GA]
at org.jboss.system.server.profileservice.deployers.MainDeployerPlugin.addDeployment(MainDeployerPlugin.java:77) [:6.0.0.Final]
at org.jboss.profileservice.dependency.ProfileControllerContext$DelegateDeployer.addDeployment(ProfileControllerContext.java:133) [:0.2.2]
at org.jboss.profileservice.dependency.ProfileDeployAction.deploy(ProfileDeployAction.java:132) [:0.2.2]
at org.jboss.profileservice.dependency.ProfileDeployAction.installActionInternal(ProfileDeployAction.java:94) [:0.2.2]
at org.jboss.kernel.plugins.dependency.InstallsAwareAction.installAction(InstallsAwareAction.java:54) [jboss-kernel.jar:2.2.0.GA]
at org.jboss.kernel.plugins.dependency.InstallsAwareAction.installAction(InstallsAwareAction.java:42) [jboss-kernel.jar:2.2.0.GA]
at org.jboss.dependency.plugins.action.SimpleControllerContextAction.simpleInstallAction(SimpleControllerContextAction.java:62) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.action.AccessControllerContextAction.install(AccessControllerContextAction.java:71) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.AbstractControllerContextActions.install(AbstractControllerContextActions.java:51) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.AbstractControllerContext.install(AbstractControllerContext.java:379) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.AbstractController.install(AbstractController.java:2044) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.AbstractController.incrementState(AbstractController.java:1083) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.AbstractController.executeOrIncrementStateDirectly(AbstractController.java:1322) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:1246) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.AbstractController.resolveContexts(AbstractController.java:1139) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:939) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.dependency.plugins.AbstractController.change(AbstractController.java:654) [jboss-dependency.jar:2.2.0.GA]
at org.jboss.profileservice.dependency.ProfileActivationWrapper$BasicProfileActivation.start(ProfileActivationWrapper.java:190) [:0.2.2]
at org.jboss.profileservice.dependency.ProfileActivationWrapper.start(ProfileActivationWrapper.java:87) [:0.2.2]
at org.jboss.profileservice.dependency.ProfileActivationService.activateProfile(ProfileActivationService.java:215) [:0.2.2]
at org.jboss.profileservice.dependency.ProfileActivationService.activate(ProfileActivationService.java:159) [:0.2.2]
at org.jboss.profileservice.bootstrap.AbstractProfileServiceBootstrap.activate(AbstractProfileServiceBootstrap.java:112) [:0.2.2]
at org.jboss.profileservice.resolver.BasicResolverFactory$ProfileResolverFacade.deploy(BasicResolverFactory.java:87) [:0.2.2]
at org.jboss.profileservice.bootstrap.AbstractProfileServiceBootstrap.start(AbstractProfileServiceBootstrap.java:91) [:0.2.2]
at org.jboss.system.server.profileservice.bootstrap.BasicProfileServiceBootstrap.start(BasicProfileServiceBootstrap.java:132) [:6.0.0.Final]
at org.jboss.system.server.profileservice.bootstrap.BasicProfileServiceBootstrap.start(BasicProfileServiceBootstrap.java:56) [:6.0.0.Final]
at org.jboss.bootstrap.impl.base.server.AbstractServer.startBootstraps(AbstractServer.java:827) [jboss-bootstrap-impl-base.jar:2.1.0-alpha-5]
at org.jboss.bootstrap.impl.base.server.AbstractServer$StartServerTask.run(AbstractServer.java:417) [jboss-bootstrap-impl-base.jar:2.1.0-alpha-5]
at java.lang.Thread.run(Thread.java:662) [:1.6.0_27]
Caused by: java.lang.RuntimeException: Error determining structure: FirstGen.ear
at org.jboss.deployment.EARStructure.doDetermineStructure(EARStructure.java:300) [:6.0.0.Final]
at org.jboss.deployers.vfs.plugins.structure.AbstractVFSArchiveStructureDeployer.determineStructure(AbstractVFSArchiveStructureDeployer.java:60) [:2.2.0.GA]
at org.jboss.deployers.vfs.plugins.structure.StructureDeployerWrapper.determineStructure(StructureDeployerWrapper.java:73) [:2.2.0.GA]
at org.jboss.deployers.vfs.plugins.structure.VFSStructuralDeployersImpl.doDetermineStructure(VFSStructuralDeployersImpl.java:197) [:2.2.0.GA]
at org.jboss.deployers.vfs.plugins.structure.VFSStructuralDeployersImpl.determineStructure(VFSStructuralDeployersImpl.java:222) [:2.2.0.GA]
at org.jboss.deployers.structure.spi.helpers.AbstractStructuralDeployers.determineStructure(AbstractStructuralDeployers.java:77) [:2.2.0.GA]
... 33 more
Caused by: org.jboss.xb.binding.JBossXBException: Failed to parse source: xml_stream#14,15
at org.jboss.xb.binding.parser.sax.SaxJBossXBParser.parse(SaxJBossXBParser.java:224) [jbossxb.jar:2.0.3.GA]
at org.jboss.xb.binding.parser.sax.SaxJBossXBParser.parse(SaxJBossXBParser.java:193) [jbossxb.jar:2.0.3.GA]
at org.jboss.xb.binding.UnmarshallerImpl.unmarshal(UnmarshallerImpl.java:171) [jbossxb.jar:2.0.3.GA]
at org.jboss.deployment.EARStructure.doDetermineStructure(EARStructure.java:169) [:6.0.0.Final]
... 38 more
Caused by: org.xml.sax.SAXException: The content of element type "application" must match "(icon?,display-name,description?,module+,security-role*)". # unknown[14,15]
at org.jboss.xb.binding.parser.sax.SaxJBossXBParser.error(SaxJBossXBParser.java:416) [jbossxb.jar:2.0.3.GA]
at org.apache.xerces.util.ErrorHandlerWrapper.error(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.impl.dtd.XMLDTDValidator.handleEndElement(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.impl.dtd.XMLDTDValidator.endElement(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.impl.XMLNSDocumentScannerImpl.scanEndElement(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) [xercesImpl.jar:6.0.0.Final]
at org.jboss.xb.binding.parser.sax.SaxJBossXBParser.parse(SaxJBossXBParser.java:209) [jbossxb.jar:2.0.3.GA]
... 41 more
The order of the display-name and description elements in application.xml was changed between J2EE 1.3 and 1.4 (with the move from DTDs to schema).
This message:
The content of element type "application" must match "(icon?,display-name,description?,module+,security-role*)
tells me that you have an incorrectly formed J2EE 1.3 application.xml file with:
<!DOCTYPE application PUBLIC
"-//Sun Microsystems, Inc.//DTD J2EE Application 1.3//EN"
"http://java.sun.com/dtd/application_1_3.dtd">
at the top.
I suggest you update this to a more modern schema such as:
<application xmlns="http://java.sun.com/xml/ns/javaee"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/javaee
http://java.sun.com/xml/ns/javaee/application_5.xsd"
version="5">
for JBossAS 6.
The order of your elements will then be correct.